Map Reduce Wordcount in Python using gRPC

Overview

Map-Reduce Word Count

This project is implemented in Python using gRPC. The input files are given in .txt format and the word count operation is performed.

The medium article for understansing the code better is Learn gRPC with an Example

Description

MapReduce is a programming model and an associated implementation for processing and generating large data sets. Users specify a map function that processes a key/value pair to generate a set of intermediate key/value pairs, and a reduce function that merges all intermediate values associated with the same intermediate key. Programs written in this functional style are automatically parallelized and executed on a large cluster of commodity machines. The run-time system takes care of the details of partitioning the input data, scheduling the program’s execution across a set of machines, handling machine failures, and managing the required inter-machine communication

Installation and Usage

Setup

Clone this repository:

$ git clone https://github.com/divija-swetha/coding-exercise.git

Dependencies

$ python -V
    Python 3.8.5
$ python -m pip install grpcio
$ python -m pip install grpcio tools

Code Structure

There are three main files called the client, worker and driver. The client gives the input files and the number of output files and the worker ports (e.g. 127.0.0.1:4001). The worker nodes are launched with their ports and are responsible for the map and reduce operations. The driver takes the input from the client and distributes the work among all the worker nodes.

Proto Files

The code implenation begins with writing the proto files for the driver and worker.

driver.proto

Driver file launches the data processing operation, which is carried out when rpc launchDriver (launchData) returns (status); is executed in the code.

Once the proto file is written, the following command is executed in the terminal.

$ python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. driver.proto

This generates two files in the directory named, driver_pb2_grpc.py and driver_pb2.py.

worker.proto

The worker file sets the driver port, carries out the map and reduce operations. An additional method, die, is provided to terminate the process. The worker methods are as follows:

rpc setDriverPort(driverPort) returns (status);
rpc map(mapInput) returns (status);
rpc reduce (rid) returns (status);
rpc die (empty) returns (status);

Similar to driver, the following command is executed in the terminal.

$ python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. worker.proto

This generates two files in the directory named, worker_pb2_grpc.py and worker_pb2.py.

Python Files

Once the proto files are ready, python files for client, driver and worker are written.

worker.py

The files worker_pb2_grpc.py and worker_pb2.py are imported along with the python libraries. The code for map and reduce are defined in the worker class along with connecting to the driver port.

MAP

Mapper function maps input key/value pairs to a set of intermediate key/value pairs. Maps are the individual tasks that transform input records into intermediate records. The transformed intermediate records do not need to be of the same type as the input records. A given input pair may map to zero or many output pairs. The number of maps is usually driven by the total size of the inputs, that is, the total number of blocks of the input files.

The input text files are opened in read mode. The operations performed on a given input file are converting it to lower case, removing special charectors (other than words) and tranform the document into words. Then each word is sent to a bucket and these are stored in files.

REDUCE

Reducer reduces a set of intermediate values which share a key to a smaller set of values. The numver of reduce operations are drfined in client.py. In reduce function, glob library is used to extract all files based on a similar id. Then we use the counter function (imported from library collections) to generate a dictionary with the frequency of words.

driver.py

The worker.py file, all the files generated from proto files and python libraries are imported. The Driver class has several functions. The files are loaded and the worker ports are saved in launchDriver and then connections are established with all the workers. For each worker, map operation is sent along with the parameters. This loop continues untill all the input files are mapped. Then reduce operation is performed similar to the map operation. The driver sents task completed update to the client.

client.py

The in-built python libraries are imported along with the files generated from the proto file. A connection channel is established with the driver. Inputs and number of reduce operations and worker ports are initialized when the client is launched. Once the word count operation is performed, it exits with a message.

All the connections communicate through gRPC and evry connection is timed. If a worker or driver connection doesn't respond within 10 sec, the connection would be timed out. The driver distributed the work based on the information of worker's state and assigns jobs to the idle workers. If no workers are available, the driver waits and tries again. Once it receives the worker outputs, it aggregates the results.

The input text files are stored in inputs folder. The intermediate files are stored in temp folder and outputs in out folder. The visualization is provided in the video on how the files are created in temp and out during the execution of the code. The video shows the working of the code in visual studio in anaconda environment.

Video

Running the Code

You can run the code with any number of workers and output files. Here, I am running for 3 workers and 6 output files.

  1. Launch 3 workers by running the following command in the terminal.Provide the port number along with the command.
$ python worker.py 4001

Open another terminal and run the following command:

$ python worker.py 4002

Each worker needs to be declared in a seperate terminal. Open a new terminal and run the following command:

$ python worker.py 4003
  1. Launch the driver in a new terminal using the following command:
$ python driver.py 4000
  1. Finally launch the client in a new terminal using the following command:
$ python client.py ./inputs 6 4001 4002 4003

Additional Information

  1. Bloom RPC can be used to visualize the gRPC server client communication.
  2. Stagglers can be handled by various methods like replication, etc.

References

GRPC Tutorial

Map Reduce

Map Reduce Tutorial using Python

Bloom RPC

Contributors

License

MIT LICENSE

Owner
Divija
Divija
A Python3 script that simulates the user typing a text on their keyboard.

A Python3 script that simulates the user typing a text on their keyboard. (control the speed, randomness, rate of typos and more!)

Jose Gracia Berenguer 3 Feb 22, 2022
Repository containing the code for An-Gocair text normaliser

Scottish Gaelic Text Normaliser The following project contains the code and resources for the Scottish Gaelic text normalisation project. The repo can

3 Jun 28, 2022
Widevine KEY Extractor in Python

Widevine Client 3 This was originally written by T3rry7f. This repo is slightly modified version of his repo. This only works on standard Windows! Usa

Vank0n (SJJeon) 68 Dec 29, 2022
Python flexible slugify function

awesome-slugify Python flexible slugify function PyPi: https://pypi.python.org/pypi/awesome-slugify Github: https://github.com/dimka665/awesome-slugif

Dmitry Voronin 471 Dec 20, 2022
A production-ready pipeline for text mining and subject indexing

A production-ready pipeline for text mining and subject indexing

UF Open Source Club 12 Nov 06, 2022
A slugifier that works in unicode

Unicode Slugify Unicode Slugify is a slugifier that generates unicode slugs. It was originally used in the Firefox Add-ons web site to generate slugs

Mozilla 315 Nov 21, 2022
BaseCrack is a tool written in Python that can decode all alphanumeric base encoding schemes.

BaseCrack Decoder For Base Encoding Schemes BaseCrack is a tool written in Python that can decode all alphanumeric base encoding schemes. This tool ca

Mufeed VH 383 Dec 27, 2022
"Complexity" of Flags of the countries of the world

"Complexity" of Flags of the countries of the world Flags (png) from: https://flagcdn.com/w2560.zip https://flagpedia.net/download/images run: chmod +

Alexander Lelchuk 1 Feb 10, 2022
ChirpText is a collection of text processing tools for Python 3.

ChirpText is a collection of text processing tools for Python 3. It is not meant to be a powerful tank like the popular NTLK but a small package which

Le Tuan Anh 5 Nov 30, 2022
Word-Generator - Generates meaningful words from dictionary with given no. of letters and words.

Meaningful Word Generator Generates meaningful words from dictionary with given no. of letters and words. This might be useful for generating short li

Mohammed Rabil 1 Jan 01, 2022
A non-validating SQL parser module for Python

python-sqlparse - Parse SQL statements sqlparse is a non-validating SQL parser for Python. It provides support for parsing, splitting and formatting S

Andi Albrecht 3.1k Jan 04, 2023
A Python package to facilitate research on building and evaluating automated scoring models.

Rater Scoring Modeling Tool Introduction Automated scoring of written and spoken test responses is a growing field in educational natural language pro

ETS 59 Oct 10, 2022
A Python library that provides an easy way to identify devices like mobile phones, tablets and their capabilities by parsing (browser) user agent strings.

Python User Agents user_agents is a Python library that provides an easy way to identify/detect devices like mobile phones, tablets and their capabili

Selwin Ong 1.3k Dec 22, 2022
A collection of pre-commit hooks for handling text files.

texthooks A collection of pre-commit hooks for handling text files. In particular, hooks for handling unicode characters which may be undesirable in a

Stephen Rosen 5 Oct 28, 2022
The Levenshtein Python C extension module contains functions for fast computation of Levenshtein distance and string similarity

Contents Maintainer wanted Introduction Installation Documentation License History Source code Authors Maintainer wanted I am looking for a new mainta

Antti Haapala 1.2k Dec 16, 2022
An anthology of a variety of tools for the Persian language in Python

An anthology of a variety of tools for the Persian language in Python

Persian Tools 106 Nov 08, 2022
Migrates translations to the REDCap native Multi-Language Management system

Automates much of the process of moving translations from the old Multilingual external module to the newer built-in Multi-Language Management (MLM) page.

UCI MIND 3 Sep 27, 2022
This project is a small tool for processing url-containing texts delivered by HUAWEI Share on Windows.

hwshare_helper This project is a small tool for handling url-containing texts delivered by HUAWEI Share on Windows. config Before use, please install

1 Jan 19, 2022
Search for terms(word / table / field name or any) under Snowflake schema names

snowflake-search-terms-in-ddl-views Search for terms(word / table / field name or any) under Snowflake schema names Version : 1.0v How to use ? Run th

Igal Emona 1 Dec 15, 2021