Python HDFS client

Overview

Python HDFS client

Because the world needs yet another way to talk to HDFS from Python.

Usage

This library provides a Python client for WebHDFS. NameNode HA is supported by passing in both NameNodes. Responses are returned as nice Python classes, and any failed operation will raise some subclass of HdfsException matching the Java exception.

Example usage:

>>> fs = pyhdfs.HdfsClient(hosts='nn1.example.com:50070,nn2.example.com:50070', user_name='someone')
>>> fs.list_status('/')
[FileStatus(pathSuffix='benchmarks', permission='777', type='DIRECTORY', ...), FileStatus(...), ...]
>>> fs.listdir('/')
['benchmarks', 'hbase', 'solr', 'tmp', 'user', 'var']
>>> fs.mkdirs('/fruit/x/y')
True
>>> fs.create('/fruit/apple', 'delicious')
>>> fs.append('/fruit/apple', ' food')
>>> with contextlib.closing(fs.open('/fruit/apple')) as f:
...     f.read()
...
b'delicious food'
>>> fs.get_file_status('/fruit/apple')
FileStatus(length=14, owner='someone', type='FILE', ...)
>>> fs.get_file_status('/fruit/apple').owner
'someone'
>>> fs.get_content_summary('/fruit')
ContentSummary(directoryCount=3, fileCount=1, length=14, quota=-1, spaceConsumed=14, spaceQuota=-1)
>>> list(fs.walk('/fruit'))
[('/fruit', ['x'], ['apple']), ('/fruit/x', ['y'], []), ('/fruit/x/y', [], [])]
>>> fs.exists('/fruit/apple')
True
>>> fs.delete('/fruit')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File ".../pyhdfs.py", line 525, in delete
  ...
pyhdfs.HdfsPathIsNotEmptyDirectoryException: `/fruit is non empty': Directory is not empty
>>> fs.delete('/fruit', recursive=True)
True
>>> fs.exists('/fruit/apple')
False
>>> issubclass(pyhdfs.HdfsFileNotFoundException, pyhdfs.HdfsIOException)
True

The methods and return values generally map directly to WebHDFS endpoints. The client also provides convenience methods that mimic Python os methods and HDFS CLI commands (e.g. walk and copy_to_local).

pyhdfs logs all HDFS actions at the INFO level, so turning on INFO level logging will give you a debug record for your application.

For more information, see the full API docs.

Installing

pip install pyhdfs

Python 3 is required.

Development testing

http://codecov.io/github/jingw/pyhdfs/coverage.svg?branch=master Documentation Status

First run install-hdfs.sh x.y.z, which will download, extract, and run the HDFS NN/DN processes in the current directory. (Replace x.y.z with a real version.) Then run the following commands. Note they will create and delete hdfs://localhost/tmp/pyhdfs_test.

Commands:

python3 -m venv env
source env/bin/activate
pip install -e .
pip install -r dev_requirements.txt
pytest
Comments
  • client should return some info when succuessfully create a file

    client should return some info when succuessfully create a file

    for example, hdfs server may return a response with headers like this

    HTTP/1.1 201 Created
    Location: webhdfs://<HOST>:<PORT>/<PATH>
    Content-Length: 0
    

    I want to get location from response headers, however, client.create do not return any thing.

    opened by cosven 7
  • Write error

    Write error

    Hello Mkdir and listdir work fine But create didn't

    fs.create('/fruit/apple', 'delicious')
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/root/miniconda2/lib/python2.7/site-packages/pyhdfs.py", line 426, in create
        metadata_response.headers['location'], data=data, **self._requests_kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/api.py", line 126, in put
        return request('put', url, data=data, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/api.py", line 58, in request
        return session.request(method=method, url=url, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/sessions.py", line 512, in request
        resp = self.send(prep, **send_kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/sessions.py", line 622, in send
        r = adapter.send(request, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/adapters.py", line 513, in send
        raise ConnectionError(e, request=request)
    requests.exceptions.ConnectionError: HTTPConnectionPool(host='1566bb80c4dc', port=50075): Max retries exceeded with url: /webhdfs/v1/fruit/apple?op=CREATE&user.name=hdfs&namenoderpcaddress=localhost:8020&overwrite=false (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f644f364510>: Failed to establish a new connection: [Errno -2] Name or service not known',))
    
    opened by albertoRamon 4
  • requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen chunked=chunked) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw) File "D:\Anaconda3\lib\http\client.py", line 1239, in request self._send_request(method, url, body, headers, encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1285, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1234, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1065, in _send_output self.send(chunk) File "D:\Anaconda3\lib\http\client.py", line 986, in send self.sock.sendall(data) ConnectionResetError: [WinError 10054] 远程主机强迫关闭了一个现有的连接。

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 440, in send timeout=timeout File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 639, in urlopen _stacktrace=sys.exc_info()[2]) File "D:\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 357, in increment raise six.reraise(type(error), error, _stacktrace) File "D:\Anaconda3\lib\site-packages\urllib3\packages\six.py", line 685, in reraise raise value.with_traceback(tb) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen chunked=chunked) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw) File "D:\Anaconda3\lib\http\client.py", line 1239, in request self._send_request(method, url, body, headers, encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1285, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1234, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1065, in _send_output self.send(chunk) File "D:\Anaconda3\lib\http\client.py", line 986, in send self.sock.sendall(data) urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "D:\workspace\phdfs\check_wrf.py", line 144, in fs.copy_from_local(parname,"/test/fcst/china/10d_arwpost_sta/near/" + wrflisttime.format("YYYYMMDD") + "/" + parname,overwrite = True) File "D:\Anaconda3\lib\site-packages\pyhdfs.py", line 753, in copy_from_local self.create(dest, f, **kwargs) File "D:\Anaconda3\lib\site-packages\pyhdfs.py", line 426, in create metadata_response.headers['location'], data=data, **self._requests_kwargs) File "D:\Anaconda3\lib\site-packages\requests\api.py", line 126, in put return request('put', url, data=data, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\api.py", line 58, in request return session.request(method=method, url=url, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 508, in request resp = self.send(prep, **send_kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 618, in send r = adapter.send(request, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 490, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    opened by Georege 4
  • BUG:Chinese character can't copy to hdfs

    BUG:Chinese character can't copy to hdfs

    UnicodeEncodeError: 'latin-1' codec can't encode characters in position 2-3: Body ('张三') is not valid Latin-1. Use body.encode('utf-8') if you want to send it encoded in UTF-8.

    opened by yiershanxll 3
  • Help me,please . The second run of the function in the script results in an abnormal result

    Help me,please . The second run of the function in the script results in an abnormal result

    I am a rookie~~!!

    The following code:

    list_info = [{"tenant": "coco", "hive_path": "/user/open_001_dev", "ftp_path": "/files/prov/001"},
                     {"tenant": "lili", "hive_path": "/user/open_002_dev", "ftp_path": "/files/prov/002"}]
    result = 0
    client=pyhdfs.HdfsClient(hosts="10.173.5.18:9000",user_name="hdfs",timeout=10,max_tries=3,randomize_hosts="false")
    def hive_content_size():
        global result
        for item in range(2):
            if "hive_path" in list_info[item]:
                print(client.get_content_summary(list_info[item]["hive_path"]))
    
    hive_content_size()
    

    The result of the first loop is output normally,but the output of the second loop is abnormal.

    The bottom is the error report:

    ContentSummary(directoryCount=1258, fileCount=3773, length=141829751002, quota=4000000, spaceConsumed=425489253006, spaceQuota=659706976665600)
    
    Failed to reach to 10.173.5.18:9000 (attempt 3/3)
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 445, in _make_request
        six.raise_from(e, None)
      File "<string>", line 3, in raise_from
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 440, in _make_request
        httplib_response = conn.getresponse()
      File "/usr/local/python/lib/python3.9/http/client.py", line 1347, in getresponse
        response.begin()
      File "/usr/local/python/lib/python3.9/http/client.py", line 307, in begin
        version, status, reason = self._read_status()
      File "/usr/local/python/lib/python3.9/http/client.py", line 268, in _read_status
        line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
      File "/usr/local/python/lib/python3.9/socket.py", line 704, in readinto
        return self._sock.recv_into(b)
    socket.timeout: timed out
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/adapters.py", line 439, in send
        resp = conn.urlopen(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 755, in urlopen
        retries = retries.increment(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/util/retry.py", line 532, in increment
        raise six.reraise(type(error), error, _stacktrace)
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/packages/six.py", line 735, in reraise
        raise value
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 699, in urlopen
        httplib_response = self._make_request(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 447, in _make_request
        self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 336, in _raise_timeout
        raise ReadTimeoutError(
    urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='10.173.5.18', port=9000): Read timed out. (read timeout=10)
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 418, in _request
        response = self._requests_session.request(
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/sessions.py", line 542, in request
        resp = self.send(prep, **send_kwargs)
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/sessions.py", line 655, in send
        r = adapter.send(request, **kwargs)
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/adapters.py", line 529, in send
        raise ReadTimeout(e, request=request)
    requests.exceptions.ReadTimeout: HTTPConnectionPool(host='10.162.3.171', port=19888): Read timed out. (read timeout=10)
    Traceback (most recent call last):
      File "/home/hadoop/shay/monthly_report/test01.py", line 24, in <module>
        print(hive_content_size())
      File "/home/hadoop/shay/monthly_report/test01.py", line 22, in hive_content_size
        print(client.get_content_summary(list_info[item]["hive_path"]))
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 633, in get_content_summary
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 450, in _get
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 442, in _request
    pyhdfs.HdfsNoServerException: Could not use any of the given hosts
    

    ask for help~~!!!

    opened by qwe55982 2
  • HdfsFileAlreadyExistsException is not implemented?

    HdfsFileAlreadyExistsException is not implemented?

    Hi! Thanks for your great work. I have noticed that some Exceptions are not implemented right now?

    For example: If I try to upload the file with same path, the python raises ConnectionError instead of HdfsFileAlreadyExistsException.

    error message as following:

    Traceback (most recent call last):
      File "test_pyhdfs.py", line 12, in <module>
        fs.create('/xxx/xxx/images/test.png', data=file)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/pyhdfs/__init__.py", line 504, in create
        metadata_response.headers['location'], data=data, **self._requests_kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/api.py", line 132, in put
        return request('put', url, data=data, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/api.py", line 61, in request
        return session.request(method=method, url=url, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/sessions.py", line 542, in request
        resp = self.send(prep, **send_kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/sessions.py", line 655, in send
        r = adapter.send(request, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/adapters.py", line 498, in send
        raise ConnectionError(err, request=request)
    requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
    
    opened by james77777778 1
  • Support customized WEBHDFS_PATH

    Support customized WEBHDFS_PATH

    In the latest version of pyhdfs, webhdfs is set as a constant '/webhdfs/v1', it works well in most kind of scene, but users may use their customized HTTP URL, for example, users may set their own webhdfs service using Pylon, and they access their restful server using their customized URL PATTERN like http://<HOST>:<HTTP_PORT>/webhdfs/api/v2/<PATH>?op=...

    opened by SparkSnail 1
  • TypeError: __new__() got an unexpected keyword argument 'storagePolicy'

    TypeError: __new__() got an unexpected keyword argument 'storagePolicy'

    I am using hadoop 2.6( with Docker: sudo docker run -i -t sequenceiq/hadoop-docker:2.6.0 /etc/bootstrap.sh -bash).

    When I using PyHDFS to call client.list_status, I got error:

    Traceback (most recent call last):
      File "testhdfs.py", line 3, in <module>
        print(client.list_status('/'))
      File "...testenv/lib/python3.4/site-packages/pyhdfs.py", line 428, in list_status
        _json(self._get(path, 'LISTSTATUS', **kwargs))['FileStatuses']['FileStatus']
      File "...testenv/lib/python3.4/site-packages/pyhdfs.py", line 427, in <listcomp>
        FileStatus(**item) for item in
    TypeError: __new__() got an unexpected keyword argument 'storagePolicy'
    

    The code:

    from pyhdfs import HdfsClient
    client = HdfsClient(hosts='172.17.0.2:50070')
    print(client.list_status('/'))
    

    This issue is cause of JSON from server has extra property storagePolicy, add it to pyhdfs.py can fix this. But I want to know weather this property is standard property of HDFS/WebHDFS.

    bug 
    opened by robberphex 1
  • why response assert not empty

    why response assert not empty

    In pyhdfs.py, line 424

    assert not metadata_response.content
    

    In my client, I get some response when upload files.

    b'<html>\r\n<head><title>307 Temporary Redirect</title></head>\r\n<body bgcolor="white">\r\n<center><h1>307 Temporary Redirect</h1></center>\r\n<hr><center>nginx/1.13.8</center>\r\n</body>\r\n</html>\r\n'
    

    This response does not mean the upload process failed, and I can successfully upload my files when I delete this line. Why add this line? could you please help me to figure out this problem?

    opened by SparkSnail 0
  • Support setting webhdfs_path

    Support setting webhdfs_path

    In the latest version of pyhdfs, webhdfs is set as a constant '/webhdfs/v1', it works well in most kind of scene, but users may use their customized HTTP URL, for example, users may set their own webhdfs service using Pylon, and they access their restful server using their customized URL PATTERN like http://<HOST>:<HTTP_PORT>/webhdfs/api/v2/<PATH>?op=...

    opened by SparkSnail 0
  • Let pyhdfs can visit HDFS in kerberos environment

    Let pyhdfs can visit HDFS in kerberos environment

    When HDFS need kerberos authentication,ur pyhds.py cannot visit HDFS. So maybe u should add authentication information in ur pyhdfs.py. In fact, it will call request module when python visit HDFS, so add authentication information at here.

    opened by LuckyNemo 0
  • got type error while append file

    got type error while append file

    File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 520, in append path, 'APPEND', expected_status=HTTPStatus.TEMPORARY_REDIRECT, **kwargs) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 466, in _post return self._request('post', path, op, expected_status, **kwargs) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 431, in _request _check_response(response, expected_status) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 933, in _check_response remote_exception['message'] = exception_name + ' - ' + remote_exception['message'] TypeError: must be str, not NoneType

    opened by BingoZ 0
  • can't parse JSON with unprintable characters

    can't parse JSON with unprintable characters

    If a weird non-utf file name is created in HDFS, then the client fails when it can't interpret the response as a valid JSON string.

    e.g. it's possible to put a ctrl-r in the file name

    bug 
    opened by jingw 0
Releases(v0.3.1)
Abusing Microsoft 365 OAuth Authorization Flow for Phishing Attack

O365DevicePhish Microsoft365_devicePhish Abusing Microsoft 365 OAuth Authorization Flow for Phishing Attack This is a simple proof-of-concept script t

Trewis [work] Scotch 4 Sep 23, 2022
Brute-forcing (or not!) deck builder for Pokemon Trading Card Game.

PokeBot Deck Builder Brute-forcing (or not!) deck builder for Pokemon Trading Card Game. Warning: intensely not optimized and spaghetti coded Credits

Hocky Harijanto 0 Jan 10, 2022
It's a simple tool for test vulnerability Apache Path Traversal

SimplesApachePathTraversal Simples Apache Path Traversal It's a simple tool for test vulnerability Apache Path Traversal https://blog.mrcl0wn.com/2021

Mr. Cl0wn - H4ck1ng C0d3r 56 Dec 27, 2022
:closed_lock_with_key: multi factor authentication system (2FA, MFA, OTP Server)

privacyIDEA privacyIDEA is an open solution for strong two-factor authentication like OTP tokens, SMS, smartphones or SSH keys. Using privacyIDEA you

1.3k Jan 03, 2023
Ethereum transaction decoder (community version).

EthTx Community Edition Community version of EthTx transaction decoder Local environment For local instance, you need few things: Depending on your di

240 Dec 21, 2022
Lnkbomb - Malicious shortcut generator for collecting NTLM hashes from insecure file shares

Lnkbomb Lnkbomb is used for uploading malicious shortcut files to insecure file

Joe Helle 216 Jan 08, 2023
Auerswald COMpact 8.0B Backdoors exploit

CVE-2021-40859 Auerswald COMpact 8.0B Backdoors exploit About Backdoors were discovered in Auerswald COMpact 5500R 7.8A and 8.0B devices, that allow a

6 Sep 22, 2022
A script to extract SNESticle from Fight Night Round 2

fn22snesticle.py A script for producing a SNESticle ISO from a Fight Night Round 2 ISO and any SNES ROM. Background Fight Night Round 2 is a boxing ga

Johannes Holmberg 57 Nov 22, 2022
spring-cloud-gateway-rce CVE-2022-22947

Spring Cloud Gateway Actuator API SpEL表达式注入命令执行(CVE-2022-22947) 1.installation pip3 install -r requirements.txt 2.Usage $ python3 spring-cloud-gateway

k3rwin 10 Sep 28, 2022
Um script simples de Port Scan + DNS by Hostname

🖥 PortScan-DNS Esta é uma ferramenta simples de Port Scan + DNS by Hostname... 💻 | DNS Resolver / by Hostname: HOST IP EXTERNO IP INTERNO 💻 | Port

AlbâniaSecurity-RT 7 Dec 08, 2022
IDA2Obj is a tool to implement SBI (Static Binary Instrumentation).

IDA2Obj IDA2Obj is a tool to implement SBI (Static Binary Instrumentation). The working flow is simple: Dump object files (COFF) directly from one exe

Mickey 94 Dec 13, 2022
LittleBrother is a simple parental control application monitoring specific processes on Linux hosts to monitor and limit the play time of children.

Parental Control Application LittleBrother Overview LittleBrother is a simple parental control application monitoring specific processes (read "games"

40 Dec 21, 2022
simple python keylogger

HELLogger simple python keylogger DISCLAIMERS: DON'T DO BAD THINGS. THIS PROGRAM IS MEANT FOR PERSONAL USES ONLY. USE IT ONLY IN COMPUTERS WHERE YOU H

Arya 10 Nov 10, 2022
Generate malicious files using recently published bidi-attack (CVE-2021-42574)

CVE-2021-42574 - Code generator Generate malicious files using recently published bidi-attack vulnerability, which was discovered in Unicode Specifica

js-on 7 Nov 09, 2022
It is a very simple XSS simulator based on flask, python.

It is a very simple XSS simulator based on flask, python. The purpose of making this is for teaching the concept of XSS.

Satin Wuker 3 May 10, 2022
Password Manager is a simple Python project which helps users in managing their passwords in a easier way

Password Manager is a simple Python project which helps users in managing their passwords in a easier way

Manish Jalui 4 Sep 29, 2021
A simple password generator using Python Tkinter.

Password-Generator-using-Python A simple password generator that generates password for you. User can Copy the password to Clipboard. Project made usi

Prashant Agheda 1 Nov 02, 2022
adb - A tool that allows you to search for vulnerable android devices across the world and exploit them.

adb - An exploitation tool for android devices. A tool that allows you to search for vulnerable android devices across the world and exploit them. Fea

136 Jan 02, 2023
Evil-stalker - A simple tool written in python, it is so simple that it is based on google dorks

evil-stalker How to run First of all, you must install the necessary libraries.

rock3d 6 Nov 16, 2022
Generate MIPS reverse shell shellcodes easily !

MIPS-Reverse MIPS-Reverse is a tool that can generate shellcodes for the MIPS architecture that launches a reverse shell where you can specify the IP

29 Jul 27, 2021