Python HDFS client

Overview

Python HDFS client

Because the world needs yet another way to talk to HDFS from Python.

Usage

This library provides a Python client for WebHDFS. NameNode HA is supported by passing in both NameNodes. Responses are returned as nice Python classes, and any failed operation will raise some subclass of HdfsException matching the Java exception.

Example usage:

>>> fs = pyhdfs.HdfsClient(hosts='nn1.example.com:50070,nn2.example.com:50070', user_name='someone')
>>> fs.list_status('/')
[FileStatus(pathSuffix='benchmarks', permission='777', type='DIRECTORY', ...), FileStatus(...), ...]
>>> fs.listdir('/')
['benchmarks', 'hbase', 'solr', 'tmp', 'user', 'var']
>>> fs.mkdirs('/fruit/x/y')
True
>>> fs.create('/fruit/apple', 'delicious')
>>> fs.append('/fruit/apple', ' food')
>>> with contextlib.closing(fs.open('/fruit/apple')) as f:
...     f.read()
...
b'delicious food'
>>> fs.get_file_status('/fruit/apple')
FileStatus(length=14, owner='someone', type='FILE', ...)
>>> fs.get_file_status('/fruit/apple').owner
'someone'
>>> fs.get_content_summary('/fruit')
ContentSummary(directoryCount=3, fileCount=1, length=14, quota=-1, spaceConsumed=14, spaceQuota=-1)
>>> list(fs.walk('/fruit'))
[('/fruit', ['x'], ['apple']), ('/fruit/x', ['y'], []), ('/fruit/x/y', [], [])]
>>> fs.exists('/fruit/apple')
True
>>> fs.delete('/fruit')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File ".../pyhdfs.py", line 525, in delete
  ...
pyhdfs.HdfsPathIsNotEmptyDirectoryException: `/fruit is non empty': Directory is not empty
>>> fs.delete('/fruit', recursive=True)
True
>>> fs.exists('/fruit/apple')
False
>>> issubclass(pyhdfs.HdfsFileNotFoundException, pyhdfs.HdfsIOException)
True

The methods and return values generally map directly to WebHDFS endpoints. The client also provides convenience methods that mimic Python os methods and HDFS CLI commands (e.g. walk and copy_to_local).

pyhdfs logs all HDFS actions at the INFO level, so turning on INFO level logging will give you a debug record for your application.

For more information, see the full API docs.

Installing

pip install pyhdfs

Python 3 is required.

Development testing

http://codecov.io/github/jingw/pyhdfs/coverage.svg?branch=master Documentation Status

First run install-hdfs.sh x.y.z, which will download, extract, and run the HDFS NN/DN processes in the current directory. (Replace x.y.z with a real version.) Then run the following commands. Note they will create and delete hdfs://localhost/tmp/pyhdfs_test.

Commands:

python3 -m venv env
source env/bin/activate
pip install -e .
pip install -r dev_requirements.txt
pytest
Comments
  • client should return some info when succuessfully create a file

    client should return some info when succuessfully create a file

    for example, hdfs server may return a response with headers like this

    HTTP/1.1 201 Created
    Location: webhdfs://<HOST>:<PORT>/<PATH>
    Content-Length: 0
    

    I want to get location from response headers, however, client.create do not return any thing.

    opened by cosven 7
  • Write error

    Write error

    Hello Mkdir and listdir work fine But create didn't

    fs.create('/fruit/apple', 'delicious')
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/root/miniconda2/lib/python2.7/site-packages/pyhdfs.py", line 426, in create
        metadata_response.headers['location'], data=data, **self._requests_kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/api.py", line 126, in put
        return request('put', url, data=data, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/api.py", line 58, in request
        return session.request(method=method, url=url, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/sessions.py", line 512, in request
        resp = self.send(prep, **send_kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/sessions.py", line 622, in send
        r = adapter.send(request, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/adapters.py", line 513, in send
        raise ConnectionError(e, request=request)
    requests.exceptions.ConnectionError: HTTPConnectionPool(host='1566bb80c4dc', port=50075): Max retries exceeded with url: /webhdfs/v1/fruit/apple?op=CREATE&user.name=hdfs&namenoderpcaddress=localhost:8020&overwrite=false (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f644f364510>: Failed to establish a new connection: [Errno -2] Name or service not known',))
    
    opened by albertoRamon 4
  • requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen chunked=chunked) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw) File "D:\Anaconda3\lib\http\client.py", line 1239, in request self._send_request(method, url, body, headers, encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1285, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1234, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1065, in _send_output self.send(chunk) File "D:\Anaconda3\lib\http\client.py", line 986, in send self.sock.sendall(data) ConnectionResetError: [WinError 10054] 远程主机强迫关闭了一个现有的连接。

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 440, in send timeout=timeout File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 639, in urlopen _stacktrace=sys.exc_info()[2]) File "D:\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 357, in increment raise six.reraise(type(error), error, _stacktrace) File "D:\Anaconda3\lib\site-packages\urllib3\packages\six.py", line 685, in reraise raise value.with_traceback(tb) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen chunked=chunked) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw) File "D:\Anaconda3\lib\http\client.py", line 1239, in request self._send_request(method, url, body, headers, encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1285, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1234, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1065, in _send_output self.send(chunk) File "D:\Anaconda3\lib\http\client.py", line 986, in send self.sock.sendall(data) urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "D:\workspace\phdfs\check_wrf.py", line 144, in fs.copy_from_local(parname,"/test/fcst/china/10d_arwpost_sta/near/" + wrflisttime.format("YYYYMMDD") + "/" + parname,overwrite = True) File "D:\Anaconda3\lib\site-packages\pyhdfs.py", line 753, in copy_from_local self.create(dest, f, **kwargs) File "D:\Anaconda3\lib\site-packages\pyhdfs.py", line 426, in create metadata_response.headers['location'], data=data, **self._requests_kwargs) File "D:\Anaconda3\lib\site-packages\requests\api.py", line 126, in put return request('put', url, data=data, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\api.py", line 58, in request return session.request(method=method, url=url, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 508, in request resp = self.send(prep, **send_kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 618, in send r = adapter.send(request, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 490, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    opened by Georege 4
  • BUG:Chinese character can't copy to hdfs

    BUG:Chinese character can't copy to hdfs

    UnicodeEncodeError: 'latin-1' codec can't encode characters in position 2-3: Body ('张三') is not valid Latin-1. Use body.encode('utf-8') if you want to send it encoded in UTF-8.

    opened by yiershanxll 3
  • Help me,please . The second run of the function in the script results in an abnormal result

    Help me,please . The second run of the function in the script results in an abnormal result

    I am a rookie~~!!

    The following code:

    list_info = [{"tenant": "coco", "hive_path": "/user/open_001_dev", "ftp_path": "/files/prov/001"},
                     {"tenant": "lili", "hive_path": "/user/open_002_dev", "ftp_path": "/files/prov/002"}]
    result = 0
    client=pyhdfs.HdfsClient(hosts="10.173.5.18:9000",user_name="hdfs",timeout=10,max_tries=3,randomize_hosts="false")
    def hive_content_size():
        global result
        for item in range(2):
            if "hive_path" in list_info[item]:
                print(client.get_content_summary(list_info[item]["hive_path"]))
    
    hive_content_size()
    

    The result of the first loop is output normally,but the output of the second loop is abnormal.

    The bottom is the error report:

    ContentSummary(directoryCount=1258, fileCount=3773, length=141829751002, quota=4000000, spaceConsumed=425489253006, spaceQuota=659706976665600)
    
    Failed to reach to 10.173.5.18:9000 (attempt 3/3)
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 445, in _make_request
        six.raise_from(e, None)
      File "<string>", line 3, in raise_from
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 440, in _make_request
        httplib_response = conn.getresponse()
      File "/usr/local/python/lib/python3.9/http/client.py", line 1347, in getresponse
        response.begin()
      File "/usr/local/python/lib/python3.9/http/client.py", line 307, in begin
        version, status, reason = self._read_status()
      File "/usr/local/python/lib/python3.9/http/client.py", line 268, in _read_status
        line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
      File "/usr/local/python/lib/python3.9/socket.py", line 704, in readinto
        return self._sock.recv_into(b)
    socket.timeout: timed out
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/adapters.py", line 439, in send
        resp = conn.urlopen(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 755, in urlopen
        retries = retries.increment(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/util/retry.py", line 532, in increment
        raise six.reraise(type(error), error, _stacktrace)
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/packages/six.py", line 735, in reraise
        raise value
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 699, in urlopen
        httplib_response = self._make_request(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 447, in _make_request
        self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 336, in _raise_timeout
        raise ReadTimeoutError(
    urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='10.173.5.18', port=9000): Read timed out. (read timeout=10)
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 418, in _request
        response = self._requests_session.request(
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/sessions.py", line 542, in request
        resp = self.send(prep, **send_kwargs)
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/sessions.py", line 655, in send
        r = adapter.send(request, **kwargs)
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/adapters.py", line 529, in send
        raise ReadTimeout(e, request=request)
    requests.exceptions.ReadTimeout: HTTPConnectionPool(host='10.162.3.171', port=19888): Read timed out. (read timeout=10)
    Traceback (most recent call last):
      File "/home/hadoop/shay/monthly_report/test01.py", line 24, in <module>
        print(hive_content_size())
      File "/home/hadoop/shay/monthly_report/test01.py", line 22, in hive_content_size
        print(client.get_content_summary(list_info[item]["hive_path"]))
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 633, in get_content_summary
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 450, in _get
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 442, in _request
    pyhdfs.HdfsNoServerException: Could not use any of the given hosts
    

    ask for help~~!!!

    opened by qwe55982 2
  • HdfsFileAlreadyExistsException is not implemented?

    HdfsFileAlreadyExistsException is not implemented?

    Hi! Thanks for your great work. I have noticed that some Exceptions are not implemented right now?

    For example: If I try to upload the file with same path, the python raises ConnectionError instead of HdfsFileAlreadyExistsException.

    error message as following:

    Traceback (most recent call last):
      File "test_pyhdfs.py", line 12, in <module>
        fs.create('/xxx/xxx/images/test.png', data=file)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/pyhdfs/__init__.py", line 504, in create
        metadata_response.headers['location'], data=data, **self._requests_kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/api.py", line 132, in put
        return request('put', url, data=data, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/api.py", line 61, in request
        return session.request(method=method, url=url, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/sessions.py", line 542, in request
        resp = self.send(prep, **send_kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/sessions.py", line 655, in send
        r = adapter.send(request, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/adapters.py", line 498, in send
        raise ConnectionError(err, request=request)
    requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
    
    opened by james77777778 1
  • Support customized WEBHDFS_PATH

    Support customized WEBHDFS_PATH

    In the latest version of pyhdfs, webhdfs is set as a constant '/webhdfs/v1', it works well in most kind of scene, but users may use their customized HTTP URL, for example, users may set their own webhdfs service using Pylon, and they access their restful server using their customized URL PATTERN like http://<HOST>:<HTTP_PORT>/webhdfs/api/v2/<PATH>?op=...

    opened by SparkSnail 1
  • TypeError: __new__() got an unexpected keyword argument 'storagePolicy'

    TypeError: __new__() got an unexpected keyword argument 'storagePolicy'

    I am using hadoop 2.6( with Docker: sudo docker run -i -t sequenceiq/hadoop-docker:2.6.0 /etc/bootstrap.sh -bash).

    When I using PyHDFS to call client.list_status, I got error:

    Traceback (most recent call last):
      File "testhdfs.py", line 3, in <module>
        print(client.list_status('/'))
      File "...testenv/lib/python3.4/site-packages/pyhdfs.py", line 428, in list_status
        _json(self._get(path, 'LISTSTATUS', **kwargs))['FileStatuses']['FileStatus']
      File "...testenv/lib/python3.4/site-packages/pyhdfs.py", line 427, in <listcomp>
        FileStatus(**item) for item in
    TypeError: __new__() got an unexpected keyword argument 'storagePolicy'
    

    The code:

    from pyhdfs import HdfsClient
    client = HdfsClient(hosts='172.17.0.2:50070')
    print(client.list_status('/'))
    

    This issue is cause of JSON from server has extra property storagePolicy, add it to pyhdfs.py can fix this. But I want to know weather this property is standard property of HDFS/WebHDFS.

    bug 
    opened by robberphex 1
  • why response assert not empty

    why response assert not empty

    In pyhdfs.py, line 424

    assert not metadata_response.content
    

    In my client, I get some response when upload files.

    b'<html>\r\n<head><title>307 Temporary Redirect</title></head>\r\n<body bgcolor="white">\r\n<center><h1>307 Temporary Redirect</h1></center>\r\n<hr><center>nginx/1.13.8</center>\r\n</body>\r\n</html>\r\n'
    

    This response does not mean the upload process failed, and I can successfully upload my files when I delete this line. Why add this line? could you please help me to figure out this problem?

    opened by SparkSnail 0
  • Support setting webhdfs_path

    Support setting webhdfs_path

    In the latest version of pyhdfs, webhdfs is set as a constant '/webhdfs/v1', it works well in most kind of scene, but users may use their customized HTTP URL, for example, users may set their own webhdfs service using Pylon, and they access their restful server using their customized URL PATTERN like http://<HOST>:<HTTP_PORT>/webhdfs/api/v2/<PATH>?op=...

    opened by SparkSnail 0
  • Let pyhdfs can visit HDFS in kerberos environment

    Let pyhdfs can visit HDFS in kerberos environment

    When HDFS need kerberos authentication,ur pyhds.py cannot visit HDFS. So maybe u should add authentication information in ur pyhdfs.py. In fact, it will call request module when python visit HDFS, so add authentication information at here.

    opened by LuckyNemo 0
  • got type error while append file

    got type error while append file

    File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 520, in append path, 'APPEND', expected_status=HTTPStatus.TEMPORARY_REDIRECT, **kwargs) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 466, in _post return self._request('post', path, op, expected_status, **kwargs) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 431, in _request _check_response(response, expected_status) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 933, in _check_response remote_exception['message'] = exception_name + ' - ' + remote_exception['message'] TypeError: must be str, not NoneType

    opened by BingoZ 0
  • can't parse JSON with unprintable characters

    can't parse JSON with unprintable characters

    If a weird non-utf file name is created in HDFS, then the client fails when it can't interpret the response as a valid JSON string.

    e.g. it's possible to put a ctrl-r in the file name

    bug 
    opened by jingw 0
Releases(v0.3.1)
Fetch Chrome, Firefox, WiFi password and system info

DISCLAIMER : OUR TOOLS ARE FOR EDUCATIONAL PURPOSES ONLY. DON'T USE THEM FOR ILLEGAL ACTIVITIES. YOU ARE THE ONLY RESPONSABLE FOR YOUR ACTIONS! OUR TO

Genos 59 Nov 17, 2022
evtx-hunter helps to quickly spot interesting security-related activity in Windows Event Viewer (EVTX) files.

Introduction evtx-hunter helps to quickly spot interesting security-related activity in Windows Event Viewer (EVTX) files. It can process a high numbe

NVISO 116 Dec 29, 2022
Cobalt Strike Beacon configuration extractor and parser.

Cobalt Strike Configuration Extractor and Parser Overview Pure Python library and set of scripts to extract and parse configurations (configs) from Co

Stroz Friedberg 102 Dec 18, 2022
A small utility to deal with malware embedded hashes.

Uchihash is a small utility that can save malware analysts the time of dealing with embedded hash values used for various things such as: Dyn

Abdallah Elshinbary 48 Dec 19, 2022
Mass Shortlink Bypass Merupakan Tools Yang Akan Bypass Shortlink Ke Tujuan Asli, Dibuat Dengan Python 3

Shortlink-Bypass Mass Shortlink Bypass Merupakan Tools Yang Akan Bypass Shortlink Ke Tujuan Asli, Dibuat Dengan Python 3 Support Shortlink tii.ai/tei.

Wan Naz ID 6 Oct 24, 2022
The best Python Backdoor👌

Backdoor The best Python Backdoor Files Server file is used in all of cases If client is Windows, the client need execute EXE file If client is Linux,

13 Oct 28, 2022
Salesforce Recon and Exploitation Toolkit

Salesforce Recon and Exploitation Toolkit Salesforce Recon and Exploitation Toolkit Usage python3 main.py URL References Announcement Blog - https:/

81 Dec 23, 2022
An forensics tool to help aid in the investigation of spoofed emails based off the email headers.

A forensic tool to make analysis of email headers easy to aid in the quick discovery of the attacker. Table of Contents About mailMeta Installation Us

Syed Modassir Ali 59 Nov 26, 2022
Brute force attack tool for Azure AD Autologon/Seamless SSO

Brute force attack tool for Azure AD Autologon

nyxgeek 89 Jan 02, 2023
Scan your logs for CVE-2021-44228 related activity and report the attackers

jndiRep - CVE-2021-44228 Basically a bad grep on even worse drugs. search for malicious strings decode payloads print results to stdout or file report

js-on 2 Nov 24, 2022
Yara Based Detection Engine for web browsers

Yobi Yara Based Detection for web browsers System Requirements Yobi requires python3 and and right now supports only firefox and other Gecko-based bro

imp0rtp3 44 Nov 20, 2022
Open Source Intelligence gathering tool aimed at reducing the time spent harvesting information from open sources.

The Recon-ng Framework Recon-ng content now available on Pluralsight! Recon-ng is a full-featured reconnaissance framework designed with the goal of p

2.4k Jan 07, 2023
the metasploit script(POC) about CVE-2021-36260

CVE-2021-36260-metasploit the metasploit script(POC) about CVE-2021-36260. A command injection vulnerability in the web server of some Hikvision produ

Taroballz 14 Nov 09, 2022
Abusing Microsoft 365 OAuth Authorization Flow for Phishing Attack

O365DevicePhish Microsoft365_devicePhish Abusing Microsoft 365 OAuth Authorization Flow for Phishing Attack This is a simple proof-of-concept script t

Trewis [work] Scotch 4 Sep 23, 2022
Cowrie SSH/Telnet Honeypot https://cowrie.readthedocs.io

Cowrie Welcome to the Cowrie GitHub repository This is the official repository for the Cowrie SSH and Telnet Honeypot effort. What is Cowrie Cowrie is

Cowrie 4.1k Jan 09, 2023
𝙾𝚙𝚎𝚗 𝚂𝚘𝚞𝚛𝚌𝚎 𝚂𝚌𝚛𝚒𝚙𝚝 - 𝙽𝚘 𝙲𝚘𝚙𝚢𝚛𝚒𝚐𝚑𝚝 - 𝚃𝚎𝚊𝚖 𝚆𝚘𝚛𝚔 - 𝚂𝚒𝚖𝚙𝚕𝚎 𝙿𝚢𝚝𝚑𝚘𝚗 𝙿𝚛𝚘𝚓𝚎𝚌𝚝 - 𝙲𝚛𝚎𝚊𝚝𝚎𝚍 𝙱𝚢 : 𝙰𝚕𝚕 𝚃𝚎𝚊𝚖 - 𝙲𝚘𝚙𝚢𝙿𝚊𝚜𝚝 𝙲𝚊𝚗 𝙽𝚘𝚝 𝙼𝚊𝚔𝚎 𝚈𝚘𝚞 𝚁𝚎𝚊𝚕 𝙿𝚛𝚘𝚐𝚛𝚊𝚖𝚖𝚎𝚛

𝙾𝚙𝚎𝚗 𝚂𝚘𝚞𝚛𝚌𝚎 𝚂𝚌𝚛𝚒𝚙𝚝 - 𝙽𝚘 𝙲𝚘𝚙𝚢𝚛𝚒𝚐𝚑𝚝 - 𝚃𝚎𝚊𝚖 𝚆𝚘𝚛𝚔 - 𝚂𝚒𝚖𝚙𝚕𝚎 𝙿𝚢𝚝𝚑𝚘𝚗 𝙿𝚛𝚘𝚓𝚎𝚌𝚝 - 𝙲𝚛𝚎𝚊𝚝𝚎𝚍 𝙱𝚢 : 𝙰𝚕𝚕 𝚃𝚎𝚊𝚖 - 𝙲𝚘𝚙𝚢𝙿𝚊𝚜𝚝 𝙲𝚊𝚗 𝙽𝚘𝚝 𝙼𝚊𝚔𝚎 𝚈𝚘𝚞 𝚁𝚎𝚊𝚕 𝙿𝚛𝚘𝚐𝚛𝚊𝚖𝚖𝚎𝚛

CodeX-ID 2 Oct 27, 2022
A hashtag check python module

A hashtag check python module

Fayas Noushad 3 Aug 10, 2022
NExfil is an OSINT tool written in python for finding profiles by username.

NExfil is an OSINT tool written in python for finding profiles by username. The provided usernames are checked on over 350 websites within few seconds.

thewhiteh4t 1.4k Jan 01, 2023
🐝 ℹ️ Honeybee extension for export to IES-VE gem file format

honeybee-ies Honeybee extension for export a HBJSON file to IES-VE GEM file format Installation pip install honeybee-ies QuickStart import pathlib fro

Ladybug Tools 4 Jul 12, 2022
Used to build an XSS platform on the command line.

pyXSSPlatform Used to build an XSS platform on the command line. Usage: 1.generate the cert file You can use openssl like this: openssl req -new -x509

70 Jun 21, 2022