Threat Intel Platform for T-POTs

Overview

GreedyBear

GitHub release (latest by date) GitHub Repo stars

CodeFactor Code style: black Imports: isort Pull request automation

The project goal is to extract data of the attacks detected by a TPOT or a cluster of them and to generate some feeds that can be used to prevent and detect attacks.

Official announcement here.

Feeds

Public feeds

There are public feeds provided by The Honeynet Project in this site: greedybear.honeynet.org. Example

Please do not perform too many requests to extract feeds or you will be banned.

If you want to be updated regularly, please download the feeds only once every 10 minutes (this is the time between each internal update).

Available feeds

The feeds are reachable through the following URL:

https://
   
    /api/feeds/
    
     /
     
      /
      
       .
        
       
      
     
    
   

The available feed_type are:

  • log4j: attacks detected from the Log4pot.
  • cowrie: attacks detected from the Cowrie Honeypot
  • all: get all types at once

The available attack_type are:

  • scanner: IP addresses captured by the honeypots while performing attacks
  • payload_request: IP addresses and domains extracted from payloads that would have been executed after a speficic attack would have been successful
  • all: get all types at once

The available age are:

  • recent: most recent IOCs seen in the last 3 days
  • persistent: these IOCs are the ones that were seen regularly by the honeypots. This feeds will start empty once no prior data was collected and will become bigger over time.

The available format are:

  • txt: plain text (just one line for each IOC)
  • csv: CSV-like file (just one line for each IOC)
  • json: JSON file with additional information regarding the IOCs

Run Greedybear on your environment

The tool has been created not only to provide the feeds from The Honeynet Project's cluster of TPOTs.

If you manage one or more T-POTs of your own, you can get the code of this application and run Greedybear on your environment. In this way, you are able to provide new feeds of your own.

Comments
  • Added Basic Testcases

    Added Basic Testcases

    Description

    Added Testcases for Views and Models

    Related issues

    Fixes #21

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [ ] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by uzaxirr 11
  • Create authenticated enrichment service

    Create authenticated enrichment service

    We could provide a service that could be queried via API key. In this way, it would be possibile to understand if an IOC is in the database of Greedybear without having to download and manage all the feeds from Greedybear.

    It would be a simple enrichment service.

    We would need:

    • a basic GUI (#11) to allow people register and get an API key.
    • limit API usage to avoid abuse.
    • allow different kind of API usage limits
    • create new API endpoint (#17)
    • Integrate it in IntelOwl (https://github.com/intelowlproject/IntelOwl/issues/817)
    opened by mlodic 9
  • Create feeds for other honeypot types

    Create feeds for other honeypot types

    GreedyBear works by extracting the data from the T-Pot logs generated by the honeypots.

    As a first alpha release we just integrated log4jpot + cowrie.

    We should also integrate all the other available honeypots in the T-PoT. Glutton should be the first

    opened by mlodic 8
  • Fixes #17: Added API for Enrichment

    Fixes #17: Added API for Enrichment

    Description

    Added Enrichment Endpoint. To get details of an observable my it's name. Endpoint: /api/enrichment?query=<observable_name>

    Please ignore the vague changes in settings.py regarding env vars. Did it because of #23 I'll revert them when my PR is good to go.

    Added Fake data in DB through admin pannel for testing purpose

    Related issues

    Fixes and Closes #17

    Type of change

    Please delete options that are not relevant.

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Screenshots

    API Response

    For a record that exist in DB

    Screenshot from 2022-01-05 21-41-55

    For a record that does not exist in DB

    Screenshot from 2022-01-05 21-42-07

    Details of the searched observable in DB

    Screenshot from 2022-01-02 23-24-40

    All Records in DB

    Screenshot from 2022-01-02 23-24-27

    opened by uzaxirr 7
  • Configured Read the Docs

    Configured Read the Docs

    Description

    Configured Read the Docs

    Changes I have done :

    added .readthedocs.yaml file made some changes to docs/source/conf.py added documentation link in readme

    Things to complete :

    I created only the empty md files in docs but haven't added any documentation in them need to add doc of openapi and redoc.

    Related issues

    This PR partially solves issue #27

    Type of change

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [x] The tests gave 0 errors.
    • [x] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by yaswanthsaivendra 4
  • Added elasticsearch container for development

    Added elasticsearch container for development

    Description

    Added elasticsearch container for development

    Related issues

    closes #23

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 4
  • Elasticsearch installation error

    Elasticsearch installation error

    i'm encountering some error while setting up GreedyBear locally After doing the docker-compose -p greedybear up cmd. It originates from settings.py where Elasticsearch client is being initialized. The ELASTIC_ENDPOINT variable in my env file is empty Screenshot from 2022-01-02 19-51-22

    opened by uzaxirr 4
  • updated feeds  view to make use of DRF and added durin authenication

    updated feeds view to make use of DRF and added durin authenication

    Description

    • Made changes to feeds View to make use of DRF
    • Added token authentication of django-rest-durin.

    Related issues

    This PR solves #26 issue.

    Type of change

    • [ ] New feature (non-breaking change which adds functionality).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)
    opened by yaswanthsaivendra 3
  • Rate limiting for admin and API

    Rate limiting for admin and API

    Description

    Rate limiting for admin and API

    Related issues

    #31

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [X] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 2
  • Add CONTRIBUTING.md file

    Add CONTRIBUTING.md file

    Can we please add or refer to the URL containing the guidelines for future contributors, I see there's nothing mentioned about it in the readme or docs for this repo.

    opened by ManishShah120 2
  • Integrate GreedyBear inside T-Pot installation

    Integrate GreedyBear inside T-Pot installation

    This would require that all of these issues were solved first:

    • #11 , #12 , #10 , #21 , #27

    Plus, we would need to work with the T-Pot team to properly integrate the project there. The goal is to try to reduce the complexity of the overall application to allow an easy integration

    opened by mlodic 2
  • Allow to do customized feeds lookups

    Allow to do customized feeds lookups

    We could add more ways to extract data feeds from GB other than "recent" and "persistent" which are free.

    These new ways must be protected with authentication to avoid abuse.

    We could give the users the chance to:

    • download the data extracted in the last X hours (customization of "recent")
    • download the data that was seen more than X times in the last X days (customization of "persistent")
    opened by mlodic 0
  • Filter IP addresses from known scanners

    Filter IP addresses from known scanners

    We should periodically download this batch of data: https://raw.githubusercontent.com/stamparm/maltrail/master/trails/static/mass_scanner.txt and add those IP to whitelists to reduce number of false positives

    opened by mlodic 0
  • Add the chance to select which honeypot we want to extract data from

    Add the chance to select which honeypot we want to extract data from

    Right now there is no chance to do that. GreedyBear would automatically extract data from all the configured honeypots.

    We should allow the app administrator from the Django Admin to enable/disable honeypot extraction. In that way we can also filter logs which states that the honeypot is not running.

    opened by mlodic 0
Releases(v1.0.2)
Owner
The Honeynet Project
The Honeynet Project
Notebooks, slides and dataset of the CorrelAid Machine Learning Winter School

CorrelAid Machine Learning Spring School Welcome to the CorrelAid ML Spring School! In this repository you can find the slides and other files for the

CorrelAid 12 Nov 23, 2022
Fast and customizable vulnerability scanner For JIRA written in Python

Fast and customizable vulnerability scanner For JIRA. 🤔 What is this? Jira-Lens 🔍 is a Python Based vulnerability Scanner for JIRA. Jira is a propri

Mayank Pandey 185 Dec 25, 2022
Log4j2 intranet scan

Log4j2-intranet-scan ⚠️ 免责声明 本项目仅面向合法授权的企业安全建设行为,在使用本项目进行检测时,您应确保该行为符合当地的法律法规,并且已经取得了足够的授权 如您在使用本项目的过程中存在任何非法行为,您需自行承担相应后果,我们将不承担任何法律及连带责任 在使用本项目前,请您务

k3rwin 16 Dec 19, 2022
EyeJo是一款自动化资产风险评估平台,可以协助甲方安全人员或乙方安全人员对授权的资产中进行排查,快速发现存在的薄弱点和攻击面。

EyeJo EyeJo是一款自动化资产风险评估平台,可以协助甲方安全人员或乙方安全人员对授权的资产中进行排查,快速发现存在的薄弱点和攻击面。 免责声明 本平台集成了大量的互联网公开工具,主要是方便安全人员整理、排查资产、安全测试等,切勿用于非法用途。使用者存在危害网络安全等任何非法行为,后果自负,作

429 Dec 31, 2022
POC for detecting the Log4Shell (Log4J RCE) vulnerability.

log4shell-poc-py POC for detecting the Log4Shell (Log4J RCE) vulnerability. Run on a system with python3 python3 log4shell-poc.py pathToTargetFile

BCC Risk Advisory 2 Dec 22, 2021
Patching - Interactive Binary Patching for IDA Pro

Patching - Interactive Binary Patching for IDA Pro Overview Patching assembly code to change the behavior of an existing program is not uncommon in ma

589 Dec 30, 2022
log4j2 passive burp rce scanning tool get post cookie full parameter recognition

log4j2_burp_scan 自用脚本log4j2 被动 burp rce扫描工具 get post cookie 全参数识别,在ceye.io api速率限制下,最大线程扫描每一个参数,记录过滤已检测地址,重复地址 token替换为你自己的http://ceye.io/ token 和域名地址

5 Dec 10, 2021
Chrome Post-Exploitation is a client-server Chrome exploit to remotely allow an attacker access to Chrome passwords, downloads, history, and more.

ChromePE [Linux/Windows] Chrome Post-Exploitation is a client-server Chrome exploit to remotely allow an attacker access to Chrome passwords, download

Finn Lancaster 3 Oct 05, 2022
This tool ability to analyze software packages of different programming languages that are being or will be used in their codes, providing information that allows them to know in advance if this library complies with processes.

This tool gives developers, researchers and companies the ability to analyze software packages of different programming languages that are being or will be used in their codes, providing information

Telefónica 66 Nov 08, 2022
proof-of-concept running docker container from omero web

docker-from-omero-poc proof-of-concept running docker container from omero web How-to Edit test_script.py so that the BaseClient is created pointing t

Erick Martins Ratamero 2 Jan 22, 2022
Vulnerability Scanner & Auto Exploiter You can use this tool to check the security by finding the vulnerability in your website or you can use this tool to Get Shells

About create a target list or select one target, scans then exploits, done! Vulnnr is a Vulnerability Scanner & Auto Exploiter You can use this tool t

Nano 108 Dec 04, 2021
Analyse a forensic target (such as a directory) to find and report files found and not found from CIRCL hashlookup public service

Analyse a forensic target (such as a directory) to find and report files found and not found from CIRCL hashlookup public service. This tool can help a digital forensic investigator to know the conte

hashlookup 96 Dec 20, 2022
A decompilation of the Nintendo Switch version of Captain Toad: Treasure Tracker

cttt-decomp A decompilation of the Nintendo Switch version of Captain Toad: Trea

shibbs 14 Aug 17, 2022
This is an advanced backdoor, created with Python

Backdoor This is a Backdoor, created with Python 3. Types of Commands: Downloading / Uploading files. Launching / Deleting / Reading file's content. S

swagkarna 28 Oct 28, 2022
A set of blender assets created for the $yb NFT project.

fyb-blender A set of blender assets created for the $yb NFT project. Install just as you would any other Blender Add-on (via Edit-Preferences-Add-on

Pedro Arroyo 1 May 06, 2022
Keystroke logging, often referred to as keylogging or keyboard capturing

Keystroke logging, often referred to as keylogging or keyboard capturing, is the action of recording the keys struck on a keyboard, typically covertly, so that a person using the keyboard is unaware

Harsha G 2 Jan 11, 2022
Yuyu Scanner is a Web Reconnaissance & Web Analysis Scanner to find assets and information about targets.

Yuyu Scanner Yuyu Scanner is a Web Reconnaissance & Web Analysis Scanner to find assets and information about targets. installation ! run as root

Justakazh 20 Nov 24, 2022
Visibility and Mitigation for Log4J vulnerabilities

Visibility and Mitigation for Log4J vulnerabilities Several scripts for the visibility and mitigation of Log4J vulnerabilities. Static Scanner - Linux

SentinelLabs 15 May 21, 2022
CVE 2020-14871 Solaris exploit

CVE 2020-14871 Solaris exploit This is a basic ROP based exploit for CVE 2020-14871. CVE 2020-14871 is a vulnerability in Sun Solaris systems. The act

Robin Massink 2 Oct 25, 2022
Cowrie SSH/Telnet Honeypot https://cowrie.readthedocs.io

Cowrie Welcome to the Cowrie GitHub repository This is the official repository for the Cowrie SSH and Telnet Honeypot effort. What is Cowrie Cowrie is

Cowrie 4.1k Jan 09, 2023