Harvis is designed to automate your C2 Infrastructure.

Related tags

Text Data & NLPHarvis
Overview

Harvis

Harvis is designed to automate your C2 Infrastructure, currently using Mythic C2.

πŸ“Œ What is it?

Harvis is a python tool to help you create multiple hauls for a specific operation during a red team engagement. It can automatically create your C2 machine, redirector machine, setup SSL, .htaccess configuration, firewall rules and more. Harvis also has the purpose of automating the job of the operator of identifying burned domains/redirectors that may be caught during the operation. After identifying the burned domains it provides the possibility of rotating the infrastructure, setting up different redirectors and atributing a different domain.

πŸ“Œ How?

πŸ”¨ Harvis uses the Digital Ocean API to automate creation and deletion of droplets.

πŸ”¨ The Namecheap API is used to set DNS records to redirectors.

πŸ”¨ The API Void is used to constantly verify the state of the redirectors and check if anything is blacklisted.

Features

Harvis has several features to help you organize your available domains and redirector/C2 machines.

  • Namecheap Interaction - Harvis enables you to buy domains directly through the command line, as long as you have credits in your account.
  • Multiple Hauls - It is possible to create as many hauls as possible, each one having specific configurations.
  • Multiple C2 Profiles by Haul - Harvis allows you to create multiple C2 profiles by hauls: you could two HTTP listeners in one of the C2's and 3 in the other one, each of them listening on different ports.
  • Multiple Redirector Configuration - Each redirector can have a different .htaccess configuration, defined by the operator.
  • Customizable Firewall Rules - Harvis comes with default firewall rules for the redirectors and C2's, however, this feature is easily customizable.
  • Priority System - Each haul has a queue system, in a way that the new redirector will replace the blacklisted one with the domains in the queue.
  • Priority System - Automatically replace your droplets: Harvis identifies any blacklisted redirector and print the results to the operator. It allows the operator to create a temporary droplet to replace the blacklisted one. It does not configure the migration of any active agents, since the way the agent will be migrated/spawned to connect to the new domain might be a very personal decision in an engagement. After creating the temporary droplet, it allows you to migrate any active agents and kill the older redirector.
  • Priority System - If somehow the script crashes, all the information will be saved in the backup.py file. Restarting the script will recover all your infrastructure as it was.

Installation

git clone https://github.com/thiagomayllart/Harvis/
cd Harvis
pip3 install -r requirements

Running

python3 harvis.py

Configuring API Keys

Harvis can only be used with the proper API Keys from Digital Ocean, Namecheap, APIVoid.

These api keys should be added to the config.py file in the respective lines. Don't forget to setup your namecheap username in namecheap_username variable:

digital_ocean_token = ""
...
namecheap_key = ""
...
apivoid_key = ""
...
namecheap_username = ""
...

For more information regarding these API Keys, visit:

https://app.apivoid.com/ https://ap.www.namecheap.com/ https://cloud.digitalocean.com/

First Run

Before running the script, you should apply some modifications to the config file, which describes the configuration of your infrastructure:

  1. Modify the "names" variable.

This variable holds the names of each haul you want your infrastructure you have. Theses names should be applied in the next variables. You can have as many hauls you want. You can also specify which domain should already be configure to each haul. If you don't specify the domains, you will be asked to move domains to each haul in the first run. If you don't have any available domains in your namecheap API, you can buy it directly though Harvis. It is also possible to have more than one domain in each haul: the first one will be used in the redirector and the others will already be in the backup list for further infrastructure rotations. Example:

names = {"short":[],"long":[],"exploitation":[],"testing":[]}
names = {"short":["domain1.com"],"long":["domain2.com","domain3.com"],"exploitation":["domain4.com"],"testing":["domain5.com"]}
  1. Modify the "config_htaccess_dic" variable:

You can customize your htaccess rules for each haul in this variable. This variable is dictionary, so remember to add an htacces for each haul you added previously in the "names" variable like:

config_htaccess_dic = \
    {"short":"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
""","long:"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
""","exploitation":"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
""","testing":"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
"""
                       }

You can notice the presence of the fields: {1}, {2}, {3}. If you customize this variable, do not remove them. {1} are the parameters your Mythic agent will use to communicate to the C2 (it also allows customization). {2} is the IP address of your C2. {3} is the location the redirector will be redirecting (it also allows customization).

  1. Modify the "agent_profiles" variable:

This variable describes the HTTP parameters that your agent will use to communicate to your C2. Mythic allows setting these parameters during the creation of the agent, so, these values should match the ones you will be configurating the agent. The first one is the GET parameter and other one is the POST parameter. Also, remeber to once again add a configuration to each haul you created previously:

agent_profiles = {"short":{"URI":"data|index"},"long":{"URI":"q|id"}... ...
  1. Modify "domain_front_redirector" variable:

This variable holds the domain your redirectors will be redirecting anyone that tries to access. Add a configuration to each haul you created previously:

domain_front_redirector = {"short":"www.example.com","long":"www.example2.com"... ... ...

  1. Modify "c2_profiles" variable:

With the "c2_profiles" variable you can create different listener profiles for each Haul you created. The format is exactly the same as the JSON you may find when accessing Configuring a C2 Profile in Mythic. You can also have other profiles than HTTP, however, depending on the protocol used, it may be necessary to change firewall rules in the C2 or the redirector (further explained).

Example:


c2_profiles = {"short":[{"name":"HTTP","config":"""{
  "instances": [
  {
    "ServerHeaders": {
      "Server": "NetDNA-cache/2.2",
      "Cache-Control": "max-age=0, no-cache",
      "Pragma": "no-cache",
      "Connection": "keep-alive",
      "Content-Type": "application/javascript; charset=utf-8"
    },
    "port": 443,
    "key_path": "privkey.pem",
    "cert_path": "cert.pem",
    "debug": true
    }
  ]
}"""}],"long":[{"name":"HTTP","config":"""{
  "instances": [
  {
    "ServerHeaders": {
      "Server": "NetDNA-cache/2.2",
      "Cache-Control": "max-age=0, no-cache",
      "Pragma": "no-cache",
      "Connection": "keep-alive",
      "Content-Type": "application/javascript; charset=utf-8"
    },
    "port": 443,
    "key_path": "privkey.pem",
    "cert_path": "cert.pem",
    "debug": true
    }
  ]
}"""}] }
  1. Modify the "check_infra_state" variable:

This variable holds the value (in seconds) that will be used as the interval between each verification of blacklisted domains by APIVoid.

  1. Modify "ip_allowed_to_connect_c2":

Replace it with the IP you will be using as the proxy to connect to your Mythic C2 panel. You can use your public IP, but it is not recommended.

  1. Replace "username":

Replace it with the username you will be using during the engagement. This variable is used to tag each droplet created in digital ocean, making the distinction between the droplets of each operator easier. It also avoids that the tool erases the droplet of another user (in case you guys are using the same API Keys).

  1. Modify Firewall Rules (OPTIONAL):

If you want to modify the firewall rules for the C2/redirector, you will find them respectively at:

C2: C2_setup.py: function firewall_rules Redirector: redirect_setup.py: firewall_rules

Important

Harvis whitelists the IP of the machine you are deploying it as being able to access your Mythic C2 panel. It is highly recommended to use a VPC to deploy Harvis.

Owner
Thiago Mayllart
- Information Security Researcher / RedTeam
Thiago Mayllart
This repository contains all the source code that is needed for the project : An Efficient Pipeline For Bloom’s Taxonomy Using Natural Language Processing and Deep Learning

Pipeline For NLP with Bloom's Taxonomy Using Improved Question Classification and Question Generation using Deep Learning This repository contains all

Rohan Mathur 9 Jul 17, 2021
Spacy-ginza-ner-webapi - Named Entity Recognition API with spaCy and GiNZA

Named Entity Recognition API with spaCy and GiNZA I wrote a blog post about this

Yuki Okuda 3 Feb 27, 2022
πŸ’¬ Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants

Rasa Open Source Rasa is an open source machine learning framework to automate text-and voice-based conversations. With Rasa, you can build contextual

Rasa 15.3k Dec 30, 2022
The NewSHead dataset is a multi-doc headline dataset used in NHNet for training a headline summarization model.

This repository contains the raw dataset used in NHNet [1] for the task of News Story Headline Generation. The code of data processing and training is available under Tensorflow Models - NHNet.

Google Research Datasets 31 Jul 15, 2022
This repo is to provide a list of literature regarding Deep Learning on Graphs for NLP

This repo is to provide a list of literature regarding Deep Learning on Graphs for NLP

Graph4AI 230 Nov 22, 2022
Task-based datasets, preprocessing, and evaluation for sequence models.

SeqIO: Task-based datasets, preprocessing, and evaluation for sequence models. SeqIO is a library for processing sequential data to be fed into downst

Google 290 Dec 26, 2022
Various capabilities for static malware analysis.

Malchive The malchive serves as a compendium for a variety of capabilities mainly pertaining to malware analysis, such as scripts supporting day to da

MITRE Cybersecurity 64 Nov 22, 2022
Code for Findings at EMNLP 2021 paper: "Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning"

Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning This repo is for Findings at EMNLP 2021 paper: Learn Cont

INK Lab @ USC 6 Sep 02, 2022
A python project made to generate code using either OpenAI's codex or GPT-J (Although not as good as codex)

CodeJ A python project made to generate code using either OpenAI's codex or GPT-J (Although not as good as codex) Install requirements pip install -r

TheProtagonist 1 Dec 06, 2021
A Chinese to English Neural Model Translation Project

ZH-EN NMT Chinese to English Neural Machine Translation This project is inspired by Stanford's CS224N NMT Project Dataset used in this project: News C

Zhenbang Feng 29 Nov 26, 2022
Code and datasets for our paper "PTR: Prompt Tuning with Rules for Text Classification"

PTR Code and datasets for our paper "PTR: Prompt Tuning with Rules for Text Classification" If you use the code, please cite the following paper: @art

THUNLP 118 Dec 30, 2022
Calibre recipe to convert latest issue of Analyse & Kritik into an ebook

Calibre Recipe fΓΌr "Analyse & Kritik" Dies ist ein "Recipe" fΓΌr die Konvertierung der aktuellen Ausgabe der Zeitung Analyse & Kritik in ein Ebook. Es

Henning 3 Jan 04, 2022
ZUNIT - Toward Zero-Shot Unsupervised Image-to-Image Translation

ZUNIT Dependencies you can install all the dependencies by pip install -r requirements.txt Datasets Download CUB dataset. Unzip the birds.zip at ./da

Chen Yuanqi 9 Jun 24, 2022
Open solution to the Toxic Comment Classification Challenge

Starter code: Kaggle Toxic Comment Classification Challenge More competitions πŸŽ‡ Check collection of public projects 🎁 , where you can find multiple

minerva.ml 153 Jun 22, 2022
Official source for spanish Language Models and resources made @ BSC-TEMU within the "Plan de las TecnologΓ­as del Lenguaje" (Plan-TL).

Spanish Language Models πŸ’ƒπŸ» Corpora πŸ“ƒ Corpora Number of documents Size (GB) BNE 201,080,084 570GB Models πŸ€– RoBERTa-base BNE: https://huggingface.co

PlanTL-SANIDAD 203 Dec 20, 2022
skweak: A software toolkit for weak supervision applied to NLP tasks

Labelled data remains a scarce resource in many practical NLP scenarios. This is especially the case when working with resource-poor languages (or text domains), or when using task-specific labels wi

Norsk Regnesentral (Norwegian Computing Center) 850 Dec 28, 2022
Code for paper "Role-oriented Network Embedding Based on Adversarial Learning between Higher-order and Local Features"

Role-oriented Network Embedding Based on Adversarial Learning between Higher-order and Local Features Train python main.py --dataset brazil-flights C

wang zhang 0 Jun 28, 2022
Faster, modernized fork of the language identification tool langid.py

py3langid py3langid is a fork of the standalone language identification tool langid.py by Marco Lui. Original license: BSD-2-Clause. Fork license: BSD

Adrien Barbaresi 12 Nov 05, 2022
Code for Text Prior Guided Scene Text Image Super-Resolution

Code for Text Prior Guided Scene Text Image Super-Resolution

82 Dec 26, 2022
λ‰΄μŠ€ 도메인 μ§ˆμ˜μ‘λ‹΅ μ‹œμŠ€ν…œ (21-1ν•™κΈ° μ‘Έμ—… ν”„λ‘œμ νŠΈ)

λ‰΄μŠ€ 도메인 μ§ˆμ˜μ‘λ‹΅ μ‹œμŠ€ν…œ λ³Έ ν”„λ‘œμ νŠΈλŠ” λ‰΄μŠ€κΈ°μ‚¬μ— λŒ€ν•œ μ§ˆμ˜μ‘λ‹΅ μ„œλΉ„μŠ€ λ₯Ό μ œκ³΅ν•˜κΈ° μœ„ν•΄μ„œ μ§„ν–‰ν•œ ν”„λ‘œμ νŠΈμž…λ‹ˆλ‹€. μ•½ 3κ°œμ›”κ°„ ( 21. 03 ~ 21. 05 ) μ§„ν–‰ν•˜μ˜€μœΌλ©° Transformer 아킀텍쳐 기반의 Encoderλ₯Ό μ‚¬μš©ν•˜μ—¬ ν•œκ΅­μ–΄ μ§ˆμ˜μ‘λ‹΅ λ°μ΄ν„°μ…‹μœΌλ‘œ

TaegyeongEo 4 Jul 08, 2022