A symbolic-model-guided fuzzer for TLS

Overview

tlspuffin

Logo with Penguin

TLS Protocol Under FuzzINg
A symbolic-model-guided fuzzer for TLS

Disclaimer: The term "symbolic-model-guided" should not be confused with symbolic execution or concolic fuzzing.

Description

Fuzzing implementations of cryptographic protocols is challenging. In contrast to traditional fuzzing of file formats, cryptographic protocols require a specific flow of cryptographic and mutually dependent messages to reach deep protocol states. The specification of the TLS protocol describes sound flows of messages and cryptographic operations.

Although the specification has been formally verified multiple times with significant results, a gap has emerged from the fact that implementations of the same protocol have not undergone the same logical analysis. Because the development of cryptographic protocols is error-prone, multiple security vulnerabilities have already been discovered in implementations in TLS which are not present in its specification.

Inspired by symbolic protocol verification, we present a reference implementation of a fuzzer named tlspuffin which employs a concrete semantic to execute TLS 1.2 and 1.3 symbolic traces. In fact attacks which mix \TLS versions are in scope of this implementation. This method allows us to utilize a genetic fuzzing algorithm to fuzz protocol flows, which is described by the following three stages.

  • By mutating traces we can deviate from the specification to test logical flaws.
  • Selection of interesting protocol flows advance the fuzzing procedure.
  • A security violation oracle supervises executions for the absence of vulnerabilities.

The novel approach allows rediscovering known vulnerabilities, which are out-of-scope for classical bit-level fuzzers. This proves that it is capable of reaching critical protocol states. In contrast to the promising methodology no new vulnerabilities were found by tlspuffin. This can can be explained by the fact that the implementation effort of TLS protocol primitives and extensions is high and not all features of the specification have been implemented. Nonetheless, the innovating approach is promising in terms of quickly reaching high edge coverage, expressiveness of executable protocol traces and stable and extensible implementation.

Features

  • Uses the LibAFL fuzzing framework
  • Fuzzer which is inspired by the Dolev-Yao symbolic model used in protocol verification
  • Domain specific mutators for Protocol Fuzzing!
  • Supported Libraries Under Test: OpenSSL 1.0.1f, 1.0.2u, 1.1.1k and LibreSSL 3.3.3
  • Reproducible for each LUT. We use Git submodules to link to forks this are in the tlspuffin organisation
  • 70% Test Coverage
  • Writtin in Rust!

Building

Now, to build the project:

git clone [email protected]/tlspuffin/tlspuffin
git submodule update --init --recursive
cargo build

Running

Fuzz using three clients:

cargo run --bin tlspuffin -- --cores 0-3

Note: After switching the Library Under Test or its version do a clean rebuild (cargo clean). For example when switching from OpenSSL 1.0.1 to 1.1.1.

Testing

cargo test

Command-line Interface

The syntax for the command-line of is:

      tlspuffin [⟨options] [⟨sub-commands⟩]

Global Options

Before we explain each sub-command, we first go over the options in the following.

  • -c, --cores ⟨spec⟩

    This option specifies on which cores the fuzzer should assign its worker processes. It can either be specified as a list by using commas "0,1,2,7" or as a range "0-7". By default, it runs just on core 0.

  • -i, --max-iters ⟨i⟩

    This option allows to bound the amount of iterations the fuzzer does. If omitted, then infinite iterations are done.

  • -p, --port ⟨n⟩

    As specified in [sec:design-multiprocessing] the initial communication between the fuzzer broker and workers happens over TCP/IP. Therefore, the broker requires a port allocation. The default port is 1337.

  • -s, --seed ⟨n⟩

    Defines an initial seed for the prng used for mutations. Note that this does not make the fuzzing deterministic, because of randomness introduced by the multiprocessing (see [sec:design-multiprocessing]).

Sub-commands

Now we will go over the sub-commands execute, plot, experiment, and seed.

  • execute ⟨input⟩

    This sub-command executes a single trace persisted in a file. The path to the file is provided by the ⟨input⟩ argument.

  • plot ⟨input⟩ ⟨format⟩ ⟨output_prefix⟩

    This sub-command plots the trace stored at ⟨input⟩ in the format specified by ⟨format⟩. The created graphics are stored at a path provided by ⟨output_prefix⟩. The option --multiple can be provided to create for each step in the trace a separate file. If the option --tree is given, then only a single graphic which contains all steps is produced.

  • experiment

    This sub-command initiates an experiment. Experiments are stored in a directory named experiments/ in the current working directory. An experiment consists of a directory which contains . The title and description of the experiment can be specified with --title ⟨t⟩ and --description ⟨d⟩ respectively. Both strings are persisted in the metadata of the experiment, together with the current commit hash of , the version and the current date and time.

  • seed

    This sub-command serializes the default seed corpus in a directory named corpus/ in the current working directory. The default corpus is defined in the source code of using the trace dsl.

Rust Setup

Install rustup.

The toolchain will be automatically downloaded when building this project. See ./rust-toolchain.toml for more details about the toolchain.

Make sure that you have the clang compiler installed. Optionally, also install llvm to have additional tools like sancov available. Also make sure that you have the usual tools for building it like make, gcc etc. installed. They may be needed to build OpenSSL.

Advanced Features

Running with ASAN

ASAN_OPTIONS=abort_on_error=1 \
    cargo run --bin tlspuffin --features asan -- --cores 0-3

It is important to enable abort_on_error, else the fuzzer workers fail to restart on crashes.

Generate Corpus Seeds

cargo run --bin tlspuffin -- seed

Plot Symbolic Traces

To plot SVGs do the following:

cargo run --bin tlspuffin -- plot corpus/seed_client_attacker12.trace svg ./plots/seed_client_attacker12

Note: This requires that the dot binary is in on your path. Note: The utility tools/plot-corpus.sh plots a whole directory

Execute a Symbolic Trace (with ASAN)

To analyze crashes you can also execute a trace which crashes the testing harness using ASAN:

cargo run --bin tlspuffin -- execute test.trace

To do the same with ASAN enabled:

ASAN_OPTIONS=detect_leaks=0 \
      cargo run --bin tlspuffin --features asan -- execute test.trace

Crash Deduplication

Creates log files for each crash and parses ASAN crashes to group crashes together.

tools/analyze-crashes.sh

Benchmarking

There is a benchmark which compares the execution of the dynamic functions to directly executing them in benchmark.rs. You can run them using:

cargo bench
xdg-open target/criterion/report/index.html

Documentation

This generates the documentation for this crate and opens the browser. This also includes the documentation of every dependency like LibAFL or rustls.

cargo doc --open

You can also view the up-to-date documentation here.

You might also like...
Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.

============================================================================================================ `MILA will stop developing Theano https:

ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs
ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs

(Comet-) ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs Paper Jena D. Hwang, Chandra Bhagavatula, Ronan Le Bras, Jeff Da, Keisuke Sa

PIGLeT: Language Grounding Through Neuro-Symbolic Interaction in a 3D World [ACL 2021]

piglet PIGLeT: Language Grounding Through Neuro-Symbolic Interaction in a 3D World [ACL 2021] This repo contains code and data for PIGLeT. If you like

Symbolic Parallel Adaptive Importance Sampling for Probabilistic Program Analysis in JAX

SYMPAIS: Symbolic Parallel Adaptive Importance Sampling for Probabilistic Program Analysis Overview | Installation | Documentation | Examples | Notebo

Official repository for the paper, MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Understanding.
Official repository for the paper, MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Understanding.

MidiBERT-Piano Authors: Yi-Hui (Sophia) Chou, I-Chun (Bronwin) Chen Introduction This is the official repository for the paper, MidiBERT-Piano: Large-

Source code and Dataset creation for the paper "Neural Symbolic Regression That Scales"

NeuralSymbolicRegressionThatScales Pytorch implementation and pretrained models for the paper "Neural Symbolic Regression That Scales", presented at I

Data and Code for ACL 2021 Paper
Data and Code for ACL 2021 Paper "Inter-GPS: Interpretable Geometry Problem Solving with Formal Language and Symbolic Reasoning"

Introduction Code and data for ACL 2021 Paper "Inter-GPS: Interpretable Geometry Problem Solving with Formal Language and Symbolic Reasoning". We cons

PyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and reinforcement learning
PyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and reinforcement learning

safe-control-gym Physics-based CartPole and Quadrotor Gym environments (using PyBullet) with symbolic a priori dynamics (using CasADi) for learning-ba

Driller: augmenting AFL with symbolic execution!

Driller Driller is an implementation of the driller paper. This implementation was built on top of AFL with angr being used as a symbolic tracer. Dril

Comments
  • Support for WolfSSL through a new rust-wolfssl crate

    Support for WolfSSL through a new rust-wolfssl crate

    I tried another approach as I struggled with #136 to debug some SEGFAULT.

    Here the approach is to clone rust-openssl, minimize the exposed interface while exposing what we need for wolfssl_binding.rs, then plug in wolfssl-sys instead of openssl-sys.

    opened by LCBH 3
  • Rediscover wolfSSL vulnerabilities

    Rediscover wolfSSL vulnerabilities

    • CVE-2020-12457 in <4.5.0 https://nvd.nist.gov/vuln/detail/CVE-2020-12457 (DDOS against server, needs change_cipher_spec (CCS) message mutations)
    • CVE 2020-24613 in <4.5.0 https://nvd.nist.gov/vuln/detail/CVE-2020-24613 in < 4.5.0, TLS 1.3 server auth. bypass
    • CVE-2021-3336 in < 4.7.0 https://nvd.nist.gov/vuln/detail/CVE-2021-3336, TLS 1.3 server auth. bypass
    • CVE 2022-25638 in < 5.2.0 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25638, TLS 1.3 server auth. bypass
    • CVE-2022-25640 in <5.2.0 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25640 (attack on client authentication, needs client authentication through cert.)
    opened by maxammann 1
Releases(evaluation)
Roadmap to becoming a machine learning engineer in 2020

Roadmap to becoming a machine learning engineer in 2020, inspired by web-developer-roadmap.

Chris Hoyean Song 1.7k Dec 29, 2022
Robust Video Matting in PyTorch, TensorFlow, TensorFlow.js, ONNX, CoreML!

Robust Video Matting (RVM) English | 中文 Official repository for the paper Robust High-Resolution Video Matting with Temporal Guidance. RVM is specific

flow-dev 2 Aug 21, 2022
Public repository of the 3DV 2021 paper "Generative Zero-Shot Learning for Semantic Segmentation of 3D Point Clouds"

Generative Zero-Shot Learning for Semantic Segmentation of 3D Point Clouds Björn Michele1), Alexandre Boulch1), Gilles Puy1), Maxime Bucher1) and Rena

valeo.ai 15 Dec 22, 2022
Generative Art Using Neural Visual Grammars and Dual Encoders

Generative Art Using Neural Visual Grammars and Dual Encoders Arnheim 1 The original algorithm from the paper Generative Art Using Neural Visual Gramm

DeepMind 231 Jan 05, 2023
Distributional Sliced-Wasserstein distance code

Distributional Sliced Wasserstein distance This is a pytorch implementation of the paper "Distributional Sliced-Wasserstein and Applications to Genera

VinAI Research 39 Jan 01, 2023
Semantic Segmentation Architectures Implemented in PyTorch

pytorch-semseg Semantic Segmentation Algorithms Implemented in PyTorch This repository aims at mirroring popular semantic segmentation architectures i

Meet Shah 3.3k Dec 29, 2022
Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks

Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks

Jina AI 794 Dec 31, 2022
A general-purpose, flexible, and easy-to-use simulator alongside an OpenAI Gym trading environment for MetaTrader 5 trading platform (Approved by OpenAI Gym)

gym-mtsim: OpenAI Gym - MetaTrader 5 Simulator MtSim is a simulator for the MetaTrader 5 trading platform alongside an OpenAI Gym environment for rein

Mohammad Amin Haghpanah 184 Dec 31, 2022
LieTransformer: Equivariant Self-Attention for Lie Groups

LieTransformer This repository contains the implementation of the LieTransformer used for experiments in the paper LieTransformer: Equivariant Self-At

OxCSML (Oxford Computational Statistics and Machine Learning) 50 Dec 28, 2022
code for paper -- "Seamless Satellite-image Synthesis"

Seamless Satellite-image Synthesis by Jialin Zhu and Tom Kelly. Project site. The code of our models borrows heavily from the BicycleGAN repository an

Light 14 Apr 05, 2022
A Quick and Dirty Progressive Neural Network written in TensorFlow.

prog_nn .▄▄ · ▄· ▄▌ ▐ ▄ ▄▄▄· ▐ ▄ ▐█ ▀. ▐█▪██▌•█▌▐█▐█ ▄█▪ •█▌▐█ ▄▀▀▀█▄▐█▌▐█▪▐█▐▐▌ ██▀

SynPon 53 Dec 12, 2022
Thermal Control of Laser Powder Bed Fusion using Deep Reinforcement Learning

This repository is the implementation of the paper "Thermal Control of Laser Powder Bed Fusion Using Deep Reinforcement Learning", linked here. The project makes use of the Deep Reinforcement Library

BaratiLab 11 Dec 27, 2022
The first public PyTorch implementation of Attentive Recurrent Comparators

arc-pytorch PyTorch implementation of Attentive Recurrent Comparators by Shyam et al. A blog explaining Attentive Recurrent Comparators Visualizing At

Sanyam Agarwal 150 Oct 14, 2022
Augmented CLIP - Training simple models to predict CLIP image embeddings from text embeddings, and vice versa.

Train aug_clip against laion400m-embeddings found here: https://laion.ai/laion-400-open-dataset/ - note that this used the base ViT-B/32 CLIP model. S

Peter Baylies 55 Sep 13, 2022
Code for "Multi-Compound Transformer for Accurate Biomedical Image Segmentation"

News The code of MCTrans has been released. if you are interested in contributing to the standardization of the medical image analysis community, plea

97 Jan 05, 2023
A multi-entity Transformer for multi-agent spatiotemporal modeling.

baller2vec This is the repository for the paper: Michael A. Alcorn and Anh Nguyen. baller2vec: A Multi-Entity Transformer For Multi-Agent Spatiotempor

Michael A. Alcorn 56 Nov 15, 2022
Code for "R-GCN: The R Could Stand for Random"

RR-GCN: Random Relational Graph Convolutional Networks PyTorch Geometric code for the paper "R-GCN: The R Could Stand for Random" RR-GCN is an extensi

PreDiCT.IDLab 31 Sep 07, 2022
A heterogeneous entity-augmented academic language model based on Open Academic Graph (OAG)

Library | Paper | Slack We released two versions of OAG-BERT in CogDL package. OAG-BERT is a heterogeneous entity-augmented academic language model wh

THUDM 58 Dec 17, 2022
A collection of loss functions for medical image segmentation

A collection of loss functions for medical image segmentation

Jun 3.1k Jan 03, 2023
Hysterese plugin with two temperature offset areas

craftbeerpi4 plugin OffsetHysterese Temperatur-Steuerungs-Plugin mit zwei tempereaturbereich abhängigen Offsets. Installation sudo pip3 install https:

HappyHibo 1 Dec 21, 2021