A Python pickling decompiler and static analyzer

Overview

Fickling

Fickling is a decompiler, static analyzer, and bytecode rewriter for Python pickle object serializations.

Pickled Python objects are in fact bytecode that is interpreted by a stack-based virtual machine built into Python called the "Pickle Machine". Fickling can take pickled data streams and decompile them into human-readable Python code that, when executed, will deserialize to the original serialized object.

The authors do not prescribe any meaning to the “F” in Fickling; it could stand for “fickle,” … or something else. Divining its meaning is a personal journey in discretion and is left as an exercise to the reader.

Installation

Fickling has been tested on Python 3.6 through Python 3.9 and has very few dependencies. It can be installed through pip:

pip3 install fickling

This installs both the library and the command line utility.

Usage

Fickling can be run programmatically:

>>> import ast
>>> import pickle
>>> from fickling.pickle import Pickled
>>> print(ast.dump(Pickled.load(pickle.dumps([1, 2, 3, 4])).ast, indent=4))
Module(
    body=[
        Assign(
            targets=[
                Name(id='result', ctx=Store())],
            value=List(
                elts=[
                    Constant(value=1),
                    Constant(value=2),
                    Constant(value=3),
                    Constant(value=4)],
                ctx=Load()))])

Fickling can also be run as a command line utility:

$ fickling pickled.data
result = [1, 2, 3, 4]

This is of course a simple example. However, Python pickle bytecode can run arbitrary Python commands (such as exec or os.system) so it is a security risk to unpickle untrusted data. You can test for common patterns of malicious pickle files with the --check-safety option:

$ fickling --check-safety pickled.data
Warning: Fickling failed to detect any overtly unsafe code, but the pickle file may still be unsafe.
Do not unpickle this file if it is from an untrusted source!

You can also safely trace the execution of the Pickle virtual machine without exercising any malicious code with the --trace option.

Finally, you can inject arbitrary Python code that will be run on unpickling into an existing pickle file with the --inject option.

License

This utility was developed by Trail of Bits. It is licensed under the GNU Lesser General Public License v3.0. Contact us if you're looking for an exception to the terms. © 2021, Trail of Bits.

Comments
  • Injections not cleaning up after itself.

    Injections not cleaning up after itself.

    The malicious code injected doesn't clean up the stack after itself which is what prevents it from being injected into arbitrary locations. This also would be the easiest way to detect pickles you've injected into. A "correct" pickle will only leave one value on the stack when everything is done, the pointer to the final object. I've never seen a real pickle not comply with this, so using pickletools.dis or your symbolic interpreter you can detect pickles you've injected into because it leaves two values on the stack whether you inject at the beginning or end.

    You can make the injections more covert by adding a pop instruction to the end so that it cleans up after itself. You would then also be able to inject into an arbitrary location like I do in https://github.com/coldwaterq/pickle_injector/blob/main/inject.py.

    For replacing the output you would add the pop instruction to the beginning of your payload / end of the real pickle, to throw away everything created and replace it with what you create.

    opened by coldwaterq 4
  • NotImplementedError: TODO: Add support for Opcode BININT

    NotImplementedError: TODO: Add support for Opcode BININT

    File "/Users/abenavides/workspace/enricher/fury_fda-models-hub-enricher/venvpython3/lib/python3.8/site-packages/fickling/pickle.py", line 106, in new raise NotImplementedError(f"TODO: Add support for Opcode {info.name}") NotImplementedError: TODO: Add support for Opcode BININT

    bug enhancement 
    opened by abenavidesmeli 2
  • NotImplementedError: TODO: Add support for Opcode BINFLOAT

    NotImplementedError: TODO: Add support for Opcode BINFLOAT

    I was trying out something sophisticated with a simple model pre-trained on MNIST. But I git this error.

    Traceback (most recent call last):
      File ".\pytorch_poc.py", line 147, in <module>
        exfil_model.pickled.insert_python_exec(
      File ".\pytorch_poc.py", line 58, in pickled
        self._pickled = Pickled.load(pickle_file)
      File "C:\Python38\lib\site-packages\fickling\pickle.py", line 343, in load
        opcodes.append(Opcode(info=info, argument=arg, data=data, position=pos))
      File "C:\Python38\lib\site-packages\fickling\pickle.py", line 105, in __new__
        raise NotImplementedError(f"TODO: Add support for Opcode {info.name}")
    NotImplementedError: TODO: Add support for Opcode BINFLOAT
    

    I guess, the project still needs work to allow to make a full-fledged ML-based attack. Any plans for when this will be completed?

    opened by shreyansh26 2
  • Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.3

    Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.3

    Bumps pypa/gh-action-pip-audit from 1.0.2 to 1.0.3.

    Release notes

    Sourced from pypa/gh-action-pip-audit's releases.

    Release 1.0.3

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.2...v1.0.3

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Bump pypa/gh-action-pip-audit from 1.0.0 to 1.0.1

    Bump pypa/gh-action-pip-audit from 1.0.0 to 1.0.1

    Bumps pypa/gh-action-pip-audit from 1.0.0 to 1.0.1.

    Release notes

    Sourced from pypa/gh-action-pip-audit's releases.

    Release 1.0.1

    What's Changed

    New Contributors

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.0...v1.0.1

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Make sure to inject our code after PROTO and FRAME

    Make sure to inject our code after PROTO and FRAME

    Resolves issue #30.

    When injecting new code, preserve leading PROTO and FRAME opcodes.

    Also adds an analysis to detect invalid PROTO opcodes that can be an indicator of tampering.

    bug enhancement 
    opened by ESultanik 1
  • Errors when scanning Stable Diffusion/Textual Inversion embeddings pickle file

    Errors when scanning Stable Diffusion/Textual Inversion embeddings pickle file

    I'm trying to give the stable diffusion community the ability to trade Textual Inversion embeddings (basically, fine-tuning the model) between each other. When I run fickle against one my embeddings, I see this:

    (base) [email protected]:~/Downloads/archive$ fickling -t data.pkl

    PROTO EMPTY_DICT Pushed {} BINPUT Memoized 0 -> {} MARK Pushed MARK BINUNICODE Pushed 'string_to_token' BINPUT Memoized 1 -> 'string_to_token' EMPTY_DICT Pushed {} BINPUT Memoized 2 -> {} BINUNICODE Pushed '' BINPUT Memoized 3 -> '' GLOBAL Traceback (most recent call last): File "/home/berble/.local/bin/fickling", line 8, in sys.exit(main()) File "/home/berble/.local/lib/python3.8/site-packages/fickling/cli.py", line 82, in main print(unparse(trace.run())) File "/home/berble/.local/lib/python3.8/site-packages/fickling/tracing.py", line 54, in run self.on_statement(added) File "/home/berble/.local/lib/python3.8/site-packages/fickling/tracing.py", line 38, in on_statement print(f"\t{unparse(statement).strip()}") File "/home/berble/.local/lib/python3.8/site-packages/astunparse/init.py", line 13, in unparse Unparser(tree, file=v) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 38, in init self.dispatch(tree) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 66, in dispatch meth(tree) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 113, in _ImportFrom interleave(lambda: self.write(", "), self.dispatch, t.names) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 19, in interleave f(next(seq)) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 66, in dispatch meth(tree) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 856, in _alias if t.asname: AttributeError: 'alias' object has no attribute 'asname'

    Any idea where I could start looking? We'd really like to be able to share embeddings safely!

    Here's a base64-encoded copy of my data.pkl:

    gAJ9cQAoWA8AAABzdHJpbmdfdG9fdG9rZW5xAX1xAlgBAAAAKnEDY3RvcmNoLl91dGlscwpfcmVi dWlsZF90ZW5zb3JfdjIKcQQoKFgHAAAAc3RvcmFnZXEFY3RvcmNoCkxvbmdTdG9yYWdlCnEGWAEA AAAwcQdYAwAAAGNwdXEIS010cQlRSwEpKYljY29sbGVjdGlvbnMKT3JkZXJlZERpY3QKcQopUnEL dHEMUnENc1gPAAAAc3RyaW5nX3RvX3BhcmFtcQ5jdG9yY2gubm4ubW9kdWxlcy5jb250YWluZXIK UGFyYW1ldGVyRGljdApxDymBcRB9cREoWAgAAAB0cmFpbmluZ3ESiFgLAAAAX3BhcmFtZXRlcnNx E2gKKVJxFGgDY3RvcmNoLl91dGlscwpfcmVidWlsZF9wYXJhbWV0ZXIKcRVoBCgoaAVjdG9yY2gK RmxvYXRTdG9yYWdlCnEWWAEAAAAxcRdYBgAAAGN1ZGE6MHEYTQADdHEZUUsASwFNAAOGcRpNAANL AYZxG4loCilScRx0cR1ScR6IaAopUnEfh3EgUnEhc1gIAAAAX2J1ZmZlcnNxImgKKVJxI1gbAAAA X25vbl9wZXJzaXN0ZW50X2J1ZmZlcnNfc2V0cSRjX19idWlsdGluX18Kc2V0CnElXXEmhXEnUnEo WA8AAABfYmFja3dhcmRfaG9va3NxKWgKKVJxKlgWAAAAX2lzX2Z1bGxfYmFja3dhcmRfaG9va3Er TlgOAAAAX2ZvcndhcmRfaG9va3NxLGgKKVJxLVgSAAAAX2ZvcndhcmRfcHJlX2hvb2tzcS5oCilS cS9YEQAAAF9zdGF0ZV9kaWN0X2hvb2tzcTBoCilScTFYGgAAAF9sb2FkX3N0YXRlX2RpY3RfcHJl X2hvb2tzcTJoCilScTNYGwAAAF9sb2FkX3N0YXRlX2RpY3RfcG9zdF9ob29rc3E0aAopUnE1WAgA AABfbW9kdWxlc3E2aAopUnE3WAUAAABfa2V5c3E4fXE5aANOc3VidS4=

    bug 
    opened by BeanCounterTop 1
  • Error using check-safety/trace features (AttributeError: 'alias' object has no attribute 'asname')

    Error using check-safety/trace features (AttributeError: 'alias' object has no attribute 'asname')

    Hello! Great tool, I like that it also includes a way to check for potentially malicious opcodes in pickle files.

    I injected a payload into a stylegan2-ada pickle file and it behaves as expected. :)

    Now, when running both --check-safety or --trace commands the following error is shown:

    !fickling --check-safety /tmp/network-snapshot-000250.backdoor.pkl
    
    Traceback (most recent call last):
      File "/usr/local/bin/fickling", line 8, in <module>
        sys.exit(main())
      File "/usr/local/lib/python3.7/dist-packages/fickling/cli.py", line 79, in main
        return [1, 0][check_safety(pickled)]
      File "/usr/local/lib/python3.7/dist-packages/fickling/analysis.py", line 38, in check_safety
        shortened, already_reported = shorten_code(node)
      File "/usr/local/lib/python3.7/dist-packages/fickling/analysis.py", line 23, in shorten_code
        code = unparse(ast_node).strip()
      File "/usr/local/lib/python3.7/dist-packages/astunparse/__init__.py", line 13, in unparse
        Unparser(tree, file=v)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 38, in __init__
        self.dispatch(tree)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 66, in dispatch
        meth(tree)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 113, in _ImportFrom
        interleave(lambda: self.write(", "), self.dispatch, t.names)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 19, in interleave
        f(next(seq))
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 66, in dispatch
        meth(tree)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 856, in _alias
        if t.asname:
    AttributeError: 'alias' object has no attribute 'asname'
    

    Let me know if there is anything more needed to debug the issue.

    Greetings!

    bug duplicate 
    opened by wunderwuzzi23 1
  • NotImplementedError: TODO: Add support for Opcode LONG1

    NotImplementedError: TODO: Add support for Opcode LONG1

    When attempting to use fickling on PyTorch models I get this error. I believe these models were just the weights. So i'm currious if this is hard to fix, and if you don't have time to fix it, any guidance you can give about the code base to help me attempt to patch it.

    opened by coldwaterq 1
  • Add DICT and INT opcodes

    Add DICT and INT opcodes

    Two very simple additions: The INT opcode is used to declare constant integer values. The DICT opcode reads values from the stack until it reaches a MARK opcode, alternating between keys and values.

    Take the following script:

    import ast
    import pickletools
    from fickling.pickle import Pickled
    
    if __name__ == '__main__':
    	
    	pickled = b"(I1\nI2\nd."
    
    	for op, arg, pos in pickletools.genops(pickled):
    		print(f"{pos}: {op.name} {arg}")
    
    	ast_data = Pickled.load(pickled).ast
    	print(ast.dump(ast_data))
    

    Before, it would output:

    0: MARK None
    1: INT 1
    4: INT 2
    7: DICT None
    8: STOP None
    Traceback (most recent call last):
      File "test.py", line 12, in <module>
        ast_data = Pickled.load(pickled).ast
      File "/home/carlos/.local/lib/python3.7/site-packages/fickling/pickle.py", line 343, in load
        opcodes.append(Opcode(info=info, argument=arg, data=data, position=pos))
      File "/home/carlos/.local/lib/python3.7/site-packages/fickling/pickle.py", line 105, in __new__
        raise NotImplementedError(f"TODO: Add support for Opcode {info.name}")
    NotImplementedError: TODO: Add support for Opcode INT
    

    Now it outputs:

    0: MARK None
    1: INT 1
    4: INT 2
    7: DICT None
    8: STOP None
    Module(body=[Assign(targets=[Name(id='result', ctx=Store())], value=Dict(keys=[Constant(value=1)], values=[Constant(value=2)]))])
    
    opened by 00xc 1
  • Add EMPTY_SET Opcode

    Add EMPTY_SET Opcode

    Thanks for this great tool! This Opcode appears to be necessary to fickle some uses of PyTorch modules such as torch.nn.Linear, which I'd love to have support for.

    Here is an example which triggers the following error:

    # example.py
    import pickle
    from fickling.pickle import Pickled
    from torch import nn
    
    filename = "model.pt"
    model = nn.Linear(2, 1)
    
    with open(filename, "wb") as model_file:
        pickle.dump(model, model_file)
    
    with open(filename, "rb") as model_file:
        pickled = Pickled.load(model_file)
    
    $ python example.py
    Traceback (most recent call last):
      File "~/ml-attacks/src/pickle_deserialization/example.py", line 12, in <module>
        pickled = Pickled.load(model_file)
      File "~/ml-attacks/.venv/lib/python3.9/site-packages/fickling/pickle.py", line 343, in load
        opcodes.append(Opcode(info=info, argument=arg, data=data, position=pos))
      File "~/ml-attacks/.venv/lib/python3.9/site-packages/fickling/pickle.py", line 105, in __new__
        raise NotImplementedError(f"TODO: Add support for Opcode {info.name}")
    NotImplementedError: TODO: Add support for Opcode EMPTY_SET
    

    Is anything else needed?

    opened by willclarktech 1
  • Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.4

    Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.4

    Bumps pypa/gh-action-pip-audit from 1.0.2 to 1.0.4.

    Release notes

    Sourced from pypa/gh-action-pip-audit's releases.

    Release 1.0.4

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.3...v1.0.4

    Release 1.0.3

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.2...v1.0.3

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 0
  • runpy._run_code and torch.jit.unsupported_tensor_ops.execWrapper

    runpy._run_code and torch.jit.unsupported_tensor_ops.execWrapper

    Hello,

    I've been playing around with some alternative ways to execute Python via pickles, and discovered both runpy._run_code and torch.jit.unsupported_tensor_ops.execWrapper can be used to call into exec without fickling detecting it. I have some demo code here that will create pickles using these techniques: https://bitbucket.org/hiddenlayersec/sai/src/master/pytorch_inject/torch_picke_inject.py

    runpy._run_code produces no warnings, and execWrapper generates a "Call to execWrapper(...) can execute arbitrary code and is inherently unsafe" warning.

    It might be worth adding explicit checks for both of these methods and detecting as overtly bad.

    Many thanks btw for the awesome library!

    Best regards,

    Tom

    opened by hidden-tom 0
  • Possible to apply heuristics scan to pickle files?

    Possible to apply heuristics scan to pickle files?

    I'm not so familiar with pickling and these scans. However, I wondered if maybe there are heuristics or signatures for certain types of pickle files that could be evaluated.

    If you knew for example that a pickle file should be for a stable diffusion model, some properties could be examined that might help to verify a bit more.

    If so, could set up something like a /signatures directoy and let people pull request in definitions, then could scan -security -sig='signatures/typename'

    This can be closed, just wanted to pass the idea by in case it could be useful

    opened by neural-loop 0
  • Add direct support for PyTorch/TorchScript serialized models

    Add direct support for PyTorch/TorchScript serialized models

    Right now, pytorch_poc.py injects malicious code contained within the pickle files of the PyTorch standard model format. This and the TorchScript serialization format are ZIP archives with pickle files. It would be great to expand upon those and provide users with easy-to-use functions that can directly manipulate these files since they're relatively common.

    enhancement 
    opened by suhacker1 2
Releases(v0.0.4)
Owner
Trail of Bits
More code: binary lifters @lifting-bits, blockchain @crytic
Trail of Bits
python template private service

Template for private python service This is a cookiecutter template for an internal REST API service, written in Python, inspired by layout-golang. Th

UrvanovCompany 15 Oct 02, 2022
JSON-RPC server based on fastapi

Description JSON-RPC server based on fastapi: https://fastapi.tiangolo.com Motivation Autogenerated OpenAPI and Swagger (thanks to fastapi) for JSON-R

199 Dec 30, 2022
Web Version of avatarify to democratize even further

Web-avatarify for image animations This is the code base for this website and its backend. This aims to bring technology closer to everyone, just by a

Carlos Andrés Álvarez Restrepo 66 Nov 09, 2022
REST API with FastAPI and SQLite3.

REST API with FastAPI and SQLite3

Luis Quiñones Requelme 2 Mar 14, 2022
FastAPI构建的API服务

使用FastAPI 构建的商城项目API 学习FastAPI 构建项目目录 构建项目接口: 对应博客:https://www.charmcode.cn/article/2020-06-08_vue_mall_api 声明 此项目已经不再维护, 可以参考我另外一个项目https://github.co

王小右 64 Oct 04, 2022
TODO aplication made with Python's FastAPI framework and Hexagonal Architecture

FastAPI Todolist Description Todolist aplication made with Python's FastAPI framework and Hexagonal Architecture. This is a test repository for the pu

Giovanni Armane 91 Dec 31, 2022
基于Pytorch的脚手架项目,Celery+FastAPI+Gunicorn+Nginx+Supervisor实现服务部署,支持Docker发布

cookiecutter-pytorch-fastapi 基于Pytorch的 脚手架项目 按规范添加推理函数即可实现Celery+FastAPI+Gunicorn+Nginx+Supervisor+Docker的快速部署 Requirements Python = 3.6 with pip in

17 Dec 23, 2022
官方文档已经有翻译的人在做了,

FastAPI 框架,高性能,易学,快速编码,随时可供生产 文档:https://fastapi.tiangolo.com 源码:https://github.com/tiangolo/fastapi FastAPI 是一个现代、快速(高性能)的 Web 框架,基于标准 Python 类型提示,使用

ApacheCN 27 Nov 11, 2022
Twitter API with fastAPI

Twitter API with fastAPI Content Forms Cookies and headers management Files edition Status codes HTTPExceptions Docstrings or documentation Deprecate

Juan Agustin Di Pasquo 1 Dec 21, 2021
Money Transaction is a system based on the recent famous FastAPI.

moneyTransfer Overview Money Transaction is a system based on the recent famous FastAPI. techniques selection System's technique selection is as follo

2 Apr 28, 2021
ReST based network device broker

The Open API Platform for Network Devices netpalm makes it easy to push and pull state from your apps to your network by providing multiple southbound

368 Dec 31, 2022
ASGI middleware for authentication, rate limiting, and building CRUD endpoints.

Piccolo API Utilities for easily exposing Piccolo models as REST endpoints in ASGI apps, such as Starlette and FastAPI. Includes a bunch of useful ASG

81 Dec 09, 2022
An image validator using FastAPI.

fast_api_image_validator An image validator using FastAPI.

Kevin Zehnder 7 Jan 06, 2022
🤪 FastAPI + Vue构建的Mall项目后台管理

Mall项目后台管理 前段时间学习Vue写了一个移动端项目 https://www.charmcode.cn/app/mall/home 然后教程到此就结束了, 我就总感觉少点什么,计划自己着手写一套后台管理。 相关项目 移动端Mall项目源码(Vue构建): https://github.com/

王小右 131 Jan 01, 2023
Async and Sync wrapper client around httpx, fastapi, date stuff

lazyapi Async and Sync wrapper client around httpx, fastapi, and datetime stuff. Motivation This library is forked from an internal project that works

2 Apr 19, 2022
A Python pickling decompiler and static analyzer

Fickling Fickling is a decompiler, static analyzer, and bytecode rewriter for Python pickle object serializations. Pickled Python objects are in fact

Trail of Bits 162 Dec 13, 2022
Backend, modern REST API for obtaining match and odds data crawled from multiple sites. Using FastAPI, MongoDB as database, Motor as async MongoDB client, Scrapy as crawler and Docker.

Introduction Apiestas is a project composed of a backend powered by the awesome framework FastAPI and a crawler powered by Scrapy. This project has fo

Fran Lozano 54 Dec 13, 2022
Auth for use with FastAPI

FastAPI Auth Pluggable auth for use with FastAPI Supports OAuth2 Password Flow Uses JWT access and refresh tokens 100% mypy and test coverage Supports

David Montague 95 Jan 02, 2023
volunteer-database

This is the official CSM (Crowd source medical) database The What Now? We created this in light of the COVID-19 pandemic to allow volunteers to work t

32 Jun 21, 2022
This is a FastAPI application that provides a RESTful API for the Podcasts from different podcast's RSS feeds

The Podcaster API This is a FastAPI application that provides a RESTful API for the Podcasts from different podcast's RSS feeds. The API response is i

Sagar Giri 2 Nov 07, 2021