Share constant definitions between programming languages and make your constants constant again

Overview

Introduction

Reconstant lets you share constant and enum definitions between programming languages.

Constants are defined in a yaml file and converted to idiomatic definitions in multiple programming languages.

Supported outputs include C/CPP header files, Python3 (using the enum module), Python2, Javascript, VueMixins, and Java.

This is still a WIP. Feel free to open an issue on github with questions or a PR with support for additional languages.

Example

Create an input file

test.yaml

constants:
- name: SOME_CONSTANT
  value: "this is a constant string"
- name: OTHER_CONSTANT
  value: 42

enums:
- name: SomeEnum
  values:
    - A
    - B
    - C
- name: OtherEnum
  values:
    - FOO
    - BAR

outputs:
  python:
    path: autogenerated_constants.py
  javascript:
    path: autogenerated_constants.js
  c:
    path: autogenerated_constants.h
  python2:
    path: autogenerated_constants_py2.py
  vue:
    path: autogenerated_vue_constants.js
  java:
    path: AutogeneratedConstants.java

Install and run reconstant

pip install git+https://github.com/aantn/reconstant.git
reconstant test.yaml

Generated Output Files

Python

autogenerated_constants.py

# autogenerated by reconstant - do not edit!
from enum import Enum

# constants
SOME_CONSTANT = "this is a constant string"
OTHER_CONSTANT = 42

# enums
class SomeEnum(Enum):
	A = 0
	B = 1
	C = 2

class OtherEnum(Enum):
	FOO = 0
	BAR = 1

Javascript

autogenerated_constants.js

// autogenerated by reconstant - do not edit!

// constants
export const SOME_CONSTANT = "this is a constant string"
export const OTHER_CONSTANT = 42

// enums
export const SomeEnum = {
	A : 0,
	B : 1,
	C : 2,
}
export const OtherEnum = {
	FOO : 0,
	BAR : 1,
}

C/CPP

autogenerated_constants.h

// autogenerated by reconstant - do not edit!
#ifndef AUTOGENERATED_CONSTANTS_H
#define AUTOGENERATED_CONSTANTS_H

// constants
const char* SOME_CONSTANT = "this is a constant string";
const int OTHER_CONSTANT = 42;

// enums
typedef enum { A, B, C } SomeEnum;
typedef enum { FOO, BAR } OtherEnum;

#endif /* AUTOGENERATED_CONSTANTS_H */

Python2-Compatible Output

autogenerated_constants_py2.py

# autogenerated by reconstant - do not edit!

# constants
SOME_CONSTANT = "this is a constant string"
OTHER_CONSTANT = 42

# enums
SOME_ENUM_A = 0
SOME_ENUM_B = 1
SOME_ENUM_C = 2
OTHER_ENUM_FOO = 0
OTHER_ENUM_BAR = 1

Vue Mixins

autogenerated_vue_constants.py

// autogenerated by reconstant - do not edit!

// constants
export const SOME_CONSTANT = "this is a constant string"
export const OTHER_CONSTANT = 42

// enums
export const SomeEnum = {
	A : 0,
	B : 1,
	C : 2,
}

SomeEnum.Mixin = {
  created () {
      this.SomeEnum = SomeEnum
  }
}
export const OtherEnum = {
	FOO : 0,
	BAR : 1,
}

OtherEnum.Mixin = {
  created () {
      this.OtherEnum = OtherEnum
  }
}

Java

AutogeneratedConstants.java

// autogenerated by reconstant - do not edit!
public final class AutogeneratedConstants {

// constants
	public static final String SOME_CONSTANT = "this is a constant string";
	public static final int OTHER_CONSTANT = 42;

// enums
	public enum SomeEnum {
		A, 
		B, 
		C
	}
	public enum OtherEnum {
		FOO, 
		BAR
	}

}
Owner
Natan Yellin
Natan Yellin
Implementation of Fast Transformer in Pytorch

Fast Transformer - Pytorch Implementation of Fast Transformer in Pytorch. This only work as an encoder. Yannic video AI Epiphany Install $ pip install

Phil Wang 167 Dec 27, 2022
DiffSinger: Singing Voice Synthesis via Shallow Diffusion Mechanism (SVS & TTS); AAAI 2022

DiffSinger: Singing Voice Synthesis via Shallow Diffusion Mechanism This repository is the official PyTorch implementation of our AAAI-2022 paper, in

Jinglin Liu 829 Jan 07, 2023
STT for TorchScript is a port of Coqui STT based on DeepSpeech to PyTorch.

st3 STT for TorchScript is a port of Coqui STT based on DeepSpeech to PyTorch. Currently it supports converting pbmm models to pt scripts with integra

Vlad Ki 8 Oct 18, 2021
A fast hierarchical dimensionality reduction algorithm.

h-NNE: Hierarchical Nearest Neighbor Embedding A fast hierarchical dimensionality reduction algorithm. h-NNE is a general purpose dimensionality reduc

Marios Koulakis 35 Dec 12, 2022
Code for Editing Factual Knowledge in Language Models

KnowledgeEditor Code for Editing Factual Knowledge in Language Models (https://arxiv.org/abs/2104.08164). @inproceedings{decao2021editing, title={Ed

Nicola De Cao 86 Nov 28, 2022
Topic Inference with Zeroshot models

zeroshot_topics Table of Contents Installation Usage License Installation zeroshot_topics is distributed on PyPI as a universal wheel and is available

Rita Anjana 55 Nov 28, 2022
NLP-based analysis of poor Chinese movie reviews on Douban

douban_embedding 豆瓣中文影评差评分析 1. NLP NLP(Natural Language Processing)是指自然语言处理,他的目的是让计算机可以听懂人话。 下面是我将2万条豆瓣影评训练之后,随意输入一段新影评交给神经网络,最终AI推断出的结果。 "很好,演技不错

3 Apr 15, 2022
Deep Learning Topics with Computer Vision & NLP

Deep learning Udacity Course Deep Learning Topics with Computer Vision & NLP for the AWS Machine Learning Engineer Nanodegree Program Tasks are mostly

Simona Mircheva 1 Jan 20, 2022
Auto_code_complete is a auto word-completetion program which allows you to customize it on your needs

auto_code_complete is a auto word-completetion program which allows you to customize it on your needs. the model for this program is one of the deep-learning NLP(Natural Language Process) model struc

RUO 2 Feb 22, 2022
A script that automatically creates a branch name using google translation api and jira api

About google translation api와 jira api을 사용하여 자동으로 브랜치 이름을 만들어주는 스크립트 Setup 환경변수에 다음 3가지를 등록해야 한다. JIRA_USER : JIRA email (ex: hyunwook.kim 2 Dec 20, 2021

Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.

Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.

730 Jan 09, 2023
Takes a string and puts it through different languages in Google Translate a requested amount of times, returning nonsense.

PythonTextObfuscator Takes a string and puts it through different languages in Google Translate a requested amount of times, returning nonsense. Requi

2 Aug 29, 2022
Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.

Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.

18 Nov 28, 2022
Persian Bert For Long-Range Sequences

ParsBigBird: Persian Bert For Long-Range Sequences The Bert and ParsBert algorithms can handle texts with token lengths of up to 512, however, many ta

Sajjad Ayoubi 63 Dec 14, 2022
This repository describes our reproducible framework for assessing self-supervised representation learning from speech

LeBenchmark: a reproducible framework for assessing SSL from speech Self-Supervised Learning (SSL) using huge unlabeled data has been successfully exp

49 Aug 24, 2022
[EMNLP 2021] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.

[EMNLP 2021] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.

Cambridge Language Technology Lab 61 Dec 10, 2022
The (extremely) naive sentiment classification function based on NBSVM trained on wisesight_sentiment

thai_sentiment The naive sentiment classification function based on NBSVM trained on wisesight_sentiment วิธีติดตั้ง pip install thai_sentiment==0.1.3

Charin 7 Dec 08, 2022
Unsupervised text tokenizer focused on computational efficiency

YouTokenToMe YouTokenToMe is an unsupervised text tokenizer focused on computational efficiency. It currently implements fast Byte Pair Encoding (BPE)

VK.com 847 Dec 19, 2022
Yet Another Sequence Encoder - Encode sequences to vector of vector in python !

Yase Yet Another Sequence Encoder - encode sequences to vector of vectors in python ! Why Yase ? Yase enable you to encode any sequence which can be r

Pierre PACI 12 Aug 19, 2021
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context

Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context This repository contains the code in both PyTorch and TensorFlow for our paper

Zhilin Yang 3.3k Dec 28, 2022