Additional tools for particle accelerator data analysis and machine information

Overview

PyLHC Tools

Cron Testing Code Climate coverage Code Climate maintainability (percentage) GitHub last commit GitHub release

This package is a collection of useful scripts and tools for the Optics Measurements and Corrections group (OMC) at CERN.

Documentation

Getting Started

This package is Python 3.7+ compatible, and can be installed through pip:

pip install pylhc

One can also install from VCS:

git clone https://github.com/pylhc/PyLHC
pip install /path/to/PyLHC

Or simply from the online master branch, which is stable:

pip install git+https://github.com/pylhc/PyLHC.git#egg=pylhc

After installing, scripts can be run with either python -m pylhc.SCRIPT --FLAG ARGUMENT or by calling the .py files directly.

Note: some scripts access functionality only available on the CERN Technical Network. To use those, you should make sure to install the relevant extra dependencies with pip install path/to/Pylhc[cern].

Description

This package provides tools for particle accelerator data analysis, simulations management and machine information extraction; complementing the optics measurement analysis tools of the omc3 package.

Functionality

  • Forced DA Analysis - Script to analyse forced DA. (forced_da_analysis.py)
  • Machine Settings Info - Prints an overview over the machine settings at a given time. (machine_settings_info.py)
  • BSRT Logger and BSRT Analysis - Saves data coming straight from LHC BSRT FESA class and allows subsequent analysis. (bsrt_logger.py & bsrt_analysis.py )
  • BPM Calibration Factors - Compute the BPM calibration factors using ballistic optics. Two methods are available: using the beta function and using the dispersion. (bpm_calibration.py)

Quality checks

  • Unit and accuracy tests are run automatically through CI Github Actions. See our workflows in this readme.
  • Additional checks for code-complexity, design-rules, test-coverage and duplication are made through CodeClimate.
  • Pull requests implementing functionality or fixes are merged into the master branch after passing CI, and getting a reviewer's approval.

Changelog

See the CHANGELOG file.

Hints for Developers

In case you want to contribute to PyLHC's development, you should install it in editable mode:

git clone https://github.com/pylhc/PyLHC
pip install --editable PyLHC

You can install extra dependencies (as defined in setup.py) suited to your use case with the following commands:

pip install --editable PyLHC[cern]
pip install --editable PyLHC[test]
pip install --editable PyLHC[test,doc]
pip install --editable PyLHC[all]

Open an issue, make your changes in a branch and submit a pull request.

Authors

  • pyLHC/OMC-Team - Working Group - pyLHC

License

This project is licensed under the GNU GPLv3 License - see the LICENSE file for details.

Comments
  • BPM Calibration

    BPM Calibration

    opened by Mael-Le-Garrec 5
  • Properly organise extra dependencies for the CERN GPN

    Properly organise extra dependencies for the CERN GPN

    This is essentially https://github.com/pylhc/omc3/issues/272 applied to this package. I will also rename the extra from tech to cern as agreed in https://github.com/pylhc/omc3/pull/273.

    Enhancement Feature Release 
    opened by fsoubelet 2
  • Refactor for consistency

    Refactor for consistency

    Refactor for consistency with other pylhc packages, which should close #41

    There's some imports changes in there from Pycharm's optimize imports, but the important changes are in setup.py and conf.py. Includes dependencies version updates.

    Important change regarding pyjapc, which is not kept up-to-date on PyPI (and installing from master can mess things up badly in builds): it is now declared as an extra dependency ([tech]) as we discussed on Mattermost, but this can still change. Tbd in this PR.

    Moving to GA will be in another issue / PR.

    opened by fsoubelet 2
  • Update setup for consistency with pylhc packages

    Update setup for consistency with pylhc packages

    Would also be a good time to think about the dependencies' versions.

    Since we're all fine with pandas 1.x in other packages, we probably shouldn't require 0.25.x here.

    Enhancement 
    opened by fsoubelet 2
  • SDDS update for llong

    SDDS update for llong

    Hi,

    The new format for sdds files often uses llong format. I was able to use your code with the following modifications : classes.py, line 17-19 : add llong format (>i8, 8, int) reader.py line 139 : convert num_dims to int -> int(num_dims)

    Thank you for the good work! Is there any plan to add ascii compatibility?

    opened by pbelange 2
  • LSA knob to MAD-X script converter

    LSA knob to MAD-X script converter

    Adds a script which can take an LSA knob and an optics it is defined for to create both a definition TFS file and a MAD-X script which will reproduce the knob in simulations.

    The script can also be fed a text file with many knobs definitions as well as their trim values, and run for all of these knobs. See examples in the module docstring.

    opened by fsoubelet 1
  • Remove submitter scripts

    Remove submitter scripts

    Closes #71

    Removed:

    • Entrypoint scripts for job_submitter and autosix
    • The htc and sixdesk_tools modules
    • Tests for the above
    • Documentation files for the modules and the entrypoints
    • Mentions in the README
    Release Request 
    opened by fsoubelet 1
  • Import ABCs from the proper modules

    Import ABCs from the proper modules

    Current job_submitter raises the following:

    DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
      from collections import OrderedDict, Iterable
    

    Should be a very quick change to become compliant (the warning is for Iterable only though).

    Enhancement 
    opened by fsoubelet 1
  • Fix lsa to madx writer

    Fix lsa to madx writer

    • LSA to MADX sign convention
    • Better trim naming
    • check madx names for allowed characters
    • option to init all variables
    • machine settings info takes ISO time
    opened by JoschD 0
  • CI Updates

    CI Updates

    Update to the CI workflows making use of the newer versions of certain official worflows.

    • Caching and cache management left to the setup-python action
    • Properly call pip as module everywhere
    • Do not run the build (and build check) twice in the publish repo
    CI/CD 
    opened by fsoubelet 0
  • Rewrite Forced_DA to da_analysis

    Rewrite Forced_DA to da_analysis

    Allow input of measurements from blown-up beams, (single) kicked beams and excited beams (forced da) via switches.

    • add missing formulas
    • input to be checked for heated (no kick)
    • input also nominal emittance (HL-LHC)
    opened by JoschD 0
  • add script to calculate RDT from tracking data

    add script to calculate RDT from tracking data

    complementary to metaclass/opticsclass, create script which takes PTC trackone data and returns processed RDTs

    compared to omc3 RDT reconstruction, no px reconstruction necessary

    Enhancement 
    opened by mihofer 3
Releases(0.7.4)
  • 0.7.4(Oct 19, 2022)

    Path release 0.7.4:

    Changes in Machine Settings Info

    • Default behaviour for no knobs given(no --knobs or knobs=None): extract None.
    • Old behaviour of extracting all restored by giving knobs = ["all"] (CLI: --knobs all)
    • Option ["default"] available for default knobs as used in OMC3. (CLI: --knobs default)
    • Additional debug logging

    What's Changed

    • Machine settings debug logging and knobs by @JoschD in https://github.com/pylhc/PyLHC/pull/105

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.3...0.7.4

    Source code(tar.gz)
    Source code(zip)
  • 0.7.3(Oct 11, 2022)

    Release 0.7.3 is a patch release which fixes:

    • LSA to MADX sign convention
    • Better trim naming
    • check madx names for allowed characters
    • option to init all variables
    • machine settings info takes ISO time

    What's Changed

    • CI Updates by @fsoubelet in https://github.com/pylhc/PyLHC/pull/103
    • Fix lsa to madx writer by @JoschD in https://github.com/pylhc/PyLHC/pull/104

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.2...0.7.3

    Source code(tar.gz)
    Source code(zip)
  • 0.7.2(May 31, 2022)

    Release 0.7.2 brings a fix to the lsa_to_madx module, ensuring it does not make the user run into a MAD-X bug later on when using the created knobs.

    Fixed:

    • Trim variables generated in the MAD-X script will make sure not to be longer than 47 characters (hard MAD-X limit), nor start with an underscore or a digit.

    What's Changed

    • Fix: MAD-X Variable Name Length Limit by @fsoubelet in https://github.com/pylhc/PyLHC/pull/102

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.1...0.7.2

    Source code(tar.gz)
    Source code(zip)
  • 0.7.1(May 23, 2022)

    Release 0.7.1 brings a fix to the lsa_to_madx module.

    Fixed:

    • Will not attempt to write to disk knobs that were not found by LSA in the provided optics.

    What's Changed

    • Fix: Do not attempt to write when knob isn't found in LSA optics by @fsoubelet in https://github.com/pylhc/PyLHC/pull/101

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.0...0.7.1

    Source code(tar.gz)
    Source code(zip)
  • 0.7.0(May 23, 2022)

    Release 0.7.0 contains the following changes:

    Added:

    • Added a new module, pylhc.lsa_to_madx, with functionality to parse LSA knobs from the command line or a text file, retrieve relevant information from LSA and create MAD-X files with the commands necessary to reproduce these knobs in simulations. This is of particular use when trying to reproduce a specific machine configuration in simulations.

    What's Changed

    • LSA knob to MAD-X script converter by @fsoubelet in https://github.com/pylhc/PyLHC/pull/100

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.6.2...0.7.0

    Source code(tar.gz)
    Source code(zip)
  • 0.6.2(Apr 23, 2022)

    Release 0.6.2 adds a flag to the info functionality of pylhc.kickgroups to display a copy-pastable list of kick files once can use in the GUI to load them all at once.

    What's Changed

    • Kickgroups file list by @JoschD and @fsoubelet in https://github.com/pylhc/PyLHC/pull/99

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.6.1...0.6.2

    Source code(tar.gz)
    Source code(zip)
  • 0.6.1(Apr 22, 2022)

    Release 0.6.1 brings a fix to the kickgroups module.

    Fixed:

    • Correctly detect the plane of the excitationSettings being read.
    • Better handling of kickgroups with no kickfiles.

    Changed:

    • The command-line commands have been renamed to list and info.

    What's Changed

    • fix by @JoschD in https://github.com/pylhc/PyLHC/pull/98

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.6.0...0.6.1

    Source code(tar.gz)
    Source code(zip)
  • 0.6.0(Apr 22, 2022)

    Release 0.6.0 contains the following changes:

    Added:

    • Added a new module, pylhc.kickgroups, with functionality to query available kickgroup files from a location, retrieve information of a given kickgroup, and retrieve relevant information for all kicks in a kickgroup. It can be called as a script (python -m pylhc.kickgroups) to printout copy-pastable information to put in the OMC logbook.

    What's Changed

    • Added KickGroup Infos by @JoschD and @fsoubelet in https://github.com/pylhc/PyLHC/pull/97

    Full Changelog: https://github.com/pylhc/PyLHC/compare/v0.5.0...0.6.0

    Source code(tar.gz)
    Source code(zip)
  • v0.5.0(Apr 20, 2022)

    What's Changed

    • Cron workflow fix by @fsoubelet in https://github.com/pylhc/PyLHC/pull/95
    • Removed IRNL RDT Correction by @JoschD in https://github.com/pylhc/PyLHC/pull/96 which can now be found as its own package in https://github.com/pylhc/irnl_rdt_correction

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.4.2...v0.5.0

    Source code(tar.gz)
    Source code(zip)
  • 0.4.2(Mar 31, 2022)

  • v0.4.1(Feb 20, 2022)

    Minor bugfixes in machine_settings_info.

    • Added:

      • time and start_time can now be given as AccDatetime-objects.
    • Fixed:

      • trims variable is initialized as None. Was not initialized if no trims were found, but used later on.
    Source code(tar.gz)
    Source code(zip)
  • v0.4.0(Feb 16, 2022)

    What's Changed

    • Add Zenodo DOI to README by @fsoubelet in https://github.com/pylhc/PyLHC/pull/89
    • Adds check for non-existing knobs by @JoschD in https://github.com/pylhc/PyLHC/pull/90
    • Update CI by @fsoubelet in https://github.com/pylhc/PyLHC/pull/91
    • Lsa with timerange by @JoschD in https://github.com/pylhc/PyLHC/pull/92

    Release 0.4.0 brings the trim-history option to the machine-info extractor. To enable this, one needs to provide a start_time. The return values are now organized into a dictionary.

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.3.0...v0.4.0

    Source code(tar.gz)
    Source code(zip)
  • 0.3.0(Nov 16, 2021)

    Release 0.3.0 brings the following:

    Added:

    • Non-linear correction script for the (HL)LHC Insertion Regions Resonance Driving Terms, including feed-down effects.

    Changed:

    • The package's license has been moved from GPLv3 to MIT.

    Note: if one wishes to extend the IRNL correction script to a different accelerator, there are valuable pointers in the following PR comment.

    Source code(tar.gz)
    Source code(zip)
  • 0.2.0(Nov 3, 2021)

    This is the first release of pylhc since its omc3 dependency is available on PyPI.

    Added:

    • BPM calibration script to get calibration factors from different BPMs
    • Proper mocking of CERN TN packages (functionality imported from omc3)

    Changed:

    • Minimum required tfs-pandas version is now 3.0.2
    • Minimum required generic-parser version is now 1.0.8
    • Minimum required omc3 version is now 0.2.0
    • Extras related to the CERN TN are now installed with python -m pip install pylhc[cern]

    Removed:

    • The HTCondor and AutoSix functionality have been removed and extracted to another package at https://github.com/pylhc/submitter
    Source code(tar.gz)
    Source code(zip)
  • 0.1.0(Dec 11, 2020)

    • Added:

      • Job submitter script to easily generate and schedule jobs through HTCondor.
      • Autosix script to easily generate and submit parametric SixDesk studies through HTCondor.
      • Script to analyse forced dynamic aperture data.
      • Scripts for logging and analysis of LHC BSRT data.
      • Utility modules supporting functionality for the above scripts.
    • Changed:

      • License moved to GNU GPLv3 to comply with the use of the omc3 package.
    • Miscellaneous:

      • Introduced extra dependencies tailored to different use cases of the package.
      • Reworked package organisation for consistency.
      • Set minimum requirements versions.
      • Moved CI/CD setup to Github Actions.
      • Improved testing and test coverage.
    Source code(tar.gz)
    Source code(zip)
Owner
PyLHC
Organisation for the OMC Team at CERN
PyLHC
This repo is dedicated to the data extraction and manipulation of the World Bank's database called STEP.

Overview Welcome to the Step-X repository. This repo is dedicated to the data extraction and manipulation of the World Bank's database called STEP. Be

Keanu Pang 0 Jan 20, 2022
This tool parses log data and allows to define analysis pipelines for anomaly detection.

logdata-anomaly-miner This tool parses log data and allows to define analysis pipelines for anomaly detection. It was designed to run the analysis wit

AECID 32 Nov 27, 2022
Project under the certification "Data Analysis with Python" on FreeCodeCamp

Sea Level Predictor Assignment You will anaylize a dataset of the global average sea level change since 1880. You will use the data to predict the sea

Bhavya Gopal 3 Jan 31, 2022
🧪 Panel-Chemistry - exploratory data analysis and build powerful data and viz tools within the domain of Chemistry using Python and HoloViz Panel.

🧪📈 🐍. The purpose of the panel-chemistry project is to make it really easy for you to do DATA ANALYSIS and build powerful DATA AND VIZ APPLICATIONS within the domain of Chemistry using using Python a

Marc Skov Madsen 97 Dec 08, 2022
Yet Another Workflow Parser for SecurityHub

YAWPS Yet Another Workflow Parser for SecurityHub "Screaming pepper" by Rum Bucolic Ape is licensed with CC BY-ND 2.0. To view a copy of this license,

myoung34 8 Dec 22, 2022
Import, connect and transform data into Excel

xlwings_query Import, connect and transform data into Excel. Description The concept is to apply data transformations to a main query object. When the

George Karakostas 1 Jan 19, 2022
Reading streams of Twitter data, save them to Kafka, then process with Kafka Stream API and Spark Streaming

Using Streaming Twitter Data with Kafka and Spark Reading streams of Twitter data, publishing them to Kafka topic, process message using Kafka Stream

Rustam Zokirov 1 Dec 06, 2021
Common bioinformatics database construction

biodb Common bioinformatics database construction 1.taxonomy (Substance classification database) Download the database wget -c https://ftp.ncbi.nlm.ni

sy520 2 Jan 04, 2022
This is an example of how to automate Ridit Analysis for a dataset with large amount of questions and many item attributes

This is an example of how to automate Ridit Analysis for a dataset with large amount of questions and many item attributes

Ishan Hegde 1 Nov 17, 2021
Geospatial data-science analysis on reasons behind delay in Grab ride-share services

Grab x Pulis Detailed analysis done to investigate possible reasons for delay in Grab services for NUS Data Analytics Competition 2022, to be found in

Keng Hwee 6 Jun 07, 2022
Python Practicum - prepare for your Data Science interview or get a refresher.

Python-Practicum Python Practicum - prepare for your Data Science interview or get a refresher. Data Data visualization using data on births from the

Jovan Trajceski 1 Jul 27, 2021
Data Competition: automated systems that can detect whether people are not wearing masks or are wearing masks incorrectly

Table of contents Introduction Dataset Model & Metrics How to Run Quickstart Install Training Evaluation Detection DATA COMPETITION The COVID-19 pande

Thanh Dat Vu 1 Feb 27, 2022
A probabilistic programming library for Bayesian deep learning, generative models, based on Tensorflow

ZhuSuan is a Python probabilistic programming library for Bayesian deep learning, which conjoins the complimentary advantages of Bayesian methods and

Tsinghua Machine Learning Group 2.2k Dec 28, 2022
Transform-Invariant Non-Negative Matrix Factorization

Transform-Invariant Non-Negative Matrix Factorization A comprehensive Python package for Non-Negative Matrix Factorization (NMF) with a focus on learn

EMD Group 6 Jul 01, 2022
Using approximate bayesian posteriors in deep nets for active learning

Bayesian Active Learning (BaaL) BaaL is an active learning library developed at ElementAI. This repository contains techniques and reusable components

ElementAI 687 Dec 25, 2022
Bearsql allows you to query pandas dataframe with sql syntax.

Bearsql adds sql syntax on pandas dataframe. It uses duckdb to speedup the pandas processing and as the sql engine

14 Jun 22, 2022
Hidden Markov Models in Python, with scikit-learn like API

hmmlearn hmmlearn is a set of algorithms for unsupervised learning and inference of Hidden Markov Models. For supervised learning learning of HMMs and

2.7k Jan 03, 2023
Python scripts aim to use a Random Forest machine learning algorithm to predict the water affinity of Metal-Organic Frameworks

The following Python scripts aim to use a Random Forest machine learning algorithm to predict the water affinity of Metal-Organic Frameworks (MOFs). The training set is extracted from the Cambridge S

1 Jan 09, 2022
Describing statistical models in Python using symbolic formulas

Patsy is a Python library for describing statistical models (especially linear models, or models that have a linear component) and building design mat

Python for Data 866 Dec 16, 2022
Containerized Demo of Apache Spark MLlib on a Data Lakehouse (2022)

Spark-DeltaLake-Demo Reliable, Scalable Machine Learning (2022) This project was completed in an attempt to become better acquainted with the latest b

8 Mar 21, 2022