Generate visualizations of GitHub user and repository statistics using GitHub Actions.

Overview

GitHub Stats Visualization

Generate visualizations of GitHub user and repository statistics using GitHub Actions.

This project is currently a work-in-progress; there will always be more interesting stats to display.

Background

When someone views a profile on GitHub, it is often because they are curious about a user's open source projects and contributions. Unfortunately, that user's stars, forks, and pinned repositories do not necessarily reflect the contributions they make to private repositories. The data likewise does not present a complete picture of the user's total contributions beyond the current year.

This project aims to collect a variety of profile and repository statistics using the GitHub API. It then generates images that can be displayed in repository READMEs, or in a user's Profile README.

Since the project runs on GitHub Actions, no server is required to regularly regenerate the images with updated statistics. Likewise, since the user runs the analysis code themselves via GitHub Actions, they can use their GitHub access token to collect statistics on private repositories that an external service would be unable to access.

Disclaimer

If the project is used with an access token that has sufficient permissions to read private repositories, it may leak details about those repositories in error messages. For example, the aiohttp library—used for asynchronous API requests—may include the requested URL in exceptions, which can leak the name of private repositories. If there is an exception caused by aiohttp, this exception will be viewable in the Actions tab of the repository fork, and anyone may be able to see the name of one or more private repositories.

Due to some issues with the GitHub statistics API, there are some situations where it returns inaccurate results. Specifically, the repository view count statistics and total lines of code modified are probably somewhat inaccurate. Unexpectedly, these values will become more accurate over time as GitHub caches statistics for your repositories. Additionally, repositories that were last contributed to more than a year ago may not be included in the statistics due to limitations in the results returned by the API.

For more information on inaccuracies, see issue #2, #3, and #13.

Installation

  1. Create a personal access token (not the default GitHub Actions token) using the instructions here. Personal access token must have permissions: read:user and repo. Copy the access token when it is generated – if you lose it, you will have to regenerate the token.
    • Some users are reporting that it can take a few minutes for the personal access token to work. For more, see #30.
  2. Click here to create a copy of this repository. Note: this is not the same as forking a copy because it copies everything fresh, without the huge commit history.
  3. If this is the README of your fork, click this link to go to the "Secrets" page. Otherwise, go to the "Settings" tab of the newly-created repository and go to the "Secrets" page (bottom left).
  4. Create a new secret with the name ACCESS_TOKEN and paste the copied personal access token as the value.
  5. It is possible to change the type of statistics reported.
    • To ignore certain repos, add them (in owner/name format e.g., jstrieb/github-stats) separated by commas to a new secret—created as before—called EXCLUDED.
    • To ignore certain languages, add them (separated by commas) to a new secret called EXCLUDED_LANGS.
    • To show statistics only for "owned" repositories and not forks with contributions, add an environment variable (under the env header in the main workflow) called EXCLUDE_FORKED_REPOS with a value of true.
  6. Go to the Actions Page and press "Run Workflow" on the right side of the screen to generate images for the first time. The images will be periodically generated every hour, but they can be manually regenerated by manually running the workflow.
  7. Check out the images that have been created in the generated folder.
  8. To add your statistics to your GitHub Profile README, copy and paste the following lines of code into your markdown content. Change the username value to your GitHub username.
    ![](https://github.com/username/github-stats/blob/master/generated/overview.svg)
    ![](https://github.com/username/github-stats/blob/master/generated/languages.svg)
  9. Link back to this repository so that others can generate their own statistics images.
  10. Star this repo if you like it!

Support the Project

There are a few things you can do to support the project:

  • Star the repository (and follow me on GitHub for more)
  • Share and upvote on sites like Twitter, Reddit, and Hacker News
  • Report any bugs, glitches, or errors that you find

These things motivate me to to keep sharing what I build, and they provide validation that my work is appreciated! They also help me improve the project. Thanks in advance!

If you are insistent on spending money to show your support, I encourage you to instead make a generous donation to one of the following organizations. By advocating for Internet freedoms, organizations like these help me to feel comfortable releasing work publicly on the Web.

Related Projects

Owner
JoelImgu
JoelImgu
Resources for teaching & learning practical data visualization with python.

Practical Data Visualization with Python Overview All views expressed on this site are my own and do not represent the opinions of any entity with whi

Paul Jeffries 98 Sep 24, 2022
OpenStats is a library built on top of streamlit that extracts data from the Github API and shows the main KPIs

Open Stats Discover and share the KPIs of your OpenSource project. OpenStats is a library built on top of streamlit that extracts data from the Github

Pere Miquel Brull 4 Apr 03, 2022
Seismic Waveform Inversion Toolbox-1.0

Seismic Waveform Inversion Toolbox (SWIT-1.0)

Haipeng Li 98 Dec 29, 2022
Bcc2telegraf: An integration that sends ebpf-based bcc histogram metrics to telegraf daemon

bcc2telegraf bcc2telegraf is an integration that sends ebpf-based bcc histogram

Peter Bobrov 2 Feb 17, 2022
A customized interface for single cell track visualisation based on pcnaDeep and napari.

pcnaDeep-napari A customized interface for single cell track visualisation based on pcnaDeep and napari. 👀 Under construction You can get test image

ChanLab 2 Nov 07, 2021
Plot-configurations for scientific publications, purely based on matplotlib

TUEplots Plot-configurations for scientific publications, purely based on matplotlib. Usage Please have a look at the examples in the example/ directo

Nicholas Krämer 487 Jan 08, 2023
Extract data from ThousandEyes REST API and visualize it on your customized Grafana Dashboard.

ThousandEyes Grafana Dashboard Extract data from the ThousandEyes REST API and visualize it on your customized Grafana Dashboard. Deploy Grafana, Infl

Flo Pachinger 16 Nov 26, 2022
Automatically generate GitHub activity!

Commit Bot Automatically generate GitHub activity! We've all wanted to be the developer that commits every day, but that requires a lot of work. Let's

Ricky 4 Jun 07, 2022
An application that allows you to design and test your own stock trading algorithms in an attempt to beat the market.

StockBot is a Python application for designing and testing your own daily stock trading algorithms. Installation Use the

Ryan Cullen 280 Dec 19, 2022
Turn a STAC catalog into a dask-based xarray

StackSTAC Turn a list of STAC items into a 4D xarray DataArray (dims: time, band, y, x), including reprojection to a common grid. The array is a lazy

Gabe Joseph 148 Dec 19, 2022
Python package for hypergraph analysis and visualization.

The HyperNetX library provides classes and methods for the analysis and visualization of complex network data. HyperNetX uses data structures designed to represent set systems containing nested data

Pacific Northwest National Laboratory 304 Dec 27, 2022
A declarative (epi)genomics visualization library for Python

gos is a declarative (epi)genomics visualization library for Python. It is built on top of the Gosling JSON specification, providing a simplified interface for authoring interactive genomic visualiza

Gosling 107 Dec 14, 2022
A Scheil-Gulliver simulation tool using pycalphad.

scheil A Scheil-Gulliver simulation tool using pycalphad. import matplotlib.pyplot as plt from pycalphad import Database, variables as v from scheil i

pycalphad 6 Dec 10, 2021
Application for viewing pokemon regional variants.

Pokemon Regional Variants Application Application for viewing pokemon regional variants. Run The Source Code Download Python https://www.python.org/do

Michael J Bailey 4 Oct 08, 2021
CONTRIBUTIONS ONLY: Voluptuous, despite the name, is a Python data validation library.

CONTRIBUTIONS ONLY What does this mean? I do not have time to fix issues myself. The only way fixes or new features will be added is by people submitt

Alec Thomas 1.8k Dec 31, 2022
Smarthome Dashboard with Grafana & InfluxDB

Smarthome Dashboard with Grafana & InfluxDB This is a complete overhaul of my Raspberry Dashboard done with Flask. I switched from sqlite to InfluxDB

6 Oct 20, 2022
Lightweight, extensible data validation library for Python

Cerberus Cerberus is a lightweight and extensible data validation library for Python. v = Validator({'name': {'type': 'string'}}) v.validate({

eve 2.9k Dec 27, 2022
In-memory Graph Database and Knowledge Graph with Natural Language Interface, compatible with Pandas

CogniPy for Pandas - In-memory Graph Database and Knowledge Graph with Natural Language Interface Whats in the box Reasoning, exploration of RDF/OWL,

Cognitum Octopus 34 Dec 13, 2022
NumPy and Pandas interface to Big Data

Blaze translates a subset of modified NumPy and Pandas-like syntax to databases and other computing systems. Blaze allows Python users a familiar inte

Blaze 3.1k Jan 01, 2023
This repository contains a streaming Dataflow pipeline written in Python with Apache Beam, reading data from PubSub.

Sample streaming Dataflow pipeline written in Python This repository contains a streaming Dataflow pipeline written in Python with Apache Beam, readin

Israel Herraiz 9 Mar 18, 2022