Displays market info for the LUNI token on the Terra Blockchain

Related tags

Web CrawlingLuniBot
Overview

LuniBot for Discord

Displays market info for the LUNI/LUNA token on the Terra Blockchain (Webscrape method currently scraping CoinMarketCap). Will evolve over time :)

Install

--Install Dependencies

pip3 install bs4 discord dotenv flask requests

--Add a .env file to the repository and add the following line within it:

token = REPLACEWITHTOKEN <---- Discord Bot token, obtained HERE DO NOT SHARE THIS TOKEN WITH ANYONE

Set up your bot in the given link above and click "Bot" on the left menu to see the "Token" section underneath the Username input.

That is where you will copy and replace the code above with your Bot Token.

This is a private key to your Bot, so make sure not to leave it anywhere someone you dont trust can get it. They could potentially mess with your server.

Add Discord Bot to your server

Click "OAuth2" on the left menu on the Discord Developer Portal

Go to the second section, "URL Generator"

Assign the permissions you wish the bot to have, make sure to know what youre giving it access to, as too much control can be screwy.

Copy the given link on the bottom of the page, should look something like "https://discord.com/api/oauth2/authorize?client_id=??????????".

Go to that link in a new tab and assign the server you want it to have access to (you will have to confirm some permissions first), and your bot will then be connected.

It will show offline until you finalize and run the program.

Finished

And just like that you should be good. As long as you have the necessary dependencies to run the commands, you should see a smooth startup upon running the main.py file in VSCode (or preferred program)

Now go ahead and type "$luni" into the channel the bot is in (and online) and see it print out the top info for LUNI

I also added print function that will print the price in the console upon request from discord.

Tips

**This bot will have access to all your channels, so if you wish to limit this command to one channel, I recommend setting permissions for the Bot's role (should automatically be set as whatever you named the bot in the Discord Developer Portal) to not be able to view whatever channels you don't want it replying in.

**If you ever do have your token compromised (ie, you fork to github with your token readily available to anybody who checks your code) there is usually no cause for concern as Discord will automatically let you know and your token will be invalid. This does mean, however, that you must recopy and reenter your token into your local code in order for the bot to continue working (upon rerunning main.py, of course)

**Lastly, I do recommend applying "slow-mode" or limiting the frequency of using the commands, as the more info it pulls, the longer it will take. I am currently looking towards solutions in making the pull far more efficient so as to ease the amount of effort it takes for the program to access the data.

**This Bot is super simple, so feel free to mess with it and add to it/change it. This serves as a template, as not many solid ones are out there for this sort of thing. Hopefully this helps with that!

PS : Added the logo for the LuniBot avatar :)

Let me know any issues or suggestions! I'm not an expert by any means, so I don't expect this to work flawlessly :P

:arrow_double_down: Dumb downloader that scrapes the web

You-Get NOTICE: Read this if you are looking for the conventional "Issues" tab. You-Get is a tiny command-line utility to download media contents (vid

Mort Yao 46.4k Jan 03, 2023
a small library for extracting rich content from urls

A small library for extracting rich content from urls. what does it do? micawber supplies a few methods for retrieving rich metadata about a variety o

Charles Leifer 588 Dec 27, 2022
News, full-text, and article metadata extraction in Python 3. Advanced docs:

Newspaper3k: Article scraping & curation Inspired by requests for its simplicity and powered by lxml for its speed: "Newspaper is an amazing python li

Lucas Ou-Yang 12.3k Jan 07, 2023
Get paper names from dblp.org

scraper-dblp Get paper names from dblp.org and store them in a .txt file Useful for a related literature :) Install libraries pip3 install -r requirem

Daisy Lab 1 Dec 07, 2021
Scrapy-based cyber security news finder

Cyber-Security-News-Scraper Scrapy-based cyber security news finder Goal To keep up to date on the constant barrage of information within the field of

2 Nov 01, 2021
jd_maotai rpa 基于selenium驱动的jd抢购rpa机器人

jd_maotai rpa 基于selenium驱动的jd抢购rpa机器人, 照顾我们这样的马大哈, 不会忘记抢购了, 祝大家过年都能喝上茅台. 特别声明: 本仓库发布的jd_maotai_rpa项目定义为自动化rpa项目, 是用于防止忘记参与jd茅台的活动(由于本人时常忘记), 而不是为了秒杀和抢

35 Nov 18, 2022
The core packages of security analyzer web crawler

Security Analyzer 🐍 A large scale web crawler (considered also as vulnerability scanner tool) to take an overview about security of Moroccan sites Cu

Security Analyzer 10 Jul 03, 2022
A package that provides you Latest Cyber/Hacker News from website using Web-Scraping.

cybernews A package that provides you Latest Cyber/Hacker News from website using Web-Scraping. Latest Cyber/Hacker News Using Webscraping Developed b

Hitesh Rana 4 Jun 02, 2022
Parsel lets you extract data from XML/HTML documents using XPath or CSS selectors

Parsel Parsel is a BSD-licensed Python library to extract and remove data from HTML and XML using XPath and CSS selectors, optionally combined with re

Scrapy project 859 Dec 29, 2022
This is a web crawler that works on employ email data by gmane.org and visualizes it in different ways.

crawler_to_visual_gmane Analyzing an EMAIL Archive from gmane and vizualizing the data using the D3 JavaScript library. This is a set of tools that al

Saim Zafar 1 Dec 20, 2021
Crawler do site Fundamentus.com com o uso do framework scrapy, tanto da aba detalhada como a de resumo.

Crawler do site Fundamentus.com com o uso do framework scrapy, tanto da aba detalhada como a de resumo. (Todas as infomações)

Guilherme Silva Uchoa 3 Oct 04, 2022
A python script to extract answers to any question on Quora (Quora+ included)

quora-plus-bypass A python script to extract answers to any question on Quora (Quora+ included) Requirements Python 3.x

Nitin Narayanan 10 Aug 18, 2022
This program scrapes information and images for movies and TV shows.

Media-WebScraper This program scrapes information and images for movies and TV shows. Summary For more information on the program, read the WebScrape_

1 Dec 05, 2021
New World Market Scraper

Bean Seller A New Worlds market scraper. Deployment This must be installed on Windows as it uses the Windows api to do its stuff Install Prerequisites

4 Sep 21, 2022
Script for scrape user data like "id,username,fullname,followers,tweets .. etc" by Twitter's search engine .

TwitterScraper Script for scrape user data like "id,username,fullname,followers,tweets .. etc" by Twitter's search engine . Screenshot Data Users Only

Remax Alghamdi 19 Nov 17, 2022
A web service for scanning media hosted by a Matrix media repository

Matrix Content Scanner A web service for scanning media hosted by a Matrix media repository Installation TODO Development In a virtual environment wit

Brendan Abolivier 5 Dec 01, 2022
Deep Web Miner Python | Spyder Crawler

Webcrawler written in Python. This crawler does dig in till the 3 level of inside addressed and mine the respective data accordingly

Karan Arora 17 Jan 24, 2022
A high-level distributed crawling framework.

Cola: high-level distributed crawling framework Overview Cola is a high-level distributed crawling framework, used to crawl pages and extract structur

Xuye (Chris) Qin 1.5k Dec 24, 2022
A multithreaded tool for searching and downloading images from popular search engines. It is straightforward to set up and run!

🕳️ CygnusX1 Code by Trong-Dat Ngo. Overviews 🕳️ CygnusX1 is a multithreaded tool 🛠️ , used to search and download images from popular search engine

DatNgo 32 Dec 31, 2022
A simple code to fetch comments below an Instagram post and save them to a csv file

fetch_comments A simple code to fetch comments below an Instagram post and save them to a csv file usage First you have to enter your username and pas

2 Jul 14, 2022