Ethereum: Web scraping in Python for P2P data extraction
As a passionate cryptocurrency, you are probably aware of the importance of being aware of the market prices and trends. An effective way to achieve this is by web scraping, which involves extracting data from web sites using automatic scripts. In this article, we will explore how to use Python for web scraping on popular cryptocurrencies such as Binance and Kucoin.
Requirements
Before you start, make sure you have the following installed requirements:
Beautifulsoop
and” requests “Choosing a library
There are several web scraping libraries available for Python. For this example, we will use “BeautifulSoup4”, which is widely used and well documented.
`Python
Import requests
from BS4 Import Beauticsoup
Web scraping binance
Binance offers a robust API that allows us to extract data from their website. We will use the next end point to pick up the prices for the USDT/ARI:https: // api.binance.com/API/V3/Ticker/Price.
`Python
def binance_data ():
headers = {'accept': 'app/json'}
params = {
"Symbol": 'USDT',
Binance USD (tied at 1 EUR)
"Limit": 100
Recover up to 100 price data points
}
Answer = Requests.Get (' antets = antets, params = params)
If answered.status_code == 200:
Returns JSON.Loads (answer.text) ['data']
Otherwise:
Printing ("Failed to recover the binning data.")
He doesn't return any
Dex USDT_Prices ():
prices = []
For symbol in ['USDT', 'Ars']:
URL = F'Https: //www.kucoin.com/es/otc/buy/ {symbol}-{sellf.exchange} = {self.exchange}/USDT '
headers = {'accept': 'app/json'}
params = {
"Limit": 100, recover up to 100 price data points
"Symbol": symbol,
'market': f '{symbol}-{self.exchange}'
}
Answer = Requests.Get (URL, headers = headers, params = params)
If answered.status_code == 200:
prices.append (json.loads (reply.text) ['data'])
Return prices
Exchange = 'binance'
replace with your favorite exchange
Prices = USDT_Prices ()
Print data taken over for each pair
For i, the enumeration symbol (prices):
print (f '{i+1}. {symbol ["symbol"]} ({symbol ["price"]}')
Data processing and storage
After extracting the desired prices, we must further process them. We can create a simple script that:
Python
Import Json
with open ('prices.json', 'w') as f:
For i, the enumeration symbol (prices):
prices_dict = {
"Timestamp": str (i+1),
"symbol": symbol ["symbol"],
"Price": symbol ["price"]
}
JSON.DUMP (prices_dict, f)
`
Conclusion
In this article, we have shown how to use Python for web scraping on popular cryptocurrencies such as Binance and Kucoin. I chose “BeautifulSoup4” as a web scraping library and used the final API points offered by each exchange to extract data. The extracted prices were then processed still using simple scripts to save them in JSON files.
Feel free to explore more web sites, adjust the parameters and improve your extraction process for better results. Happy programming!