Python asynchronous plus process gets Bitcoin market information

Source: Internet
Author: User

Goal

  Select several trading platforms with a large number of Bitcoin transactions to view the corresponding APIs and get the ticker and depth information for the currency pairs in the market. We choose 4 Trading platforms from the website: Bitfinex, Okex, Binance, Gdax. The corresponding trading pair is BTC/USD,BTC/USDT,BTC/USDT,BTC/USD.

first , CCXT library

began to think directly to request the market API, and then parse the obtained data, but to GitHub found a relatively good Python library, which encapsulates the acquisition of the Bitcoin market related functions, so that the analysis of the API time is omitted.    So I just have to pass in the market and the corresponding currency pairs, using the functions fetch_ticker and Fetch_order_book inside the library. You can get ticker and depth information about the market (see the CCXT Manual for specific usage). Then take the market Okex as an example, use the CCXT library to get Okex ticker and depth information.

# Introduction to Library Import ccxt# instantiation Market exchange = Ccxt.okex () # trade pair symbol = ' BTC/USDT ' # get ticker Info ticker = Exchange.fetch_ticker (symbol) # won Fetch depth Information depth = Exchange.fetch_order_book (symbol) print (' ticker:%s, depth:%s '% (ticker, depth))

after running, you will get the result as you can see that the ticker and depth information has been obtained.

Ii. access to four market information (for loop)

   Next we get four market information, depth inside there are asks and bids, the amount of data a little bit more, here depth information I only go to the front five, for ticker I also only extract information inside the info (specifically what meaning to refer to the corresponding market API). After the simple encapsulation, the first thing I wanted was a for loop. Let's start with what you think:

# Introduction Library import Ccxtimport TimeNow = Lambda:time.time () start = Now () def getData (Exchange, symbol): data = {} # for storing ticker and depth Info # get ticker Info tickerinfo = Exchange.fetch_ticker (symbol) # get depth Info depth = {} # get depth info Exchang E_depth = Exchange.fetch_order_book (symbol) # gets asks,bids minimum 5, up to 5 messages asks = Exchange_depth.get (' asks ') [: 5] Bids = Exchange_depth.get (' bids ') [: 5] depth[' asks '] = asks depth[' bids '] = bids data[' ticker '] = Tickerinfo data[' d Epth '] = depth return datadef main (): # Instantiate market exchanges = [Ccxt.binance (), Ccxt.bitfinex2 (), Ccxt.okex (), CCXT.GD Ax ()] # trade Pairs symbols = [' BTC/USDT ', ' BTC/USD ', ' BTC/USDT ', ' BTC/USD '] for I in range (len (exchanges)): Excha Nge = exchanges[i] symbol = symbols[i] data = getData (Exchange, symbol) print (' Exchange:%s data is% S '% (exchange.id, data)) if __name__ = = ' __main__ ': Main () print (' Run time:%s '% (now ()-start))

   After running it will be found that although every market information has been obtained, the execution of almost 5.7 seconds, because it is synchronous, that is, in order to be executed, if you want to get four market information at a certain time, it is clear that this result does not meet our requirements.

Third, asynchronous Ga (Coroutine)

  Although the previous loop can output results, it takes a long time and does not achieve the desired effect, followed by asynchronous Ga (refer to an article in the knowledge above), to use the asynchronous first to introduce the Asyncio library, This library is only available after 3.4, and it provides a mechanism that allows you to write concurrent models in a single-threaded environment using the coroutines, IO multiplexing (multiplexing I/O). Here is a small example of a Python document.

Import Asyncioasync def compute (x, y):    print ("Compute%s +%s ..."% (x, y))    await Asyncio.sleep (1.0)    return x + Yasync def print_sum (x, y):    result = await compute (x, y)    print ("%s +%s =%s"% (x, y, result)) loop = ASYNCIO.G Et_event_loop () Loop.run_until_complete (Print_sum (1, 2)) Loop.close ()

  

When the event loop starts running, it looks for coroutine in the task to execute the schedule because the event loop registers print_sum (), so print_sum () is called, and execution result = await Compute (x, y) this statement (equivalent to result = yield from compute (x, y)), because compute () is itself a coroutine, so Print_sum () This process is temporarily suspended,compute () is added to the event loop, the program flow executes the print statement in Compute () , prints "compute%s +%s ...", The await Asyncio.sleep (1.0)is then executed because asyncio.sleep () is also a coroutine, and then Compute () is suspended, Waiting for the timer to read the second, during this 1-second process, the event loop queries the queue for coroutine that can be dispatched, since Print_sum () and Compute () have been suspended before, So the event loop stops waiting for the schedule to be dispatched, and when the timer reads the second, the program flow returns to compute () to execute the return statement, and the result is returned to result in print_sum () . Finally, the result is printed, there are no tasks in the event queue that can be dispatched, and Loop.close () closes the event queue and ends the program.

  Next we use the asynchronous and the Association (PS:CCXT Library also has the corresponding asynchronous), after the operation Discovery time only takes 1.9 seconds, is many times faster than before.

Run time:1.9661316871643066

Related code:

# Introduction Library import Ccxt.async as ccxtimport asyncioimport timenow = Lambda:time.time () start = Now () Async def getData (Exchange, S Ymbol): data = {} # for storing ticker and depth information # get ticker info tickerinfo = await exchange.fetch_ticker (symbol) # Get Dept    H Information depth = {} # get depth information exchange_depth = await exchange.fetch_order_book (symbol) # get Asks,bids minimum 5, up to 5 information asks = Exchange_depth.get (' asks ') [: 5] bids = exchange_depth.get (' bids ') [: 5] depth[' asks '] = asks depth[' bids '] = Bids data[' ticker '] = tickerinfo data[' depth '] = depth return datadef main (): # Instantiate market exchanges = [Ccxt.bin    Ance (), Ccxt.bitfinex2 (), Ccxt.okex (), Ccxt.gdax ()] # trading to symbols = [' BTC/USDT ', ' BTC/USD ', ' BTC/USDT ', ' BTC/USD '] tasks = [] for i in range (len (exchanges)): task = GetData (Exchanges[i], symbols[i]) tasks.append (Asynci O.ensure_future (Task)) loop = Asyncio.get_event_loop () loop.run_until_complete (asyncio.wait (Tasks)) if __name__ = = ' _ _main__ ': Main () pRint (' Run time:%s '% (now ()-start)) 

  

third, timed crawl and use MongoDB to save data

  On the previous basis, add a timed task that crawls data at intervals and saves the data to the MongoDB database. Just make a little change to the previous code, and the code and the results are as follows:

Import Asyncioimport Ccxt.async as Ccxtimport timeimport pymongo# get Ticker and depth information async def get_exchange_tickerdepth ( Exchange, Symbol): # where Exchange is instantiated after the market # print (' Start Get_ticker ') while True:print ('%s is run%s '% (Exch Ange.id, Time.ctime ()) # get ticher info tickerinfo = await exchange.fetch_ticker (symbol) Ticker = Ticker            Info.get (' info ') if type (ticker) = = Type ({}): ticker[' timestamp '] = tickerinfo.get (' timestamp ') ticker[' High ' = Tickerinfo.get ("high") ticker[' low ' = tickerinfo.get (' low ') ticker[' last ') = Ti        Ckerinfo.get (' last ') Else:ticker = tickerinfo # print (ticker) # get depth information depth = {} Exchange_depth = await exchange.fetch_order_book (symbol) # gets asks,bids minimum 5, up to 5 information asks = Exchange_de        Pth.get (' asks ') [: 5] bids = exchange_depth.get (' bids ') [: 5] depth[' asks '] = asks depth[' bids '] = bids # Print (' depth:{} '. Format (depth)) data = {' Exchange ': exchange.id, ' countries ': exchange.countries, ' sym Bol ': symbol, ' ticker ': ticker, ' depth ': depth} # Save data save_exchangedate (Excha        Nge.id, data) print (' *********%s is finished, time%s ********* '% (Exchange.id, Time.ctime ())) # wait times Await Asyncio.sleep (2) # Store def save_exchangedate (Exchangename, data): # link MongoDB connect = Pymongo. Mongoclient (host= ' localhost ', port=27017) # CREATE DATABASE Exchangedata = connect[' Exchangedataasyncio '] # CREATE TABLE Exchange Information = Exchangedata[exchangename] # print (table_name) # data deduplication after saving count = Exchangeinformation.count () if Not count > 0:exchangeinformation.insert_one (data) else:for item in Exchangeinformation.find (). Skip (            COUNT-1): Lastdata = Item if lastdata[' ticker ' [' timestamp ']! = data[' ticker ' [' Timestamp ']: Exchangeinformation.insert_one (data)def main (): exchanges = [Ccxt.binance (), Ccxt.bitfinex2 (), Ccxt.okex (), Ccxt.gdax ()] symbols = [' BT C/USDT ', ' BTC/USD ', ' BTC/USDT ', ' BTC/USD '] tasks = [] for i in range (len (exchanges)): task = Get_exchange_tic    Kerdepth (Exchanges[i], symbols[i]) tasks.append (asyncio.ensure_future (Task)) loop = Asyncio.get_event_loop () Try: # print (Asyncio. Task.all_tasks (Loop)) Loop.run_forever () except Exception as E:print (e) loop.stop () LOOP.R Un_forever () finally:loop.close () if __name__ = = ' __main__ ': Main ()

V. Summary

  The use of a co-process enables efficient concurrency tasks. Python introduces the concept of a co-process in 3.4, but this is based on the generator object, and 3.5 determines the syntax of the association. This is a simple use of asyncio. Of course the implementation of the process is not only Asyncio,tornado and gevent have implemented similar functions. I have a problem here, that is, after running for a period of time, the inside of the market may have a request timeout and other circumstances caused the association to stop running, how can I get the error and then restart the corresponding process. If there is a great God, please advise.

Python asynchronous plus process gets Bitcoin market information

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.