Problems with regular database links: scenario One:
Disadvantage: There are too many connections per request to create a database connection repeatedly
ImportPymysqldefindex (): Conn=pymysql.connect () cursor=conn.cursor () cursor.execute ('SELECT * from TB where ID >%s', [5,]) Result=Cursor.fetchall () cursor.close () Conn.close ( )Print(Result)defupdate ():#The first step: The disadvantage: Each request repeatedly create a database connection, the number of connections too manyconn =pymysql.connect () cursor=conn.cursor () cursor.execute ('Update userinfo set username=%s where ID >%s',['Ctz', 5,]) Conn.commit () Cursor.close () Conn.close ( )return 'Hello'
There is a problem: each request has to create a database link, we can process the data only a very little time, but the connection database for a long time, each request repeatedly create a database connection, too many connections, resulting in loss of database performance
Scenario Two:
Disadvantage, cannot support concurrency
ImportPymysqlconn=Pymysql.connect ()defindex (): Cursor=conn.cursor () cursor.execute ('SELECT * from TB where ID >%s', [5,]) Result=Cursor.fetchall () cursor.close () Conn.close ( )Print(Result)defupdate ():cursor=conn.cursor () cursor.execute ('Update userinfo set username=%s where ID >%s',['Ctz', 5,]) Conn.commit () Cursor.close () Conn.close ( )return 'Hello'
Put the database link in the constant, so that each request will be created only once the database link, but there is a problem: cannot support concurrency
If more than one thread comes in and executes the same function, the first thread does not play, the second thread comes in, but only one conn the error; Of course, we can also solve the error problem in this way:
Import Threadinglock=Threading. Rlock () CONN=pymysql.connect ()def index (): With LOCK: = conn.cursor () cursor.execute ('select * from TB where ID >%s', [5,]) = Cursor.fetchall () cursor.close () print(Result)
However, this solves the problem of multiple threads competing for errors, but the program becomes serial after locking, and the operation efficiency becomes lower.
To solve the problem of the above two scenarios, we found a solution to create a database connection pool
Database Link Pool
Dbutils is a Python module that implements database connection pooling.
There are two modes of connection for this connection pool:
- Mode one: Creates a connection for each thread, and the thread does not close even if the Close method is called, but simply puts the connection back into the connection pool for its own thread to use again. When the thread terminates, the connection is automatically closed.
Implementation principle: Based on the threaing.local implementation to create a link for each thread, when the thread shuts down, is not really closed; When this thread is called again, it is the connection that was created at the beginning of the use. The database connection is not closed until the thread terminates
"""Create a connection for each thread, thread.local implementation. """ fromDbutils.persistentdbImportPersistentdbImportPymysqlpool=Persistentdb (creator=pymysql,#modules that use linked databasesMaxusage=none,#the maximum number of times a link is reused, none means unlimitedSetsession=[],#a list of commands to execute before starting the session. such as: ["Set Datestyle to ...", "Set time zone ..."]ping=0,#Ping the MySQL server to check if the service is available. # for example: 0 = None = never, 1 = default = Whenever it is requested, 2 = When a cursor is created, 4 = When a query is executed, 7 = Alwayscloseable=False,#If False, Conn.close () is actually ignored for the next use, and the link is automatically closed when the thread is closed. If True, Conn.close () closes the link, and then calls Pool.connection again with an error, because the connection is actually closed (pool.steady_connection () can get a new link)Threadlocal=none,#This thread is exclusive to the object that holds the linked object, if the linked object is resethost='127.0.0.1', Port=3306, the user='Root', Password='123', Database='Pooldb', CharSet='UTF8')deffunc ():#conn = steadydbconnection ()conn =pool.connection () cursor=conn.cursor () cursor.execute ('SELECT * from TB1') Result=Cursor.fetchall () cursor.close () Conn.close ( )#not really closed, but fake off. conn = Pymysql.connect () conn.close ()Conn=pool.connection () cursor=conn.cursor () cursor.execute ('SELECT * from TB1') Result=Cursor.fetchall () cursor.close () Conn.close ( )ImportThreading forIinchRange (10): T= Threading. Thread (target=func) T.start ()
Mode two: Create a batch of connections to the connection pool for all threads to share, use to fetch, and then put back into the connection pool once you are finished.
PS: Because the threadsafety value of Pymysql, MYSQLDB, etc. is 1, the threads in this mode connection pool are shared by all threads.
Import TimeImportPymysqlImportThreading fromDbutils.pooleddbImportPooleddb, Shareddbconnectionpool=Pooleddb (creator=pymysql,#modules that use linked databasesMaxconnections=6,#the maximum number of connections allowed for a connection pool, 0 and none means no limit on the number of connectionsmincached=2,#at least 0 of the free links created in the link pool are not created when initializingmaxcached=5,#most idle links in the link pool, 0 and none are not limitedMaxshared=3,#The maximum number of links shared in a linked pool, 0 and none means sharing all. PS: Useless, because Pymysql and mysqldb modules such as threadsafety are 1, all values regardless of set to how much, _maxcached forever is 0, so forever is all links are shared. Blocking=true,#If there are no available connections in the connection pool, wait is blocked. True, wait, False, not wait and then errorMaxusage=none,#the maximum number of times a link is reused, none means unlimitedSetsession=[],#a list of commands to execute before starting the session. such as: ["Set Datestyle to ...", "Set time zone ..."]ping=0,#Ping the MySQL server to check if the service is available. # for example: 0 = None = never, 1 = default = Whenever it is requested, 2 = When a cursor is created, 4 = When a query is executed, 7 = Alwayshost='127.0.0.1', Port=3306, the user='Root', Password='123', Database='Pooldb', CharSet='UTF8')deffunc ():#detects if the number of currently running connections is less than the maximum number of links, if not less than: Waits or reports raise Toomanyconnections exception #otherwise #gets the link steadydbconnection in the link created when the initialization is prioritized. #The Steadydbconnection object is then encapsulated into the pooleddedicateddbconnection and returned. #if the link you initially created is not linked, create a Steadydbconnection object and encapsulate it in pooleddedicateddbconnection and return. #once the link is closed, the connection is returned to the connection pool for subsequent threads to continue to use. #pooleddedicateddbconnectionconn =pool.connection ()#print (th, ' link was taken away ', Conn1._con) #print (Th, ' currently in the pool ', Pool._idle_cache, ' \ r \ n ')cursor=conn.cursor () cursor.execute ('SELECT * from TB1') Result=Cursor.fetchall () conn.close () Conn=pool.connection ()#print (th, ' link was taken away ', Conn1._con) #print (Th, ' currently in the pool ', Pool._idle_cache, ' \ r \ n ')cursor=conn.cursor () cursor.execute ('SELECT * from TB1') Result=Cursor.fetchall () conn.close () func ()
Ps:
If there are three threads to get a link in the connection pool: 1 links can serve three thread 2 links can serve three threads 3 links can serve three threads maxshared in using Pymysql/ MySQLdb is useless, because their threadsafety are 1, and the view POOLEDDB source code can know if and maxshared: = maxshared = [] # the cache for shared connections else: = 0 Even if it's set, it's always 0.
Python Database Connection Pool Dbutils