The newest feature in the Software I ' m developing to Just software is a workstream, which are our aggregate name for MicroB Log and activity stream functionality. Since we ' re using a relational DB we ' ve modeled this module in the way then every WorkStream message be attached to a user ' s stream in a simple m:n relation manner. To reduce the number of INSERT statements the list of streams are assembled in the business service and then The database service which does sth. Like this:
INSERT into WorkStream (MsgId, UserID) VALUES (5, 1), Values (5, 2), ...;
This is however can result in a huge amount of VALUES (...) parameters when the platform has users. It didn ' t occur as a problem to me when implementing it; PostgreSQL can handle large data, no? The IT ' s at least limited. When to use a JDBC prepared statement then the number of parameters has a hard limit:32767. If you don ' t obey this limit you'll be something like this:
Java.io.IOException:Tried to send an out-of-range integer as a 2-byte value:40000
Not a very concrete message, right? When I-I-exception I was kind of baffled and disaffected. But after narrowing, the problem in a debug session and looking at the PostgreSQL JDBC driver ' s source code the cause was obvious:the PostgreSQL client/backend protocol dictates this number of parameters being send from the client to the Postgres backend as a 2 byte integer (Aaah, now the above message actually makes). You'll find details of the protocol this brave (my 2-byte friend are defined in the Parse message).
All right, got the bugger, what's the solution now? Easy:just split the parameter list up to chunks of say 30.000 parameters per INSERT statement.
----------------------------------
Problem: When the JDBC driver for PG prepared statement parameter set, a sending size on the client side is limited to 2-byte.
Workaround: The current version maximum limit of 50132, recommended each split in 50,000 parameters to execute.