Use the big segment of the dream database to say something about the performance of a large segment of the database: In a database, you often need to use large segments of the type, such as in Oracle Long, blob, clob,sqlserver text, image,mysql text, Longtext, CLOB, blobs, and CLOB, blob types in the Dream database. Stored information is probably mainly two categories, such as long text, such as large paragraph of text, the general varchar can only store 4K of data, has not met the requirements; the other is to store binary information, such as uploaded files. In general, however, large sections do not mean saving a lot of files, such as long articles, icons, small pictures, and so on. The data is too large to keep in the database there are many problems:
1. Slow speed: Affects the query speed of a table, in addition to the number of rows, but also includes the size of the physical space of the table. This table does not feel a significant difference in query when the amount of data is small. However, if the data stored in a field is a large piece of text or a large file, the physical space of the table becomes rapidly larger, and the space occupied by the field may be more than 90% of the space occupied by the whole table. On this basis, if the number of rows is increased to hundreds of thousands of and millions, the space occupied by the whole table will reach an alarming number, and the speed of the query will be greatly affected.
2. Inconvenient operation: You must open a stream to the database, construct a buffer, and then output a servletoutputstream. Use database connection, increase database access load does not say, if the user suddenly interrupted download, also need to deal with database shutdown action, easy to cause performance problems. Memory consumption is significant if you read the entire data into memory and then output. If it is a hard disk file, just return to a URL on it. Even if you do not want users to access files directly, you can also construct a iostream to output files that will not consume database resources and transmit quickly.
3. Performance problems: In particular, if many users to download large files, the application server to occupy a lot of memory to cache the contents of the file, assuming 10 users, download 10MB of files, the JVM's peak will require at least 100MB memory to support, it is easy to cause the JVM crash.
Therefore, the large size of the database is not suitable for storing large data, the data may affect the performance of the database storage.
To get to the bottom of this, using the hibernate operation of the large segment of the dream database should have the following steps:
1 first need a table to store large and large segments of data: including content, type;
2 must get a data stream representing the uploaded file;
3 for Save operation
All right, we're going to start with a table that includes 2 large segment types (CLOB, BLOB) fields in the Dream database:
--创建表
create table TESTLOB(
ID int primary key,
TITLE varchar(50),
CLOBNAME varchar(50),
CLOBCONTENT clob,
BLOBNAME varchar(50),
BLOBCONTENT blob
);
In establishing a sequence used to process the table's water ID:
--创建序列
CREATE SEQUENCE "TESTLOB"."SEQ_TESTLOB_ID"
INCREMENT BY 1 START WITH 1 MAXVALUE 100000 MINVALUE 1
NOCYCLE