How SQL server uses Log Explorer to restore logs to csdn Forum

Source: Internet
Author: User

How to Use logtaile:
After installation, open Log Explorer File => attach log file-> Select Server and login mode-> connect->
Select database> attach> browse> view log> In the left-side dialog box to view the log record,
Click View DDL commands. There are many drop table commands.
Click the "undo" button below to generate the table structure Statement (Create Table ....)
Click the "Salvage" button below to generate the insert Statement (insert into... values ....)
(Provided in the above versions by lynx1111)
I generate the insert statement for the table to be deleted according to the "Salvage" of the above method. In fact, this method is used to generate
The SQL script already contains createtable. The process took about 8 hours, and it was slower
The recovery process is almost impossible. The maximum table script size exceeds 1 GB after being generated.
After all the SQL scripts are generated, the database should be stopped and the log and. MDF OF THE DATE folder should be
Copy the file (for fear of damaging the log file, and do not use the database backup mode). The total file size is 5.7 GB.
After that, the formal recovery will begin. Create a new database, and first try to run
There is no problem with the script for small tables. However, when a large SQL script file is imported, the query analyzer reports an error.
. I consulted realgz to learn that logtaile R has good support for big scripts, so I switched to logexplorer -- run
SQL Script Function to run the script. Sure enough, the large file can be restored.
However, after running, it is found that tables with ntext fields are recovered abnormally slowly.
Writetext is used to write data in the table recovery script. Nearly 12 rows were used to restore a table with 0.3 million data.
Hours, and the database has a large number of such tables. To speed up the data, I installed several more machines
After three days, all the tables have been processed.
The process has a small number of errors.
Next, I will export tables from several machines to the same database. However, the restored tables do not contain indexes and identifiers.
Therefore, you need to re-create indexes, identifiers, default values, and triggers. When a primary key is created, it is found that
There are actually data duplicates... You cannot delete duplicate data.
You can use select distinct * into t_new from t_old to delete duplicate data,
This method cannot be used in tables with ntext fields. In the end, delete from t_table is used.
Where ID in (select ID from t_table A where (select count (*)
From t_table A where a. ID = ID)> 1) Deleted duplicate data
Record
After 72 hours of efforts, 99.9% of data is restored. The website was resumed on the evening of April 9, April 8.
At this time, some users reported that they could not log on. A small part of the data was lost, that is, the logtaile R reported
Wrong data ...... No way. I re-use uedit to open the SQL script and search for the data,
A closer look shows that a large number of carriage returns are used in some of these data, and logistice R cannot identify them. Therefore
.
Haha, the customer is God and there is no way, so we have to restore the user table to the local machine again. If an error occurs, we will record the ID,
Then test the SQL script to run the query analyzer (the query analyzer can run)
Now we have established a maintenance plan to make a full backup every week. In addition, the database operation process is also standardized to prevent
Stop such incidents
**************************************** ***************************************
*****
1. Use the text/ntext field with caution
2. The script execution tool of logtaile R is good for dealing with large files, but the execution process will produce incorrect judgments for multiple carriage returns.
3. Don't worry if you have any questions. Go to csdn and ask experts for help. They will be very enthusiastic to help you.

 

It is not easy to go out and work for others. What problems do you think are your reasons.

You said that the database does not even perform basic backup. What's more, I just need to import several stored procedures.

I'm too lazy to talk to that group of idiots.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.