Recently do a large file import because it involves data to get the structure needed for file processing.
Save a large amount of data in an array.
In the process of processing, consuming a lot of memory, not memory_limit setup issues, I have set large enough.
In the process of their own also have unset off. However, the memory_get_usage () output function is called before the memory comparison with the call.
There was no noticeable drop in memory after the function call was found. The large arrays have been unset off. is the same.
If you import a small amount of files, there is no memory overflow error. But when the file is large to a certain extent. Because in the course of execution,
After the function is called, the memory is still in great use. causing a memory overflow.
This makes me a little depressed. Has the local variable space been freed after the function call?
Are there any memory leaks that someone has encountered the same problem. Want to have the same problem or someone knows. Together to discuss the next. Thank you.
PS: The score is very small, only this.
Reply to discussion (solution)
Unset () is an array or an array element?
Also, is the array used in the function a local variable or a global variable?
Also, is the array used in the function a local variable or a global variable?
Well, I've been thinking about that.
Unset is an array
I also know that the unset () global variable in the function
Just eliminate the temporary variable inside the function.
I've given the global $globals variable to unset.
That's still true.
Things that have not been done before have had similar problems, because such large data has not been processed before.
Now that I have this problem, let me rethink it.
Today also read the principle of PHP memory management
I don't know if anyone has ever met
Code...
Code...
The code is too long, I'll put a function on it.
function ImportTestSuiteFormArray1 ($db, $parentID, $tproject _id, $userID, $duplicateLogic,& $testsuiteArray, & $testcaseArray) {global $productIndex; $unexistPF = Array ();//Platform $PRODUCERELFB = Array () for the presence of Excel files not in the current project); $ Resultmap = null, $tsResult = null;//test suite import Result $tables = Tlobject::getdbtables ("Platforms"), $PFSQL = "Select Id,name from". $tables [' Platforms ']. "Where testproject_id= $tproject _id"; $pfRe = $db->fetchcolumnsintomap ($PFSQL, "id", "name"); $pfRe = Is_null ($pfRe)? Array (): $pfRe, foreach ($productIndex as $vKey = = $pItem) {if (!in_array ($vKey, $pfRe)) {$unexistPF [] = $vKey;}} $memory 1 = memory_get_usage (),//$fileName = Date ("Md-his"),//$hand = fopen ("e:/testcase/". $fileName. ". TXT "," A + ");//$bTime = Microtime_float (); $tempTSArray = Array (); $createSuc =" created successfully. "; $updateSuc =" The update was successful. "If (Is_array ($testsuiteArray) && count ($testsuiteArray) >0) {foreach ($testsuiteArray as $key = = $tsItem {//$begin = Microtime_float (), if ($tsItem [' name ']! = ") {if ($tsItem [' Parentnum ']==0) {$Parid = $parentID;} else{$parID = $tempTSArray [$key] [' ParentID '];} $TSUITEMGR = new Testsuite ($db), $info = $tsuiteMgr->get_by_name ($tsItem [' name '], $parID), if (Is_null ($info)) {$ret = $tsuiteMgr->create ($parID, $tsItem [' name '], "", $tsItem [' Node_order ']); $tsuiteID = $ret [' id ']; $tsResult [] = Array ( $tsItem [' name '], $CREATESUC);} else{$tsuiteID = $info [0][' id ']; $ret = $tsuiteMgr->update ($tsuiteID, $tsItem [' name '], "", NULL, $tsItem [' Node_ Order ']), $tsResult [] = Array ($tsItem [' name '], $UPDATESUC);} if (Is_array ($tsItem [' Children ']) && count ($tsItem [' Children ') >0) {foreach ($tsItem [' Children '] as $val) { $tempTSArray [$val] [' parentid '] = $tsuiteID;}} $end = Microtime_float ();//fwrite ($hand, "Time per statute:". ( $end-$begin). " \ r \ n ");}} echo "Testsuitearray ago". Memory_get_usage (). "
"; $testsuiteArray = Null;//echo" after "Testsuitearray". Memory_get_usage (). "
"; $GLOBALS [' Testsuitearray ']=null;//echo" after global Testsuitearray ". Memory_get_usage ()."
"; if (Is_array ($testcaseArray) && count ($testcaseArray) >0) {$tcData = array ();//$begin = Microtime_float (); $flag = 0;foreach ($testcaseArray as $key = = $tsItem) {if ($tsItem [' Parentnum ']==0) {$parID = $parentID;} else{$parID = $tempTSArray [$key] [' ParentID '];} $tcData [$flag] = Array ("name" + = $tsItem [' name '], "node_order" = = $tsItem [' Order '], "parentid" and "* * * * *"; _array ($tsItem [' property ']) && count ($tsItem [' property ']) {>0) {foreach ($tsItem [' property '] as $pKey =>$ val) {$tcData [$flag] [$pKey] = $val;}} if (Is_array ($tsItem [' Custom_fields ']) && count ($tsItem [' Custom_fields ') >0) {foreach ($tsItem [' Custom_ Fields '] as $cfName = = $cfValue) {$tcData [$flag][customfields][] = Array ("name" = + $cfName, "value" = = $cfValue);}} if (Is_array ($tsItem [' SRS ']) && count ($tsItem [' SRS ']) >0) {$tcData [$flag] [' SRS '] = $tsItem [' SRS '];} if (Is_array ($tsItem [' Produce ']) && count ($tsItem [' produce ') >0) {$tcData [$flag] [' produce '] = $tsItem [' Produce '];} $flag + +;} $end = Microtime_float ();//fwrite ($hand, "Cycle case Time:". ( $end-$begin). " \ r \ n ");//echo" Testcasearray Ago ". Memory_get_usage ()."
"; $testcaseArray = Null;//echo" after "Testcasearray". Memory_get_usage (). "
"; $GLOBALS [' Testcasearray ']=null;//echo" after global Testcasearray ". Memory_get_usage ()."
"; $tempTSArray = Null;//echo" after "Temptsarray". Memory_get_usage (). "
"If (Is_array ($tcData) && count ($tcData) >0) {//$begin = Microtime_float (); $resultMap = SaveImportedTCData1 ($db, $tcData, $tproject _id, $userID, NULL, $duplicateLogic, $produceRelFB, $pfRe); $tcData = Null;//echo "Tcdata". Memory_get_usage (). "
";//$memory 2 = Memory_get_usage ();//$end = Microtime_float ();//fwrite ($hand," Save use case Total time: ". $end-$begin). " \ r \ n ");//fwrite ($hand," Total Time: ". ( $end-$bTime). " \ r \ n ");//fwrite ($hand," Memory Consumption: ". ( $memory 2-$memory 1). " \ r \ n ");}} $return = Array ("Resultmap" = $resultMap, "Tsresult" + $tsResult, "unexistpf" = $unexistPF, "producerel" + = $produceRelFB); return $return;//return $resultMap;}
I explained under: $testsuiteArray and $testcasearray is an array of parse files that get. Two arrays may have dozens of m or hundreds of M.
Now the problem is not to say that arrays are too large to be used for other purposes. The problem is that after I call this function, and this function inside the SAVEIMPORTEDTCDATA1 function, the call finished SAVEIMPORTEDTCDATA1 memory is not significantly reduced. I tried to destroy all the returned results with the referenced variables, and the memory was still very large, resulting in an overflow. This looks like the code is tired, thank you.
Wood someone ...
can refer to
Http://www.laruence.com/2011/03/04/1894.html
can refer to
Http://www.laruence.com/2011/03/04/1894.html
Haha, this article I have read also studied the blogger said.
It says that the symbol table is accounted for. So I was thinking about how to release the symbol table.
I was asking the blogger. Thank you for your reply.
After file analysis, the array takes up 20M
Then the database operation, here involves a lot of operations.
After execution, the memory is much closer to 1G than the first.
Is there anything unusual about such a result?
During the process, there are unset of unnecessary data.
In fact, after careful analysis
Those arrays are released after the call is complete.
Why is it taking up so much memory after the call?
Because of the large number of operations on the database?
- -
Just want to ask the question,, 3Q
5,000 arrays, with more information in each array.
Performs more than 17W of database operations, and more than 40 seconds of database execution time
The loop operation is used.
The operating memory of each array is incremented gradually after execution.
It is supposed that the local variables are the same every time the loop is used.
Only the array that holds the results of the database operation increases, but it increases in small amounts.
It's impossible to need that much memory.
So where does this extra memory footprint come from?
Until the function call is complete. These memory spaces are not released, why is this?
Why Why Why
I am very distressed by this bottleneck.
Rich Experience
Great achievements, big brothers and uncles.
How did that extra memory come from?
Help .....
You try to release the database connection to try, may be occupied by the database operation.
You try to release the database connection to try, may be occupied by the database operation.
The database is released as well.
Did this middle operation lead to a memory leak?
Is anyone in this situation?
I've been in a situation where create_function has a memory leak, but you don't use it here.
This is probably only a gradual adjustment of their own,
There are several features in your xdebug that can help you analyze memory
I'm going to blow it--
Think too much.
You need to check the space occupied by the $return that carries the return value
You need to check the space occupied by the $return that carries the return value
The return value $return be unset.
Still occupy a lot of
$resultMap =saveimportedtcdata1 ($db, $tcData, $tproject _id, $userID, NULL, $duplicateLogic, $produceRelFB, $pfRe);
The memory before and after this function call is quite different.
I return the results $resultmap, $tcData, and $PRODUCERELFB and other related data are unset, or the same.
Does the function call end all the temporary space that was allocated when it was executed? How can this be so. I can't figure it out. What happened to that memory? Where is you--
Take a look at the garbage collection mechanism of PHP.
Take a look at the garbage collection mechanism of PHP.
I've known Java GC since I was a student.
PHP GC has long been familiar with the mechanism like Java.
I don't think that's the problem right now.
I think we should go into PHP kernel research the PHP memory allocation mechanism
Using Xdebug only execution implements no memory condition
I use Wincachegrind. exe to view
Or did I not find--
If you haven't processed such a large data,
With so many database operations
These problems are still not touched on the general system.
This function is not usually used
But as a management tool
Sometimes it's going to take so many big data deals.
I think this thing's settled.
It's going to be another growth. More attention to efficiency and other problems.
Be sure to solve
I feel so alone.
Other custom functions are also called in your function, and custom classes are instantiated, which can cause problems
You're going to have to do one.
Then check this function saveImportedTCData1 to see if it's because of his internal problems.
In addition, if the data is obtained from the database, the recordset resource has not been released ...
Thank you upstairs for two answers.
Custom classes and functions, yes.
Actually the biggest problem saveImportedTCData1 in this function.
I'm releasing all the output from this function.
Even if it's finished, memory consumption through meomory_get_usage () output is still n times the memory call before calling this function
Probably an order of magnitude more.
I think I'll just check the function.
Say Xdebug can see memory consumption through Wincachegrind?
I don't see any memory.
Say thank you again to the answer.
PS: In fact, if Baidu and Google can find I want to answer I generally do not ask. I found that most of the questions I asked did not get the answers I wanted. Can communicate with you also learn a lot. Try to find the reason again.
?????。
Self-troubleshooting, this outsider really can not help anything, especially your saveImportedTCData1 how to write do not know.
Add a reference pass and test again, probably because you have a variable in your function, and the variable itself is very large.
$resultMap =saveimportedtcdata1 ($db, & $tcData, $tproject _id, $userID, NULL, $duplicateLogic, & $produceRelFB, $ PFRE);
Self-troubleshooting, this outsider really can not help anything, especially your saveImportedTCData1 how to write do not know.
Add a reference pass and test again, probably because you have a variable in your function, and the variable itself is very large.
$resultMap =saveimportedtcdata1 ($db,& $tcData, $tproject _id, $userID, NULL, $duplicateLogic,& $produc ...
Well, my prototype of the function is just in $tcdata, and $PRODUCERELFB has a reference
Just checked it out a bit.
There is a line code in the function above
$tcData [$flag][customfields][] = Array ("name" = + $cfName, "value" = = $cfValue);
[CustomFields] forgot to quote, I remember reading the manual has mentioned this problem, this will lead to inefficient.
Because PHP needs a lot of extra checking. So it's always been customary to quote. May be careless to forget.
Results because of a lot of data so suddenly the cost difference comes out, I forget the space overhead is 20 times times worse. The cost of time is not counted.
The problem is still looking for another function to appear.
After Xdebug re-commissioning analysis
Final problem solved
Thank you all the above friends
Admire the landlord Ah
But how to solve the landlord? I've been having this sort of problem lately. Solution
The landlord is how to solve the main problems caused by those aspects