Recently done in the Windows XP x64,vs2005 environment to do 32-bit programs to compile the work of 64-bit programs, encountered some of the 64-bit programming may encounter problems: such as inline assembly (the solution is changed to C + + code), a long type of change, The most critical issue is encountering a 64-bit process that requires calling a 32-bit DLL. Because some 32-bit DLLs do not have source code, can not be recompiled to a 64-bit DLL, so only a way to solve the 64-bit process calls 32-bit DLL problem, this problem I was scratching my heads for a few days.
Related information:
Microsoft's official website describes the issue as follows:
In a 64-bit Windows system, a 64-bit process cannot load a 32-bit DLL, and a 32-bit process cannot load a 64-bit DLL. However, 64-bit Windows supports interprocess communication (RPC) between 64-bit and 32-bit processes, including native or cross-machine. In 64-bit Windows, an out-of-process 32-bit COM server can communicate with 64-bit clients, and an out-of-process 64-bit COM server can also communicate with 32-bit clients. So, if you have a 32-bit COM unrecognized DLL, you can encapsulate it into an out-of-process COM server and call the DLL in a 64-bit process with a COM configuration. (The last sentence I do not understand too!!) ha haha)
Verify:
Work Flow:
1. Create an out-of-process COM server (EXE).
2. Encapsulate the interface function of the 32-bit DLL as the related interface of the COM server.
3. Register the COM server *.exe/regserver (logoff *.exe/unregserver).
The 4.64-bit process calls the 32-bit COM server interface successfully. Thus the curve implements a 64-bit process call 32-bit DLL.
Specific steps:
I first created a simple DLL project, outputting only one function int c = Add (int a,int b); Generating Lib and DLLs
Then create an out-of-process COM (EXE type), internal link DLL, add method Method:add (Long *c)
{*c = add;} Compile the build.
Then register Com,*.exe/regserver
Create a 64-bit WIN32 project to verify that the method call is correct in a 64-bit environment, verified correctly!!!
Conclusion: The above method can solve the problem of 64-bit process calling 32-bit DLL
32-bit processes that call 64-bit DLLs should also be resolved in this way, because 32-bit and 64-bit COM systems are installed under 64-bit Windows systems
Problems and thoughts of 64-bit process
1. Pointers and long conversions
This is the most basic processing part, because the 32-bit system address is 32 bits, so in many code there will be such a conversion:
void* PData;
LONG LData;
LData = (LONG) pData;
The address is now 64 bits, so the original conversion will result in high address 4Byte discarded.
This conversion has always been considered unsafe, but still a large number of cases, the actual programmer code as long as the use of a pointer to keep it, there is absolutely no need to use a long reservation, the same problem will appear in the function pointer retention.
Thinking:
The so-called existence is reasonable thinking in the mischief bar, many programmers think this program to run well, so regardless of whether there is a risk still so write, actually changed to security code more than a few.
Of course, many programmers do not have the concept of 16-bit, 32-bit, 64-bit completely.
Another possibility is that Microsoft's Windows message is misleading, the message wparam and lparam often entrainment data pointers, and then forced type conversion, but look at their definition, which is separated by a layer, although only a little grammatical, but definitely has the foresight, And many people will only leaf out and not understand the period of ingenuity.
2. Changes in PE Import and export table
The application of the PE hack, because before someone found that Vista 32, the original need to replace a function in the import table can not be found, the result went to modify the export table, let's not talk about the danger of modifying the export table, results vista When I was 64, I found a small change in PE format.
Ms turns the address in import table to 64bit, but the export table is still 32bit, and it is estimated that the code cannot be more than 4GB, so it becomes difficult to change the export table. Because different DLLs will be load in different address segments, in which the gap will be greater than 4GB, so only your own DLL is mandatory defined in the same 4GB range as the target DLL.
Looking back, this change is extremely dangerous because the Export table OS will try to remain globally unique, so once your DLL exits failing to properly restore the original value, it will cause all other programs that use this DLL to crash.
I finally found that the function that needed hack was placed in the Delayload import table, and the simple modification solved the problem.
If the problem is not finally found and the method of modifying export table is adopted, then it will be miserable, and the testing department has found a regular system crash.
Thinking:
Here there is a detour, the time of this detour project is hard to blame, but the programmer failed to analyze the problem is a bigger problem, Vista in many aspects have retained a good compatibility, to analyze the Windows directory will find the basic and XP is not very different, So there's no change in the functionality of many of the underlying DLLs (which makes me feel that Vista isn't as 70% code rewrite as it's published), and it's easy to find the problem when parsing a DLL with just a little bit more detail.
Another problem is that MS in the definition of PE format is indeed very high-looking, very good to maintain the ability to loose coupling, in fact, the PE format in 2001 years after the basic no major changes, then we have basically no 64-bit concept, this is the need to learn, but also for export Table not extended to 64bit I still reserve the opinion, do not really think 4GB will not exceed? Didn't uncle Gates claim 640KB was enough? ^.^
3. Problems with Delayload
Delayload This feature in VC6 began to appear, generally people will not contact, including myself, but not this opportunity will not go to see.
Recommend a post
Http://www.microsoft.com/msj/0200/hood/hood0200.aspx
There is a detailed exposition, interested can see for themselves, in fact, its principle is very simple, mainly in the use of a DLL interface to load this DLL, this can save space, because some DLLs in the interface may never be used in the program cycle, and if a DLL does not, if not used , then the entire program can continue to execute.
This is really an ingenious setting, although there will be a slight performance penalty for the first call, and the code will be slightly larger, but it may lead to greater space savings.
But then comes the thought, why such a good thing will be so unfamiliar? Personal considerations are for the following reasons:
1. This technology is obviously too low for many people, how many people care about how the compiled program is stored, and how to be loaded to run it?
2. The benefits of Delayload are not visible to many people, so long as my program compiles and can be executed correctly, is it necessary to save some space or extend compatibility?
3. Now the hardware is too good, why do you want to do this optimization? Memory consumption is much, nothing to buy 1GB memory plug soon solved?
4. The compilation system is not smart enough, maybe Visual Studio is more intelligent, automatic analysis and then generate delay load will be better?
64-bit process call 32-bit DLL solution/program 64 Problems and considerations