Million red envelopes, fiery open!!! Have you more wonderful!
25 years ago, how did the developer The play was crammed into that little memory?
The question was read by 500,000 of people, and Dave Baggett's answer to the question was more than 6,000 praise, including the game master.
Crash Bandicoot Game cover
Problem description
The home game system software uses the 64k~128k magnetic card (cartridge), but it offers a wide variety of graphics, sprites and sounds to play for hours. The game system seems to provide a lot of functionality (functional functions, libraries, hardware acceleration and graphics instructions, etc.), a lot of pictures, music and sound effects, animation effects, game algorithms can be put into such a small storage space, how surprising, not to mention 25 years ago!
The size of the storage space mentioned above now seems to be equivalent to a JPEG file with medium-resolution compression (Moderate-resolution compressed)-a picture. I was very curious about what was going on with software development in that era. I firmly believe that at that time, developers certainly did not have a so-called open source Software collaborative development environment, not to mention in such a software development can reap the huge economic return of the era.
I was curious to know what knowledge, technology, ideas, or insights the developers of that era had done to develop this. Is it possible that some ideas have been lost or not recorded? There were so many types of video games, and it took millions of people hundreds of hours to get on top of it, not to mention the use of game development in such an efficient way--which is clearly a feat. This efficiency reminds me of the recorded music demo. In combination, I would like to ask the following questions: How to better understand the principles of computer science and the use of technology, for example, how to encode the 64K demo.
The point of this question is the professional skills at that time: Why did the developers have been so successful then? What techniques, solutions, or algorithmic techniques have been lost in their use?
Dave Baggett's answer
Here is an anecdote about the the late 1990s. At that time, I and Andy Gavin together for the PS1 to write the game "The Ancient Wolf" (Crash Bandicoot).
At the time, Ram remained the main problem. PS1 only has 2MB of RAM, however we have to do something crazy to make the game fit the hardware. Our game data is about 10MB in size, so we have to do dynamic paging input and output (paged in and out dynamically), although the load lag frame rate drops to 30Hz, but we don't have any problems.
It was successful because Andy wrote an incredibly paging system that swapped the 64K data pages to be used as a data traversal level for the game of the ancient wolf. This was the "all-stack" capability at the time, and in this paging system, the entire code of the Direct Memory Access system (DMA) was run up to advanced memory management, down to the opcode level. Andy even controls the physical layout of the bytes on the CD-ROM disk, so even if the disk's rate is 300KB/S,PS1, it can be loaded into the corresponding data when the game is executed to a certain location.
I was mainly responsible for writing packaging tools, the function of the tool is to the resource files, such as sound, graphic images, small animal Voice control code (translator Note: Lisp control code) packaged as 64K data paging, stuffed into Andy's system. (Incidentally, this problem-packaging objects of any size into fixed-size data paging, producing packets-is a NP-complete problem, so it seems impossible to get the optimal solution at a reasonable time or in a linear complexity.) )
However, sometimes the algorithm is not appropriate, my packaging tool uses a variety of algorithms (such as the first adaptation, The best Fit (first-fit,best-fit), etc.), only to find the best packaging scheme, including using random search similar to simulated The gradient descent process used in annealing. Basically, I wrote a bunch of different packaging strategies, one for each attempt, and the preferred choice.
The problem with using random guided searches like that (a random guided search) is that you never know if you can get the same results again. Some of the old wolves ' checkpoints can only be packed randomly by "taking a chance" and putting the maximum allowable number of pages (21 in my impression). This means that once you have packaged the level, you may have changed the code of a turtle, and you will never find the 21-paging packet again. A few times, an art design modifies some content and destroys the current paging count, so we have to change the content half-randomly until the packager can find the available packets again. And I have to explain it to this stubborn artist at 3 o'clock in the morning.
Now recall that the best part of the time, and the worst part of it, was to adapt the core C/assembler code (c/assembly). It was not a few days since the release of the final beta, and these days were the best chance for us to seize the holiday release game, before we lost a whole year. At that time we were making a random arrangement (manifestations) of the C language code that was semantically the same, but with different grammatical manifestations (semantically identical but syntactically different permute) You want the compiler to be able to generate 200-byte, 125-byte, 50-byte, and then less than 8-byte code.
As the method used at that time "for (i=0; i < x; i++)"--what happens if we use the above variables and rewrite the method with a while loop for use as his? This is where we try out all kinds of general methods-like the lowest two bits of data that are plugged into pointers (this method only works on R3000, because all addresses are 4-byte aligned (4-byte aligned))-the solution after that.
In the end, the Wolf succeeded in adapting the contents of the PS1, as well as the 4-byte free space. Yes, 4 bytes outside of 2097152. That was a good time.
How did the developer plug the game into memory 25 years ago?