See a lot of people continue to ask this question.
As I've heard before, it's like the KVM bottom implementation doesn't really support J2ME to do streaming video/audio, but I don't know why that guy said that.
So now there is a person abroad to propose the following ideas, and allegedly in the nokia6260[related data: Nokia 6260 Nokia62602.0 (3.0436.0) symbianos7.0s Series602.1 CONFIGURATIONCLDC-1.0]
On the real implementation of (two network mode: Bluetooth and GPRS have been tested), but I suspect his premise is "your phone must allow simultaneous implementation of multiple instances of the player into the prefetched state (pre-read sound stream)":
First step:
Statement two player;
Step Two:
Httpconnection begins to request the first byte of the audio file to the server, and we will set the number of bytes read to 18KB;
Step Three:
When the first part of the data is in place, player A starts realize and prefetch and starts playing;
Fourth Step:
While player A is playing, (18KB of AMR data can be played for 10 seconds), Httpconnection continues to request the second part of the data (assuming that GPRS transmits 3KB per second, then 18KB needs to transmit 6 seconds, counting the time before and after communication loss, should not exceed 10 seconds) ;
Fifth Step:
When the second part of the data is in place, assuming player A has not finished playing (which requires adjusting every byte of your data to make assumptions), then feed the data to Player B and let it realize and prefetch;
Sixth step:
When player a finishes playing, gets the event notification, so player B starts playing.
So reciprocating.
Let's see if this theory can be.
I tested it myself on Nokia 7610, and the premise I mentioned above proved to be workable: "Your phone must allow multiple instances of the player to enter the prefetched state (pre-read sound stream)." Real Nokia phones can do this: each of the two threads has a player that starts doing m_player.realize () and M_player.prefetch () and waits.