WebRTCDemo.apk Code read (7): line relationship, WebRTCDemo.apk read
Reprinted with indicated source http://blog.csdn.net/wanghorse
The actual ProcessThreadImpl threads of webrtc mainly include:
1. CreateThread (UdpSocketManagerPosixImpl: Run) under VoiceChannelTransport, used to send and receive network packets
2. AudioTrackJni: PlayThreadProcess, used for playing
3. The internal thread of SLAndroidSimpleBufferQueueItf under OpenSlesInput mainly captures audio
4. VoiceEngine_startSend thread OpenSlesInput: StartRecording, CreateThread (CbThread) to create hardware resources for processing hardware captured audio packets
5. ThreadWrapper: CreateThread (ChannelDecodeThreadFunction) in ViEChannel for video decoding
6. The CreateThread (IncomingVideoStream: IncomingVideoStreamProcess) thread of VideoEngine_startRender is used to obtain decoded video data.
7. VideoEngine_startRender thread CreateThread (VideoRenderAndroid: JavaRenderThreadProcess), used for hardware-level Render
8. The JAVA thread calls ProvideCameraFrame.
9. VideoEngine_allocateCaptureDevice thread CreateThread (ViECapturer: ViECaptureProcess), used to process captured probes
1. If audio is received, the network audio package, RTP packet parsing, NETEQ processing, queue, and so on will be directly collected in thread 1; decoding, speech synthesis, and playing in thread 2
2. for audio sending, audio capturing is completed in thread 3 (inside the system API) and placed in the webrtc program queue. Audio, encoding, packaging, and sending are processed in thread 4.
3. if the video is sent, the network video package is collected in thread 1, and RTP is parsed. Video decoding is performed in thread 5, and decoding data is processed in thread 6; complete hardware-level Render in thread 7
4. If the video is sent, the hardware camera data is captured in thread 8. Thread 9 is used to process the camera data, encoding, RTP packaging, and sending.