This article mainly introduces WEBRTC (we translate and collation, translator: Weizhenwei, check: Blacker), the earliest published in the "Weaving wind net"
Support original, reprint must indicate the source, welcome attention to my public number blacker (Id:blackerteam or WEBRTCORGCN).
Technically speaking, using a webcam for online broadcasting does not require WEBRTC. The camera itself is a server that can connect to the router and stream video content online. So, why do we need WEBRTC?
There are at least two reasons for this:
1. With more and more viewers watching the webcast, the network bandwidth will gradually become inadequate. If the number of viewers continues to grow, the web cam will also become a bottleneck.
2. As mentioned above, the webcam itself is the server. But what protocol does it use to transfer video content to a browser or mobile device? Most likely, the webcam uses an HTTP protocol to stream video frames and JPEG images. However, HTTP streams may not be suitable for real-time video streaming. In an on-demand video scenario where interaction and real-time performance is not important, the HTTP protocol does perform well. If you're watching a movie, a delay of 5-10 seconds doesn't affect the viewing experience unless you're watching the movie with others at the same time. For example, Alice and Bob watched a movie while Alice chatted with Bob before he saw the key plot: "Oh, my God, Jack murdered her!" "That's the spoiler."
Another option is to use the RTSP/RTP transport protocol plus the H + + video codec. However, in this case the browser needs to install the video player plugin beforehand, such as VLC or QuickTime. These plug-ins can receive and play video like a video player. But we need a real browser-based video stream without the help of any plug-in.
First, let's get to know the web cam you want to use, and learn exactly what it sends to the browser. As a test body, we use the D-Link DCS 7010L webcam:
You can read more about the camera's installation and configuration details in subsequent chapters, where we just check how it operates the video stream. When we log on to the Web management interface of the camera, we can see such slices as:
Images can be opened in all browsers, showing the same frame rate. If all cameras and computers are connected to the same router, the video playback results should be smooth, but in fact it is not. The HTTP protocol seems to be the reason why, let's run Wireshark to confirm this speculation:
Here we see a TCP sequence of 1514 bytes long:
Finally an HTTP OK response message contains the length of the JPEG image received:
Then we open the Chrome browser menu chrome/developer tools/networkto view the real-time HTTP GET requests and images transmitted over HTTP:
We do not expect such an HTTP stream, it is not fluent, and jitter the HTTP request. How many such HTTP requests can the camera handle concurrently? We suspect that when there are more than 10 HTTP requests, the camera strikes, or the error continues and the delay is noticeable.
Let's take a look at the HTML source code of the webcam Web management interface, and you can find the following code:
We need a RTSP/RTP protocol to play the video smoothly. But does this work in the browser? Whether! Maybe it works after installing the QuickTime plugin, but we want a pure browser video stream.
Another notable option is Flash Player, which can receive rtmp streams obtained through Wowza conversion rtsp/rtp/h.264. But Flash Player is also a browser plugin, although it is more popular than VLC and QuickTime.
In our scenario, we test the same RTSP/RTP stream, but use the WEBRTC compatible browser as the player to play the video without any additional plugins. We set up a conversion server that gets the video stream from the webcam, broadcasts it over the Internet to a user viewed in a WEBRTC browser, and has no limit on the number of viewers.
Translator: Weizhenwei, detailed see: " weaving wind net "
Android IOS WebRTC Audio Video Development Summary (83)--using WebRTC broadcast webcam video (top)