1 Wifidisplay Introduction
1.1WiFiDisplay Overview
Wifidisplay (WFD) is a new concept proposed by the WiFi Alliance on the basis of existing technology, in order to accelerate the video/audio transmission sharing. WiFi Alliance set up a certification program for this: miracast--is used to certify whether a device supports Wifidisplay functionality.
is the Wifidisplay function of the technical support system, in fact, the most important part of the WiFi direct: that is, two devices without AP (accesspoint) in the case of direct connection, which laid out two WiFi-enabled devices can be delivered at any time high-quality/HD video premise. In addition, other deep-blue technologies are required to support:
11n: 802.11n protocol, support the highest transmission speed 540mbit/s;
WMM: That is, WiFi multimedia short, mainly for different data content to ensure the stability and quality of its transmission;
WPA2: is the WiFi Alliance for the use of 802.11i protocol and the use of more complex encryption algorithm certification program;
WiFi Protectedsteup: is also a Wi-Fi Alliance certification project: To simplify the user to install wireless LAN and security performance of the configuration work;
WiFi Direct: Indicates that the device can be connected directly without the participation of AP;
WiFi Miracast: This is the certification program for whether the Wifi-display function can be implemented.
Figure 1 Wifidisplay Technical Support architecture
In addition, the WiFi Alliance describes the simplified working model of wifidisplay (Figure 2). In this working model, Miracast defines the source side of the transmission of the video/audio data, and accepts the data and re-renders it as the sink side. You can see that the source side has the ability to store and download/Generate data content, and to encode the data. The sink end requires the ability to decode the data, and the ability to re-present the view/audio. Miracast is the definition of the two devices, how to maintain the session, the format of the data can be transmitted standard, session control and other content.
Fig. 2 The working model of Wifidisplay
1.2 Wifidisplay important specifications and standards WiFi Federation defines the miracast supported video/Audio format standards:
Figure 3 Standard for display, video, and audio formats supported by Miracast
At the same time, Miracast also regulates the negotiation of the device (Figure 4) and the process of establishing the session (Figure 5). This detailed description of the device after establishing a physical connection, through the standard steps to complete the session of WiFi display, and then start the data transfer. For more information on each standard step, see the official explanation of Miracast.
Figure 4 Standard procedure for device negotiation defined by Miracast
Figure 5 Miracast definition of the display session establishment process standard
2 introduction of main modules
Because the WFD function mainly involves the WIFIP2P function and the display function, now the two modules involved in Android Wifip2pservice and Surfaceflinger do some introduction.
2.1 wifip2p2.1.1 wifip2p Introduction
WIFIP2P is an important technical specification proposed by the WiFi consortium that defines how two WiFi devices can connect and communicate without routing. By definition, a device that supports WIFIP2P needs to act as a peer-to groupowner or peer-to-client role to form a peer Group:
Figure 6 WIFIP2P Workgroup model
The device of peer group owner needs to play the function of traditional route: Control wifip2p Workgroup, enable equipment communication, etc. p2pclient devices need to connect to the peer group owner device to form a workgroup to communicate.
Based on the above working model, wifip2p refines the following technical items:
Figure 7 wifip2p definition of the P2pdiscovery specification
In the peer-to Discovery specification, the details of discovering devices (device Discovery) and building a workgroup (groupformation) are defined. It is found that equipment is required to first enter the scanning phase (scanphase), to send the probe request frame, and then into the search phase (find Phase), at this stage the device will be in searchstate and listen Switch in state: Two stages are sent probe request frame, listen proberequest frame and send probe response frame. Once you have found a nearby peer device, you can build a workgroup: It includes the negotiation (gonegotiation) of the group owner and the device Exchange Security configuration information (Provisioning) for the client device to use the security configuration information to connect to go.
In addition, WIFIP2P defines the Groupoperation technology item, which describes the scenarios and processes that interact with the workgroup:
· Peer Device Join GO (Group Owner)
· Peer Device Join GC (Group Client)
· GC Invite Peer Device
· GO Invite Peer Device
... ...
2.12 Android in wifip2p
The wifip2p module on Android mainly involves the following parts:
Figure 8 wifip2p in Android involves modules
Wifip2psettings is used to interact with the user part of the main is to provide UI to the user to choose to open/close wifip2p function, rendering the search to the user-to-peer device to users, etc. ; Wifip2psettings implements the function called the interface in the Wifip2pmanager, Wifip2pmanager will eventually interact with Wifip2pservice, they rely on the Android binder mechanism to achieve inter-process communication, Wifip2pservice is the core module for managing WIFIP2P functionality in Android:
Figure 9 wifip2p in Android involves modules
In the Wifip2pservice class, there is an inner class p2pstatemachine (state machine) to manage the different states of wifip2p and then perform different actions In addition, Wifip2pservice also creates a Wifimonitor object to receive messages from Wpa_supplicant and to drive the state machine's work based on the received message to the P2pstatemachine value. As you can see, wifip2pservice or wifimonitor interact with the interfaces in the Wifinative.java, which are all native functions, because the wap_supplicant process is a C language implementation, Android is called through the JNI mechanism android_net_wifi_wifi.cpp inside the local interface, these local interfaces to eventually interact with Wpa_supplicant (WIFI.C inside is to send to Wpa_ Supplicant inside the package of messages).
Wpa_supplicant is an open source project that implements many of the features in the WiFi Alliance specification, as described in the official documentation. is a sketch of the wpa_supplicant and the lower part:
Figure Wpa_supplicant and the bottom part sketch
2.2 Surfaceflinger
As is known to all, Surfaceflinger is an important module in Android that manages graphics and is presented to Framebuffer, which collects image data drawn from all applications in the system and then concentrates on the physical screen. is a simplified block diagram of the Android display:
Figure one Android display System brief block diagram
Describes the scenario using OpenGL ES Graphics development specifications. First, each application plots itself on its own window, where the window (Window-2) actually corresponds to some buffers, and then surfaceflinger the buffers in a unified management At the end, the graphic data for each frame is sent to the framebuffer and is actually displayed on the device.
For upper-level applications, the WINDOW-2 is implemented as a Surfacetextureclient class in Android. Because it is under the development framework of OpenGL ES, Surfacetextureclient actually inherits from Anativewindow and implements some interfaces locally (see EGL knowledge):
Figure 12 window implementations for upper-level applications
Hook_dequeuebuffer () and Hook_queuebuffer () are local functions that need to be implemented in the framework definition, where the upper-layer application needs to be buffered from the buffer queue, and the buffer queue is sent back to the buffer when the drawing is completed. The situation is as follows:
Figure 13 Local window and buffer queue interaction diagram
For example, the local window actually gets the buffer from the bufferqueue, and the buffer is into row to the buffer queue after drawing. And Bufferqueue is generated when the application creates surface (corresponding layer on the Surfaceflinger side), The local window gets the Isurfacetexture object through the returned ISurface object to eventually interact with Bufferqueue.
For the local window Window-1 (Figure 11) on the Surfaceflinger side, Surfacetextureclient also functions as a local window. However, the object of the request buffer operation and the into row buffer operation on the Surfaceflinger side through the standard OpenGL ES interface is different, and the operations performed after the into row buffer are different:
Local window of Figure Surfaceflinger
For example, on the Surfaceflinger side of the local window Window-1, there will be its own Bufferqueue object, and finally the Gralloc module of the HAL layer interacts with the actual allocation of buffers, Different types of buffers are allocated depending on the function: the buffer allocated by the upper application is for OpenGL drawing; The buffer allocated here is to be Surfaceflinger framebuffer (see GRALLOC.H).
For upper-level applications, populating the buffer post-processing end (Consumer&producter model) invokes the onframequeued () function of the layer file, which is required to wait for a vsync signal if the graphics need to be rendered ( See Android Project Butter), when you receive the vsync signal, the buffer that needs to be rendered is calculated by the z-axis and the image is mixed, and for the Surfaceflinger end, After the buffer into row consumer will eventually call FB's HAL layer interface to complete the actual display.
3 Implementation details under Android
Android has supported the Wifidisplay feature since 4.2. The following is a block diagram of Miracast's official WFD work:
Figure 15WiFiDisplay Working block diagram
On the basis of the above working model, Android has done the following work mainly:
• Added the setting portion of the Wifidisplay to provide the user with the option to operate;
• The new Plus Displaydevice class is used to describe different display devices, Wifidisplay as a virtual display device;
• Modified the Surfaceflinger module to support the Wifidisplay and put the display content on different devices via SF;
• New Displaymanagerservice for unified management of system display devices
• Establish a management process based on the session defined by Miracast to implement the program at source and Sink end
3.1 Application-side
For the Wifidisplay feature, Android provides an interface for the application to use for display on the second display device. The details refer to the official description, the key steps are shown in Figure 16:
Figure Android on WFD application programming brief steps
You can see two classes consisting mainly of presentation and mediarouter. In fact, they all invoke the function –displaymanager provided by an important class, and the implementation of the Displaymanager interface is in the Displaymanagerservice.java file in another process.
3.2 Displaymanagerservice and related
Displaymanagerservice is the core service for managing Wifidisplay features in Android:
Figure Displaymanagerservice schematic diagram
Each display device (Localdisplaydevice, Wifidisplaydevice) has a corresponding displayadapter, Displaymanagerservice manages these adapter to manage different display devices. For example, Displaymanager initiated the operation of the Wifidisplay request, will be Displaymanagerservice transferred to Wifidisplayadapter to actually complete. In addition, the Displaymanagerservice service also calls the function of Windowmanagerservice directly through the Windowmanagerfuncs window management function interface, realizes the refresh of the window content. The same Displaymanagerservice service also calls the Inputmanagerservice function directly through the Inputmanagerfuncs interface to set the display view information for the display required by the input system.
The wifidisplayadapter operation will eventually be implemented in the Wifidisplaycontroller, to achieve device scanning, device connectivity and other operations:
Figure Wifidisplaycontroller schematic diagram
With regard to the Wifidirect connection to other devices, Android has a dedicated wifip2pservice to manage, where the connection to the device is done directly by invoking the interface provided by Wifip2pmanager (and wifip2pservice interaction).
3.3 Source/sink Device session management side
After the device establishes a WiFi connection, the RTSP connection is monitored on the remotedisplay for subsequent session creation and transmission of live stream data.
Fig. 19 schematic diagram of the monitoring RTSP connection process
For example, the actual monitoring final process will go to remotedisplay, that is, the creation of a Remotedisplay object. By creating this local Remotedisplay object, a Anetworksession object is created to manage the operation of the network section, and a Alooper object is created to distribute and process the various messages; Create a Wifidisplaysource object (because most of the devices that carry the Android system are in the Wifidisplay scenario, most will play the source side, so here is the Wifidisplaysource object), which handles all kinds of messages:
Figure Remotedisplay schematic diagram
In addition, the source directory also includes C + + class files such as PlaybackSession.cpp, MediaPuller.cpp, Converter.cpp, TSPacketizer.cpp, Sender.cpp, and corresponding header files. is responsible for the source side of the session process management, mirror media reading, encoding, TS Packaging and RTP package delivery process. The Converter object includes a Mediacodec object that is responsible for the encoding process, and the Mediacodec object actually calls the Iomx interface through the Acodec object to complete the encoding of the mirrored data using the OPENOMX media framework.
3.4 Video/Audio Data acquisition method
Schematic of media data acquisition at source end
, the source device-side program manages the visual/audio data to be rendered on the remote device through the MediaSource class. In the Surfacemediasource and Audiosource are inherited from the MediaSource class, they correspond to video data and audio data, according to different scenarios, the source device side program can add video or audio data source to the current program, And every mediasource will create a converter object and a Mediapuller object for encoding and data reading of media data. is to obtain the video data of the program fragment, call MediaSource Universal interface read to implement.
Figure 22 Getting the video data program fragment
3.4.1 Video Data source
As shown in Figure 22, the video data is read from a bufferqueue. The top-level application is to put the video data that needs to be displayed in the bufferqueue corresponding to the surface, to achieve the remote device delivery of video data.
Figure Surfaceflinger Management of different display equipment diagram
In more detail, the above surface is held by Wifidisplaydevice, which is also related to the changes surfaceflinger make to support the screen delivery to different devices. 23, mainly abstract Displaydevice to express the local display device and Wifidisplay display devices, in the SurfaceFlinger.cpp handletransactionlocked function can be seen:
Figure Surfaceflinge Processing code snippets for different display devices
For virtualdisplay words (Wifidisplay device), Mframebuffersurface is empty; Surfacetextureclient directly uses the surface variables that are carried in the state information. This surface is exactly when RTSP was established, registered in the callback function ondisplayconnected in the Sufaceflinger (Section 3.3 describes the listening connection), This surfaceflinger will post the display data for the upper application onto the surface, and the source side program will take the data out of the surface.
3.4.2 Audio Data source
As can be seen from Figure 21, the acquisition of audio data is obtained through AUDIORECOD. The Android system abstracts the Audiotrack object and the Audiorecord object for the audio system, respectively, for the output of the sound data and the acquisition/acquisition of audio data. For audio data processing in Wifidisplay scenarios, Android is also done without destroying the original architecture:
Schematic diagram of a simple audio structure in WFD scenario
Audiorecord through the Audiosystem object to obtain Audioflinger and audiopolicyservice the remote interface to operate, and finally the HAL Layer module interaction, in fact, the completion of a read operation from a pipeline , and Androidtrack is based on the same model, except that the process is done by writing to a corresponding pipe in the HAL layer. Where Audiopolicyservice plays a central role in the various components of Android about audio:
Figure Audiopolicyservice Core and other components interaction diagram
For example, in detail: audiopolicyservice direct interaction is a call ' Policy ' HAL Layer module, this module provides a common interface, then the common interface will be called into a specific strategy management class provided by the interface, corresponding to the Audiopolicymanageralsa; Finally, according to the different invocation parameters to obtain the real HAL layer module, and hardware interaction, and Audiofli Once the Nger has been given the specific HAL layer module, it can be manipulated directly.
These specific HAL modules, the ID name is audio_hardware_module_id, just compiled into a different name of the library file, and the corresponding Wifidisplay HAL layer module does not need to deal with the specific hardware, the actual implementation of a pipeline:
Fig. Wifidisplay HAL Module Program fragment
Therefore, for audio data that needs to be rendered through wifidisplay to a remote device, only the Audiotrack interface is required to place the audio data in the pipe in the HAL module above , and the source terminal of Wifidisplay only needs to call Audiorecord interface to fetch the data. The most pre-sent encoding, audio/video packaging is dedicated to processing, without destroying the original system architecture.
4 Wifidisplay application scenarios and related products
4.1 main application Scenarios
Wifidisplay Technology realizes the wireless remote rendering of video/audio data, and there are many equipments involved in audio/video processing, and the common application scenarios are:
1, digital player and digital TV work scene: In this application scenario, because the source end device does not have user interaction screen, so the sink end device needs and user interaction to complete the device pairing, transmission control and so on;
Figure WFD Application Scenario 1
2, describes a digital set-top box and DTV plus an AP's working model, the source device in this model is connected to both the AP and the sink device, the source side of the data is now obtained from the network. This is allowed in the Wifidirect model for a virtual address that is actually corresponding to the peer-to-peer connection at the bottom.
Figure WFD Application Scenario 2
3, is described is the application scenario with two sink end devices, at this time the source device needs to find and connect two sink devices, and send video/audio data to different sink devices. At the same time two sink devices also need to communicate with each other to do synchronization and other information exchange.
Figure WFD Application Scenario 3
4.2 Related Application products
Wifidisplay application products, the market already have some, such as Baidu launched Baidu Audio and video sticks, millet boxes and so on. The work of Baidu AV Stick is as follows: (Official note)
Figure 2,900-degree video Stick 2S working diagram
Baidu developed a series of Baidu AV Bar, in fact, a full consideration of the reality of a set of wireless video playback program. From the point of view, considering the actual situation of the existing TV (most only with HDMI function), this scheme for TV or the use of HDMI and Baidu 2S wired connection, and the real wireless module is here Baidu 2S. Here the source device is the mobile phone, sink-end device is actually Baidu 2S.
The Xiaomi box is basically the same as the working model, but it supports the music remote playback as well as the airplay protocol and the DLNA Protocol (official instructions) according to the official instructions.
5 DLNA Technology and airplay technology5.1 DLNA Technology
DLNA (Digital Living network Alliance) is a certification organization initiated by Sony, Intel, Microsoft, etc., and is designed to enable consumer electronics to interconnect devices and data across wired/wireless networks. Compared to the Miracast certification program, DLNA has its own set of frameworks and standards: first, DLNA classifies virtually all electronic products on the market (figure 30), allowing DLNA to standardize on a wide range of devices, On the other hand, DLNA is the basis for the standard of interaction between different devices.
Figure a DLNA-defined device category and type
Next, DLNA defines the device's working architecture (Figure 31), where UPnP is an important protocol layer for device discovery, connectivity, and control (reference articles), while media data transfers are either using the HTTP transport Protocol or the RTP protocol.
Figure DLNA System architecture diagram
5.2 Airplay Technology
Airplay is a solution that Apple has implemented to transfer media information between Apple products or their certified products. Airplay technology enables automatic discovery of each other between products, and easily transmits music, pictures and video files to each other.
Airplay is based on the multicast DNS (multicast DomainName server-mdns) protocol and DNS service discovery (DNS services Discovery, abbreviated DNS-SD) protocol, which are IETFN zeroconf Working Group proposed a network protocol for automatic search of equipment and services, based on the two protocols, Apple's digital home network framework was implemented.
Airplay protocol message sending format and rules based on MDNs protocol, MDNS protocol based on multicast technology, defines the basic format and receiving/sending rules of messages between each device in the home. The protocol is based on the DNS protocol and makes some modifications to its message format and message order. For example, the DNS message header has been simplified to focus on achieving mutual discovery of home devices, and in view of the use of multicast technology, MDNs has made many improvements in reducing network congestion and message redundancy, so that the discovery of devices and services within the LAN does not cause excessive message interaction.
On the basis of the MDNs agreement, the DNS-SD agreement provides a complete process for declaring and using the service. That is, what MDNs messages must be sent by the device to fully declare and describe their services. The DNS-SD protocol uses PTR, SRV, and txt three types of records to comprehensively describe the type of service, its name, and the IP and port number of the host.
When the DNS-SD protocol is used to realize the discovery and description of devices and services, Apple's Airplay protocol specifies the transmission and control message format of images, audio and video, thus enabling media sharing and collaborative action between smart devices. After obtaining information on other devices and services through DNS-SD (i.e., the IP address and port number of the device or service), airplay uses the HTTP message to achieve the transmission and control of pictures and video, and uses the RSTP protocol to achieve audio transmission and control.
The airplay work model and Wifidisplay technology are basically consistent with DLNA technology, except that it is Apple's private solution and therefore there is no public specification (non-official agreement standard).
PDF Download: http://download.csdn.net/detail/srw11/7645371
Research on Wifidisplay function under Android