"Mission 4" shortly after the opening, Agent Hanave on the train station to wear contact lenses, holding a mobile phone, in the vast sea of people searching for tracking objects. Glasses will Hanave see the information automatically collected and quickly match the character database to lock the target quickly. The face meets the beauty is the fatal killer, the handset sends out the alarm sound, displays the beauty killer the information card, but is too late ... Such sci-fi scenes are being turned into reality by technology giants, such as smart glasses and smart watches, which are the product of a combination of search engines and virtual reality.
The fusion of search engine and virtual reality
The so-called virtual reality, refers to the digital means to simulate the real environment to the user a immersive feeling, to provide users with visual, auditory, tactile and other sensory simulation, so that users as Urbanite general, can be timely and unrestricted to observe the three-degree space in the things.
Now in Baidu search the seaside, users will hear the waves and the sound of seabirds, search cuckoo, Birds and spring sound to bring a forest environment; Search for "lovelorn" will send a heartbreaking sound ... Let the search results sound unprecedented, which may make some users are not accustomed to, but this bold innovation or the search engine into the virtual reality of the era, all these attempts are to make search engine results more realistic and closer to reality, rather than just a boring list.
Search engine combines virtual with reality
The search engine's attempt at virtual reality is not a demand for fabrication. The internet allows people to break through time and space restrictions to obtain information, but there is nothing beyond the information, such as in the electrical business site before shopping without trial, and then for example to order the application of the order before the menu can not smell food fragrance, space barrier still exists. Virtual reality can help the internet break through time and space constraints, from sound and image upgrade to sound, image, smell, touch, taste, feeling ...
Multimedia search perceives the real world
Search engines were initially based on keyword text Search, then added to the understanding of natural language, and later can support complex search sentences, as a whole is based on text.
Now, more and more devices have the function of photographing, photographing and recording. Mobile device multimedia features have become standard, support positioning, body sense, gravity induction and other basic functions, intelligent hardware is equipped with the collection of health data, environmental data sensors. All kinds of data will be the input of search engines in the future.
Input mode rich Media is just the first step in multimedia search. The combination of search results and virtual reality enables users to face the results as if they are immersive as the second step in multimedia search. The big part of these two changes is the development of mobile Internet. The Internet connects people with information, the mobile Internet connects people with the physical world, and the real world is being mapped to the Internet in sync.
The map provides the spatial structure basis, the flow data such as information flow, video stream, voice flow, live stream and so on are mapped in time dimension. In the context of the mobile Internet mapping entity world, search engine missions are also changing: connecting people and services to help people explore the physical world.
Search Engine Evolution
When it comes to multimedia, people think of sounds, images and video, and when it comes to virtual reality, people think of "perception" of sight, hearing, touch, taste, smell, etc. Who is the most important in these multimedia forms or "perceptions"? Touch, taste, smell of the simulation, the current technical difficulty is still relatively large, but the visual and auditory has been relatively mature.
Relevant data show that more than 90% of human information access depends on the eyes, Li in 2012 Baidu World Congress declared that we have entered the era of reading. Then the voice and the image are parallel, even more mature.
However, voice is now more as input, output to graphics and other visual content mainly. But people expect to talk to search engines just as natural as Iron Man's conversation with his assistant Jarvis. Foreseeable future search results will be more through the voice of "read out", such as the screen is not suitable for the car scene, and many smart devices do not have a screen, based on this should be easier to understand why Baidu will let search results simulate the real world sound.
The voice can liberate the human hands, but also can liberate the eyes. However, Voice also has a scene that it is not good at: noisy outdoors, or home or office that can easily affect others.
In multimedia search, sound and image are not interchangeable relationships, but collaboration makes search engines more intelligent, simple and natural. Future multimedia search should be the main voice and image interaction, the two have different applicable scenarios. And the search engine box is like a real window, which connects people with information, people and services, people and the real world.