Author: Mark Liu
Download Sample Code
Simple Introduction
In Android, it's easy to create an app that can play video clips, and it's easy to create a game app that uses a 3D graphics plane. However, it is not easy to create an app that can play video on 3D graphics objects.
This article describes the apps I've created to address this challenge.
The app renders video on a 3D plane and enables users to play interactively on the video plane.
The application needs to address three major implementation issues:
- How do I build code to enable users to change the 3D plane while playing video?
- Although the default MediaPlayer in Android is equipped with full playback operations, it is difficult to obtain video image data and render it in a customized fashion.
- How do I design a graphics component to run video data rendering?
In addition, the application faces the challenge of device functionality because it requires video and graphical objects to be executed at the same time. This article introduces the features and use of the application. The design is then briefly introduced.
Characteristics
The application has the following features:
- It enables users to select video clips from their local media library.
- The application supports both horizontal and vertical orientations, and the video can be played continuously when the direction is converted.
- It is available in three modes of playback:
- Default mode: This is the normal playback mode. Video clips play as if they were in a regular media player.
- Random rotation: The plane of the video can rotate randomly with both landscape and portrait orientation, and its color can also be toggled between red and normal colors.
- Touch Rotation: The user can use the touch function to slide and rotate the plane to the left and right, depending on the speed.
- When you move an app to the background or close, you can save the current video clip and play it again when the app is activated again.
Note : There is no video session control in the app, so users cannot stop, fast-forward, or play back video. If you move the app to the background or turn it off, the video only starts playing from the beginning.
Use mode to start playing a video clip
The app does not include any video whatsoever. The user is therefore able to select the free video clips supported by the Android platform for playback. But. It is recommended to use H.264AVC codec to MPEG-4 standard processing video editing. Since we developed it in such a format. When the app is first started. The file selector interface (see previous page) will be displayed.
After the user selects the preferred file selector. The system will provide a screen indication.
After you select a video clip. will start playing.
Note : Sometimes the video needs to wait for a period of time before it starts-about 5 ~ 10 seconds.
Choose a different mode
Use Normal mode when the app starts playing. To change the mode, the user can click on the option button on the UI to launch the Options menu.
The Options menu has four options: The top three is the play mode option. The last option allows the user to select a different video clip.
Random rotation
This selection allows the plane of the rendered video to rotate randomly with a 3D effect on the horizontal and vertical axes.
In addition, it can periodically render red tones on the screen.
Touch rotation
In touch rotation mode, the user can swipe left and right. This allows the plane to rotate left and right along the vertical axis.
When the slide is faster. The rotation will speed up, and when the user stops sliding, the rotation will slow down.
Design
The three main components in the app are UI components that provide users with interactive application presentations, video components for video playback, render video framebuffer, and create a opengl* plane of 3D effects in special mode.
The video playback component has two threads. The video engine thread uses the Mediacodec class. This class is available in the Android Jellybean version number (API 16). It provides a low-level Android media framework API, so you can control playback at the frame level.
On each video frame. The graphics component controls the image data and changes how the image is rendered.
In addition The playback component can also run an audio engine thread that plays a soundtrack to a video clip. To synchronize the video and audio threads so that the characters ' lips match the voice they speak, we implement an AV synchronization algorithm.
Video playback keeps its timestamp consistent with the time stamp of the audio.
It is able to fine-tune the playback speed to control the video frame within a time range of 30 milliseconds.
The graphics component can join a glsurfaceview. This allows you to embed a custom rendering class in the class. The render class implements the renderer interface to run the OpenGL rendering algorithm. The algorithm can implement 3D rotation effect on the texture plane, and change the mode to other algorithms according to the user's input.
In the renderer. The rotation and coloring of video frames is controlled by a simple vertex and pixel shader. The rotation of the video plane is achieved by applying a series of Euler angular rotations.
The coloring of a frame is achieved by linearly blending the video frame content with a fixed color (in this case, red) in the pixel shader.
In addition, the graphics component can also be used as a video frame listener for the video engine. and can be transferred to the Mediacodec object during implementation, so that the rendering function can be invoked whenever a frame is available. The graphics component provides an option in the UI component so that users can change the playback mode.
The MEDIACODEC codec process is kept in sync with the video frame, and the renderer functionality in the graphics component is kept in sync with the graphics frame.
Therefore, when you access the frame buffer, the functionality in the two components must be kept in sync.
RELATED LINKS and resources:
- Porting opengl* games to Intel Atom processor-based android* systems
- android* MediaPlayer Sample code Walk-through on Intel architecture
- 3D games on Intel processor graphics
- intel® android* Developer Zone
To learn about many other Intel tools available for Android developers, visit:.
Author Brief introduction
Mark Liu is the software project division of Intel's Ultra Portable Devices Division (Intel Ultra-Mobile Group), responsible for developing a verification framework based on Android devices. In addition He also participates in several other Android projects with Intel's ultra-portable devices division. Includes smart phones and tablet devices.
Most of the work he does is related to media playback, video conferencing, and software stack performance tuning.
Add Intel software and devices division based on Intel? The Atom? After the processor's device software support team, he was responsible for a number of different tasks, including the development of Android sample media applications, system optimization for Windows * 8 Media framework, and documentation for writing media applications.
Chris Kirkpatrick is the software project division of Intel Software and services, and is responsible for providing Intel graphics solutions development support for visual and interactive computing project design teams. He holds a BS degree in computer Science from Oregon State State University.
classification:
| Video3d.zip (115.71 KB) Download now |
Copyright notice: This article blog original articles, blogs, without consent, may not be reproduced.
Android-based 3D video sample code