Mark Liu
Download Sample Code
Brief introduction
In Android, it's easy to create an app that plays video clips, and it's easy to create a game app with a 3D graphics plane. However, it is not easy to create an app that can play video on 3D graphics objects. This article describes the apps I've created to address this challenge. The app renders video on a 3D plane and enables users to play interactively on the video plane.
The application needs to address three major implementation issues:
- How do I build code to enable users to change the 3D plane while playing video?
- Although the default MediaPlayer in Android is equipped with full playback operations, it is difficult to obtain video image data and render it in a customized fashion.
- How do I design a graphics component to perform video data rendering?
In addition, the application faces the challenge of device functionality because it requires both video and graphics objects to run simultaneously. In this paper, the characteristics and use of the application are introduced, and then the design of the model is briefly introduced.
Characteristics
The application has the following features:
- It enables users to select video clips from their local media library.
- The application supports both horizontal and vertical orientations, and the video continues to play when the direction is converted.
- It is available in three modes of playback:
- Default mode: This is the normal playback mode; video clips play as if they were in a regular media player.
- Random rotation: The plane of the video can rotate randomly with both landscape and portrait orientation, and its color can also be toggled between red and normal colors.
- Touch Rotation: The user can use the touch function to slide and rotate the plane to the left and right, depending on the speed.
- When you move an app to the background or close, you can save the current video clip and play it when the app is reactivated.
Note : There is no video session control in the app, so users cannot stop, fast-forward, or play back video. If you move the app to the background or turn it off, the video can only play from the beginning.
Start playing a video clip using mode
The app does not contain any video, so users can choose any video clip supported by the Android platform to play. However, it is recommended to use video editing that uses the H.264AVC codec to MPEG-4 standard processing because we use this format at development time. The file selector interface (see previous page) will be displayed when the app is first launched.
After the user selects the preferred file selector, the system will provide a screen indication. When you select a video clip, playback starts.
Note : Sometimes the video needs to wait for a while before it starts-about 5 ~ 10 seconds.
Choose a different mode
Normal mode is used when the app starts playing. To change the mode, the user can click on the option button on the UI to launch the Options menu.
The Options menu has four options: The first three is the play mode option, and the last option allows the user to select a different video clip.
Random rotation
This selection allows the plane of the rendered video to rotate randomly with a 3D effect on the horizontal and vertical axes. In addition, it can periodically render red tones on the screen.
Touch rotation
In touch rotation mode, the user can swipe left and right, allowing the plane to rotate left and right along the vertical axis. When the slide is faster, the rotation will accelerate, and when the user stops sliding, the rotation will slow down.
Design
The three main components of the app are UI components that provide users with interactive application presentations, video components for video playback, render video framebuffer, and create a opengl* plane of 3D effects in special mode.
The video playback component has two threads. The video engine thread uses the Mediacodec class. This class is available in the Android Jellybean version (API 16). It provides a low-level Android media framework API, so you can control playback at the frame level. On each video frame, the graphics component can control the image data and change how the image is rendered.
In addition, the playback component can perform audio engine threads that play a soundtrack to a video clip. To synchronize the video and audio threads so that the characters ' lips match the voice they speak, we implement an AV synchronization algorithm. Video playback keeps its timestamp consistent with the time stamp of the audio. It can fine-tune the playback speed to control the video frame within a time range of 30 milliseconds.
The graphics component can add a Glsurfaceview to embed a custom rendering class in the class. The render class implements the renderer interface to execute the OpenGL rendering algorithm. The algorithm implements a 3D rotation effect on the texture plane, and changes the mode to other algorithms based on the user's input.
In the renderer, the rotation and coloring of video frames is controlled by a simple vertex and pixel shader. The rotation of the video plane is achieved by applying a series of Euler angular rotations. The coloring of a frame is achieved by linearly blending the video frame content with a fixed color (in this case, red) in the pixel shader.
In addition, the graphics component can be used as a video frame listener for the video engine and can be transferred to the Mediacodec object during implementation, so that the rendering function can be invoked whenever a frame is available. The graphics component provides an option in the UI component so that users can change the playback mode.
The MEDIACODEC codec process is kept in sync with the video frame, and the renderer functionality in the graphics component is kept in sync with the graphics frame. Therefore, when you access the frame buffer, the functionality in the two components must be kept in sync.
RELATED LINKS and resources:
- Porting opengl* games to Intel Atom processor-based android* systems
- android* MediaPlayer Sample code Walk-through on Intel architecture
- 3D games on Intel processor graphics
- intel® android* Developer Zone
To learn more about the Intel tools available for Android developers, visit:.
About the author
Mark Liu is a software engineer for the Intel Ultra-Mobile Group, which is responsible for developing an Android-based device verification framework. In addition, he is involved in several other Android projects in Intel's ultra-portable devices division, including smartphones and tablet devices. Most of the work he does is related to media playback, video conferencing, and software stack performance tuning.
Join Intel software and devices division based on Intel? The Atom? After the processor's device software support team, he was responsible for a number of different tasks, including the development of Android sample media applications, system optimization for Windows * 8 Media framework, and documentation for writing media applications.
Chris Kirkpatrick is a software engineer at Intel Software and services and is responsible for developing support for the visual and interactive computing engineering team with Intel Graphics solutions. He holds a BS degree in computer Science from Oregon State State University.
classification:
- Android *
- developer
- Android *
Video3d.zip (115.71 KB) Download now |