IOS development-Multimedia

Source: Internet
Author: User
Tags file url set background

IOS development-Multimedia

In iOS, audio playback can be divided into audio playback and music playback. The former mainly refers to the playing of some short audios, which are usually used as embellishment audios and do not need to be controlled by progress or loop. The latter refers to some relatively long audios, usually the main audio, which usually requires precise control. Playing audio in iOS uses AudioToolbox. framework and AVFoundation. framework to play audio and music.

I. Sound Effects

AudioToolbox. framework is a C-language-based framework used to play Sound effects. In essence, it registers Short audio to the System Sound Service ). System Sound Service is a simple and underlying Sound playing Service. This method can only be used to play small prompts or warning sounds. It can also call the System vibration function, however, it also has some limitations: 1. the audio playback time cannot exceed 30 s. the data must be in PCM or IMA4 format and can be played.) 4. the playback progress cannot be controlled. play the sound immediately after calling the method. 6. no loop playback and stereo Control

Ii. Music
If you want to play a large audio or have precise control over the audio, the System Sound Service may be difficult to meet the actual needs. In this case, AVFoundation is used. AVAudioPlayer in the framework. We can regard AVAudioPlayer as an advanced player that supports a wide range of audio formats.

AVAudioPlayer can play audio files of any length, supports loop playback, supports synchronous playback of multiple audio files, controls the playback progress, and starts playing from any point of the audio file, for more advanced functions, see AVAudioPlayer documentation. To use an AVAudioPlayer object to play a file, you only need to specify an audio file for it and set a delegate object that implements the AVAudioPlayerDelegate protocol.

As long as the numberOfLoops attribute of AVAudioPlayer is set to a negative number, the audio file continues to play cyclically until the stop method is called.

The AVAudioPlayer class encapsulates the ability to play a single sound. The player can be initialized using NSURL or NSData. Note that NSURL cannot be a network url but must be a local file url, because AVAudioPlayer cannot play network audio. Because AVAudioPlayer can only play a complete file and does not support stream playback, it must be buffered before playing. One AVAudioPlayer can only play one audio. If you want to mix audio, you can create multiple AVAudioPlayer instances, each of which is equivalent to one track on the sound mixing board,

Additional advanced
1. Set the background playback mode (two steps are indispensable ):
Step 1: Required background modes
App plays audio or streams audio/video using AirPlay

Step 2: Set the AVAudioSession type to disabled and call setActive: To start the AVAudioSession * audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory: AVAudioSessionCategoryPlayback error: nil]; [audioSession setActive: YES error: nil];

2. Handle unplugging the headset and pause the Function
[[Nsicationicationcenter defacenter center] addObserver: self selector: @ selector (routeChange :) name: AVAudioSessionRouteChangeNotification object: nil];
NSDictionary * dic = notification. userInfo;
Int changeReason = [dic [AVAudioSessionRouteChangeReasonKey] intValue];
If (changeReason = AVAudioSessionRouteChangeReasonOldDeviceUnavailable ){
AVAudioSessionRouteDescription * routeDescription = dic [AVAudioSessionRouteChangePreviousRouteKey];
AVAudioSessionPortDescription * portDescription = [routeDescription. outputs firstObject];
If ([portDescription. portType isEqualToString: @ "Headphones"]) {
If ([self. audioPlayer isPlaying])
{
[Self. audioPlayer pause];
Self. timer. fireDate = [NSDate distantFuture];
}
}

Configure iOS9 HTTPS & HTTP
NSAppTransportSecurity

NSAllowsArbitraryLoads
 

Shake demo

# Import "ViewController. h "// short-effect Audio Framework # import @ interface ViewController () @ end @ implementation ViewController-(void) viewDidLoad {[super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib .} -(void) motionBegan :( UIEventSubtype) motion withEvent :( UIEvent *) event {NSLog (@ "Shake start"); // play short-effect audio // create system sound ID SystemSoundID soundID; // create a short-effect audio player and set the playback resource NSURL * url = [NSURL fileURLWithPath: [[NSBundle mainBundle] pathForResource: @ "shake" ofType: @ "wav"]; combine (_ bridge CFURLRef) (url), & soundID); // play the short-effect audio and add the vibration effect. // kSystemSoundID_Vibrate indicates the vibration effect audioservicespsystemsound (kSystemSoundID_Vibrate ); // play sound AudioServicesPlaySystemSound (soundID);}-(void) motionEnded :( UIEventSubtype) motion withEvent :( UIEvent *) event {NSLog (@ "Shake end "); // General write page Jump}-(void) motionCancelled :( UIEventSubtype) motion withEvent :( UIEvent *) event {NSLog (@ "Shake cancel ");} method for playing music # pragma mark-set the music player-(void) createAudioPlayer {// initialize the player // method 1 (local) using NSURL to initialize the URL here is not a network url but a local url NSURL * url = [NSURL fileURLWithPath: [[NSBundle mainBundle] pathForResource: self. localMusicArray [_ currentIndex] ofType: @ "mp3"]; NSLog (@ "% @", self. localMusicArray [_ currentIndex]); _ audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: url error: nil]; // pre-play // [_ audioPlayer prepareToPlay]; /// play // [_ audioPlayer play]; // second (network) buffer the file locally with NSData initialization and then play the local file // _ audioPlayer = [[AVAudioPlayer alloc] initWithData: [NSData dataWithContentsOfURL: [NSURL URLWithString: self. urlArray [_ currentIndex] error: nil]; /*/set the proxy _ audioPlayer.. delegate = self; // set the volume _ audioPlayer. volume = 0.5; // between 0.1-1.0 // sets the number of playback cycles. A negative number indicates an infinite loop. A 0 value indicates playing once and a positive number indicates playing several times, the player is played once by default. numberOfLoops = 0; // you can specify the unit of unit/s _ audioPlayer. currentTime = 0; // you can specify the number of audio channels read-only attribute from any position/* int channals = _ audioPlayer. numberOfChannels; // set the count to enable the audio count _ audioPlayer. meteringEnabled = YES; [_ audioPlayer updateMeters]; // update the audio reading // obtain the average power level and peak level for (int I = 0; I <channals; I ++) {// average level float average = [_ audioPlayer averagePowerForChannel: I]; // peak level float peak = [_ audioPlayer peakPowerForChannel: I];} // obtain the playback duration NSTimeInterval time = _ audioPlayer. duration; * // allocate the playback resource and add it to the internal implementation queue [_ audioPlayer prepareToPlay]; // [_ audioPlayer play];} # pragma mark-proxy method to handle emergencies during playback-(void) audioPlayerDidFinishPlaying :( AVAudioPlayer *) player successfully :( BOOL) flag {NSLog (@ "playback completed "); // loop playback, single loop... if (flag) {// normal playback of the audio file} else {// indicates that the audio is decoded incorrectly even though the playback is complete}-(void) audioPlayerDecodeErrorDidOccur :( AVAudioPlayer *) player error :( NSError *) error {// data decoding error} // Process audio playback interrupted, before ios8, you must implement these two proxy methods. After ios8 is enabled, the system automatically handles the issue. // The playback is interrupted, for example, a sudden call or the user's home key returns-(void) audioPlayerBeginInterruption :( AVAudioPlayer *) player {// pause if (_ audioPlayer. isPlaying) {[_ audioPlayer pause] ;}// end interrupt return program-(void) audioPlayerEndInterruption :( AVAudioPlayer *) player {[_ audioPlayer play];} enable Line Control for background playback # pragma mark-Detail processing background playback and enable Line Control-(void) dealWithDetail {// Set background playback, modify plist // initialize the audio session when multiple apps play the audio simultaneously. Manage AVAudioSession * session = [[AVAudioSession alloc] init]; // set the type to background playback [session setCategory: AVAudioSessionCategoryPlayback error: nil]; // enable background playback [session setActive: YES error: nil]; // listen to the status of the output device [[nsicationicationcenter defacenter center] addObserver: self selector: @ selector (routeChange :) name: AVAudioSessionRouteChangeNotification object: nil];}-(void) routeChange :( NSNotification *) noti {// obtain the listening content NSDictionary * dic = noti. userInfo; // get the reason for the status change int changeReason = [dic [AVAudioSessionRouteChangeReasonKey] intValue]; // if the status changes because the output device cannot be detected if (changeReason = fail) {// obtain the status description AVAudioSessionRouteDescription * description = dic [AVAudioSessionRouteChangePreviousRouteKey]; // obtain the port description AVAudioSessionPortDescription * portDescription = [description. outputs firstObject]; // if the port is a headset and the audio is being played, stop playing if ([portDescription. portType isEqualToString: @ "HeadPhones"]) {if (_ audioPlayer. isPlaying) {// pause the player [_ audioPlayer pause]; // pause the timer [timer setFireDate: [NSDate distantFuture]; }}} recording demo # import "ViewController. h "# import # define SCREEN_HEIGHT [UIScreen mainScreen]. bounds. size. height # define SCREEN_WIDTH [UIScreen mainScreen]. bounds. size. width @ interface ViewController () {// record the recording time UILabel * _ timeLabel; AVAudioRecorder * _ audioRecoder; AVPlayer * _ player;} @ end @ implementation ViewController-(void) viewDidLoad {[super viewDidLoad]; [self createUI]; [self createAudioRecoder]; // The timer checks the recording duration [nst1_scheduledtimerwithtimeinterval: 0.01 target: self selector: @ selector (culTime) userInfo: nil repeats: YES] ;}# pragma mark-timer function-(void) culTime {_ timeLabel. text = [NSString stringWithFormat: @ "%. 2f s ", _ audioRecoder. currentTime] ;}# pragma mark-create audio recording-(void) createAudioRecoder {// set the path of the recording file to NSString * path = [NSHomeDirectory () stringByAppendingPathComponent: @ "events/recoder. aac "]; NSLog (@" % @ ", path); // sets the recording audio attribute NSDictionary * dic =@{ AVFormatIDKey: @ (kAudioFormatMPEG4AAC ), // audio format AVSampleRateKey: @ (4000.0), // bit rate AVNumberOfChannelsKey: @ (2) // audio}; // initialize audio recording _ audioRecoder = [[AVAudioRecorder alloc] initWithURL: [NSURL fileURLWithPath: path] settings: dic error: nil]; // sets the proxy to detect when the recording file has an error _ audioRecoder. delegate = self; // pre-recording [_ audioRecoder prepareToRecord];} # pragma mark-create UI-(void) createUI {_ timeLabel = [[UILabel alloc] initWithFrame: CGRectMake (150,100,100, 40)]; _ timeLabel. backgroundColor = [UIColor redColor]; [self. view addSubview: _ timeLabel]; // The control button NSArray * arr = @ [@ "Recording", @ "pause", @ "stop", @ "play"]; for (int I = 0; I <arr. count; I ++) {UIButton * btn = [UIButton buttonWithType: UIButtonTypeCustom]; btn. frame = CGRectMake (SCREEN_WIDTH/4 * I, 200, SCREEN_WIDTH/4-20, 40); [btn setTitle: arr [I] forState: UIControlStateNormal]; [btn setTitleColor: [UIColor blackColor] forState: UIControlStateNormal]; btn. tag = 100 + I; [btn addTarget: self action: @ selector (onClick :) forControlEvents: UIControlEventTouchUpInside]; [self. view addSubview: btn] ;}# pragma mark-button response method-(void) onClick :( UIButton *) btn {switch (btn. tag) {case 100: {[_ audioRecoder record];} break; case 101: {[_ audioRecoder pause];} break; case 102: {[_ audioRecoder stop];} break; case 103: {[self startPlay];} break; default: break; }}# pragma mark-play recording-(void) startPlay {// initialize audio player _ player = [[AVPlayer alloc] init]; // obtain the playback path AVPlayerItem * item = [AVPlayerItem playerItemWithURL: _ audioRecoder. url]; NSLog (@ "% @", _ audioRecoder. url); // switch the playback source [_ player replaceCurrentItemWithPlayerItem: item]; // play [_ player play];} @ end

Video Playback

I. MPMoviePlayerController

MPMoviePlayerController supports playing local and online videos. This class implements the MPMediaPlayback protocol, so it has general player control functions, such as playing, pausing, and stopping. However, MPMediaPlayerController is not a complete view Controller. to display a video in the UI, you need to add the view attribute to the interface.

Ii. MPMoviePlayerViewController

In fact, if MPMoviePlayerController does not play as an embedded video (for example, embedding a video in News), it usually occupies a full screen during playback, especially on the iPhone and iTouch. Therefore, since iOS3.2 and later, Apple is also thinking that since MPMoviePlayerController often adds its view to another view controller as a subview, why not create a controller view to create an MPMoviePlayerController attribute and play it in full screen by default? developers can directly use this view controller during development. The MPMoviePlayerController is an internal view controller that inherits from UIViewController. MPMoviePlayerViewController has a moviePlayer attribute and a url-based initialization method. In addition, mpmovieplayer implements some functions specific to modal View display, for example, the default mode is full screen display, automatic playback after the pop-up, and the mode window display. If you click the "Done" button, the implementation before the mode window iOS9 is automatically exited: // initialize the player and set the playback resource // network video playing MPMoviePlayerViewController * player = [[MPMoviePlayerViewController alloc] initWithContentURL: [NSURL URLWithString: @ "http://video.szzhangchu.com/1442391674615_6895658983.mp4"]; // play the local video MPMoviePlayerViewController * player = [[MPMoviePlayerViewController alloc] initWithContentURL: [NSURL fileURLWithPath: [[NSBundle mainBundle] pathForResource: @ "MovieTest" ofType: @ "mp4"]; // sets the playback source player. moviePlayer. movieSourceType = MPMovieSourceTypeFile; // sets the player for full-screen playback. moviePlayer. controlStyle = MPMovieControlStyleFullscreen; // pre-Playback video preprocessing [player. moviePlayer prepareToPlay]; // start playing [player. moviePlayer play]; // video playback controller [self presentViewController: player animated: YES completion: nil]; implementation after iOS9: // initialize the player AVPlayerViewController * playerVC = [[AVPlayerViewController alloc] init]; // sets the playback resource AVPlayer * player = [AVPlayer playerWithURL: url]; playerVC. player = player; [self presentViewController: playerVC animated: YES completion: nil];

3. AVPlayer

MPMoviePlayerController is powerful enough to complete a player without having to write a few lines of code, but it is precisely because of its high encapsulation that it makes it complicated or even impossible to customize the player. For example, if you need to customize the player style, it is not appropriate to use MPMoviePlayerController. If you want to control the video freely, you can use AVPlayer. AVPlayer exists in AVFoundation. It is closer to the underlying layer, so it is more flexible. AVPlayer itself cannot display videos, and it does not have a view attribute like MPMoviePlayerController. If AVPlayer is to be displayed, a player layer AVPlayerLayer must be created for display. The Player layer inherits from CALayer and can be added to the Controller view layer with AVPlayerLayer. You need to use AVPlayer to first understand several common classes:

AVAsset: used to obtain multimedia information. It is an abstract class and cannot be used directly.
AVURLAsset: a subclass of AVAsset. You can create an AVURLAsset object containing media information based on a URL path.
AVPlayerItem: A media resource management object, which contains the basic information and status of the manager's video. An AVPlayerItem corresponds to a video resource.

Function introduction:
The playback and pause functions of videos are also the most basic functions. AVPlayer corresponds to two methods: play and pause. However, the key issue is how to determine whether the current video is being played. In the previous content, both the audio player and the video player have the corresponding status, but AVPlayer does not have this status attribute, generally, you can determine the playback speed of the player to obtain the playback status. If the rate is 0, it indicates that it is stopped, and 1 indicates that it is normal.

So far, both MPMoviePlayerController and AVPlayer play videos are quite powerful, but there are also some unavoidable problems, that is, the supported video encoding formats are very limited: H. 264, MPEG-4, extension (compression format ):. mp4 ,. mov ,. m4v ,. m2v ,. 3gp ,. 3g2. However, both MPMoviePlayerController and AVPlayer support most audio encoding.

Iv. VLC video playback (Integrated playback of third-party videos)

Integration steps

1> Add libMobileVLCKit

2> libraries: ibstdc ++ libiconv libbz2 Security. framework QuartzCore. framework CoreText. framework CFnetWork. framework OpenGLES. framework AudioToolbox. framework

3> modify the C ++ compiler to stdC ++:
In [Build Setting], enter [c ++] [Apple LLVM 5.1-upgrade ate-C ++]
[C ++ Standard ..] select the first [libstdc ++ (gnu c ++ standard library)]

4> Add a path in [Build Setting] [search] [Search Paths] [Header ...]
Open [libMobileVLCKit] [include] [MobileVLCKit] to display the description copy address.
Change the First Half of the address (including the folder name) to $ (SRCROOT)

5> Add a header file // # import "VLCMediaPlayer. h"

6> changed the file used to. mm to support C ++ compilation.

Example video link

http://video.szzhangchu.com/1442391674615_6895658983.mp4

Configure iOS9
NSAppTransportSecurity

NSAllowsArbitraryLoads
 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.