"How to quickly develop a complete iOS Live App" (collection)

Source: Internet
Author: User

Tag: GitHub ble SAR str intermediary line data copy err

Effect

In order to collect, I also go all out, please ignore the characters, attention to technology.


Ignore myself. PNG Basic Knowledge Introduction
  • AVFoundation: The avfoundation frame is needed for audio and video data acquisition.

  • AVCaptureDevice: Hardware devices, including microphones, cameras, which enable you to set some properties of a physical device (such as camera focus, white balance, etc.)

  • AVCaptureDeviceInput: Hardware input object, you can create corresponding Avcapturedeviceinput object according to Avcapturedevice, to manage hardware input data.
  • AVCaptureOutput: Hardware output object for receiving various output data, usually using corresponding subclass Avcaptureaudiodataoutput (sound data output object), Avcapturevideodataoutput (video data output object)
  • AVCaptionConnection: When an input and output is added to the avcapturesession, the Avcapturesession establishes a connection between the input and output devices, and the connection object can be obtained through avcaptureoutput.
  • AVCaptureVideoPreviewLayer: The camera captures the preview layer, can view the photo or video recording effect in real time, create the object needs to specify the corresponding Avcapturesession object, because avcapturesession contains video input data, video data can be displayed.
  • AVCaptureSession:协调输入与输出之间传输数据
    • System function: can operate hardware equipment
    • How it works: Let the app and the system generate a capture session, the equivalent of the app and the hardware device is connected, we just need to add the hardware input object and output object to the session, the conversation will automatically connect the hardware input object and output, so that the hardware input and output devices can transmit audio and video data.
    • Real life Scenario: Tenant (input money), intermediary (session), landlord (output room), tenant and landlord are in the intermediary registration, intermediary will let tenant and landlord contact, the future tenants can directly and landlord contact.
Capturing audio and video steps: official documentation
    • 1.Create a Avcapturesession object
    • 2.Get the Avcapturedevicel recording device (camera), recording device (microphone), note that there is no input data function, just to adjust the configuration of the hardware device.
    • 3.Create an audio/video hardware input data object (Avcapturedeviceinput) based on an audio/video hardware device (Avcapturedevice) that specifically manages data entry.
    • 4.Create a video output data management object (Avcapturevideodataoutput) and set the sample cache proxy (setsamplebufferdelegate) to get the captured video data
    • 5.Create an audio output data management object (Avcaptureaudiodataoutput) and set the sample cache proxy (setsamplebufferdelegate) to get the captured audio data through it
    • 6.Adding the data input object Avcapturedeviceinput, the data output object Avcaptureoutput to the media session management Object Avcapturesession automatically causes the audio input to be connected to the output and the video input and output.
    • 7.Create a video preview layer Avcapturevideopreviewlayer and specify a media session, add a layer to the display container layer
    • 8.Start the avcapturesession, only open, will start the input to the output data stream transmission.
Capturing audio and video-(void) setupcaputurevideo{1. Create capture session, must be strong reference, otherwise it will be releasedAvcapturesession *capturesession = [[Avcapturesession alloc] init]; _capturesession = capturesession;2. Get the camera device, default is the rear cameraAvcapturedevice *videodevice = [Self Getvideodevice:Avcapturedevicepositionfront];3. Get a sound deviceAvcapturedevice *audiodevice = [Avcapturedevice Defaultdevicewithmediatype:Avmediatypeaudio];4. Create the corresponding video device input objectAvcapturedeviceinput *videodeviceinput = [Avcapturedeviceinput Deviceinputwithdevice:videodevice Error:NIL]; _currentvideodeviceinput = Videodeviceinput;5. Create the corresponding audio device input objectAvcapturedeviceinput *audiodeviceinput = [Avcapturedeviceinput Deviceinputwithdevice:audiodevice Error:NIL];6. Adding to the sessionNote "It's a good idea to decide if you can add input, and the session can't add empty6.1 Adding videosif ([Capturesession canaddinput:videodeviceinput]) {[Capturesession addinput:videodeviceinput];}6.2 Adding audioif ([Capturesession canaddinput:audiodeviceinput]) {[Capturesession addinput:audiodeviceinput];}7. Get the video data output deviceAvcapturevideodataoutput *videooutput = [[Avcapturevideodataoutput alloc] init];7.1 Set up agent to capture video sample dataNote: The queue must be a serial queue to obtain data and cannot be emptydispatch_queue_t Videoqueue = Dispatch_queue_create ("Video Capture Queue", dispatch_queue_serial); [Videooutput setsamplebufferdelegate:Self Queue:videoqueue];if ([Capturesession canaddoutput:videooutput]) {[Capturesession addoutput:videooutput];}8. Get Audio data output deviceAvcaptureaudiodataoutput *audiooutput = [[Avcaptureaudiodataoutput alloc] init];8.2 Set up agent to capture video sample dataNote: The queue must be a serial queue to obtain data and cannot be emptydispatch_queue_t Audioqueue = Dispatch_queue_create ("Audio Capture Queue", dispatch_queue_serial); [Audiooutput setsamplebufferdelegate:Self Queue:audioqueue];if ([Capturesession canaddoutput:audiooutput]) {[Capturesession addoutput:audiooutput];}9. Get the video input and output connection for resolving audio and video data _videoconnection = [Videooutput connectionwithmediatype:Avmediatypevideo];10. Add a video preview layerAvcapturevideopreviewlayer *previedlayer = [Avcapturevideopreviewlayer Layerwithsession:capturesession]; Previedlayer. frame = [UIScreen Mainscreen]. Bounds; [Self. View. Layer Insertsublayer:previedlayer Atindex:0]; _previedlayer = Previedlayer;11. Start the session [Capturesession startrunning];}Specify camera orientation to get the camera-(Avcapturedevice *) Getvideodevice: (Avcapturedeviceposition) position{Nsarray *devices = [Avcapturedevice Deviceswithmediatype:Avmediatypevideo];for (Avcapturedevice *devicein devices) {if (Device.position = = position) { return device;} } return NIL;}  #pragma mark-avcapturevideodataoutputsamplebufferdelegate //get input device data, it is possible that the audio may be video-(void) Captureoutput: (avcaptureoutput *) captureoutput Didoutputsamplebuffer: (cmsamplebufferref) SampleBuffer Fromconnection: (avcaptureconnection *) connection{if (_ Videoconnection = = connection) {nslog (@ "capture video Data");} else {nslog (@ "Capture to audio data");}}   
Video capture additional function one (switch camera)
    • Switch camera steps
      • 1.Gets the current video device input object
      • 2.Determine if the current video device is front-or rear-facing
      • 3.Determine the orientation of the switch camera
      • 4.Obtain the corresponding camera device according to the camera direction
      • 5.Create a corresponding camera input object
      • 6.Remove the previous video input object from the session
      • 7.Add a new video input object to the session
Switch Camera-(ibaction) Togglecapture: (ID) Sender {Get Current device orientationAvcapturedeviceposition curposition = _currentvideodeviceinput. device. position;Get the direction you need to changeAvcapturedeviceposition Toggleposition = Curposition = =Avcapturedevicepositionfront? avcapturedevicepositionback:avcapturedevicepositionfront; //Get changed camera device avcapturedevice *toggledevice = [selfgetvideodevice:toggleposition]; //Get changed camera input device avcapturedeviceinput *toggledeviceinput = [avcapturedeviceinput deviceinputwithdevice: Toggledevice error:nil]; //Remove the previous camera input device [_capturesession removeinput:_currentvideodeviceinput]; //Add a new camera input device [_capturesession addinput:toggledeviceinput]; //record current camera input device _currentvideodeviceinput = toggledeviceinput;}          
Video capture additional function two (focus cursor)
    • Focus Cursor Step
      • 1.Tapping on the monitor screen
      • 2.To get the point position of the click, convert to a point on the camera, you must go through the video preview layer ( AVCaptureVideoPreviewLayer )
      • 3.Set the position of the focus cursor picture and animate
      • 4.Set the camera device focus mode and Exposure mode (note: This setting must be locked configuration lockForConfiguration , otherwise error)
Tap the screen and the Focus view appears-(void) Touchesbegan: (nsset<Uitouch *> *) touches withevent: (Uievent *) event{Get the Click locationUitouch *touch = [touches anyobject];Cgpoint point = [Touch Locationinview:Self. view];Converts the current position to the location on the camera pointCgpoint camerapoint = [_previedlayer capturedevicepointofinterestforpoint:point];Set focus point cursor position [Self Setfocuscursorwithpoint:point];Set focus [Self Focuswithmode:Avcapturefocusmodeautofocus Exposuremode:Avcaptureexposuremodeautoexpose Atpoint:camerapoint];}/** * Set FOCUS cursor Position * * @param point cursor position */-(void) Setfocuscursorwithpoint: (Cgpoint) point{Self. Focuscursorimageview. Center=point;Self. Focuscursorimageview. transform=Cgaffinetransformmakescale (1.5,1.5);Self. Focuscursorimageview. alpha=1.0; [UIView animatewithduration:1.0 animations:^{Self. Focuscursorimageview. transform=cgaffinetransformidentity; } completion:^ (BOOL finished) {Self. Focuscursorimageview. alpha=0; }];}/** * Set Focus */-(void) Focuswithmode: (Avcapturefocusmode) Focusmode Exposuremode: (Avcaptureexposuremode) Exposuremode Atpoint: (Cgpoint) point{Avcapturedevice *capturedevice = _currentvideodeviceinput. device;Lock configuration [Capturedevice lockforconfiguration:Nil]; //Set Focus if ([Capturedevice isfocusmodesupported:Avcapturefocusmodeautofocus]) {[Capturedevice Setfocusmode:Avcapturefocusmodeautofocus]; } if ([Capturedevice isfocuspointofinterestsupported]) {[Capturedevice setfocuspointofinterest:point];} //Set exposure if ([Capturedevice isexposuremodesupported:avcaptureexposuremodeautoexpose]) {[Capturedevice Setexposuremode:Avcaptureexposuremodeautoexpose]; } if ([Capturedevice isexposurepointofinterestsupported]) {[Capturedevice setexposurepointofinterest:point]; c11>//unlocking configuration [Capturedevice unlockforconfiguration];}           
Conclusion

Follow up will also update more information about the live broadcast, hope to do the church every friend from scratch to do a live app, and the demo will be gradually perfected.
Demo Click to download

    • Because the FFmpeg cubby is larger, about 100M.
    • Originally wanted to upload all the code, uploaded 1 hours, has not succeeded, gave up.
    • Provide another option that requires you to import the Ijkplayer library in detail:
    • After downloading the demo, open the Yzliveapp.xcworkspace problem

Open Yzliveapp.xcworkspace problem
    • Pod Install will solve

Snip20160830_12.png
    • Download Jkplayer Library, click to download
    • Drag the jkplayer directly into the same directory as the classes, run the program directly, you can succeed.

Drag into the Ijkplayer to the same level directory as classes. png
    • Note that you do not need to 打开工程,把jkplayer拖入到工程中 , but directly copy the Jkplayer library to the same level as the classes directory.
    • Error Demonstration:不要向下面这样操作

Snip20160830_14.png
    Recommended extension reading copyright belongs to the author



    Wen/袁峥 Seemygo (Jane book author)
    Original link: http://www.jianshu.com/p/c71bfda055fa
    Copyright belongs to the author, please contact the author to obtain authorization, and Mark "book author".

    "How to quickly develop a complete iOS Live App" (collection)

    Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.