The basic idea of sending pictures and audio files is:
First convert the picture into a binary file, then encode the binary file base64 and encode it into a string. A child node is added within the message to be sent, and the node's stringvalue (the value of the node) sets the encoded string. Then after the message is taken out of the message file, through the MessageType first judge is not picture information, if the picture information first through their own previously set node name, the child node stringvalue out, should be a base64 after the string,
Past period Review :
XMPP finishing notes: Sending and displaying chat messages http://www.cnblogs.com/dsxniubility/p/4307073.html
XMPP Grooming Notes: Quick configuration of the environment (with installation package) http://www.cnblogs.com/dsxniubility/p/4304570.html
XMPP Grooming Notes: Xmppframework framework Import and introduction http://www.cnblogs.com/dsxniubility/p/4307057.html
XMPP Grooming Notes: User network connection and friend management http://www.cnblogs.com/dsxniubility/p/4307066.html
One. Picture Send
If you do not see this article in the Dong Platinum Blog Park, please click to view the original
The picture is through the interface of the plus click on the pop-up album interface, and then click on a picture in the album, album Back, Pictures issued
-(Ibaction) Setphoto { Uiimagepickercontroller *picker = [[Uiimagepickercontroller alloc]init]; Picker.delegate = self; [Self Presentviewcontroller:picker animated:yes completion:nil];}
This is the plus click Method, then set the Uiimagepickercontroller agent, and then follow the corresponding protocol
It is important to note that compliance with the uiimagepickercontrollerdelegate must be followed by Uinavigationcontrollerdelegate. Agreement
The following is a pop-up album Click on a picture after the proxy method triggered, are commonly used methods here also can not explain.
#pragma mark-******************** Imgpickercontroller proxy Method-(void) Imagepickercontroller: (Uiimagepickercontroller *) Picker Didfinishpickingmediawithinfo: (nsdictionary *) info{ UIImage *image = info[ Uiimagepickercontrolleroriginalimage]; NSData *data = uiimagepngrepresentation (image); [Self sendmessagewithdata:data bodyname:@ "image"]; [Self Dismissviewcontrolleranimated:yes completion:nil];}
One of the Sendmessagewithdata:bodyname: a custom method
The function of this method is to pass in a data binary file and the type of file, the file is sent out.
All of them have bodyname in the back, allowing the user to pass in a type name in order to differentiate between sending pictures and sending audio
The code in the method is as follows:
/** Send binary file */-(void) Sendmessagewithdata: (NSData *) data bodyname: (NSString *) name{ xmppmessage *message = [ Xmppmessage messagewithtype:@ "chat" To:self.chatJID]; [Message addbody:name]; Convert to Base64 encoding nsstring *base64str = [data base64encodedstringwithoptions:0]; Set node content xmppelement *attachment = [xmppelement elementwithname:@ "attachment" stringvalue:base64str]; Contains child nodes [message addchild:attachment]; Send Message [[Sxxmpptools Sharedxmpptools].xmppstream sendelement:message];}
The process in this method is the first to say, coding and then sending. This custom method is also useful for sending audio information.
Two. Display of pictures
This is in the TableView data source method, the removal of information is about to be assigned before a layer of judgment, if the picture information, the following method to assign values.
about the basic sending process where you forget to view the normal text message sending method :
if ([message.body isequaltostring:@ "image"]) {xmppmessage *msg = message.mess Age For (Xmppelement *node in Msg.children) {//Remove message decoding nsstring *BASE64STR = Node.stringva Lue NSData *data = [[NSData alloc]initwithbase64encodedstring:base64str options:0]; UIImage *image = [[UIImage alloc]initwithdata:data]; Display the picture in the label nstextattachment *attach = [[Nstextattachment alloc]init]; Attach.image = [Image scaleimagewithwidth:200]; nsattributedstring *attachstr = [nsattributedstring Attributedstringwithattachment:attach]; Using the property assignment method of this label, we can ignore the common assignment method cell.messageLabel.attributedText = attachstr; [Self.view Endediting:yes]; } }
This uses a scaleimagewithwidth: method, this method is to pass in an allowable maximum width, and then this method inside the first judge, such as whether the piece size exceeds the maximum value, if not exceed the maximum value is how big the picture is, If the size of the picture exceeds the maximum width, the overall size of the picture is scaled down to exactly equal to the maximum width of the dimension. This will use the knowledge of the quartz2d context.
This method can be written into the UIImage classification, the code is as follows
/** reduce the image to the specified width */-(UIImage *) Scaleimagewithwidth: (cgfloat) width{ if (self.size.width <width | | width < = 0) { return self; } CGFloat scale = Self.size.width/width; CGFloat height = Self.size.height/scale; CGRect rect = CGRectMake (0, 0, width, height); Start context target size is so large uigraphicsbeginimagecontext (rect.size); Draws an image within the specified area [self drawinrect:rect]; Draw results from the context UIImage *resultimage = Uigraphicsgetimagefromcurrentimagecontext (); Close context Returns the result Uigraphicsendimagecontext (); return resultimage;}
Three. Sending of audio
The transmission of audio, with the previous picture sent, there are some similarities, there are some differences. The core idea of audio transmission is to press the button to start recording, release the hand end recording and save the recording. So you need to handle the button's press and lift two listening methods. But there's an Apple bug: Custom buttons can't handle Touchupinside and TouchDown at the same time. is to press the button not to let go is a print, hand one to loosen a print. This is not possible, all hand a loose two printing at the same time. (unless the button is particularly large, the general small button cannot listen to both click events at the same time). But Apple's own system button is OK, no matter how small, such as buttonwithtypeadd (small plus button) can be, so set the click on the sound button below a inputview, above the two can be processed at the same time the button. Use this button to control the start recording and end the recording. Once saved, it is converted into a data binary and then encoded by Base64. Then join the child node, and the picture is similar to the past. When receiving, the StringValue decoding is also taken out of the node. However, when the sound is displayed in the TableView cell, click the cell to trigger the sound playback time. To play the audio. Some of the style changes within the cell are also controllable when playing .
Connect the Click event of the Sound button in the interface first.
-(Ibaction) Setrecord { //toggle focus, eject recording button [Self.recordtext becomefirstresponder];}
Actually is oneself casually wrote a TextField click to let him get the focus , then the following pop up an input box above the button
The lazy load of this textfield is as follows
-(Uitextfield *) Recordtext { if (_recordtext = = nil) { _recordtext = [[Uitextfield alloc] init]; UIButton *btn = [UIButton buttonwithtype:uibuttontypecontactadd]; _recordtext.inputview = BTN; [Btn addtarget:self Action: @selector (Startrecord) Forcontrolevents:uicontroleventtouchdown]; [Btn addtarget:self Action: @selector (Stoprecord) forcontrolevents:uicontroleventtouchupinside]; [Self.inputmessageview Addsubview:_recordtext]; } return _recordtext;}
For a series of processing of audio files, it is best to pull out a tool class to write it, and then call it directly when needed, and other projects can be dragged past the direct use.
The first properties you need to use are as follows.
@interface sxrecordtools () <avaudioplayerdelegate>/** Recorder */@property (nonatomic,strong) Avaudiorecorder * recorder;/** Recording Address */@property (Nonatomic,strong) nsurl *recordurl;/** player */@property (nonatomic,strong) Avaudioplayer * player;/** Playback Complete Callback */@property (nonatomic,copy) void (^palycompletion); @end
As for the beginning of the recording and ending recording method as follows
/** start recording */-(void) startrecord{ [Self.recorder record];} /** Stop recording */-(void) stoprecordsuccess: (void (^) (Nsurl *url,nstimeinterval time)) Success andfailed: (void (^) ()) failed{ //CurrentTime can only be taken here Nstimeinterval time = self.recorder.currentTime; [Self.recorder stop]; if (Time < 1.5) { if (failed) { failed (); } } else{ if (success) { success (self.recordurl,time);}} }
Start recording and end recording, there are methods in the frame. The main is to judge the length of the audio, less than 1.5 seconds will callback the recording failed code block.
It is important to note that the current recording length of the Recorder.currenttime, only in this method can be taken, out of the way to take the value.
Then in the controller, the method in the tool class is invoked in the pressed and lifted listener method of the small plus button
#pragma mark-******************** recording Method-(void) Startrecord { NSLog (@ "start recording"); [[Sxrecordtools Sharedrecorder] startrecord];} -(void) Stoprecord { NSLog (@ "Stop recording"); [[Sxrecordtools Sharedrecorder] stoprecordsuccess:^ (Nsurl *url, Nstimeinterval time) { //Send sound data nsdata *data = [NSData Datawithcontentsofurl:url]; [Self sendmessagewithdata:data bodyname:[nsstring stringwithformat:@ "audio:%.1f seconds", Time]]; } andfailed:^{ [[[[Uialertview alloc] initwithtitle:@ "Prompt" message:@ "time Too Short" delegate:nil cancelbuttontitle:@ "OK" Otherbuttontitles:nil, nil] show]; }];}
Can clearly see, send voice Sendmessagewithdata: When the length of the sound as a parameter bodyname incoming. The string is then stored in the child node of the message.
Four. Display of audio files
It is also the same as the picture, for the bank to check out the information is not the audio information, if so, traverse the node, take out the string, and intercept the "Audio:", so that the TableView cell only show the length of time,
else if ([Message.body hasprefix:@ "Audio"]) { xmppmessage *msg = message.message; For (Xmppelement *node in Msg.children) { nsstring *base64str = node.stringvalue; NSData *data = [[NSData alloc]initwithbase64encodedstring:base64str options:0]; NSString *newstr = [Message.body substringfromindex:6]; Cell.messageLabel.text = Newstr; Cell.audiodata = data; } }
This audiodata is a special information for storing sound files. However, tables can be reused in order to make the audio files in a cell that has just been reused do not conflict, overlay. It is recommended that you add a row when you first remove the cell
cell.audiodata = nil;
Five. About playback of sound files
Although the framework itself has a sound file playback method, but also need to do a lot of additional operations, it is recommended to first write a method in the tool class, is to play the data file, and set the completion of the callback code. That is playdata:completion:. In the playback method, first determine whether the sound is playing, and if it is playing, do nothing. The player's proxy is then set in the method so that the proxy method can be used to listen for the sound file when it finishes playing, triggering the agent method. Therefore, this incoming completion code block must first be recorded with a member variable , and then execute this block of code in the proxy method where the sound file has finished playing
-(void) Playdata: (NSData *) data completion: (void (^) ()) completion{ //Determine if the if (self.player.isPlaying) is playing [Self.player stop]; } Record block code self.palycompletion = completion; Monitor player playback status self.player = [[Avaudioplayer alloc]initwithdata:data error:null]; Self.player.delegate = self; [Self.player play];}
The proxy method executes the saved block of code in the agent method that the sound file finishes playing
#pragma mark-******************** proxy method to complete playback-(void) audioplayerdidfinishplaying: (Avaudioplayer *) player Successfully: (BOOL) flag{ if (self.palycompletion) { self.palycompletion (); }}
After the method in the tool class is finished, you can go outside and call it. Add a click method to your own custom Sxchatcell. By default, the button is the default color, the color changes to red when clicked, and then the callback code when the playback is complete then restores the color to the default color.
-(void) Touchesbegan: (Nsset *) touches withevent: (Uievent *) event { //If there is audio data, play audio directly if (self.audiodata! = nil) { //play audio self.messageLabel.textColor = [Uicolor redcolor]; If the block code of the Singleton contains self, be sure to use weakself __weak sxchatcell *weakself = self; [[Sxrecordtools Sharedrecorder] playData:self.audioData completion:^{ weakSelf.messageLabel.textColor = [ Uicolor Blackcolor];} ];} }
The red cell is being played.
If you do not see this article in the Dong Platinum Blog Park, please click to view the original
After that, the complete picture and audio files are sent.
XMPP finishing notes: Sending picture messages and voice messages