Xmpp sort notes: Send image information and sound information, xmpp Image Information

Source: Internet
Author: User

Xmpp sort notes: Send image information and sound information, xmpp Image Information

The basic idea of sending images and audio files is:

First, convert the image into a binary file, and then encode the binary file with base64 encoding, and then encode it into a string. Add a subnode to the message to be sent. The stringValue of the node sets the encoded string. Then, when a message is sent and the message file is retrieved, messageType is used to determine whether it is an image information. If it is an image information, obtain the stringValue of the subnode Based on the node name you set earlier, it should be a string after base64,

Previous review:

Xmpp organizing notes: http://www.cnblogs.com/dsxniubility/p/4307073.html for sending and displaying chat Information

Xmpp Organization notes: Quick environment configuration (with installation package) http://www.cnblogs.com/dsxniubility/p/4304570.html

Xmpp Organization notes: import and introduction to xmppFramework http://www.cnblogs.com/dsxniubility/p/4307057.html

Xmpp Organization notes: user network connection and friend management http://www.cnblogs.com/dsxniubility/p/4307066.html

I. Send Images

If you did not see this article in Dong Boran's blog, click to view the original article.

The image is displayed by clicking the plus sign on the page. Then, click an image in the album. The album is returned and the image is sent.

- (IBAction)setPhoto {    UIImagePickerController *picker = [[UIImagePickerController alloc]init];        picker.delegate = self;        [self presentViewController:picker animated:YES completion:nil];}

This is the method for clicking the plus sign, and then set the proxy of UIImagePickerController, and then follow the corresponding protocol.

Note that UINavigationControllerDelegate must be followed when UIImagePickerControllerDelegate is followed. Protocol

The following is the proxy method triggered by clicking an image in the pop-up album. The common methods are described here.

# Pragma mark-********************* imgPickerController proxy method-(void) imagePickerController :( UIImagePickerController *) picker didFinishPickingMediaWithInfo :( NSDictionary *) info {UIImage * image = info [encoding]; NSData * data = UIImagePNGRepresentation (image); [self sendMessageWithData: data bodyName: @ "image"]; [self dismissViewControllerAnimated: YES completion: nil];}

The sendMessageWithData: bodyName: is a custom method.

The function of this method is to pass in a data binary file and file type, and then send the file.

All of them are followed by a bodyName, which allows you to input a type name to distinguish between sending an image and sending audio.

The code in the method is as follows:

/** Send binary file */-(void) sendMessageWithData :( NSData *) data bodyName :( NSString *) name {XMPPMessage * message = [XMPPMessage messageWithType: @ "chat": self. chatJID]; [message addBody: name]; // converts it to a base64 encoded NSString * base64str = [data base64encodedstringwitexceptions: 0]; // set the node content XMPPElement * attachment = [XMPPElement elementWithName: @ "attachment" stringValue: base64str]; // contains the subnode [message addChild: attachment]; // send the message [[SXXMPPTools sharedXMPPTools]. xmppStream sendElement: message];}

The process in this method is described at the beginning, encoding before sending. This custom method also applies to sending Audio Information.

 

 

II. Image Display

In the tableView data source method, a layer of judgment is added before the retrieved information is assigned a value. For Image Information, assign a value using the following method.

 

About the basic sending process, forget how to view the common text information:

If ([message. body isEqualToString: @ "image"]) {XMPPMessage * msg = message. message; for (XMPPElement * node in msg. children) {// decode NSString * base64str = node. stringValue; NSData * data = [[NSData alloc] initWithBase64EncodedString: base64str options: 0]; UIImage * image = [[UIImage alloc] initWithData: data]; // display the NSTextAttachment * attach = [[NSTextAttachment alloc] init]; attach in the label. image = [image scaleImageWithWidth: 200]; NSAttributedString * attachStr = [NSAttributedString attributedStringWithAttachment: attach]; // The label attribute assignment method is used, you can ignore the ordinary cell value assignment method. messageLabel. attributedText = attachStr; [self. view endEditing: YES] ;}}

In this case, a scaleImageWithWidth method is used. This method is used to pass in the maximum allowed width. Then, this method is used to determine whether the slice size exceeds the maximum value, if the maximum size is not exceeded, the size of the image is too large. If the size of the image exceeds the maximum width, the overall size of the image is reduced to a size equal to the maximum width. The z2d context knowledge is used.

This method can be written as a classification of UIimage. The Code is as follows:

/** Narrow down the image to the specified width range */-(UIImage *) scaleImageWithWidth :( CGFloat) width {if (self. size. width <width | width <= 0) {return self;} CGFloat scale = self. size. width/width; CGFloat height = self. size. height/scale; CGRect rect = CGRectMake (0, 0, width, height); // the size of the context target starting from UIGraphicsBeginImageContext (rect. size); // draw the image in the specified area [self drawInRect: rect]; // obtain the rendering result from the context UIImage * resultImage = UIGraphicsGetImageFromCurrentImageContext (); // disable the returned result UIGraphicsEndImageContext (); return resultImage ;}

 

 

3. Audio sending

The sending of audio is similar to the sending of previous images, and is also different. The core idea of audio sending is to press the button to start recording, release the hand to end the recording and save the recording. Therefore, you need to process the two listening methods of Button push and lifting. However, there is an apple bug: Custom buttons cannot simultaneously handle TouchUpInSide And TouchDown. That is, if you press the button, it is a print, and a print is opened. This is not the case. You can print both of them at the same time. (Unless the buttons are very large, generally small buttons cannot listen to the two click events at the same time ). However, the system buttons provided by Apple can be used, no matter how small it is, such as buttonWithTypeAdd (small plus button). Therefore, after clicking the sound Button, an inputView is displayed, the preceding buttons can be used to process both time points at the same time. You can use this button to control the start and end recording. After saving, it is also converted to a data binary file and then encoded using base64. Then add the sub-node and send it like an image. During receiving, the stringValue in the node is also decoded. However, the time displayed in the cell of tableview is the sound time. Click this cell to trigger the sound playback time. To play the audio. Some style changes in the cell can be controlled during playback.

First, link the Click Event of the sound button in the interface.

-(IBAction) setRecord {// switch the focus. The recording button [self. recordText becomeFirstResponder] appears.}

In fact, you just wrote a textField and asked him to get the focus when he clicked it. Then a button is displayed in the input box below.

The lazy loading of textField is as follows:

- (UITextField *)recordText {    if (_recordText == nil) {        _recordText = [[UITextField alloc] init];                UIButton *btn = [UIButton buttonWithType:UIButtonTypeContactAdd];        _recordText.inputView = btn;                [btn addTarget:self action:@selector(startRecord) forControlEvents:UIControlEventTouchDown];        [btn addTarget:self action:@selector(stopRecord) forControlEvents:UIControlEventTouchUpInside];                [self.inputMessageView addSubview:_recordText];    }    return _recordText;}

 

For a series of audio file processing operations, it is best to extract a tool class to write it, and then call it directly when necessary, and other projects can also be dragged to use it directly in the future.

First, you need to use the following attributes.

@ Interface SXRecordTools () <AVAudioPlayerDelegate>/** recorder */@ property (nonatomic, strong) AVAudioRecorder * recorder;/** recording address */@ property (nonatomic, strong) NSURL * recordURL;/** player */@ property (nonatomic, strong) AVAudioPlayer * player;/** callback upon playback completion */@ property (nonatomic, copy) void (^ palyCompletion) (); @ end

The start and end recording methods are as follows:

/** Start recording */-(void) startRecord {[self. recorder record];}/** stop recording */-(void) stopRecordSuccess :( void (^) (NSURL * url, NSTimeInterval time) success andFailed :( void (^) () failed {// currentTime NSTimeInterval time = self can only be obtained here. recorder. currentTime; [self. recorder stop]; if (time <1.5) {if (failed) {failed () ;}} else {if (success) {success (self. recordURL, time );}}}

You can either start or end a recording. It is mainly determined that the audio duration is less than 1.5 seconds, and the code block that calls back the recording failed.

Note that recorder. currentTime indicates the duration of the current recording, which can only be obtained in this method. If the method is used, the value cannot be obtained.

Then, in the controller, the method in the tool class is called by pressing the small plus button and raising the listening method.

# Pragma mark-********************** recording method-(void) startRecord {NSLog (@ "Start recording"); [[SXRecordTools sharedRecorder] startRecord] ;}- (void) stopRecord {NSLog (@ "stop recording "); [[SXRecordTools sharedRecorder] stopRecordSuccess: ^ (NSURL * url, NSTimeInterval time) {// send audio data NSData * data = [NSData audio: url]; [self sendMessageWithData: data bodyName: [NSString stringWithFormat: @ "audio: %. 1f seconds ", time];} andFailed: ^ {[[[UIAlertView alloc] initWithTitle: @" prompt "message: @" Too Short time "delegate: nil cancelButtonTitle: @ "OK" otherButtonTitles: nil, nil] show] ;}];}

Clearly, when sendMessageWithData is sent, the duration of the sound is passed in as the parameter bodyName. Then the string is saved to the subnode of the message and sent.

 

 

4. Display audio files

It is also the same as an image. First, determine whether the information extracted from this line is audio information. If yes, traverse the node, retrieve the string, intercept it, and intercept "audio :", make the cell of tableView display only the length of time,

else if ([message.body hasPrefix:@"audio"]){                XMPPMessage *msg = message.message;                for (XMPPElement *node in msg.children) {                    NSString *base64str = node.stringValue;                        NSData *data = [[NSData alloc]initWithBase64EncodedString:base64str options:0];                        NSString *newstr = [message.body substringFromIndex:6];            cell.messageLabel.text = newstr;                        cell.audioData = data;        }    }

This audioData is used to store audio file information. However, tables can be reused. In order to avoid conflicts and overlay the audio files in a reusable cell. We recommend that you add a row when cell is just taken out.

Cell. audioData = nil;

 

 

5. Playing audio files

Although the framework has its own method for playing audio files, many additional operations are required. We recommend that you first write a method in the tool class to play the data file, and the callback code after the setting is complete. PlayData: completion :. In the playback method, first determine whether the sound is playing. If the sound is playing, no operation is performed. Then, set the player proxy in the method. In this way, you can use the proxy method to listen to when the sound file is played and trigger the proxy method. Therefore, this passed-in completion code block must be recorded with the member variables, and then executed in the proxy method after the audio file is played.

-(Void) playData :( NSData *) data completion :( void (^) () completion {// determine whether the video is being played if (self. player. isPlaying) {[self. player stop];} // record the block code self. palyCompletion = completion; // listens to the player's playback status self. player = [[AVAudioPlayer alloc] initWithData: data error: NULL]; self. player. delegate = self; [self. player play];}

 

The proxy method re-executes the saved code block in the proxy method after the audio file is played.

# Pragma mark-********************* proxy method for playback completion-(void) audioPlayerDidFinishPlaying :( AVAudioPlayer *) player successfully :( BOOL) flag {if (self. palyCompletion) {self. palyCompletion ();}}

 

After writing the methods in the tool class, you can call them out. Add a click Method to the custom SXChatCell. By default, the button is the default color. When you click the button, the color turns red, and the callback code when the playback is complete restores the color to the default color.

-(Void) touchesBegan :( NSSet *) touches withEvent :( UIEvent *) event {// if audio data exists, directly play the audio if (self. audioData! = Nil) {// play the audio self. messageLabel. textColor = [UIColor redColor]; // If the block code of a single instance contains self, use weakSelf _ weak SXChatCell * weakSelf = self; [[SXRecordTools sharedRecorder] playData: self. audioData completion: ^ {weakSelf. messageLabel. textColor = [UIColor blackColor] ;}] ;}}

The cell in red is playing.

 

If you did not see this article in Dong Boran's blog, click to view the original article.

After that, the complete image and audio files are sent.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.