IOS Remote Control's gesture monitoring and UI implementation, ios Remote Control ui

Source: Internet
Author: User

IOS Remote Control's gesture monitoring and UI implementation, ios  Remote Control ui


This article provides an example of a function page similar to the  gesture remote control.

As follows:






The Touch event response is implemented by monitoring the system's touch practices.

Cache and analyze the point set using an array.



- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{    if (!self.allowsInteraction) return;    UITouch *touch = [touches anyObject];    CGPoint start = [touch locationInView:self.view];        [_gestureManager beginMonitorWithPoint:start];    [self showLightAtPoint:start];        NSLog(@"touch begin");}- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{    if (!self.allowsInteraction) return;        UITouch *touch = [touches anyObject];    CGPoint point = [touch locationInView:self.view];        __weak typeof(&*self) weakSelf = self;    [_gestureManager updateMonitorWithPoint:point action:^{        [weakSelf showLightAtPoint:point];    }];}

When the touch starts and moves, a class is used to trigger and manage gesture-related methods and other behaviors. That is, member _ gestureManager.




- (void)beginMonitorWithPoint:(CGPoint)point{    [self addPoint:point];}- (void)updateMonitorWithPoint:(CGPoint)point action:(dispatch_block_t)actionBlock{    _curTime++;    int delta = (int)(_curTime - _lastSpawnTime);        if (delta >= TIME_GAP) {        if (actionBlock) {            actionBlock();        }                _lastSpawnTime = _curTime;        [self addPoint:point];    }}

After listening, we do not need to trigger the track of the point of the texture display for each point transmitted by the system. Therefore, we have set the Member to set the gap bit, which has reached the point intensive level for control.



- (void)endMonitor{    _curTime = 0;    _lastSpawnTime = 0;    [self pathAnalysis];    [self.pointPath removeAllObjects];}



At the end of the touch and the end of the listener, the members were reset, the gestures were analyzed, and the Dot Array was cleared.

The following describes how to analyze gestures.

Calculate the difference between the start point and the end point, analyze x and y, determine the direction, and then determine whether it is highlighted (whether it is returned, functions, and other gestures)



- (void)pathAnalysis{    int count = self.pointPath.count;    NSLog(@"points count: %d", count);        if (count > JUDGE_CONTAIN) {        goto SendNone;    } else if (count == 1) {        [self sendDelegateResult:MonitorResultTypeChosen];    } else {        CGPoint start = valueToPoint([self.pointPath firstObject]);        CGPoint end   = valueToPoint([self.pointPath lastObject]);        int deltaX = pSub(start, end).x;        int deltaY = pSub(start, end).y;                int midIndex = count/2;        CGPoint mid = valueToPoint(self.pointPath[midIndex]);                if (abs(deltaX) > JUDGE_X && abs(deltaY) < JUDGE_Y) { // horizontal direction                        if (deltaX < 0) {  //right direction                if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeRight start:0 end:self.pointPath.count-1]) goto SendNone;                if (pSub(start, mid).y > JUDGE_Y/2) {                    if ([self checkTrackIsMenu]) [self sendDelegateResult:MonitorResultTypeMenu];                    else goto SendNone;                } else if (abs(pSub(start, mid).y) < JUDGE_Y) {                    [self sendDelegateResult:MonitorResultTypeRight];                } else goto SendNone;            } else {    //left                if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeLeft start:0 end:self.pointPath.count-1]) goto SendNone;                                if (pSub(start, mid).y > JUDGE_Y/2) {                    if ([self checkTrackIsMenu]) {                        [self sendDelegateResult:MonitorResultTypeMenu];                    } else goto SendNone;                } else if (abs(pSub(start, mid).y) < JUDGE_Y) {                    [self sendDelegateResult:MonitorResultTypeLeft];                } else goto SendNone;                            }        } else if (abs(deltaX) < JUDGE_X && abs(deltaY) > JUDGE_Y) { // vertical direction                        if (deltaY < 0) {   // down                if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeDownwards start:0 end:self.pointPath.count-1]) goto SendNone;                if (pSub(start, mid).x > JUDGE_X/2) {                    if ([self checkTrackIsBack]) [self sendDelegateResult:MonitorResultTypeBack];                    else goto SendNone;                } else if (abs(pSub(start, mid).x) < JUDGE_X) {                    [self sendDelegateResult:MonitorResultTypeDownwards];                } else goto SendNone;            } else {            // up                if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeUpwards start:0 end:self.pointPath.count-1]) goto SendNone;                if (abs(pSub(start, mid).x) < JUDGE_X) [self sendDelegateResult:MonitorResultTypeUpwards];                else goto SendNone;            }        } else goto SendNone;    }    return;    SendNone:    [self sendDelegateResult:MonitorResultTypeNone];    return;}

There are also some functions to be used



UIKIT_STATIC_INLINE UIImageView * quickImageView(NSString * imgName) {    UIImageView *iv = [[UIImageView alloc] initWithImage:ImageCache(imgName)];    return iv;}UIKIT_STATIC_INLINE CGPoint pSub(CGPoint a, CGPoint b) {    return CGPointMake(a.x - b.x, a.y - b.y);}UIKIT_STATIC_INLINE NSValue * pointToValue(CGPoint a) {    return [NSValue valueWithCGPoint:a];}UIKIT_STATIC_INLINE CGPoint valueToPoint(NSValue *v) {    return [v CGPointValue];}

Because these functions are frequently called, they are declared as inline static.



The methods for checking whether there is a direction or whether the direction is prominent are as follows:


- (BOOL)checkIsAlwaysCorrectDirection:(MonitorResultType)direct start:(int)start end:(int)end{    PathLogicBlock block;    switch (direct) {        case MonitorResultTypeRight:        {            block = ^(CGPoint v) {                BOOL ret = (v.x >= 0)? NO: YES;                return ret;            };        }            break;        case MonitorResultTypeLeft:        {            block = ^(CGPoint v) {                BOOL ret = (v.x <= 0)? NO: YES;                return ret;            };        }            break;        case MonitorResultTypeUpwards:        {            block = ^(CGPoint v) {                BOOL ret = (v.y <= 0)? NO: YES;                return ret;            };        }            break;        case MonitorResultTypeDownwards:        {            block = ^(CGPoint v) {                BOOL ret = (v.y >= 0)? NO: YES;                return ret;            };        }            break;        default: {return NO;}            break;    }        for (int i = start; i+POINT_GAP < end; i += POINT_GAP) {                CGPoint s = valueToPoint(self.pointPath[i]);        CGPoint e = valueToPoint(self.pointPath[i+POINT_GAP]);                CGPoint d = pSub(s, e);                if (!block(d)) {return NO;}    }        return YES;}

Here, we use block to set conditions, and then check and return BOOL values in traversal.



Others are judged by traversal. Most analyses are within one traversal. For example, check whether a menu gesture or a returned gesture is displayed.


- (BOOL)checkTrackIsMenu{    int start = 0;    int end = self.pointPath.count-1;    BOOL flag = NO;            while (valueToPoint(self.pointPath[start]).y >= valueToPoint(self.pointPath[start+1]).y) {start++;}    while (valueToPoint(self.pointPath[end]).y >= valueToPoint(self.pointPath[end-1]).y) {end--;}        if (abs(start-end) < 2*POINT_GAP) { flag = YES; }        return flag;}- (BOOL)checkTrackIsBack{    int start = 0;    int end = self.pointPath.count-1;    BOOL flag = NO;        while (valueToPoint(self.pointPath[start]).x >= valueToPoint(self.pointPath[start+1]).x) {start++;}    while (valueToPoint(self.pointPath[end]).x >= valueToPoint(self.pointPath[end-1]).x) {end--;}        if (abs(start-end) < 2*POINT_GAP) { flag = YES; }        return flag;}


In terms of image display, I preload the image to be used after the Controller is loaded.



- (void)loadGestureManager{    _gestureManager = [MIGestureManager sharedManager];    _gestureManager.delegate = self;    [_gestureManager preloadResources];}//gesture manager method- (void)preloadResources{    for (int i = 0; i < INITIAL_COUNT; i++) {        UIImageView *iv = quickImageView(PointImage);        [self.imageSet addObject:iv];    }        _upImageView     = quickImageView(UpwardsImage);    _downImageView   = quickImageView(DownwardsImage);    _leftImageView   = quickImageView(LeftImage);    _rightImageView  = quickImageView(RightImage);    _homeImageView   = quickImageView(HomeImage);    _backImageView   = quickImageView(BackImage);    _menuImageView   = quickImageView(MenuImage);    _chosenImageView = quickImageView(chosenImages[0]);        NSMutableArray *aniArr = [NSMutableArray array];    for (int i = 0; i < 4; i++) {        UIImage *image = ImageCache(chosenImages[i]);        [aniArr addObject:image];    }    _chosenImageView.animationImages = aniArr;    _chosenImageView.animationDuration = 0.7;    _chosenImageView.animationRepeatCount = 1;}

View level problem:



We can see that the effects of  remote control are all under a grid. Here, a grid view is covered in the view of the display point track to achieve that effect.


Source Code address: Rannie/MIRemoteControl


Of course, there are also many problems to be solved in this project, which are also mentioned in Readme. md of the project:


1. The collection of points is maintained through the NSMutableArray that comes with the system. Because the structure cannot be stored, the unwrapped package action is as follows:

Static inline NSValue * pointToValue (CGPoint ){
Return [NSValue valueWithCGPoint: a];
}

Static inline CGPoint valueToPoint (NSValue * v ){
Return [v CGPointValue];
}
You can maintain the point sequence set by implementing the data structure on your own.


2. The texture uses UIImageView, which can be implemented through lightweight layer settings of content.


3. The touch event in the controller is monitored here. You can also use UIGestureRecognizer as a subclass to listen to UITouch. You need to import a header file that is subclass of UIGestureRecognizer to listen to touch events. For details, see Using UIGestureRecognizer with Swift Tutorial.


4. The Path Analysis of points is relatively simple. If there is research on statistics, there will be better analysis formulas.



The above is all the content of this blog. You are welcome to correct and comment.


How to Use  mobile phone remote control

1. download and install the remote control to enter the interface. It is an interface that imitates the traditional remote control. You can go up, down, and right. The following three buttons are buttons of the Android system, basically the same as the remote control provided by  box.
remote control
2. pull to the right. Different Operation interfaces are displayed, including buttons, gestures, handles, and body feeling. There is a search on the top of the page. As the home router is not very good, I always get stuck, if you want to have a friend guide, I will not introduce it here.
3. Check the settings first. You can select vibration feedback, somatosensory control, reverse screen cast, and remote control keyboard. If you do not like to shake the phone every time you click it, turn off the vibration feedback. We recommend that you enable other functions.
4. The game gamepad interface is very helpful for playing games, especially for some Simulators of old game consoles.
5. The best way to operate a mobile phone is to draw a gesture on the screen. You do not need to view the virtual key position of the mobile phone in the traditional way.
6. the finger slides in four directions and corresponds to different direction keys.
7. Shake the phone to return to the main interface, draw a line horizontally as the return key, draw a line vertically as the menu key, and confirm with your fingers on a single machine.
8. You can directly upload the searched video to the  box for playback, but it is better to use the home router. Otherwise, there is a risk of freezing. If the mobile phone cannot connect to the vro, please delete the network on your mobile phone and re-enter the password to connect.
The above section describes how to use a mobile phone to control the  box. Here we mainly want to download and install the  remote control application, no matter what brand of mobile phones can be downloaded and installed, and then control the  box. How can I learn.

Can  2 S defend against eavesdropping?

Hi!
Support with third-party software
A more detailed description of the problem helps netizens understand your troubles and help you solve the problem more accurately. Thank you for your support for  mobile phone!
enterprise Platform [Official Certification]

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.