100 lines of python code to implement a one-hop Helper Program and 100 lines of python
Preface
I will share with you the Helper Program for the "hop-on" game written in python this afternoon. In the past, I was preparing to use Raspberry Pi to manipulate a "mechanical finger" to replace human touch operations. However, this solution is still being developed and shared. Next we will share with you how to use the "software-only" method to play the "Hop ".
Principle
The principle is actually very simple. follow these steps:
- Before each hop, take a screenshot of the phone screen and save it to a local computer;
- In calculation, the distance between the doll position and the center of the table to be jumped to dd;
- Converts the preceding distance dd to the corresponding touch time ss;
- Send the simulated touch command to the mobile phone, and the touch time is the above time ss;
Implementation
I have only developed Android, so the following is only the implementation method on the Android platform.
Step 1
You can use the adb tool officially provided by Android. First, you need to search for and download the adb tool in the corresponding operating system. Next, you need to connect your phone to your computer, and choose Settings> developer Options> USB debugging. Now call the adb tool on the command line to check whether the mobile phone is detected:
adb devices
PS: if you add the adb path to the path environment variable, you can directly call adb on the command line; otherwise, you must enter the full PATH of adb.
If the device information is output after the preceding command is executed, the connection to the mobile phone is successful. Continue with the following operations.
Use the following command to capture the screen image of the mobile phone and save it on the SD card:
adb shell screencap -p /mnt/sdcard/screencap.png
Then, run the following command pull to send images to your computer:
adb pull /mnt/sdcard/screencap.png C:/screencap.png
Step 2
Is the Key to the entire problem. To calculate the distance between a doll and the center of the table to jump to, You must identify the position (coordinates) of the doll and the center (coordinates) of the table ).
We use the center of the bottom row of the doll as the position of the doll, as shown in:
You can do this to identify the bottom of a human doll. We can see that the rgb values in the bottom of a human doll are between (53, 57, 95) and (59, 61,103), so we scan each pixel line by line, find the rows in which the rbg value is located, and the last row is the bottom of the doll. The bottom line is obtained, and the center coordinates of the line are naturally calculated.
Next, we need to identify the center of the platform that the doll will jump. To get the coordinates of the center, we only need to identify the coordinates of vertex1 and vertex2 in the obtained two vertices:
We also use the method of scanning each pixel from left to right, from top to bottom to find vertex1 coordinates. Before scanning, obtain the rgb value of the entire background color and take any blank area (for example, if your mobile phone size is 1920x1080, you can determine that the coordinates are (40,500) point must be in the "blank .). During the scanning process, once the color of a certain area is different from the background color, a "mutation" occurs, it can be concluded that the point is vertex1.
We record the rgb value of vertex1 as the background color of the table. During the next scan, we began to care about whether the rgb value of the current scan point is "similar" to the record value ". "Similarity" indicates that the vertex "belongs" to the table. It can be found that vertex2 is the smallest point in all "belonging" table points, so vertex2's coordinates are also found.
Obviously, the abscissa at the center of the table is equal to the abscissa of vertex1, And the ordinate is equal to the ordinate of vertex2.
Step 3
After multiple attempts, it is found that it is appropriate to convert the distance from dd (unit: px) to ss (unit: milliseconds) using the following formula:
s=d∗1.35s=d∗1.35
Step 4
After obtaining the touch time, we still use the adb tool to simulate the touch screen behavior. The following are related commands:
adb shell input swipe 0 0 0 0 1000
The last parameter of the preceding command is the time needed to simulate the screen press, in milliseconds.
Effect
Connect your mobile phone to your computer (USB debugging must be enabled for your mobile phone), and go to the "one-Hop" game. Then, run the code on your computer to automatically "one-Hop ".
Previous:
Complete code
The following is the complete code. On my mobile phone (1920*1080), we found that in most cases, the target can be hit. In a few cases, the target cannot be hit. In rare cases, the target will jump out of the table. For mobile phones with other resolutions, you may need to modify the BACKGROUND_POS and DISTANCE_TO_TIME_RATIO parameters.
Import mathimport osimport tempfileimport timefrom functools import performancefrom PIL import ImageBACKGROUND_POS = (40,500) DISTANCE_TO_TIME_RATIO = 1.35SCREENSHOT _ PATH = tempfile. gettempdir () + "/screenshot.png" def calculate_jump_distance (): im = Image. open (SCREENSHOT_PATH) background_rgb = im. getpixel (BACKGROUND_POS) role_pos_list = None vertex1_pos = None block_background_rgb = None vertex2_pos = N One role_line_flag = True for y in range (BACKGROUND_POS [1], im. height): if role_pos_list and role_line_flag: break role_line_flag = True vertex2_line_flag = True for x in range (BACKGROUND_POS [0], im. width): current_rgb = im. getpixel (x, y) next_rgb = im. getpixel (x + 1, y) if x + 1 <im. width else (0, 0, 0) # recognize vertex 1 if x> BACKGROUND_POS [0] and y> BACKGROUND_POS [1] and not vertex1_pos \ and not Is_similar (background_rgb, current_rgb) and is_similar (current_rgb, next_rgb): vertex1_pos = (x, y) vertex = current_rgb # recognize vertex 2 if vertex and is_similar (current_rgb, cosine, cosine, 5): vertex2_line_flag = False if vertex2_pos: if x <vertex2_pos [0] and vertex2_pos [0]-x <20 and y-vertex2_pos [1] <20: vertex2_pos = (x, y) else: vertex2_p OS = (x, y) # identify the villain if is_part_of_role (current_rgb): if role_line_flag: role_pos_list = [] role_line_flag = False then (x, y) if len) = 0: raise Exception ('the location of the villain cannot be identified !!! ') Pos_sum = reduce (lambda o1, o2: (o1 [0] + o2 [0], o1 [1] + o2 [1]), role_pos_list) role_pos = (int (pos_sum [0]/len (role_pos_list), int (pos_sum [1]/len (role_pos_list) destination_pos = (vertex1_pos [0], vertex2_pos [1]) return int (linear_distance (role_pos, destination_pos) def is_part_of_role (rgb ): return 53 <rgb [0] <59 and 57 <rgb [1] <61 and 95 <rgb [2] <103def linear_distance (xy1, xy2): return math. sqrt (pow (xy1 [0]-xy2 [0], 2) + pow (xy1 [1]-xy2 [1], 2) def is_similar (rgb1, rgb2, degree = 10): return abs (rgb1 [0]-rgb2 [0]) <= degree and abs (rgb1 [1]-rgb2 [1]) <= degree and abs (rgb1 [2]-rgb2 [2]) <= degreedef screenshot (): OS. system ("adb shell screencap-p/mnt/sdcard/screencap.png") OS. system ("adb pull/mnt/sdcard/screencap.png {}>{}/jump. out ". format (SCREENSHOT_PATH, tempfile. gettempdir () def jump (touch_time): OS. system ("adb shell input swipe 0 0 0 0 {}". format (touch_time) def distance2time (distance): return int (distance * DISTANCE_TO_TIME_RATIO) if _ name _ = '_ main _': count = 1 while True: screenshot () distance = calculate_jump_distance () touch_time = distance2time (distance) jump (touch_time) print ("#{}: distance ={}, time = {}". format (count, distance, touch_time) count + = 1 time. sleep (1)
Summary
The above is the 100-line python code provided by the editor for you to implement the skip-hop Helper Program. I hope it will be helpful to you. If you have any questions, please leave a message for me, the editor will reply to you in a timely manner. Thank you very much for your support for the help House website!