Touch detection of 3D models in iOS11 ar scene

Source: Internet
Author: User

The latest iOS11 in the AR particularly hot, oneself also to online to find a few demo play, the core code is nothing less than:

AR view: Show 3D interface

@property (Nonatomic,strong) Arscnview *arscnview;

Add Model Method One:

Create a new scene

Scnscene *scene = [Scnscene scenenamed:@ "ART.SCNASSETS/SHIP.SCN"];

Set the scene to the view

self.arSCNView.scene = scene;

Add Model Method Two:

1. Use the scene to load the SCN file (SCN format file is a 3D modeling file, using 3DMax software can be created, here the system has a default 3D aircraft)--------on the right I added many 3D models, only need to replace the file name

Scnscene *scene = [Scnscene scenenamed:@ "ART.SCNASSETS/SHIP.SCN"];//@ "MODELS.SCNASSETS/CHAIR/CHAIR.SCN"];

2. Get the aircraft node (a scene will have multiple nodes, here we write only, the aircraft node is the first of the scene child nodes by default)

All scenarios have only one root node and all other nodes are child nodes of the root node

Scnnode *shipnode = scene.rootnode.childnodes[0];

Self.planenode = Shipnode;

The aircraft is larger, release the zoom and adjust the position so that it is in the middle of the screen

Shipnode.scale = Scnvector3make (0.5, 0.5, 0.5);

shipnode.position = Scnvector3make (0, -15,-15);

;

3D modeling of an aircraft is not one go, there may be many sub-nodes splicing, so the inside of the child nodes should be changed together, or the above modification will be invalid

For (Scnnode *node in shipnode.childnodes) {

Node.scale = Scnvector3make (0.5, 0.5, 0.5);

node.position = Scnvector3make (0, -15,-15);

}

3. Add an aircraft node to the current screen

[Self.arSCNView.scene.rootNode Addchildnode: Shipnode];

Environment Building code See the end of the text after the split line

======================== Opening split Line ====================

The first thought of course is to add gestures to see if it works, but if added to the Arscnview, a bit of blind work, because Arscnview is not all the 3D model, no matter whether there is a click on the model, gesture method will start, and then observe Scnscene, Scnscene inherited from the NSObject, observed the API into a dead end; back to see arscnview there is a way:

/*!

@method hittest:options:

@abstract Returns An array of Scnhittestresult for each node, contains a specified point.

@param point A, point in the coordinate system of the receiver.

@param options Optional parameters (see the ' Hit test options ' group for the available options).

*/

-(Nsarray<scnhittestresult *> *) HitTest: (cgpoint) point options: (Nullable nsdictionary<scnhittestoption, ID > *) options;

The method returns a Scnhittestresult array with the node for each element in the array containing the specified point (cgpoint)

For example:Arscnview is a multi-layered composite board, each click of the finger, like a needle through the template, the method will reverse the needle through all points (node) of a hungry array, each point actually contains the location of your finger click (cgpoint), This way we can find out which node has the parent node by facilitating each Scnhittestresult node in each array, and find the name of node and the root of the 3D model to make a comparison, and that's where the 3D model is clicked;

The code is as follows:

Scnnode *vasenode = scene.rootnode.childnodes[0];

Vasenode.name = @ "Virtual object root node";// It is important to compare by name

4. Set the position of the Vase node to the position of the captured flat land, if not set, the default is the origin position, which is the camera position

Vasenode.position = Scnvector3make (planeanchor.center.x, 0, Planeanchor.center.z);

5. Add a vase node to the current screen

//!!! It is important to note that the Vase node is the root node that is added to the node that the agent snaps to, not the AR attempt. Because the captured flat anchor point is a local coordinate system, not a world coordinate system

[Node Addchildnode:vasenode];

System method

-(void) Touchesbegan: (Nsset<uitouch *> *) touches withevent: (Uievent *) event

{

if (Self.artype = = Artypeplane&&self.rootnode) {

Self.currenttouches = touches;

Uitouch *touch = [Self.currenttouches anyobject];

Cgpoint tappoint = [Touch locationinview:self.arscnview];// The point is where the finger is clicked

Nsdictionary *hittestoptions = [Nsdictionary dictionarywithobjectsandkeys:@ (True), Scnhittestboundingboxonlykey, nil ];

Nsarray<scnhittestresult *> * results= [Self.arscnview hittest:tappoint options:hittestoptions];

For (Scnhittestresult *res in results) {// traverse all of the returned results in node

if ([Self isnodepartofvirtualobject: Res.node]) {

[Self dosomething];

Break

}

}

}

}

searching for the specified node

-(BOOL) Isnodepartofvirtualobject: (scnnode*) Node {

if ([@ "Virtual Object root node" IsEqualToString:node.name]) {

return true;

}

if (node.parentnode! = nil) {

return [self isNodePartOfVirtualObject:node.parentNode];

}

return false;

}

=========== End Split Line ==============

#pragma mark-Build arkit environment

Lazy load Session Tracking configuration

-(Arsessionconfiguration *) arsessionconfiguration

{

if (_arsessionconfiguration! = nil) {

return _arsessionconfiguration;

}

1. Create World Tracking Session configuration (better with arworldtrackingsessionconfiguration effect), requires A9 chip support

Arworldtrackingsessionconfiguration *configuration = [[Arworldtrackingsessionconfiguration alloc] init];

2. Set the tracking direction (tracking plane, which will be used later)

Configuration.planedetection = Arplanedetectionhorizontal;

_arsessionconfiguration = Configuration;

3. Adaptive lighting (camera from dark to strong light fast transition effect will be more gentle)

_arsessionconfiguration.lightestimationenabled = YES;

return _arsessionconfiguration;

}

Lazy Load Shooting Session

-(Arsession *) arsession

{

if (_arsession! = nil)

{

return _arsession;

}

1. Create a session

_arsession = [[Arsession alloc] init];

_arsession.delegate = self;

2 Return to session

return _arsession;

}

Create an AR view

-(Arscnview *) Arscnview

{

if (_arscnview! = nil) {

return _arscnview;

}

1. Create an AR view

_arscnview = [[Arscnview alloc] initWithFrame:self.view.bounds];

2. Setting the proxy snap to flat will return in the proxy callback

_arscnview.delegate = self;

2. Setting up a View session

_arscnview.session = self.arsession;

3. Automatically refresh the lights (3D game, here can be ignored)

_arscnview.automaticallyupdateslighting = YES;

return _arscnview;

}

-(void) Viewdidappear: (BOOL) animated

{

[Super viewdidappear:animated];

1. Add an AR view to the current view

[Self.view AddSubview:self.arSCNView];

2. Turn on the AR session (the camera is now working)

[Self.arsession runWithConfiguration:self.arSessionConfiguration];

}

Touch detection of 3D models in iOS11 ar scene

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.