The CraftAR iOS Augmented Reality SDK allows you to create AR apps that render the experiences created with the CraftAR service. If you’re not yet familiar with the general steps, read How to add augmented reality into your app.
An Augmented Reality app using the iOS native SDK can be implemented following two steps. First, you need to set up the UIVIewController and then you can trigger Augmented Reality experiences.
If you want to see an example that combines Cloud Image Recognition (see Tutorial: Use Cloud Image Recognition on iOS ) with Tracking, take a look at the open source samples available in our Github repository: https://github.com/Catchoom/craftar-example-ios.
Setting up the SDK in your UIViewController
Once you have set up the CraftARSDK into your Xcode project, it’s time to implement the UIViewController that will show the experience.
1. Adopt CraftARSDKProtocol in your UIViewController
Adopt the <code>CraftARSDKProtocol</code> and add <code>CraftARTracking</code> interface to your UIViewController.
1
2
3
4
5
6
7
8
|
#import "MyViewController.h"
@interface MyViewController () {
// CraftAR SDK reference
CraftARSDK *_sdk;
CraftARTracking *_tracking;
}
@end
|
2. Get the instance of the CraftARSDK
Once the view is loaded, you can get an instance of the CraftARSDK.
1
2
3
4
5
6
7
8
9
|
- (void)viewDidLoad {
[super viewDidLoad];
// setup the CraftARSDK
_sdk = [CraftARSDK sharedCraftARSDK];
// Implement the CraftARSDKProtocol to know when the previewView is ready
[_sdk setDelegate:self];
}
|
3. Start the VideoCapture module
Once the view is loaded and will appear, you can initialize the VideoCapture module of the CraftARSDK with a specific UIView.
1
2
3
4
5
6
|
- (void) viewWillAppear:(BOOL) animated {
[super viewWillAppear:animated];
// Start Video Preview for search and tracking
[_sdk startCaptureWithView: self.videoPreviewView];
}
|
Note: the ‘videoPreviewView’ you provide will be loaded with a rendering view and no other subviews will be displayed for it. If you need to display other UIViews as part of MyViewController, add them to self.view of MyViewController (i.e. at the same level as ‘videoPreviewView’).
4. Get the instance of Tracking
Once the VideoCapture module is ready, it performs a callback to didStartCapture. Here you can setup the Tracking interface.
1
2
3
4
|
- (void) didStartCapture {
// Get the Tracking instance
_tracking = [CraftARTracking sharedTracking];;
}
|
Rendering Augmented Reality
Once your ViewController has the necessary protocols and instance of Tracking, it’s time to add code to start using the iOS Augmented Reality SDK for a mobile app.
In most cases, you’ll use the Cloud Image Recognition service from CraftAR to recognize the object and obtain the necessary AR scene in return. Take a look at the tutorial about Cloud Image Recognition to see the flow.
Next is an example of the calls that are required to start rendering the scene attached to a CraftARItem.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
|
- (void) loadAndRenderARExperiences:(NSArray *)arExperiences {
Boolean haveContent = false;
for (CraftARItem* item in arExperiences) {
// Setup the AR experience with content associated with this CraftARItem
if ([item isKindOfClass:[CraftARItemAR class]]) {
CraftARItemAR* arItem = (CraftARItemAR *)item;
// If the item has contents, add them to the AR experience
if ([arItem allContents].count > 0) {
[_tracking addARItem: arItem];
haveContent = true;
}
}
}
if (haveContent) {
// Start the AR Experience
[_tracking startTracking];
}
}
|