Intermediate Tutorial

Creating Image Tracking AR Content for the Web

30 minutes

Posted on: September 7, 2019

Learn Sumerian
Creating Image Tracking AR Content for the Web


augmented reality
web-based AR

In this tutorial you will learn about:

Creating an AR application using Amazon Sumerian
AWS Amplify
and 8th Wall

In this tutorial, you create a scene in Amazon Sumerian to place a video in Augmented Reality (AR) on top of your business card. The published experience is sharable via a URL in supported browsers (for example, Chrome on an Android device or Safari on an iOS device).

To gain an understanding of the real-world environment, AR applications use technologies such as simultaneous localization and mapping (SLAM) and image tracking. Image tracking first recognizes a particular image and then uses that image as the base for an AR entity in such a way that the AR entity’s position and rotation will always match that image, even if that image is moving relative to the mobile AR device. At the time of the writing of this tutorial, the pending WebXR specification promises to expose platform SLAM technology to browsers, but is still under development. Until it’s released and supported by browsers, you have to choose your own JavaScript SLAM implementation to create web-based AR applications. Alternatively, you can build native Android Sumerian AR applications or iOS Sumerian AR applications. The advantage of creating a web-based AR application, however, is that it is directly sharable by using a URL.

In this tutorial, we use the third-party, commercially available 8th Wall library for SLAM and image tracking.

You’ll learn about:

  • Tracking and anchoring an entity to an image
  • Using a video file as a texture for display on a surface
  • Integrating the 8th Wall JavaScript library


Before you begin, you should have completed the following tasks and tutorials:

Step 1: Create a new project on

To begin, you will need to create a new 8th Wall project using the 8th Wall dashboard.

  1. Log in to your 8th Wall account, and click “Start a new project”.
  2. Give the project a meaningful name, and click “Create”.
  3. Authorize your test device (e.g. your phone or tablet) by clicking the “Device Authorization” button and following the instructions.

    Caution: If you use a QR reader to scan the mobile QR code, be sure to then launch that link in the native browser on your phone before authorizing. Otherwise, you may be authorizing the embedded browser of the QR reader app but not the main web browser on your phone.

  4. Copy and save your 8th Wall app key for use later in this tutorial. You can find your app key by clicking on the cog icon on the left of the 8th Wall project dashboard.

Step 2: Add 8th Wall Image Targets

In this step, you upload an image on to which a video will be pasted in AR. You can use any image that you have a physical copy of in the real world, such as your business card or something similar.

  1. Follow the 8th Wall documentation to add an image target. You can add more than one image if you want, for example, if you want your video quad to be anchored to your business card or your driver’s license.

  2. Crop your image so that it doesn’t have empty space around it (you can do this in the 8th Wall interface as you upload the image). See 8th Wall’s documentation for best practices about choosing and cropping an image target.

Step 3: Start a Project in Sumerian

  1. Log into Sumerian from the AWS console.
  2. From the Sumerian Dashboard, under Create scene from template, use the Empty template to create a new scene. Give your scene a meaningful name.

    Caution: Sumerian provides a template called the Augmented Reality Template, but do not use that template for this tutorial. That template is meant for use in native mobile applications and won’t be a good basis for our 8th Wall AR scene.

  3. Download the and import it into your scene. To import you can click on the folder icon in the Asset Panel or drag-drop the ZIP file from your desktop to the Sumerian Editor window.
  4. After the import completes, from the Entities panel delete the extra camera named DELETE ME.
  5. Select the entity named videoQuad. It has a script attached to it named 8th Wall Init which exposes a property called App Key. Paste your 8th Wall app key mentioned above into that App Key field.
  6. Select the Default Camera entity and turn off its Follow Editor Camera option.
  7. Set the Default Camera position to 0, 0.5, 0. 8th Wall uses the camera height to effectively scale virtual content and cannot be zero (see 8th Wall’s troubleshooting guide for more information).

Your scene can now be published and viewed on your test device! Give it a try. Tip: Use a URL shortener service like or to create a URL that’s easier to type into your phone’s browser. Note that the scene will not work in Preview mode in the editor by pressing the Play the scene button at the bottom of the canvas. The scene will only work on mobile AR devices.

Step 4: Exploring the scene’s construction

Script: playVideo

Some browsers prevent videos from starting automatically, sometimes referred to as “autoplay blocking”. When this happens, a user gesture is needed in order to ask for permission to autoplay. It’s a best practice to check the value of the Promise returned from a video play attempt to see if autoplay was blocked, and to seek out a user gesture, such as a touch event, if it is. If autoplay is blocked, the playVideo script will emit a VideoAutoplayBlocked signal. The VideoAutoplayBlocked State Machine Behavior in turn responds to this signal, obtains a user gesture, and emits a “PlayVideo” signal again to retry playing the video.

Script: imageTargetAnchor

This script adjusts the transform of an entity so that it is placed on a tracked image target. It responds to xrimageupdated events from 8th Wall to anchor the entity it is attached to, in this case the videoQuad entity, to a tracked image in the real world.

Script: 8th Wall Init

This script handles loading and initializing the 8th Wall JavaScript SDK. You’ll find this script attached to the videoQuad entity.

State Machine Behavior: VideoAutoplayBlocked

This behavior listens hides the entity it is attached to and listens for the VideoAutoplayBlocked signal. If it is received, it shows the entity it is attached to and waits for the user to click or tap on the entity. When the user does, it hides the entity and emits a PlayVideo signal. This signal in turn causes the playVideo script to try and replay the video. You’ll find this behavior attached to the playButton entity.

This behavior is needed to gracefully handle autoplay blocking. If a play attempt fails because of autoplay blocking, the scene must obtain a user gesture, such as a touch or tap, and then retry the play attempt. This behavior handles obtaining the user gesture and signaling that a retry should be attempted.

Optional: Debugging on AR Devices

To debug issues, you might find that you need web developer tools such as the JavaScript console and debugger. On Android and iOS, web developer tools can remotely debug mobile devices. Simply connect the AR mobile device to your computer via a USB cable.

  • For instructions on remote debugging on Android using Chrome, see this documentation.
  • For remote debugging on iOS using Safari, connect your mobile device to your computer with a USB cable, and then open Safari’s Develop menu and select the AR device from the Device List.

You should now have a much better understanding of how to create an AR application using AWS Amplify and 8th Wall. To learn more, check out the following tutorials:

Back to Tutorials

© 2019 Amazon Web Services, Inc or its affiliates. All rights reserved.