Intermediate Tutorial


Augmented Reality Using Amazon Sumerian, AWS Amplify, and 8th Wall


60 minutes

Posted on: September 7, 2019

Learn Sumerian
Augmented Reality Using Amazon Sumerian, AWS Amplify, and 8th Wall

Tags

AR
augmented reality
web-based AR
video
streaming
s3

In this tutorial you will learn about:

Creating an AR application using Amazon Sumerian
AWS Amplify
and 8th Wall

In this tutorial, you create content in Amazon Sumerian and publish a simple browser-based augmented reality (AR) experience to place a video in AR on top of your business card. This is sharable via a URL in supported browsers (for example, Chrome on an Android device or Safari on an iOS device).

To gain an understanding of the real-world environment, AR applications use technologies such as simultaneous localization and mapping (SLAM) and image tracking (finding and tracking an image in the real world). At the time of the writing of this tutorial, the pending WebXR specification promises to expose platform SLAM technology to browsers, but is still under development. Until it’s released and supported by browsers, you have to choose your own JavaScript SLAM implementation to create web-based AR applications. Alternatively, you can build native Android Sumerian AR applications or iOS Sumerian AR applications. The advantage of creating a web-based AR application, however, is that it is directly sharable by using a URL.

In this tutorial, we use the third-party, commercially available 8th Wall library for SLAM and image tracking.

Note: To complete this tutorial, you must have completed or be familiar with the steps in both the Streaming Video from Amazon S3 tutorial and the Getting Started with Amazon Sumerian, AWS Amplify, and the React Component tutorial.

You’ll learn about:

  • Authoring an AR-viewable scene
  • Publishing a scene privately
  • Sumerian Scripting API
  • Integrating the 8th Wall JavaScript library
  • Tracking and anchoring an entity to an image
  • Publishing an React app to support user authentication

Prerequisites

Before you begin, you should have completed the following tasks and tutorials:

Step 1: Start a Project in Sumerian

  1. From the Dashboard, navigate to the scene templates.

  2. Choose the Augmented Reality scene template.

  3. In the New Scene dialog box, choose a descriptive name for your scene, and then choose Create.

Step 2: Upload a Video to Amazon S3 and Display It on a Quad Entity

  1. Complete the Streaming Video from Amazon S3 tutorial in your new AR scene. This tutorial walks you through uploading a video to Amazon S3 and creating a Quad primitive to display it on.

  2. For this tutorial, name the quad that will display your video videoQuad.

Step 3: Add a Script to Anchor the videoQuad Entity in AR

In this step, you create a script that responds to xrimageupdated events from 8th Wall to anchor your videoQuad entity to a tracked image in the real world.

  1. Select the videoQuad entity you created in Step 2 and following the Streaming Video from Amazon S3 tutorial.
  2. In the Inspector panel, expand the Script component and add a script by clicking the + (plus icon) next to the drop input.

  3. Choose Custom (Legacy Format) from the menu that opens.

  4. Edit the script by clicking the pencil icon in the script’s panel.

  5. In the Text Editor, click the pencil icon next to Script in the Documents area on the left of the window, and then rename the script “imageTargetAnchor”.

  6. In the lower left of the Text Editor, choose Save .

  7. Replace the contents of the imageTargetAnchor script with the following.

     'use strict';
     function setup(args, ctx) {
       ctx.firstUpdate = true;
       ctx.imageFound = false;
       ctx.baseRotationMatrix = new sumerian.Matrix3(sumerian.Matrix3.IDENTITY);
       ctx.matrix = new sumerian.Matrix3();
       ctx.quatMatrix = new sumerian.Matrix3();
       ctx.quaternion = new sumerian.Quaternion();
       ctx.baseScale = new sumerian.Vector3(1,1,1);
       ctx.baseTranslation = new sumerian.Vector3(0,0,0);
       ctx.entity.hide();
    
       ctx.worldData.onImageFound = event => {
         ctx.imageFound = true;
         if (ctx.entity.isHidden) {
           ctx.entity.show();
         }
       };
    
       ctx.worldData.onImageLost = event => {
         ctx.imageFound = false;
       };
    
       ctx.worldData.onImageUpdated = event => {
         // Fired when an image's location is updated, either by SLAM or by image tracking. We only
         // want to update the imageTargetAnchor for image tracking, indicated by ctx.imageFound.
         if(!ctx.firstUpdate && ctx.imageFound) {
           // Rotation
           ctx.quaternion.set(event.rotation.x, event.rotation.y, event.rotation.z, event.rotation.w);        
           ctx.quatMatrix.copyQuaternion(ctx.quaternion)
           ctx.matrix.mult(ctx.quatMatrix, ctx.baseRotationMatrix);
           ctx.entity.transformComponent.setRotationMatrix(ctx.matrix);        
    
           // Translation
           ctx.entity.transformComponent.setTranslation(ctx.baseTranslation.x + event.position.x, ctx.baseTranslation.y + event.position.y, ctx.baseTranslation.z + event.position.z);
    
           // Scale
           ctx.entity.transformComponent.setScale(event.scale * ctx.baseScale.x, event.scale * ctx.baseScale.y, event.scale * ctx.baseScale.z);
         }
       };
    
       // See https://docs.8thwall.com/web/#sumerian-events for additional
       // available 8th Wall events.
       sumerian.SystemBus.addListener('xrimagefound', ctx.worldData.onImageFound);
       sumerian.SystemBus.addListener('xrimagelost', ctx.worldData.onImageLost);
       sumerian.SystemBus.addListener('xrimageupdated', ctx.worldData.onImageUpdated);
     }
    
     function update(argx, ctx) {
       if (ctx.firstUpdate) {
         // Stash the unmodified entity's scale and rotation to add on to the
         // image target's location during the xrimageupdated callback
         ctx.firstUpdate = false;
         ctx.baseRotationMatrix.copy(ctx.entity.transformComponent.getRotationMatrix());
         ctx.baseScale.set(ctx.entity.transformComponent.getScale());
         ctx.baseTranslation.set(ctx.entity.transformComponent.getTranslation());
       }
     }
    
     function cleanup(argx, ctx) {
       sumerian.SystemBus.removeListener('xrimagefound', ctx.worldData.onImageFound);
       sumerian.SystemBus.removeListener('xrimagelost', ctx.worldData.onImageLost);
       sumerian.SystemBus.removeListener('xrimageupdated', ctx.worldData.onImageUpdated);
     }
    
  8. In the lower left of the Text Editor, choose Save.

  9. Under the Camera component, make sure the AR Camera entity is set as the Main Camera.

  10. With the AR Camera still selected, in the Transform component, enter a Translation value of 0.4 in Y. 8th Wall uses the camera height to effectively scale virtual content and cannot be zero (see 8th Wall’s troubleshooting guide for more information).

  11. Delete the Default Camera and any other cameras in your Entities panel. You want the AR Camera to be the only camera in your scene.

Step 4: Set Up a React app created in AWS Amplify

Follow the tutorial Getting Started with Amazon Sumerian, AWS Amplify, and the React Component to set up a React web app to host your scene privately.

Be aware of the following as you complete that tutorial:

  1. Use the scene you created in the Publish the Sumerian Scene Privately and Add It to the Amplify Project section of that tutorial.

  2. When you test the React app locally on your computer, the Main Camera won’t be placed correctly because your computer doesn’t have a rear-facing camera and the orientation sensors found in AR devices. The Main Camera will be correctly placed when we test on an AR device, later in this tutorial.

  3. In the tutorial’s final Running on a VR or AR Device, in which you add hosting and publish your React app, choose PROD. This is because you will need your app to be served over HTTPS for the 8th Wall library to work correctly.

Step 5: Create an 8th Wall Developer Account

Now that we have our AR scene hosted privately in an React app, we integrate the 8th Wall JavaScript library to take care of image tracking and SLAM. It’s free to set up an 8th Wall developer account and to test your AR web app locally (see the 8th Wall pricing plans here).

Complete the following sections of the 8th Wall setup tutorial:

  1. Create a ‘Web Developer’ account.
  2. Create an app key. Copy the app key created in this process. You will need it in Step 7.
  3. Authorize your AR device.

Step 6: Add 8th Wall Image Targets

In this step, you upload an image on to which the imageTargetAnchor script you added in Step 3 will anchor Sumerian entities. This can be any image. For this tutorial, try using your business card or something similar.

  1. Follow the 8th Wall documentation to add an image target. You can add more than one image if you want, for example, if you want your video quad to be anchored to your business card and your driver’s license.

  2. Crop your image so that it doesn’t have empty space around it (you can do this in the 8th Wall interface as you upload the image).

Step 7: Integrate 8th Wall into the React app

  1. In your React app directory, open public/index.html. Then add the following line between the <head> ... </head> tags, and replace APP_KEY with the app key you copied from Step 5.
     <head>
     ...
       <script async src="https://apps.8thwall.com/xrweb?appKey=APP_KEY"></script>
     ...
     </head>
    
  2. Add code to initialize the 8th Wall library. In src/App.js, replace the line import Amplify from 'aws-amplify'; and the App class definition with the following.

     // ...
     import Amplify, {XR as awsXR} from 'aws-amplify';
     // ...
    
     class App extends Component {
       render() {
         return (
           <div id="sumerian-scene-dom-id" style={ {height: '100vh'} }>
             <p id="loading-status">Loading...</p>
           </div>
         );
       }
    
       componentDidMount() {
         this.loadAndStartScene();
       }
    
       async loadAndStartScene() {
         await awsXR.loadScene('scene1', 'sumerian-scene-dom-id');
    
         const world = awsXR.getSceneController('scene1').sumerianRunner.world;
    
         window.sumerian.SystemBus.addListener('xrerror', (params) => {
           // Add error handling here
         });
    
         window.sumerian.SystemBus.addListener('xrready', () => {
           // Both the Sumerian scene and XR8 camera have loaded. Dismiss loading status
           const loadingStatus = window.document.getElementById('loading-status');
           if (loadingStatus && loadingStatus.parentNode) {
             loadingStatus.parentNode.removeChild(loadingStatus);
           }
         });
    
         window.XR8.Sumerian.addXRWebSystem(world);
    
         awsXR.start('scene1');
       }
     };
    

    Note: This code assumes you named your Sumerian scene scene1 during the amplify add xr step in _Step 4: Set up a React app created in AWS Amplify_. If you named your scene something something other than scene1, you have to update the code with this name.

Step 8: Build, Deploy, and Test

Build and deploy your React app created in AWS Amplify, as follows.

    amplify publish --invalidateCloudFront

This builds and deploys your app, and gives you a URL in the terminal window to view your React app on your AR device. The --invalidateCloudFront forces the Amazon CloudFront cache to be cleared for the publish. This is needed if you’re iterating and publishing in succession.

Optional: Debugging on AR Devices

To debug issues, you might find that you need web developer tools such as the JavaScript console and debugger. On Android and iOS, web developer tools can remotely debug mobile devices. Simply connect the AR mobile device to your computer via a USB cable.

  • For instructions on remote debugging on Android using Chrome, see this documentation.
  • For remote debugging on iOS using Safari, connect your mobile device to your computer with a USB cable, and then open Safari’s Develop menu and select the AR device from the Device List.

Optional: Interaction Improvement - Do Not Play Hidden Video

As an exercise, see if you can modify the videoPlayback script so that instead of calling video.play() on load, it waits for the first xrimagefound event to start the video when the image target is first detected.

You should now have a much better understanding of how to create an AR application using AWS Amplify and 8th Wall. To learn more, check out the following tutorials:

Back to Tutorials

© 2019 Amazon Web Services, Inc or its affiliates. All rights reserved.