
Build an AR Portal app with ARCore and Sceneform
While trying to learn more about ARCore, I faced the yearning to build an AR Portal.
Most of the examples I found on the web were made with Unity. I tried and it works great indeed. ARCore with Unity is very powerful.
But what if an AR experience is simply a bonus to your app and not the main feature. I want to build a native Android App with a bit of AR flare not a full powered AR app.
Thus, I decided to take on the challenge of building an AR Portal with only plain ARCore and Sceneform. To add to the challenge, we are gonna do everything, Even generate the assets.
Our AR Portal is built with a 360 degree image to represent the virtual world, not a 3D model.
If you just want to check the code, here it is https://github.com/shinmen/ARPortal
Launch the app and marvel at your own AR Portal.

This article is composed of two parts:
- How to generate the assets
- How to use our assets to build our AR Portal app
The first part is optional. You can read it if you want to dive in a bit more in how the assets were built and build yours yourself.
Create your assets
If you just want to use the included assets and apply your own 360 image, feel free to skip this part.
I am definitely not a graphist and I don’t know any who could help me but I found clara.io to create and compile my assets into .obj files. It’s free and seems very powerful.
However, feel free to use any other software, this is just an example.
Here’s what we aim to build.

To make this, we superimposed two spheres and delete parts of them to create the “door” to a the virtual world and the “door” to the real world.
- The first sphere that is going to contain the virtual world is very simple to build.
Create a scene -> add a sphere to the scene -> (left panel) click on Tools -> select Multi-SubObject (looks like 4 dots) -> select on the sphere a few tiles to create the door toward the virtual world and click delete on the left panel.
- The second sphere is the virtual world. This one is a bit more complex and require a bit of an explanation.
When we apply an image to a sphere, it’s going to stick to the outside coat of it.
Why? Because of the normals. As I said I’m not a graphist but from what I understood, the normals represent the light applied to the object.

We stick an image to the object with default normals, it gives you this.
However if we flip the normals, you can see the ray of lights are looking to the inside and the image is stuck on the interior coat.

So the process for the second sphere :
- In the menu -> Model -> Normals -> flip normals
- (left panel) click on Tools -> select Multi-SubObject (4 dots as a square) -> select on the sphere a few tiles on the inside to create the door toward the real world and click flip on the left panel. The normals on these tiles are back as before.
- Select these tiles on the outside of the sphere and click delete.
This way, from the outside we will be able to see the virtual world but from the inside, we will see the real world.
Last step is to apply an image.
File -> Import Files -> import a 360 jpeg image in the Material Library -> click right on the jpeg -> create Material from image
Finally our two spheres are ready. Let’s export our new 3D objects.
Select the first sphere in the left panel -> File -> Export Selected -> Wavefront OBJ
Repeat the operation for the second sphere.
Build your AR Portal app
First we need to add ARCore and Sceneform to our app.
It’s pretty straightforward following Google’s “get started” tutorial.
Once we’re done, we must import our models to create our virtual world.
The two models we’ll use are a sphere container and a sphere with our virtual world inside it. Both have a hole of the same size to mimic the portal door.
Models of .obj generally include a material file (.mtl) along to describe the object. Object files reference the material file inside their definition.
Our virtual world sphere will apply a material that stick the 360 image to it.
The sphere container will currently apply a basic opaque material.
We want the sphere container’s material to become transparent in order to give the effect of a door floating in mid air.
First, let’s import the virtual world sphere:
Click right on the .obj file ->import Sceneform asset -> Finish
Sceneform will create a .sfa and a .sfb file. If you change values inside the .sfa file, it will automatically propagate those values to the .sfb file.
The .sfb file is used to display the 3D object in the viewer sceneform widget.
The material brought along the sphere container apply an opaque coat on the surface of the sphere. What we want is a transparent material.
One can be found in the sceneform repository: sceneform_face_mesh_occluder_material.mat
What’s important is colorWrite: false and depthWrite: true.
The meaning is obvious, we want the object to be transparent but still occlude what’s behind it.
The parameters name: unused is I think because of a strange behavior with an empty parameters array that trigger a error when importing an object with this material.
Download this material and import it in your sampledata folder along your 3D objects and materials.
Next, let’s import the sphere container with this new material:
Click right on the .obj file ->import Sceneform asset -> Material Path : set the path of your transparent material -> Finish
In our generated sfa file, we can see that the referenced material is our new transparent one.
One last task, the 3D object are rather small and unless we design our portal for hobbits, we must enlarge the objects.
That’s easy, in each sfa file, in the model object, you can add a scale entry and assign a value. With a bit of trial and error I set 10.
Everything is ready, we can code now.
The code is pretty straight forward, we build both models.
The only clarification I’d like to make is about this line:
renderPriority = Renderable.RENDER_PRIORITY_LAST
There is an issue with rendering order of the objects that render a black form instead of our materials.
https://github.com/google-ar/sceneform-android-sdk/issues/167
To put in a nutshell, the camera feed is rendered last ( value: 7) and objects are sorted by distance. We make all objects render last and let the camera render first within this group to avoid this issue.
In the onTapListener, we create an anchor and attach a node to it.
We create an invisibleNode of our transparent sphere container as a children of our anchorNode.
We place it with the localPosition property further away from the AnchorNode and we bring our node closer to the floor.
Look at those great articles for further information on how to position our nodes with Vector3 and Quaternion.
Since the 3D objects holes(doorways) are facing the opposite direction from the camera, we have to rotate the 3D objects on their y axis to face the camera. The localRotation property does just that.
Now we create a roomNode which contains our virtual world whose parent is the invisibleNode to make itself relative to the sphere container. We update the virtual world node’s scale to be just a bit smaller than the sphere container to avoid overlaps.
Launch the app now and enter your own AR portal.
Many optimization could be made for a better ux but the idea is there to create your own portal to a virtual world.
Finally it would look a lot better with a skilled graphist to provide the assets but it’s still pretty satisfying to to it all by yourself.