ARCore Flutter Further Tests

The main issue I’m experiencing currently is the inability to set the scale of an object. I’ve now forked the main repository and I’m going to look deeper into it and make any changes I may need.

The problem I have is that when a 3d object is rendered it has a colossal scale; I assumed this was constrained by some kind of Vector3 limit on the camera. For example I am rendering a vintage telephone that I’d like to place on a table in the camera view. However, it’s appearing the size of an gigantic spaceship.

There is a scale parameter for the Node (which represents the 3d model of a telephone in ar-speak) but this doesn’t not appear to have any effect. The parameter takes a Vector3 object (x,y,z) which is a relative measurement to the parent object. However, given that the Node’s parent object is not something I have access to, I can’t set this. Eitherway, I’ve trying setting the scale to tiny values but it makes no difference. I’ve also tried wrapping the Node in other nodes but this hasn’t helped either.

I have checked out the underlying ARCore Java library and understand that the scale ought to be relative to the estimated size of the detected Plane (the horizontal plane of my desktop for example). This size is taken from the realworld  estimated coordinates and should be at least accurate to a metre. The attributes are ExtendX and ExtentY. From these values it should be possible to scale the Node relatively. I’m going to check out the Java source code and see if I can spot anything.

Reformatting the Object File
I couldn’t find anything wrong in the code at first glance. The object scale should be relative to the plane upon which it’s placed. So, I turned to my object files again. I noticed that while the earlier tests using the KhronosGroup images were big (oversized yellow duck!) they were spaceship size. So, my attention turned to the GLTF coding of my images. I went through the specification again and cross checked the Duck file with my telephone come spaceship one. It’s not easy to see anything amiss like this as it’s all about transformations and rotations – numbers; which are all relative to one another. But, I did have thought about the origins of these 3d objects. I got them from Sketchfab, where you can download them directly in GLTF format. Great! Maybe not. I noticed that even Windows 3D viewer couldn’t open my the telephone. I went back to Sketchfab and downloaded the telephone again, but this time in USDZ format. A format created by Pixar that’s becoming more and more associated with AR design. It’s a single file with the textures etc incorporated; I imported this into Adobe Dimensions and the first thing I noticed was a spaceship sized telephone.  I panned out of the ‘scene’ to see the telephone at it’s more earthly scale. My hypothesis is that Sketchfab auto-convert the source objects into GLTF as scenes rather than just objects. This could explain why the scale issues. I hope this is the case, anyway. I’ll export the telephone from Dimensions in GLTF format and test it in AR again.

Telephone in Dimensions, waiting to be exported.

Once exported, I moved the files into my web project from which I’m serving these objects from the web.

GLTF files

And deploy to firebase hosting:

deploy to firebase hosting

The result was certainly in the right direction. It’s no longer the size of USS Enterprise but seems to be fixed to the size of the detected plane, which I suspect is estimated at one square metre; and its just floating about in the air like a drone.  I shall work on the scaling further and try to understand why it’s not anchoring to the plane correctly.

Giant vintage ARCore Phone in Flutter

Reference Points

This is a good place to copy a few reference points from the Java API docs at Google for ARCore, as they are written succintly and help to keep in mind the different concepts of AR development.


“Describes the current best knowledge of a real-world planar surface.”


“Represents an immutable rigid transformation from one coordinate space to another. As provided from all ARCore APIs, Poses always describe the transformation from object’s local coordinate space to the world coordinate space


“As ARCore’s understanding of the environment changes, it adjusts its model of the world to keep things consistent. When this happens, the numerical location (coordinates) of the camera and Anchors can change significantly to maintain appropriate relative positions of the physical locations they represent.These changes mean that every frame should be considered to be in a completely unique world coordinate space.”


I was eventually able to scale my 3D object correctly using a combination of GLTF settings and ARCore config.

With some Shader work within Flutter I’ve created a Noiresque look in which the vintage 1940’s 3D telephone I got from Sketchfab (see link in video description) is positioned consistently in the AR or ‘Mixed Reality’ world based on the detected horizontal Plane, the Pose of the object and of course the World Coordinate Space.

ARCore with Flutter Tests

In order to get to the bottom of the issues I have been experiencing with ArCore for Flutter I decided to test and document different combinations of factors. Given that there is a multitude of possible test cases with different types of 3d object file, different versions of Sceneform and heaps of other stuff; I wanted to test each case one by one and record my findings, as I’m convinved this is doable with Flutter and ARCore.

V1.15 Gradle

implementation ''
implementation ''
implementation ''implementation ''

Tests with KhronoGroup Duck model

  1. Test GLTF with Draco encrypted

Remote glb (github) : no (causes app to crash)

signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x4

  1. Test Binary GLB

Remote glb (github) : no , not renderable

E/ModelRenderable( 8021): Unable to load Renderable registryId=''

  1. Test Embedded gltf

Remote glb (github) : no, can’t load embeeded uri’s

Error : FileNotFoundException
Possible Solution : PR for embedded base64 extraction?

4. Test Quantized gltf

Remote glb (github) : no, fails silently

5. Test Original Format gltf
Remote glb (github) : yes, WORKS!

6. Test Original Format gltf (alternative model)

Remote glb (github) : no, fails silently

Info from library : I/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021):  addArCoreNode


Test in source code of Flutter ARcore for silent exceptions to understand why test 6 fails when it is the same format as test 5 which passes.I turned on debugging in the flutter library (it is a property of the controller but marked as final so not manageable from the app level). and… the 3d object appeared! WIthout explantion at this stage. I was expecting a fail and some useful debugging info, but it suddenly works and its quite a relief to see a decent detailed 3d object in AR in Flutter (that isn’t a rubber duck!).

The only additional debug during this test:

I/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021):  addArCoreNodeI/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021): addNodeWithAnchor inserted testI/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021): addNodeToSceneWithGeometry: NOT PARENT_NODE_NAME

7. Test another alternative remote gltf in debug mode

Fails silently.

Disapointing because the previous alternative with debug mode on loaded ok.

I’ve tested some custom objects, as well as one which I converted myself from an obj format. And on the whole they work. Some are slow due to download speeds; but this is potentially resolved if I can use a localhost.

However, with one example there are persistent cache related errors that I may need to look into.
Caused by: /data/user/0/dev.matwright.noirscape/cache/1612617961633-0/textures/Material.001_baseColor.jpeg: open failed: ENOENT (No such file or directory)

Whereby no textures are found within the cache directory. I suspect this is dev related though, and I’ll come back to this particular model to retest that probability.

In terms of download speed, I can also reduce the sizes of textures. There is plenty of room for reduction for use on a mobile screen.

Dev: 3D Object into AR with Flutter

I encountered a number of issues with getting complex 3d objects into a Flutter AR app. The Flutter AR components takes either a remote glft file, a format designed for loading 3 objects in a web browser; or sfb a format specific to Sceneform, a 3D rendering component for smartphones. The webformat is fine for simple objects that need to be loaded on the fly but anything with complex detailed textures can quickly run into big file sizes. Futhermore, I am hosting my files on Firebase Storage and a slft object is actually a bundle of files. I will need to handle this at some point, but my intitial task is to implement a complex and detailed 3d asset in the AR view. To do this, given the file size and the fact there will only be a few of them I wanted to embed the complex items with the app’s assets. This requires the SFB format. The trouble here is that there are different versions of the Sceneform plugin. The original version, by Google, was frozen in time, archived as version 1.17.1 and opensourced. This was a copy of the final closed-source version by Google (1.15.0). A further version is 1.16.0 which was built as a Gradle module and which supports the glTF format instead of sfa and sfb. glTF is supported within the Flutter ARCore module as a url based resource and not an asset. However, I could get around this by serving assets through an intergrated http server within my app for the complex static objects.

Yet despite these notices, upon testing the 1.15.0 version I WAS able to embed a remote glTF file using reference 3d objects from the KhronosGroup github repository. KhronosGroup being an American non for profit organisation which focuses on the creation of open standard, royalty-free API’s.

BUT, only some gltf models would show in the AR space.

A duck model works fine:

But, models that more akin to the type of thing I want to use would not show; such a lantern.

Both the duck and the lantern are part of the same Khronos 2.0 library; each has non-embedded textures – separate 2d png files and a bin file. But, something is prevent certain gltf files from rendering in ARCore but not others. If feel like getting to the bottom of this is going to be a drain on time but I do need a solution. And unless I can understand the constraints and limitations of using different types of 3d objects with different implementations of the SceneForm 3d engine; I will be feeling my way in the dark; and this is not how I like to proceed. I will carry out further tests before reaching a conclusion.


Narrative Design [WIP]

The Noirscape app narrative design features several conceptual layers.

The present narrative is a choice based narrative loosely based on CYOA Quest Pattern. The pattern uses small, tightly-grouped clusters of nodes allowing many ways to approach a single situation (Ashwell 2015)

Quest ¨Pattern []

“This mode [Quest Pattern] is well-suited for journeys of exploration, focused on setting; the quest’s structure tends to be organised by geography rather than time.”

Sam Kabo Ashwell, Standard Patterns in Choice-Based Games, These Heterogenous Tasks, 2015


A Conceptual Interactive & Spatialised Narrative Design – Part 2

Storyboard of User Experience

The short Youtube presentation below charts the app customer, Kevin’s ste-by-step process from purchasing the boxed product to completing the initial investigation/story.

Prototype of Noirscape

The boxed product is sold on the high street in selected specialised stores.

The product’s cover design will be highly stylised to the ‘Noir’ look and feel.

It is clear from the cover illustration that this is a hybrid board-app game.

The localised nature of the product holds special appeal as it provides a bespoke look and feel and a personal connection through the relationship with the consumer’s home town.


  • The boxed product is attractively illustrated and crafted
  • The product includes an NFC ID card which allows a radio signal to be detected by the user’s phone from the ID card. This is used to activate the app, so that only people in posession of a card can play. It, is also used to provide hints to the player when they are stuck on an enigma.
  • The smartphone application is available in English and French.
  • The app uses GPS tracking to anchor gameplay in geographic places.
  • 360 degree video with special effects is used to create a sense of immersiveness
  • All features are widely supported by smartphones including budget models


Technical experiments have been validated in the following areas:

  • Applying special effects to 360 degree film
  • Applying superimposed footage in a 360 degree film
  • Accessing the 360 footage via Flutter, the cross-platorm app development platform


  • Familiarity Hypothesis – building a product which is tailored to a user’s home town generates increased curiousity, intrigue, a sense of personal associated; but, also a level of respect and appreciation for the product innovators for acknowledging their town.
  • Thematic Approach – allows for the possibility of future releases of new editions of the product along different themes : Cyberpunk, Medieval, Victorian, etc. I decided to use Film Noir for the prototype primarily because I am personally inspired by the genre but also because it lends itself so well to the embedded narrative of investigating past events, but also the notion of searching and solving enigmas, which is synonomous with escape style games.
  • GPS/Geo – The arrival of smartphones, which are effectively pocket sized computers, has opened up new ways to experience a crossover between different types of spaces, places and narrative.

“The development of mobile technology, global positioning systems (GPS), and augmented reality counters the tendency of computers to lure sedentary users into virtual worlds by replacing simulated environments with real-world settings and by sending users on a treasure hunt in the physical space” [Ryan, Foot & Azaryahu, 2016, pp102]

  • Embedded & Spatialised Narrative Design – an innovative approach to combining storyworlds with the realword using recent smartphone technologies.

“The search for the hidden story takes advantage of the visual resources of digital systems by sending the player on a search for clues hidden in the storyworld” [Ryan, Foot & Azaryahu, 2016, pp108]

“In embedded narrative, space is there to be searched, since it contains the clues to the story that need to be retrieved” [Ryan, Foot & Azaryahu, 2016 pp110]


  • Some older, lower-end devices may not support NCF card reading functionality
  • App requires above average drive space due to 360 degree video media
  • Potential safety/responsability issues concerning public interaction aspect of app (places)
  • Product can be passed on from one user to another potentially without purchase


  • Additional releases for other towns including major cities, in France & worldwide
  • New episodes can be developed and added via in-app purchases
  • More Augmented Reality (AR) features
  • Opportunity for users to ‘leave their mark’ with Geo stamped AR.


  • Build working cross-platform prototype of Noirscape
  • User test in my home town
  • Prepare Crowdsourcing / Kickstarter campaign for furthet towns

“A storyline becomes an option whenever a chronological or a thematic sequential structure is introduced into a spatial arrangement of coesistent elements in the form of routes and paths that direct movement in space” [Ryan, Foot & Azaryahu, 2016, pp158]


The app itself, will be built using Flutter, a cross-platform app development framework created by Google. As an app developer, I have been using Flutter for about a year. I have experimented with a range of concepts including machine learning driven games and narrative based educational apps.

The app will require quite advanced video editing skills as it uses 360 degree film and special effects. To address this requirement, I have taken a number of professional Adobe training courses which lead to industry certification by Adobe:

  • Adobe Illustrator – graphic design/icons, etc
  • Adobe Premier – 360 film editing
  • Adobe After Effects – 360 film special effects & animations
  • Adobe Photoshop – photo editing and effects

I recognise shortcomings in respect to my knowledge and skills in respect to the physical boxed product design & production. To address part of this weakness, I have been working with 3D design software to envisage visual aspects to the boxed product. I have also registered for an Adobe InDesign course in January to help with packaging design of the physical product.

I still need to research how to source and manufacture the box and some of the included items. However, I have already purchased samples of NFC cards that will be used for the detective’s ID interactve card. I have researched printing equipment which can be used to illustrate these plastic cards, too.

The project is being managed using Agile methodology via Trello, a popular Kanban app for organising the development into stages while integrating with the user centered design approach alongside personas and the storyboard.


Walser, Randall. 1990. “Elements of a Cyberspace Playhouse”. Proceedings of the National Computer Graphics Association 1990, Anaheim, CA, 403-410.

RYAN, Marie-Laure, Kenneth E. FOOTE and Maoz AZARYAHU. 2016. Narrating Space/spatializing Narrative : Where Narrative Theory and Geography Meet Columbus: The Ohio State University Press.

ALLEGORIES OF SPACE The Question of Spatiality in Computer GamesEspen Aarseth 2001





A Conceptual Interactive & Spatialised Narrative Design – Part 1

Design Challenge

To design an immersive and interactive smartphone application using 360 video and GPS to send users on a film noir themed investigation into the physical space of their local town.


Persuasive Purpose

  • To incite people to engage with the narrative of spaces and places of their town.
  • To encourage them to visit their town centre.
  • To promote physical excercise through walking and exploring.

Industrial Design

Noirscape is a physical boxed product that includes several items for gameplay as well as a GPS/AR smartphone application.

People at the mall

Type of Person

Residents who frequent less, or not at all, their local town; instead, favouring peripheral commercial centers and who are therefore rapidly losing contact with their local town’s character, economic & heritage value.

They are likely:

  • Ages 20 – 60
  • Active Smartphone users
  • Suburban
  • Comfortable with technology
  • Yearning for adventure

Persona 1 – Product Consumer

young man bringing groceries home

Kevin Rousseau

  • Age 25
  • Office Worker
  • Earns €3k per month
  • Single

“I seem to spend my whole life shopping, at work or sleeping”

Kevin Rousseau

“I hear there’s a new bar opened in town but I haven’t been down the high street for ages”

Kevin Rousseau

“I’d take a walk around town this weekend; but, no one else will join me”

Kevin Rousseau

Persona 2 – Product Retailer

young woman sitting on couch studying using the phone

Julie Liebereau

  • Age 29
  • Business Owner
  • Earns €4k per month
  • Sells Board Games
  • In a couple

“I’d love the opportunity to work with a local designer and cut out the middle-man”

Julie Liebereau

“I sometimes feel like I’m just selling fancy boxes. There’s so much scope for something new & innovative on the board game scene”

Julie Liebereau

“I love the idea of something local with a bespoke feel.”

Julie Liebereau