Target Audience Research : User Survey

While the allure of Noir is an assumption I hold about the target audience; it is just as important that the audience is interested in role-playing, interacting with other participants and within the context of their local town. I made assumptions about attitudes towards a product which is tailored to a person’s home town as well as the concept of leaving one’s mark withing the fictional world of the experience. In order to test and respond further to these assumptions I carried out some primary research which I have detailed below.

Although I only polled one hundred people; I used an initial qualifier filter to exclude people who answered no to the question “Could you be interested in a themed digital interactive experience?”.  83% of the initially filtered participants said they would participate in the digital interactive experience if it was to be Film Noir themed. I then tested the perceptions, knowledge and preferences that the poll participants held about both Film Noir and the nature of the type of app I am building. Among the finding were that a majority (61%) would prefer to download the app via an app store and print out any physical assets rather than purchase a boxed product. This is not to say that a boxed product is unpopular; 39% would prefer a boxed edition. However, the survey did not give details of pricing differences. While my original assumption early on in the product development cycle has been to ship the app as a hybrid  boxed app. I have come up against logistical and cost problems with a supplier as well.  Therefore it is very likely, with the data from the research, that the app will initially be distributed primarily via the app stores.

Demograhics Summary

Several other assumptions were validated within the research. 59% of participants prefer an open narrative as opposed to a prewritten one. Although a greater majority (77%) prefer a narrative with an ending. This latter finding suggests that an episodic approach could satisfy a large majority; whereby the participants have control over much of the narrative experience while encountering closure through periodic narrative endings to sub-plots. A staggering 82% of those surveyed agreed to some degree that it is better when a digital interactive experience is tailored to a participant’s local town or city. 62% of participants said they would be either moderately or extremely likely to engage with interactive digital content within their local town as part of the experience; a further 22% said they would slightly likely. Leaving one’s mark within an interactive experience, such as by leaving hidden messages at locations for other participants to discover, was deemed between moderately and extremely important to 82% of people. Those surveyed agreed overwhelmingly (87%) that digital interactive experiences through mobile devices could encourage people to engage with their local towns more if tailored to included familiar content such as well know local buildings and places.

Using the demographical data obtained from the survey I was able to determine certain trends which helps to better identify more specific target audience cross sections.

92% 18 – 24
81% 25 -34
84% 35 – 44
69% 45 – 54
71% >54

While the idea of a digital interactive Film Noir experience is popular in all age ranges; it is most popular with younger participants.When asked about the likelihood of engaging with digital content in their local town; the results where consistent among age groups.

67%  18 – 24
53%  25 – 34
68%  35 – 44
67%  45 – 54

Generally, across the results there was a suggestion of some scepticism in the 25 – 34 age group and the greatest level of interest in the 18 – 24 while over 35s were generally in between these other two groups in terms of levels of interest.There was no disparity in the results in terms of whether the person was using iPhone or Android. Some other demographic filters produced some variations when answering yes to the question whether they would want to participate in a Film Noir experience.

93% Uni
93% PostGrad
76% High School
81% Single
96% Married
96% Male
80% Female

Overall the ideal target candiate, based on this data would be:

Male, Married, University educated and over 35.

Although all demographics in the 18 – 24 age groups responded positively across the survey, the quantity of participants polled is limited; but it serves as a litmus test for the general take up of the idea. Finally, there was one further result which I found to be noteworthy. I asked participants which other themes would be of interest to them other than Noir and the most popular response out of ten choices was Comedy (64%) and Romantic (43%). These are elements which could and perhaps should be included within the ongoing development of the gameplay of Noirscape. Noir film was certainly not without a sense of humour, albeit a dark one.

I always cry at weddings. Especially my own.

Humphrey Bogart, Film Noir Actor

Raw Data: Survey Results from March 2021

The first question seeks to get an idea about the general public’s perception of what people think Film Noir is. The keywords
love, detective and crime came out on top along with black & white’

Interactive Rotary Phone Demo

I added some final sparkle to my vintage rotary dialler. Using various calculations I am able to predict which number has been dialled in a way this quite closely emulates the real thing. For example, a digit is only registered when the dial is turned fully to the catch.

I also added sound effects. Each digit has its own sound file to correspond with the length of time of the rotation which of course varies for each number. The sound effect also has to respond to when the dial is released too early and therefore has an irregular return rotation timespan.

Lastly, I added some texture to the background and some text using an style typewriter font which shows the participants name and their own tel number within the experience.

I had great fun putting this into the hands of my two childre, aged 9 and 12; they had no idea how to work a rotary dial phone. Who needs to invent enigmas and puzzles when we can just emulate the trials and tribulations of analog technology!

I also shared a demo with friends and colleagues as well as on social media and I was quite surprised by the number of people suggesting this could become an app in it’s own right. I haven’t researched what’s already in the app stores. But, certainly an interesting idea would be to make a standalone rotary dialler app which interacts with the phone’s call api to initiate real calls. I am already thinking about the fun that could be had creating different version; the 1970’s, 1890’s, 1950’s.

Anyway, here is a video I put together to demonstrate the feature in action.

https://www.youtube.com/watch?v=Kr54QQu3U4Q

Custom Animated Interactivity #Dialler

Problem: Calculating the button which has been pressed within a Painted Canvas

I created a telephone dial using Flutter’s CustomPainter API. This permitted me to draw the various elements onto the screen relative to the size of the screen. The problem now is : how to detect which number has been dialled. There are no workable Gesture detection strategies within the custom painter which allows me to directly detect on individual elements, like the numbers. I can however detect when the canvas is touched using the hitTest method of the CustomPainter class. This provides me with the screen coordinates (offset) of the x and y position of the ‘hit’; the position where the user has touched the screen.

Now, to be able to work out how that position corresponds to a number (on the dialler) I need to do some math. Given the size of the screen I can easily calculate the center by divided the x/y values by two. To illustrate this I used the CustomPainter to draw to red circles on my canvas; one at the center and one where a hit occurs.

Initial Dial Drawing with center point and touched point highlighted

Now, if I can calculate the angle from the position of the calculated center to the position of the hit I will be able to map the value to a known position of a number on the dial.

In the above example the tapped area shown on the left at the number seven is 50(x)  and 240 (y). It’s important to note that the values used by Flutter are logical pixels and not physical ones. So these values don’t correspond to the actual screen resolution. In the case here, the red spot at dial seven is 50 logical pixels along the x axis and 240 logical pixels along the y axis. These are both relative to the top-left of the  canvas.

The center, in my case, working on a physical Pixel 3 XL, is at 205,205 (the canvas is therefore a square with 410 logical pixel sides)

I can now calculate the angle from the center of the canvas to the hit point using Dart’s built in atan2 function

double rads = atan2(position.dy - (canvasSize.height / 2),
    position.dx - (canvasSize.width / 2));

ref : https://api.flutter.dev/flutter/dart-math/atan2.html

Using the x and y coordinates above equates to the following:

rads = atan2(240 - 205, 50 - 205);

Which returns 2.92 radians (rounded to 2 decimal places)
To calculate the angle in degrees, Dart has a build in function. Otherwise the formula is :

(2.92 * 180.0) / pi

Which returns 167 degrees and is confirmed by Google:

Google Calculator to validate Radians and Degrees conversion
The Dial with a protractor image from Mathsisfun.com superimposed to show the angles look right

Dart’s atan2 function returns a range from -pi (3.14159…) to +pi. And pi * Rad equals 180 degrees. So the final value in degrees will always between -180 and +180. Therefore the sign (+/-) determines whether the hit was in the upper or lower half of the dial.

I can now calculate which number is being dialled  based on the initial hit.  The next step is to prevent the dial from turning beyond the stopper. The stopper was used as a catch on rotary phones to indicate the desired number before automatically returning the dial back to its original position. The caller would therefore place the finger inside a circular opening above the desire number and rotate the dial clockwise until they reached the stopper. The numbers remained stationary and the angle of rotation was used to calculate each digit of the telephone number to call.

ARCore Flutter Further Tests

The main issue I’m experiencing currently is the inability to set the scale of an object. I’ve now forked the main repository and I’m going to look deeper into it and make any changes I may need.

The problem I have is that when a 3d object is rendered it has a colossal scale; I assumed this was constrained by some kind of Vector3 limit on the camera. For example I am rendering a vintage telephone that I’d like to place on a table in the camera view. However, it’s appearing the size of an gigantic spaceship.

There is a scale parameter for the Node (which represents the 3d model of a telephone in ar-speak) but this doesn’t not appear to have any effect. The parameter takes a Vector3 object (x,y,z) which is a relative measurement to the parent object. However, given that the Node’s parent object is not something I have access to, I can’t set this. Eitherway, I’ve trying setting the scale to tiny values but it makes no difference. I’ve also tried wrapping the Node in other nodes but this hasn’t helped either.

https://github.com/KhronosGroup/glTF/tree/master/specification/2.0

I have checked out the underlying ARCore Java library and understand that the scale ought to be relative to the estimated size of the detected Plane (the horizontal plane of my desktop for example). This size is taken from the realworld  estimated coordinates and should be at least accurate to a metre. The attributes are ExtendX and ExtentY. From these values it should be possible to scale the Node relatively. I’m going to check out the Java source code and see if I can spot anything.

https://developers.google.com/ar/reference/java/com/google/ar/core/Plane#getExtentX()

Reformatting the Object File
I couldn’t find anything wrong in the code at first glance. The object scale should be relative to the plane upon which it’s placed. So, I turned to my object files again. I noticed that while the earlier tests using the KhronosGroup images were big (oversized yellow duck!) they were spaceship size. So, my attention turned to the GLTF coding of my images. I went through the specification again and cross checked the Duck file with my telephone come spaceship one. It’s not easy to see anything amiss like this as it’s all about transformations and rotations – numbers; which are all relative to one another. But, I did have thought about the origins of these 3d objects. I got them from Sketchfab, where you can download them directly in GLTF format. Great! Maybe not. I noticed that even Windows 3D viewer couldn’t open my the telephone. I went back to Sketchfab and downloaded the telephone again, but this time in USDZ format. A format created by Pixar that’s becoming more and more associated with AR design. It’s a single file with the textures etc incorporated; I imported this into Adobe Dimensions and the first thing I noticed was a spaceship sized telephone.  I panned out of the ‘scene’ to see the telephone at it’s more earthly scale. My hypothesis is that Sketchfab auto-convert the source objects into GLTF as scenes rather than just objects. This could explain why the scale issues. I hope this is the case, anyway. I’ll export the telephone from Dimensions in GLTF format and test it in AR again.

Telephone in Dimensions, waiting to be exported.

Once exported, I moved the files into my web project from which I’m serving these objects from the web.

GLTF files

And deploy to firebase hosting:

deploy to firebase hosting

The result was certainly in the right direction. It’s no longer the size of USS Enterprise but seems to be fixed to the size of the detected plane, which I suspect is estimated at one square metre; and its just floating about in the air like a drone.  I shall work on the scaling further and try to understand why it’s not anchoring to the plane correctly.

Giant vintage ARCore Phone in Flutter

Reference Points

This is a good place to copy a few reference points from the Java API docs at Google for ARCore, as they are written succintly and help to keep in mind the different concepts of AR development.

PLANE

“Describes the current best knowledge of a real-world planar surface.”

https://developers.google.com/ar/reference/java/com/google/ar/core/Plane

POSE

“Represents an immutable rigid transformation from one coordinate space to another. As provided from all ARCore APIs, Poses always describe the transformation from object’s local coordinate space to the world coordinate space

https://developers.google.com/ar/reference/java/com/google/ar/core/Pose


WORLD COORDINATE SPACE

“As ARCore’s understanding of the environment changes, it adjusts its model of the world to keep things consistent. When this happens, the numerical location (coordinates) of the camera and Anchors can change significantly to maintain appropriate relative positions of the physical locations they represent.These changes mean that every frame should be considered to be in a completely unique world coordinate space.”

https://developers.google.com/ar/reference/java/com/google/ar/core/Pose#world-coordinate-space

Conclusion

I was eventually able to scale my 3D object correctly using a combination of GLTF settings and ARCore config.

With some Shader work within Flutter I’ve created a Noiresque look in which the vintage 1940’s 3D telephone I got from Sketchfab (see link in video description) is positioned consistently in the AR or ‘Mixed Reality’ world based on the detected horizontal Plane, the Pose of the object and of course the World Coordinate Space.

ARCore with Flutter Tests

In order to get to the bottom of the issues I have been experiencing with ArCore for Flutter I decided to test and document different combinations of factors. Given that there is a multitude of possible test cases with different types of 3d object file, different versions of Sceneform and heaps of other stuff; I wanted to test each case one by one and record my findings, as I’m convinved this is doable with Flutter and ARCore.

V1.15 Gradle

implementation 'com.google.ar.sceneform.ux:sceneform-ux:1.15.0'
implementation 'com.google.ar.sceneform:core:1.15.0'
implementation 'com.google.ar:core:1.15.0'implementation 'com.android.support:multidex:1.0.3'

Tests with KhronoGroup Duck model
https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Duck/

  1. Test GLTF with Draco encrypted

https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/Duck/glTF-Draco/Duck.gltf

Remote glb (github) : no (causes app to crash)

signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x4

  1. Test Binary GLB

https://github.com/KhronosGroup/glTF-Sample-Models/blob/master/2.0/Duck/glTF-Binary/Duck.glb?raw=true

Remote glb (github) : no , not renderable

E/ModelRenderable( 8021): Unable to load Renderable registryId='https://github.com/KhronosGroup/glTF-Sample-Models/blob/master/2.0/Duck/glTF-Binary/Duck.glb?raw=true'

  1. Test Embedded gltf

https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/Duck/glTF-Embedded/Duck.gltf

Remote glb (github) : no, can’t load embeeded uri’s

Error : FileNotFoundException
Possible Solution : PR for embedded base64 extraction?


4. Test Quantized gltf

https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/Duck/glTF-Quantized/Duck.gltf

Remote glb (github) : no, fails silently

5. Test Original Format gltf

https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/Duck/glTF/Duck.gltf
Remote glb (github) : yes, WORKS!

6. Test Original Format gltf (alternative model)

https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/AntiqueCamera/glTF/AntiqueCamera.gltf

Remote glb (github) : no, fails silently

Info from library : I/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021):  addArCoreNode

ACTION

Test in source code of Flutter ARcore for silent exceptions to understand why test 6 fails when it is the same format as test 5 which passes.I turned on debugging in the flutter library (it is a property of the controller but marked as final so not manageable from the app level). and… the 3d object appeared! WIthout explantion at this stage. I was expecting a fail and some useful debugging info, but it suddenly works and its quite a relief to see a decent detailed 3d object in AR in Flutter (that isn’t a rubber duck!).

The only additional debug during this test:

I/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021):  addArCoreNodeI/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021): addNodeWithAnchor inserted testI/com.difrancescogianmarco.arcore_flutter_plugin.ArCoreView( 8021): addNodeToSceneWithGeometry: NOT PARENT_NODE_NAME

7. Test another alternative remote gltf in debug mode

https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Buggy/glTF/Buggy.gltf

Fails silently.

Disapointing because the previous alternative with debug mode on loaded ok.

https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/Lantern/glTF/Lantern.gltf

TEST : OK – WORKS !
I’ve tested some custom objects, as well as one which I converted myself from an obj format. And on the whole they work. Some are slow due to download speeds; but this is potentially resolved if I can use a localhost.

However, with one example there are persistent cache related errors that I may need to look into.
Caused by:

java.io.FileNotFoundException: /data/user/0/dev.matwright.noirscape/cache/1612617961633-0/textures/Material.001_baseColor.jpeg: open failed: ENOENT (No such file or directory)

Whereby no textures are found within the cache directory. I suspect this is dev related though, and I’ll come back to this particular model to retest that probability.

In terms of download speed, I can also reduce the sizes of textures. There is plenty of room for reduction for use on a mobile screen.