Noirscape Swipe Card Development

Noirscape uses NFC (Near Field Communications) technology to add some fun to the signing in process. A detective ID card is among the items that will be included in the boxed version of the game but they’ll also be available via the future website and onlin merchandise store.

NFC is a tiny, low-cost, batteryless radio transmitter which allows app developers to embed the featherweight, waferthing device into clothing, toys, posters and just about anything else. In my case, I am using a business card format. The actual NFP cpmponent is inside the plastic and is only about an inch in diameter. For the purpose of my testing I have created my own version using card and a stick’on NFP emitter.

Prototype ID cards used for a gamified sign in process for the Noirscape experience.

Near Field Communication (NFC) is a standards-based short-range wireless connectivity technology that makes life easier and more convenient for consumers around the world by making it simpler to make transactions, exchange digital content, and connect electronic devices with a touch. NFC is compatible with hundreds of millions of contactless cards and readers already deployed worldwide.

https://nfc-forum.org/what-is-nfc/ [01/04/21]

The ID card for Noirscape contains a unique reference code that may only be used once and is used in conjunction with anonymous sign in through Google Firebase integration. It is a gamification feature rather than a security element. ALthough there are plenty of possiblities for future feature development; inlcuding ‘friending’ other participants in the realworld by allowing one another to scan each other’s card.

Video walkthrough of the sign in sequence upon first installation of the app.

AR Gamification : snaps

Finding hidden objects in an augmented reality [AR] experience is fun; especially when the items have interactive qualities which trigger new content, clues and narrrative. Furthermore, given the quality of 3D assets, AR lighting, shadow and reflection prediction and generation; coupled with a the Noirscape black & white filter, it’s a great little feature to be able to capture an augmented view within the home and share with others on social media or messaging.

The user may take a picture of their AR Scene and/or re position the object to their liking
The user then reframes the image to their liking.

The Noirscape snap can be shared across social media or installed apps using the device’s native Share API
The image is now used within the app’s inventory for current user and may be shared to social media.

The functionality allow the participant to to take an AR photo snap of their new found fictional objects withing the realworld space of their home. The image can then be shared on social media or sent to a recipient using the device’s sharing API. This provides content for the user to share, a memory from the experience and marketing value for the app brand.

Noirscape AR shot shared to Twitter from within the App

In the above image, the dots which represent the AR Plane object (the horizontal plane detected in the real environment). Using the the Flutter ArCore package, there was no way to hide these once the AR object had been placed and it spoils the photograph somewhat. Fortunately I was able to fork the main package and add the functionality and send over a pull release request to the main package maintainer. So, I am now able to hide these dots just prior to takeing the capture.

One of the advantages of Flutter is the great community and the opportunity for contributing to packages.

AR : Fiction or Virtual?

Augmented Reality (AR) provides a way to place realistic looking virtual objects into a realworld scene. While the object may merely exist upon the screen of a phone; there are features to AR which combine the worlds; fiction and reality; beyond the two dimensional surface of a smartphone. For example, when a 3D object is placed within the AR scene; it may not be physically present upon the targetted surface; but certain qualities of the said surface are projected into the machine generated final composition (horizontality, width, height and distance of the plan) and ultimately this data is processed within the mind of the person who momentarily accepts the presence of a fictional item within their immediate realworld environment.

There are a few ways to look at this. One way is the simple matter of tricking the mind. I myself, during testing of my AR functionality had a fall [no long term damage to me or phone] while walking about and testing placement of fictional virtual items in my home. My perception of the realworld space around me was tricked by the magic window of the smartphone through which I had been focusing for several minutes while placing a 1940’s vintage telephone on various surfaces. I got confused and tripped over. Maybe it was just me being clumsy; but there is, it seems to me, something to be said about the nature of fiction and how we, as humans, easily incorporate representations of real items into our mental processing of reality. This isn’t even new, or a result of the emergence of hitech. Whenever we we look at a photograph, we are looking at paper and markings; yet we see a person, or a real thing. Everyone recognises Magrite’s famous painting on this subjec; The Treachery of Images.

La Trahison des images, 1929; René Magritte

In Magritte’s case, the representative object in question is more evidentally two dimensional and contrived. Although the mind interprates the markings as being a placeholder for a real pipe; that is all it is – a placeholder. In the same way that the word P I P E is a placeholder for the physical and usable real thing. In Magritte’s painting, the image in question is a question of linguistic semantics. The children’s book style in which the image is constructed intends to poke fun at the way we learn to identify things from images in the same way we do from words; in this case the pipe image is little more than a word; a modern hieroglyphic.

About the same time as the famous illustration, Magritte published a fascinating article in the newspaper, ‘The Revolutionary Surrealist’ entitled ‘les images & les mots’ (‘Words & Images’) in which he shares a number of observations or platitudes, maybe, surrounding the nature of words, images and their role in our interpretation of reality.

Magritte,  1929, Les mots et les images, p. 32

The above illustrations are about depictions of reality and the way in which our minds relate to the concept of things; particularly within the scheme of language and words; but also just the nature of things. For example, one image remarks how an object leads one to believe that there are other objects behind it. Or, from another page, how the visible contours or objects, in reality, touch one another in a mosaique manner.

Magritte, 1929 Les Mots et les Images

So, what impact does Augmented Reality have, in respect to these kinds of platitudes? the expectance that objects hide other objects and so forth. AR techniques allow devices and software to imitate reality and then embed the imitation within reality; capturing the direction of light within the scene, to cast convincing looking shadows and reflections. In the case of Noirscape; a participant discovers a fictional telephone and can place the telephone in their realworld environment rather convincingly. The app also features a rotary dialler which is associated with the fictional telephone. Within the scope of the app; the dialler is used to call fictional characters. But, if it were to be connected to the device’s real calling capabilities; in other words, the participant can call someone in the realworld through interacting with the representative dial of the AR telephone; then its hard for me to make a distinction between using a physical phone or the augmented reality one. I think maybe, in this case, it is no longer a question of being a fictional telephone; but rather a virtual telephone. Whereas Magritte depicts a pipe, I cannot smoke a mere depiction. Whereas, I could hook up my fictional telephone to the realworld and make a call.

3d telephone from Noirscape App AR feature

References

http://ideophone.org/magritte-on-words-and-images/ [accessed 25/03/21]

https://plato.stanford.edu/entries/fiction/ [accessed 25/03/21]

http://ideophone.org/description-and-depiction/ [accessed 25/03/21]

360 Flashback Interaction

The challenge is to keep the app’s media files small so that they can be served on-demand from a content delivery network (CDN). This ensures only content that is relevant to the current user is downloaded and the main app size is kept small. Instead of using a mp4 or other type of movie file to convey the town 360 scene I am using the new Google webp format which I have previously validated within the 360 Flutter component. My aim is to keep this file as small as possible so it can be quickly served from a remote server. Other content such as voice and 3d characters which are part of the experience but not specific to a town  may be compiled as assets with the build as they will not need to update as often as other interactions. There is also the option to stream the non-town specific video over the underlying 360 animated webp image file.

Using an Adobe Media Encoder plugin I was able to export a short section of my 360 movie into a webm file. The movie version of the webp format. However, this format turned out not to be supported by the 360 component I am using. So, I am looking to convert to webp as I will not need embedded audio; which can be played from a separate file.

I found that Google provides decent documentation for webp as well as a number of command line programs to help convert.

https://developers.google.com/speed/webp/docs/using

The conversion worked nicely. I am now able to open up my town scene with movement; in this case rainfall. However, (there is always a however…) the 360 plugin supports the standard Flutter Image widget which in turn support webp animated images, but I am so far unable to loop. So, the animation stops after the final frame.

The tools I am using can be downloaded here:

https://storage.googleapis.com/downloads.webmproject.org/releases/webp/index.html

Instruction for configuring the lightweight image sequence with Google tools followed by deployment to the CDN

Adding Sound and Interaction

As the 360 visual content of the spatial narrative is town based; the relevant media file is served from the CDN. Its important that the image files are as optimized as possible do they transfer quickly. Sound effects will be embedded within the app as they are common to all users, whereas narrative speech is also, like the town based visual content, based on the user’s language but also the path they take through the adventure based on their decision making. So, it also makes sense to serve this content from CDN. Speech files do not need to load instantly as they are usually triggered through an interaction. So, the final recipe involves playing embedded sound effects and an embedded visual effect while the main 360 media loads. The media is then cached so the slight delay is only noticable on the first playback. The narrative sound is then played on top of the 360 visual content at the appropriate moments.

For the sound I have opted for an AI generated voice; albeit with intonation and a Hollywood accent.

Interactive 360 Scene Demo

Spatial Narrative Content

Noirscape experiments with a cross reality approach to narrative. Participants search for and find fictional objects in their own physical home through the AR (augmented reality) feature. One of these objects is a door and it’s keyhole a doorway between an augmented view of one world and the entirely fictional world of another. Furthermore, Noirscape binds the narrative with physical spaces in the nearby town. In my case and for the purpose of the pilot version of Noirscape, this is the French town of Bourg-en-Bresse.

I previously carried out fieldwork collecting 360 panoramic photographic content in and around the town; at over twenty locations selected not neccesarily for their prominence, but also for their intrigue; whether that be a curious street name, a bygone brewery turned warehouse or the site of a house where a celebrated artist and philosopher one lived. The noirscape experience will take participants into their town where segments of narrative are reveals through what I call flashbacks – these are a staple component of film noir; where the protagonist, who is usually since deceased or condemmed to a life of prison, recounts events from the past which played an important role in their current whereabouts [or absence of].

Opening Sequence to Sunset Boulevard, Paramount, 1950

My challenge is to take my 360 content from the town and combine it with fictional Noir narrative to give an augmented or combined immersive experience whereby the content is triggered only by visiting the place in the physical world and at which point, a flashback from a fictional past occurs. To achieve this I decided to work with digital 3d character creation and animation. I had previously arranged to work with a friend who is also an actor; but, it’s complicate right now with the pandemic, to meet up and spending enough quality time to get something filmed; I was planning to use my green screens and then take the content into the 360 editor using Adobe After Effects and Premier Pro. One thing lead to another and I opted for digital characters. I initially hoped I’d be able to use Adobe software but they have discontinued their Fuse product which was a character designer app that could be used with Mixamo, their recently acquired character animation service. I decided to use Reallusion’s Character Creator instead due to the vast amount of resources available. I used Headshot, their AI face generator to base character on my own face (although I’ve reworked it somewhat since!) and I imported custom objects like a Fedora hat and set up the character in a black coat.

A base character in Reallusion Character Creator software with an AI interpretation of my face projected onto it.
My clothed and hatted character in a T pose
Closer shot

Experimenting with different predefined pose templates

Next I took the character into iClone, Reallusion’s 3D animation suite. The challenge with iClone was to be able to bring in my 360 photo and create my own scene within the panorama. However, I ran into problems with this at first. While export to 360 panorama format is suported in iClone, I couldn’t achieve this using photography without experiencing problems with the way the image was being wrapper; due to distortion at the poles of the sphere if the Skybox object. The Skybox object in iClone and more generally in 3D design, is the imagery used to define the farthest most visible details; this would normally be the sky, hence the name; but may also be a distant mountain range. Usually this would only be thought of as a backdrop, with far more focus on the foreground and midground detail. In my case the Skybox would be represented by a complete 360 photo, in which I would place 3D assets like a person, a vehicle, etc.

Example of 0 degrees (ground) when 360 is wrapped within Photoshop;

Ground shot taken in iClone with the same 360 photo set as the Skybox image

I discussed the issue in the Reallusion support forum; and one solution put forward was to create my own 3d sphere object and set my 360 image as the texture. This did produce a slightly better outcome but not satisfactory enough for what I need. The Reallusion is fantastic nontheless; what I am seeking to do is certainly not a typical user-case by any means. One really good feature with iClone, and one of the key reasons for settings a photo as the Skybox, is for calculating light within a scene. The iClone software will identify from the image, in my case the 360 photo, which direction light is coming from and therefore where to cast light and shade within the 3D assets added to the scene. So, although I chose not to use iClone with the 360 photo visible, I still used it for the lighting work.

Scene from within iClone with my 3D character and other assets placed within my photo.

Within iClone I applied some subtle animation to my character; his tie blows in the wind and he blinks and moves a little while he waits for his rendez-vous. I applied rain effects with splashes and flickering light effects. In order to export my animation without the Skybox image so that I could bring it into Adobe After Effects I would need to export as an image sequence so ensure a transparent background. The sequence is 30 seconds long and 30 frames per second; so the software rendered 900 images in total which I then imported into After Effects.

Within After Effects the first challenge was to align the two-dimensional representation of my sequence within a 360 environment. If I place it as-is then it will be foreably bent into a banana shape as it is interprated through a 360 viewer. So, to avoid this, it’s important to alter the curvature of the 2d assets to align with the 360 image in equirectangular-panoramic format.

The 2D animation curvature is altered to match that of the 360 scene so that when wrapped into a sphere it looks correct.
My Animation positioned within the 360 photo with field of view warping to match 360 sphere position.
Adobe After Effects Settings Using the VR Plane to Sphere effect to warp the field of view.

I’m generally pleases with the outcome and although it took quite a bit of time to get what I wanted, I now have a documented workflow for the process; I have a character ready to deplot to new scenarios and the knowhow to create others much more quickly. A small issue I have with the end result is that the animation is too subtle to really see properly on a mobile device; but this is easily tweaked. For now, I’m going to settle with what I have for the purpose of integrating with the app. The next step is to create a looping image based version of the scene in webp format as I have shown in a previous post. I will then play the audio channel, with the voice narration and sound effects via the app/device rather than the media file itself. This will keep the size of the media file down and allow me to serve the localised element (the view using footage from a specific town) and the global content – the spoken narrative.

Mobile phone view of interactive scene
Interactive YoutTube Version

Target Audience Research : User Survey

While the allure of Noir is an assumption I hold about the target audience; it is just as important that the audience is interested in role-playing, interacting with other participants and within the context of their local town. I made assumptions about attitudes towards a product which is tailored to a person’s home town as well as the concept of leaving one’s mark withing the fictional world of the experience. In order to test and respond further to these assumptions I carried out some primary research which I have detailed below.

Although I only polled one hundred people; I used an initial qualifier filter to exclude people who answered no to the question “Could you be interested in a themed digital interactive experience?”.  83% of the initially filtered participants said they would participate in the digital interactive experience if it was to be Film Noir themed. I then tested the perceptions, knowledge and preferences that the poll participants held about both Film Noir and the nature of the type of app I am building. Among the finding were that a majority (61%) would prefer to download the app via an app store and print out any physical assets rather than purchase a boxed product. This is not to say that a boxed product is unpopular; 39% would prefer a boxed edition. However, the survey did not give details of pricing differences. While my original assumption early on in the product development cycle has been to ship the app as a hybrid  boxed app. I have come up against logistical and cost problems with a supplier as well.  Therefore it is very likely, with the data from the research, that the app will initially be distributed primarily via the app stores.

Demograhics Summary

Several other assumptions were validated within the research. 59% of participants prefer an open narrative as opposed to a prewritten one. Although a greater majority (77%) prefer a narrative with an ending. This latter finding suggests that an episodic approach could satisfy a large majority; whereby the participants have control over much of the narrative experience while encountering closure through periodic narrative endings to sub-plots. A staggering 82% of those surveyed agreed to some degree that it is better when a digital interactive experience is tailored to a participant’s local town or city. 62% of participants said they would be either moderately or extremely likely to engage with interactive digital content within their local town as part of the experience; a further 22% said they would slightly likely. Leaving one’s mark within an interactive experience, such as by leaving hidden messages at locations for other participants to discover, was deemed between moderately and extremely important to 82% of people. Those surveyed agreed overwhelmingly (87%) that digital interactive experiences through mobile devices could encourage people to engage with their local towns more if tailored to included familiar content such as well know local buildings and places.

Using the demographical data obtained from the survey I was able to determine certain trends which helps to better identify more specific target audience cross sections.

92% 18 – 24
81% 25 -34
84% 35 – 44
69% 45 – 54
71% >54

While the idea of a digital interactive Film Noir experience is popular in all age ranges; it is most popular with younger participants.When asked about the likelihood of engaging with digital content in their local town; the results where consistent among age groups.

67%  18 – 24
53%  25 – 34
68%  35 – 44
67%  45 – 54

Generally, across the results there was a suggestion of some scepticism in the 25 – 34 age group and the greatest level of interest in the 18 – 24 while over 35s were generally in between these other two groups in terms of levels of interest.There was no disparity in the results in terms of whether the person was using iPhone or Android. Some other demographic filters produced some variations when answering yes to the question whether they would want to participate in a Film Noir experience.

93% Uni
93% PostGrad
76% High School
81% Single
96% Married
96% Male
80% Female

Overall the ideal target candiate, based on this data would be:

Male, Married, University educated and over 35.

Although all demographics in the 18 – 24 age groups responded positively across the survey, the quantity of participants polled is limited; but it serves as a litmus test for the general take up of the idea. Finally, there was one further result which I found to be noteworthy. I asked participants which other themes would be of interest to them other than Noir and the most popular response out of ten choices was Comedy (64%) and Romantic (43%). These are elements which could and perhaps should be included within the ongoing development of the gameplay of Noirscape. Noir film was certainly not without a sense of humour, albeit a dark one.

I always cry at weddings. Especially my own.

Humphrey Bogart, Film Noir Actor

Raw Data: Survey Results from March 2021

The first question seeks to get an idea about the general public’s perception of what people think Film Noir is. The keywords
‘love’, ‘detective’ and ‘crime’ came out on top along with ‘black & white’. This was more of an intro question to serve as a guide and a reminder as to what
is meant by Noir for the survey the participant is about to engage with.

I was interested to understand about perception of Noir in terms of suitability for younger audiences. My assumption is that the app would be for over sixteens.
But, it’s also true that at the time of the height of Noir cinema there was tight rules and regulations regarding content and language. Writers and directors made creative use of language and lighting, scenes and so on to evoke the ‘forbidden’ aspects of the content, without breaking the rules.

I wanted to learn about what an audience would do when given the power to drive the narrative themselves; to understand which Noiresque scenarios could be most popular.

The 100 participants who took part were pre-filtered. They all had previously agreed that they would be open to participating in a digitial interactive experience.
However, they were not told of the theme. So, question four is a good test to see whether the Noir theme is alluring or not. Those who answered no, were then taken straight to question 6.

Question 5 expands further on seekings to understand audience perception and preferences. This time in relation to role playing.

Question 6 was answered exclusively by those who said they would not wish to participate in a Noir themed digital interactive experience.
Six of the initially pre-filtered 100 participants seemingly changed their minds about openess to such experiences. Just 9 out of the initial 100 stated they didn’t like the theme.

Participants who stated that they didn’t like digital interactive experiences in question 6 were not asked any further questions. The remaining
94 participants were then asked about other themes they would be interested in either as well as Noir, or in the case of the 9 people who don’t like Noir, instead of it.
This question provides a good indicator of where Noir theme stands in relation to other more or less [assumed] popular themes.

The remaining questions were theme neutral and sought to find out more about what the audience’s perception of the nature of digital interactive experiences;
and some specific questions about assumptions carried by my proposed app.

Interactivity between participants would be a popular feature. 73% would prefer this. But, 27% is a significant enough proportion to warrant further analysis.
What are the reasons why, for example? Can those reasons be overcome by reassurances? Should the app offer a solo mode? This piece of research does not answer those questions.

The following question is troublesome is some ways, to the premise and assumption that the Noirscope experience can be ongoing and expanding.
Interestingly, it’s a very similar distribution to the previous question and further analysis could be done to look at any correlation between the datas from these two questions.

Question 12 validates an important assumption and premise of Noirscape – which of course is designed to be tailored to a participants locality.

Question 13 re-enforces my locality hypothesis further [along with Q 12].

Leaving one’s mark in a digital interactive experience is import to a large majority of people questioned.

Finally, along with the Q12 and Q13 the user response confirmed the assumption that an experience with elements tailored to ther local town could encourage people to engage more with real local places.

Interactive Rotary Phone Demo

I added some final sparkle to my vintage rotary dialler. Using various calculations I am able to predict which number has been dialled in a way this quite closely emulates the real thing. For example, a digit is only registered when the dial is turned fully to the catch.

I also added sound effects. Each digit has its own sound file to correspond with the length of time of the rotation which of course varies for each number. The sound effect also has to respond to when the dial is released too early and therefore has an irregular return rotation timespan.

Lastly, I added some texture to the background and some text using an style typewriter font which shows the participants name and their own tel number within the experience.

I had great fun putting this into the hands of my two childre, aged 9 and 12; they had no idea how to work a rotary dial phone. Who needs to invent enigmas and puzzles when we can just emulate the trials and tribulations of analog technology!

I also shared a demo with friends and colleagues as well as on social media and I was quite surprised by the number of people suggesting this could become an app in it’s own right. I haven’t researched what’s already in the app stores. But, certainly an interesting idea would be to make a standalone rotary dialler app which interacts with the phone’s call api to initiate real calls. I am already thinking about the fun that could be had creating different version; the 1970’s, 1890’s, 1950’s.

Anyway, here is a video I put together to demonstrate the feature in action.

https://www.youtube.com/watch?v=Kr54QQu3U4Q

Custom Animated Interactivity #Dialler

Problem: Calculating the button which has been pressed within a Painted Canvas

I created a telephone dial using Flutter’s CustomPainter API. This permitted me to draw the various elements onto the screen relative to the size of the screen. The problem now is : how to detect which number has been dialled. There are no workable Gesture detection strategies within the custom painter which allows me to directly detect on individual elements, like the numbers. I can however detect when the canvas is touched using the hitTest method of the CustomPainter class. This provides me with the screen coordinates (offset) of the x and y position of the ‘hit’; the position where the user has touched the screen.

Now, to be able to work out how that position corresponds to a number (on the dialler) I need to do some math. Given the size of the screen I can easily calculate the center by divided the x/y values by two. To illustrate this I used the CustomPainter to draw to red circles on my canvas; one at the center and one where a hit occurs.

Initial Dial Drawing with center point and touched point highlighted

Now, if I can calculate the angle from the position of the calculated center to the position of the hit I will be able to map the value to a known position of a number on the dial.

In the above example the tapped area shown on the left at the number seven is 50(x)  and 240 (y). It’s important to note that the values used by Flutter are logical pixels and not physical ones. So these values don’t correspond to the actual screen resolution. In the case here, the red spot at dial seven is 50 logical pixels along the x axis and 240 logical pixels along the y axis. These are both relative to the top-left of the  canvas.

The center, in my case, working on a physical Pixel 3 XL, is at 205,205 (the canvas is therefore a square with 410 logical pixel sides)

I can now calculate the angle from the center of the canvas to the hit point using Dart’s built in atan2 function

double rads = atan2(position.dy - (canvasSize.height / 2),
    position.dx - (canvasSize.width / 2));

ref : https://api.flutter.dev/flutter/dart-math/atan2.html

Using the x and y coordinates above equates to the following:

rads = atan2(240 - 205, 50 - 205);

Which returns 2.92 radians (rounded to 2 decimal places)
To calculate the angle in degrees, Dart has a build in function. Otherwise the formula is :

(2.92 * 180.0) / pi

Which returns 167 degrees and is confirmed by Google:

Google Calculator to validate Radians and Degrees conversion
The Dial with a protractor image from Mathsisfun.com superimposed to show the angles look right

Dart’s atan2 function returns a range from -pi (3.14159…) to +pi. And pi * Rad equals 180 degrees. So the final value in degrees will always between -180 and +180. Therefore the sign (+/-) determines whether the hit was in the upper or lower half of the dial.

I can now calculate which number is being dialled  based on the initial hit.  The next step is to prevent the dial from turning beyond the stopper. The stopper was used as a catch on rotary phones to indicate the desired number before automatically returning the dial back to its original position. The caller would therefore place the finger inside a circular opening above the desire number and rotate the dial clockwise until they reached the stopper. The numbers remained stationary and the angle of rotation was used to calculate each digit of the telephone number to call.

Interactive Components

The Noirscape participant collects fictional objects in the Augmented and Immersive reality parts of the experience. Beyond the fun of finding these items and seeing them appear in-situ on your living room table; they also provide important props and gameplay. The objects are associated with actions. Sometimes an object will have multiple actions, sometimes they’ll depend on other objects to work together. Here are a couple of examples of interactive fictional objects I’m currently developing.

Interactive rotary dial built using Flutter Custom Painter

A phone is discovered within the AR space and now appears in the ‘found objects’ screen. The phone has a ‘call someone’ action. A participant will need to find the right number; maybe its on a scrap of paper, hidden in a desk drawer or in the pocket of a recently murdered man. Eitherway, the participant will be able to have some fun with the interactive object.

The vintage phone founds in AR space and now listed in the user’s found objects inventory

My aim with this object is to allow the participant to operate an old fashioned rotary style dial phone. I have used Flutter’s built in painting library and decorators to build the interface for this. The interface is built up of shapes and lines which are calculated with elementary maths. For example the numbers are positioned dynamically. I references a previous project I worked on to build an art-deco style clock; as the phone dial has a similar approach. However, the numbers are not in the same order, and they are not evenly spaced around the dial face as they would be on a clock face. But, the previous project helped me get started.

example snippet of code to position the dial numbers. Among the challenged is to make the digits (text) position vertically by default.

I took a few screenshots as the painting progressed. The end result looks quite satisfactory. The next stage was work on making this interactive. I am currently able to rotate the dial at specific angles and I am also able to detect a user’s drag interaction with the screen. My next task is to combine these together so that a user can turn the dial and the dial then returns back automatically to the starting position. When a correct number (one found within the gameplay) is dialled; the user learns new narrative or clues. When they play around with random numbers; they’ll be some fun too.

stage one
stage two

stage three
stage four
stage five

Interactive Door and Keyhole

When the participant finds an old fictional wooden door in their living room they notice it has a keyhole. Clearly there will be a key to find somewhere. But, until they do find the key, they can still interact with the door by peering through the keyhole.

Interactive door found in AR

The keyhole idea came about by accident while working on the AR object scanner. As I implemented a vignette to give the AR mode a noir feel, I mistakenly set the vignette to the wrong settings and created the pin-hole type effect. Although I will improve the shape and look of the keyhole view, it does work well, I find. I have restriced the viewing angles to represent the constrains of looking through a keyhole; as clearly it wouldn’t be right if you could see a full 360 degree view! The 360 image viewer works well with animated webc files too. I also tested with a image using the new google image format (webp) which allows GIF style looping animations.

Narrative Props & Backstory

While, in itself, the narrative of Noirscape is to be interpreted at will by it’s participants, the structure of the environment in which the experience will take place will of course provide the underlying props and backstory along with the opportunity for interactivity with these props, people and different types of spaces – fictional, semi-fictional and realworld.

To help anchor my own understanding of the interactive aspect I have put together a flowchart to represent a snapshot of the app’s experience flow. This diagram illustrates how the narrative interactivity moves between levels of fiction, semi-fiction and realworld spaces while providing the participant with the props, events and actions that make the app ‘playable’.

Flow chart to illustrate process of moving between ‘realities’ within a single narrative.

While this diagram only illustrates a section of the app’s world it would represent a satisfactory achievement if I am able to build the functionality over the next four to six weeks. From then on I would be able to focus on adding new interactive items, 360 events and so on. The functionality to power the above snapshot is to be fed by cloud based data; hence I am able to add new elements on-the-fly; without having to bake them into the app itself and have many releases.

Found Objects

A common theme throughout the experience is the aquisition of Noirscape objects. These objects can be found in the Augmented Reality (AR) part of the app, the immersive 360 world as well as outside in the local town.

The objects will each have their own unique properties and actions. Actions represent what a participant can do with each item. For example, as per the flow chart, the participant will experience the old fashioned realsized doorway appear in their own home within the AR part of the app. They will learn that the door is locked as each item has a description. A common action in choose your own adventure is ‘examine‘. The participant may examine the door. The door object will be pre-configured to possess this ‘examinability‘ quality along with a consequence of carrying out the action. In the case of the door, the participant will find a keyhole as a consequence of the examine action. At some point they will surely find a key with which to ‘open‘ the door. Until they find the key, though, it remains shut. The keyhole is also treated as an interactive object which the participant has found. Albeit, it cannot be removed from the door, which is the parent object. However, the participant can ‘look through‘ the keyhole and this will reveal a keyhole view of a fictional space on the other side of the door.

An object (door) with available actions

This transitioning through nuances of fiction and reality is an important aspect to Noirscape and very much inspired by my research into the success of Bandersnatch but also the creative influence of other filmmakers and indeed videogame makers and thinkers who are keen to explore and expoit these blurred bounderies.

Refs/Resources

Text Adventure Game Design in 2020, Chris Ainsley
https://medium.com/@model_train/text-adventure-game-design-in-2020-608528ac8bda