360 Flashback Interaction

The challenge is to keep the app’s media files small so that they can be served on-demand from a content delivery network (CDN). This ensures only content that is relevant to the current user is downloaded and the main app size is kept small. Instead of using a mp4 or other type of movie file to convey the town 360 scene I am using the new Google webp format which I have previously validated within the 360 Flutter component. My aim is to keep this file as small as possible so it can be quickly served from a remote server. Other content such as voice and 3d characters which are part of the experience but not specific to a town  may be compiled as assets with the build as they will not need to update as often as other interactions. There is also the option to stream the non-town specific video over the underlying 360 animated webp image file.

Using an Adobe Media Encoder plugin I was able to export a short section of my 360 movie into a webm file. The movie version of the webp format. However, this format turned out not to be supported by the 360 component I am using. So, I am looking to convert to webp as I will not need embedded audio; which can be played from a separate file.

I found that Google provides decent documentation for webp as well as a number of command line programs to help convert.

https://developers.google.com/speed/webp/docs/using

The conversion worked nicely. I am now able to open up my town scene with movement; in this case rainfall. However, (there is always a however…) the 360 plugin supports the standard Flutter Image widget which in turn support webp animated images, but I am so far unable to loop. So, the animation stops after the final frame.

The tools I am using can be downloaded here:

https://storage.googleapis.com/downloads.webmproject.org/releases/webp/index.html

Instruction for configuring the lightweight image sequence with Google tools followed by deployment to the CDN

Adding Sound and Interaction

As the 360 visual content of the spatial narrative is town based; the relevant media file is served from the CDN. Its important that the image files are as optimized as possible do they transfer quickly. Sound effects will be embedded within the app as they are common to all users, whereas narrative speech is also, like the town based visual content, based on the user’s language but also the path they take through the adventure based on their decision making. So, it also makes sense to serve this content from CDN. Speech files do not need to load instantly as they are usually triggered through an interaction. So, the final recipe involves playing embedded sound effects and an embedded visual effect while the main 360 media loads. The media is then cached so the slight delay is only noticable on the first playback. The narrative sound is then played on top of the 360 visual content at the appropriate moments.

For the sound I have opted for an AI generated voice; albeit with intonation and a Hollywood accent.

Interactive 360 Scene Demo

Spatial Narrative Content

Noirscape experiments with a cross reality approach to narrative. Participants search for and find fictional objects in their own physical home through the AR (augmented reality) feature. One of these objects is a door and it’s keyhole a doorway between an augmented view of one world and the entirely fictional world of another. Furthermore, Noirscape binds the narrative with physical spaces in the nearby town. In my case and for the purpose of the pilot version of Noirscape, this is the French town of Bourg-en-Bresse.

I previously carried out fieldwork collecting 360 panoramic photographic content in and around the town; at over twenty locations selected not neccesarily for their prominence, but also for their intrigue; whether that be a curious street name, a bygone brewery turned warehouse or the site of a house where a celebrated artist and philosopher one lived. The noirscape experience will take participants into their town where segments of narrative are reveals through what I call flashbacks – these are a staple component of film noir; where the protagonist, who is usually since deceased or condemmed to a life of prison, recounts events from the past which played an important role in their current whereabouts [or absence of].

Opening Sequence to Sunset Boulevard, Paramount, 1950

My challenge is to take my 360 content from the town and combine it with fictional Noir narrative to give an augmented or combined immersive experience whereby the content is triggered only by visiting the place in the physical world and at which point, a flashback from a fictional past occurs. To achieve this I decided to work with digital 3d character creation and animation. I had previously arranged to work with a friend who is also an actor; but, it’s complicate right now with the pandemic, to meet up and spending enough quality time to get something filmed; I was planning to use my green screens and then take the content into the 360 editor using Adobe After Effects and Premier Pro. One thing lead to another and I opted for digital characters. I initially hoped I’d be able to use Adobe software but they have discontinued their Fuse product which was a character designer app that could be used with Mixamo, their recently acquired character animation service. I decided to use Reallusion’s Character Creator instead due to the vast amount of resources available. I used Headshot, their AI face generator to base character on my own face (although I’ve reworked it somewhat since!) and I imported custom objects like a Fedora hat and set up the character in a black coat.

A base character in Reallusion Character Creator software with an AI interpretation of my face projected onto it.
My clothed and hatted character in a T pose
Closer shot

Experimenting with different predefined pose templates

Next I took the character into iClone, Reallusion’s 3D animation suite. The challenge with iClone was to be able to bring in my 360 photo and create my own scene within the panorama. However, I ran into problems with this at first. While export to 360 panorama format is suported in iClone, I couldn’t achieve this using photography without experiencing problems with the way the image was being wrapper; due to distortion at the poles of the sphere if the Skybox object. The Skybox object in iClone and more generally in 3D design, is the imagery used to define the farthest most visible details; this would normally be the sky, hence the name; but may also be a distant mountain range. Usually this would only be thought of as a backdrop, with far more focus on the foreground and midground detail. In my case the Skybox would be represented by a complete 360 photo, in which I would place 3D assets like a person, a vehicle, etc.

Example of 0 degrees (ground) when 360 is wrapped within Photoshop;

Ground shot taken in iClone with the same 360 photo set as the Skybox image

I discussed the issue in the Reallusion support forum; and one solution put forward was to create my own 3d sphere object and set my 360 image as the texture. This did produce a slightly better outcome but not satisfactory enough for what I need. The Reallusion is fantastic nontheless; what I am seeking to do is certainly not a typical user-case by any means. One really good feature with iClone, and one of the key reasons for settings a photo as the Skybox, is for calculating light within a scene. The iClone software will identify from the image, in my case the 360 photo, which direction light is coming from and therefore where to cast light and shade within the 3D assets added to the scene. So, although I chose not to use iClone with the 360 photo visible, I still used it for the lighting work.

Scene from within iClone with my 3D character and other assets placed within my photo.

Within iClone I applied some subtle animation to my character; his tie blows in the wind and he blinks and moves a little while he waits for his rendez-vous. I applied rain effects with splashes and flickering light effects. In order to export my animation without the Skybox image so that I could bring it into Adobe After Effects I would need to export as an image sequence so ensure a transparent background. The sequence is 30 seconds long and 30 frames per second; so the software rendered 900 images in total which I then imported into After Effects.

Within After Effects the first challenge was to align the two-dimensional representation of my sequence within a 360 environment. If I place it as-is then it will be foreably bent into a banana shape as it is interprated through a 360 viewer. So, to avoid this, it’s important to alter the curvature of the 2d assets to align with the 360 image in equirectangular-panoramic format.

The 2D animation curvature is altered to match that of the 360 scene so that when wrapped into a sphere it looks correct.
My Animation positioned within the 360 photo with field of view warping to match 360 sphere position.
Adobe After Effects Settings Using the VR Plane to Sphere effect to warp the field of view.

I’m generally pleases with the outcome and although it took quite a bit of time to get what I wanted, I now have a documented workflow for the process; I have a character ready to deplot to new scenarios and the knowhow to create others much more quickly. A small issue I have with the end result is that the animation is too subtle to really see properly on a mobile device; but this is easily tweaked. For now, I’m going to settle with what I have for the purpose of integrating with the app. The next step is to create a looping image based version of the scene in webp format as I have shown in a previous post. I will then play the audio channel, with the voice narration and sound effects via the app/device rather than the media file itself. This will keep the size of the media file down and allow me to serve the localised element (the view using footage from a specific town) and the global content – the spoken narrative.

Mobile phone view of interactive scene
Interactive YoutTube Version

Photoshoot and After Effects Improvements

During my recent experiments I have been working with Adobe After Effects to apply special effects and animations to 360 degree video footage. My first experiments were quite successful. I was able to do what I set out to do, with a couple of areas that would need some further attention. One such area was the exportation format from After Effects which was producing an unusual effect whereby the 360 image was not wrapping correctly.

In the image below it can be seen how a 360 image looks when flattenned out. There are a multitude of settings and formats within After Effects related to 360 formatting; some of these depend on the destination platform. For example, whether the video will be used in VR or as a Facebook or Youtube video. After many, many cases of trial and error with different formats [I was unable to find sufficient documentation for my specific case within the Adobe support] I was able to output a satisfactory video, complete with the colour filters and rain effects.

360 image in After Effects

I took about 20 pictures at various locations around my town. I had previously used 360 video as the source format. However, I have since opted to work 360 photos which I then apply animated effects and further layers of superimposed footage within the Adobe suite of software. This approach means I can opt for a higher defintion of the underlying image source (a photo is higher resolution that a frame) and it also means the final video size is much smaller. A further advantage is that it more straightforward to edit out any unwanted elements. For example, I took one of my photographs into Adobe Photoshop and used it’s 3D tools to remove the camera tripod and the shadow it was casting.

Initial image after some photoshop clean up

Below, the image can be seen, within Photoshop – usually associated with flat images, with filters applied and the ability to move and edit in 360 mode.

Manipulating a 360 image within Adobe Photoshop

The following short clip demonstrates how what was originally static colour 360 image can be transformed into a more lively and ambient scene with just a few special effects.

After effects applied to animate a static image

Using this approach, I will be able to create some interesting scenes, by adding further animated scenarios and providing clues to the player about how to proceed wihin the gameplay.

Delivering 360 Video from Device Assets

In my previous post I wrote about the 360 video viewer plugin I am experimenting with using Flutter, a cross-platform development framework, and a practice clip I edited within After Effects. While there is plenty of support for 360 images within Flutter, I was surprised to only find a single plugin to support 360 video. The plugin uses a Google SDK for Cardboard VR. Although Google has since released a newer SDK for cardboard, I only need limited functionality – to view a 360 video, so I am hoping this plugin will suffice for me.

The plugin uses Google VR SDK

In my previous experiment, there was a fairly serious limitation, in that I had to call the 360 clip via a URL.

The plugin only supports 360 video via a URL

360 video files can be quite big, the short ones I have been experimenting with are around 10Mb and even on a fast wifi connection there was a significant delay while the video is downloaded in its entirety before the 360 video player starts playing. This would provide a poor user experience in terms of the wait. I could potentially add some gameplay functionality into this waiting time; but it’s still a big ask – to download large files; especially if the user is outdoors and using mobile data, which may be limited or even expensive to the user.

I spent some time analysing the plugin code as well as the underlying Google VR SDK to see how much work would be involved should I want to implement, myself, a feature to allow local, on-device files to load, rather than via a url. I found that this would not be a trivial project, so I sought an alternative approach.

I could not expect my users to access large video files via their mobile data so I had to think up a new solution. Rather than develop new functionality for the plugin or build my own implementation of a VR SDK plugin, I decided to experiment with setting up a mini http server within my app and thus serving the 360 video asset from a local server url.

Dart, the programming language which underpins the Flutter framework includes built in HTTP server functionality via its HTTPServer class.

Dart’s HttpServer class

Using this approach I would be able to create a lightweight http server within my app from which to serve the 360 videos and satisify the 360 video viewer’s play-from-URL requirement. I even found an existing Flutter plugin which uses this technique and maps http requests directly to a specific assets folder.

Flutter plugin implementing the http server class for a Flutter app scenario

I built a test app combining both the local assets server and the 360 viewer along with my test 360 clip and the video now loads much quicker using the locally served file. This means I can now consider either bundling all my app’s 360 content into the app build itself, or downloading them to local filespace as and when the user has wifi access. Thus, overcoming the problem of slowness and using a user’s mobile data for accessing these clips from the web.

The 360 clip now loads instantly when the button is clicked.

Testing 360 Video with Flutter App Development

Following on from my 360 video experimentation in Adobe After Effects I wanted to test out options for embedding my 360 videos within a a mobile app using Flutter.

I found a number of existing packages available for 360 images, however, running 360 video is less common. I found one package called video_player_360 which acts as a plugin for the Google VR SDK for both iOS and Android allowing videos to be played via a remote URL.

I have a couple of reservations about this package. For a start it using a Google plugin which is now archived. Although just for using the 360 video functionality, this should not be a problem. Although, I will be worth me looking at the new SDK which Google has released as a replacement to see if this has sililar video player functionality and maybe I can build a new Flutter plugin using this updated Google VR SDK.

New Google VR SDK for Adnroid and iOS

The second issue I have with this plugin is that it only works with remote URLs. This means that a user must be connected to the internet and given the nature of 360 video, these can be quite big files even for a short clip. In my test I was using a 12MB file hosted on Github. My phone was connected to my office WiFi and it took a good 5 seconds to load up. Once it did load, the video worked nicely on my mobile device (Google Pixel 3). I was able to move the phone about and experience the immersive effect of the video.

Testing 360 Video within a Flutter mobile app

360 Film Experimentation

A feature of my app concept is to use footage from within local towns. I would like to use 360 film if possible, for several reasons.

  1. It will provide a more immersive user experience
  2. It will provide opportunity to hide clues to the game within the virtual environment which are a little more challenging to find
  3. It will generally make for a more unique and innovative feel to the app

There are several areas I need to explore and experiment with in order to test the feasability of this idea.

I would like to explore the possibility of shooting studio footage – for example of character acted scenes, using a green screen which would be integrated into the town footage. This would mean I could reuse the same studio footage in multiple towns and therefore offer the app to a greater number of users. As my theme for the app is Film Noir, I could also use other props and footage such as period vehicles and so on.

I have began to experiment with 360 footage and stock character footage using Adobe Premier and Adobe After Effects.

Inital 360 video footage in colour

I started by opening a short 360 video from a local woodland in Adobe Premier. I then applied some filters to create a black and white affect.

360 Footage with Black & White Filter

Next I took the footage into Adobe After Effects to add rain and a more darker ambience. The rain and darkness will be a common theme throught my ‘Film Noir’ style app.

My next tasks was to test superimposing footage filmed on a green screen. For testing purposes I grabbed a free clip from Pixabay. Using the Keylight plugin provided with After Effects a transparency is created allowing the content to intergrate within the existing scene.

Keylight helps to convert the green screen footage into a transparency
Keylight Transparency

In order to make the 2d footage work within the 360 environment, I had to convert the layer to a 3d layer and use VR tools within After Effects to anchor down the layer within the 360 sphere so that it remains in place even when the viewer navigates around the 360 environment.

Here is the 360 version on Youtube:

Conclusions and Next Steps

I found the experiments with 360 film quite promising. There are a few areas I need to iron out though. The output from After Effects is producing an ill formatted sphere. This can be seen in the Youtube video above. I am not sure why this is happening. The view is fine within After Effects so I think it is a problem within the export settings. The VR controls in After Effects are still quite daunting to me. But I’m pleased I’ve been able to apply some effects like the ‘Noir’ filter; the rain and mist effects as well as bring in the extra character.

My next steps are to iron out the export issues and make another short clip using new 360 footage from my town which I intend to capture this coming weekend. I will also film some test acting scenes using a greenscreen and try getting together a solid example of a clip suitable for use in my app.

I would then like to test the integration of 360 film into a Flutter app. To see what controls are available.

One consideration that I am now aware of, is that the output file sizes are quite big. This could be an issue with a mobile App. I will seek to keep the clips as short as possible and try to loop and use Flutter for the audio rather than the video. This way I can keep the video files as compact as possible. It will be a good idea to look at managing downloading forthcoming videos for the gameplay and removing previous ones, so that the app is not a huge download size and doesn’t become more and more bloated either.