Film

Voodoonaut – Feature Film

Associate Producer, Audio Mixing

Music Videos:

Colossus – Strike Up The Band

Performance, Editing

Colossus – Without Me

Performance, Editing

Colossus – Sharp As A Knife

Performance, Creative Development, Video Editing

Stash Magnetic – My Future

SFX

Dark Days – Coldin Berlin

Camera operation, SFX, Creative development

The Power – Coldin Berlin

Lighting, SFX, Creative development

DREAM

DREAM is an EC-funded project that will deliver the next generation robot-enhanced therapy (RET). It develops clinically relevant interactive capacities for social robots that can operate autonomously for limited periods under the supervision of a psychotherapist. DREAM will also provide policy guidelines to govern ethically compliant deployment of supervised autonomy RET. The core of the DREAM RET robot is its cognitive model which interprets sensory data (body movement and emotion appearance cues), uses these perceptions to assess the child’s behaviour by learning to map them to therapist-specific behavioural classes, and then learns to map these child behaviours to appropriate robot actions as specified by the therapists.*

I worked on this project as a research assistant between 2015-2019.

Briefly, the main challenges addressed by the DREAM project therefore are:

  • Child-robot interaction strategies: We focus on evidence-based therapies suitable for improving social interaction skills. The robot will assist the therapist in teaching the child social skills like turn-taking, imitation and joint attention.
  • Cognitive social behaviour for supervised autonomy: We will develop a new platform-independent cognitive controller based on the needs of human social interaction, exploiting the propensity of people to “fill in the gaps” in social interaction by generating behaviour that facilitates interpretation by ASD children.
  • Child-specific behaviour assessment: We will create a model that can track and quantify changes in child behaviour based on off-body sensory data as well as provide quantitative inter-individual comparisons to assist therapist in their tasks.
  • Multi-sensory data fusion and interpretation for diagnostic support: Multi-sensory data will be used to provide quantitative support for the diagnosis and care/treatment of ASD, replacing current labour intensive techniques involving paper and pencil, or manual video analysis.
  • Ethics of human-robot interaction: We will ensure that the technical design of the robot complies with existing ethical, social, and legal norms. We also address new ethical and legal issues raised by the particular kind of interaction that emerges in the interaction between children and autonomous robots in therapeutic contexts.

* (text copied from www.dream2020.eu)

Sweet Dreams

“Champagne rain, custard lightning, lakes of swan soup, forests of black forest gateaux.” The tasty morsels of food and drink you consume in the experience have been developed by Mark Garston and Chef Michele Stanco, both whom are former team members of Heston Blumenthal’s Fat Duck Group.

Sweet Dreams was a project I worked on with a group of artists from Marshmallow Laser Feast.

Sweet Dreams was developed with the support of the BFI’s Film Fund

Ethereum Visualisation

Freelance job working with Marshmallow Laser Feast to create a visualisation of the Ethereum network. The piece was displayed at the annual Blockchains developer conference, being displayed on two screens, an LED wall displayed behind a holographic scrim layer. The visualisation was developed as a representation of the network’s history, evolving and being manipulated by historic data across the various forks.

eth-3

eth-2

eth-1

Midnight Turbo!

So, not actually a game.. more like an interactive music video toy sorta thing.. This was my entry for the 2018 Global Game Jam. I worked by myself and created a sort of plaything that you can fiddle with while listening to this rad retro wave song I found online (Midnight Rider by Fazzio – https://fazzioretro.bandcamp.com/track/midnight-rider).

https://globalgamejam.org/2018/games/midnight-turbo 

Swoop along! WooHoo! (You need an xbox controller to play this.. its for mac so you will prolly need to download the drivers from somewhere, lol)

Midnight Turbo

Wavelength

Wavelength is a casual ‘dot-to-dot zen’ game where players try to link as many coloured dots as possible before the timer runs out.

Play it now..

Follow the project

Honours:

  • Game Republic Student Showcase at the University of Leeds – Honourable mention for Best Game Design
  • 2017 Global Game Jam (Jork Jam Site) – Best Art (on site), and honourable mentions for Best Audio, Best Concept and Best Overall Entry

Press:

What Does It Take?

Interactive video with sound *Collaboration with Marisa Tapper

A monitor or projected screen display the feed from a camera hidden below it and directed towards the viewer. The feed cycles through several forms of distortion intended to convey a digitisation of the viewer by the machine. A robotic voice drones out the inner monologue of a computer consciousness as it tries to figure out the world around it. The monologue orates a narrative alluding to the machine’s arrogance and subtle disdain for the human race. The machine however, is also lonely and seems to crave interaction with the audience. The piece is intended to highlight our projected anthropomorphisation of machines throughout modern era, but especially in the face of growing interest in artificial intelligence and virtual assistants.

Between

Points of light float towards the player in a patterned movement. The player stands in front of the piece holding a PS Move controller (a small baton illuminated at one end to act as a computer vision tracking beacon) which controls the movement of an eye-like glowing avatar on screen. The player’s goal is to move the avatar in line with the on coming points of light and collect as many of them as possible. As the player collects them, a narrative starts to unfold verbally.

The theme of the piece relates to the concept of reincarnation and thus the narrative follows the story of a character’s life, highlighting a key moment from their experience as a human. At several gameplay thresholds (or levels), the player is confronted by a large rotating geometric shape which they must pass through. If the player has successfully collected enough light, they will pass through the shape into the next level where the process continues. The light is symbolic of forgiveness, harbouring the idea that souls will repeat their journey’s through the earth plane learning lessons and letting go of their pent up karma. Each level is symbolic of a different chakra being represented by the colours, sonic palette, and narrative theme.

If the player fails to collect enough light to pass through any of the seven levels, they are returned to the beginning of the experience but with a different narrative signifying that the soul has gone back to the earth plane, lived another life and is now attempting ascendance again.

The piece was originally developed as an installation exhibited in Sweden and Denmark with the visuals projected onto a large screen and participants standing before it, their movements being tracked by a PS Move tracking system. A later version of the project was done in VR.

We’ve been porting Between over to the Oculus Rift recently (WIP):