Sweet Dreams

“Champagne rain, custard lightning, lakes of swan soup, forests of black forest gateaux.” The tasty morsels of food and drink you consume in the experience have been developed by Mark Garston and Chef Michele Stanco, both whom are former team members of Heston Blumenthal’s Fat Duck Group.

Sweet Dreams was a project I worked on with a group of artists from Marshmallow Laser Feast.

Sweet Dreams was developed with the support of the BFI’s Film Fund

DREAM

DREAM is an EC-funded project that will deliver the next generation robot-enhanced therapy (RET). It develops clinically relevant interactive capacities for social robots that can operate autonomously for limited periods under the supervision of a psychotherapist. DREAM will also provide policy guidelines to govern ethically compliant deployment of supervised autonomy RET. The core of the DREAM RET robot is its cognitive model which interprets sensory data (body movement and emotion appearance cues), uses these perceptions to assess the child’s behaviour by learning to map them to therapist-specific behavioural classes, and then learns to map these child behaviours to appropriate robot actions as specified by the therapists.*

I worked on this project as a research assistant between 2015-2019.

Briefly, the main challenges addressed by the DREAM project therefore are:

  • Child-robot interaction strategies: We focus on evidence-based therapies suitable for improving social interaction skills. The robot will assist the therapist in teaching the child social skills like turn-taking, imitation and joint attention.
  • Cognitive social behaviour for supervised autonomy: We will develop a new platform-independent cognitive controller based on the needs of human social interaction, exploiting the propensity of people to “fill in the gaps” in social interaction by generating behaviour that facilitates interpretation by ASD children.
  • Child-specific behaviour assessment: We will create a model that can track and quantify changes in child behaviour based on off-body sensory data as well as provide quantitative inter-individual comparisons to assist therapist in their tasks.
  • Multi-sensory data fusion and interpretation for diagnostic support: Multi-sensory data will be used to provide quantitative support for the diagnosis and care/treatment of ASD, replacing current labour intensive techniques involving paper and pencil, or manual video analysis.
  • Ethics of human-robot interaction: We will ensure that the technical design of the robot complies with existing ethical, social, and legal norms. We also address new ethical and legal issues raised by the particular kind of interaction that emerges in the interaction between children and autonomous robots in therapeutic contexts.

* (text copied from dream2020.eu)

Ethereum Visualisation

Freelance job working with Marshmallow Laser Feast to create a visualisation of the Ethereum network. The piece was displayed at the annual Blockchains developer conference, being displayed on two screens, an LED wall displayed behind a holographic scrim layer. The visualisation was developed as a representation of the network’s history, evolving and being manipulated by historic data across the various forks.

eth-3

eth-2

eth-1

Voodoonaut

“Has your imagination been colonised?”

See more at voodoonaut.com

Voodoonaut is a weird art film disguised as a science fiction experience. from visionary director/creator John Clay. Between 2022 and 2024 I assisted in the creation of this film primarily taking on the roles of Assistant Producer. In addition to this role, I wore a number of other hats such as creative consultant, audio editor, camera technician, vfx engineer, composer.

To save the world from an otherworldly threat, Lt Julia Earle’s consciousness is projected into outer space. All is well until her body disappears.

Featuring the acting talents of Sofia Martins Gray, Camille Alexander, Lars Chittka, Barbara Pugilise, Hazel Turnock, Nicholas Anson, Patrick Lyons, Ashleigh Cole, Patrick Lyons, Sarah Merrifield, Ziwa Gwatidzo, Liam Rigney, Annalie Wilson, Ola Kitchen, Nathan Ridley, Iggy Crespop, Arran Goodchild, Kyriaki Karadelis and Neil Anderson.

Filmed By Lou Smith, Neil Anderson, Sofia Martins Gray, Aloha Dead, Rob Homewood, Ludovic Cairtey, Levant Kucukkaya, Sergio Angot, Shogo Hino and Andras Paul.

‘It’s a broken alien black box recorder of a puzzle movie. This film isn’t wilfully trying to confound you, it just has it’s own way of talking. There is no wrong answer, there is only your need to question cinematic space exploration.’

John Clay, Director

Instagram –   / voodoonautfilm  
Facebook –   / chemicallysinister  

Voodoonaut Soundtrack –    • Voodoonaut – The Soundtrack  

Music Videos

Colossus – Strike Up The Band

Performance, Editing

Colossus – Without Me

Performance, Editing

Colossus – Sharp As A Knife

Performance, Creative Development, Video Editing

Stash Magnetic – My Future

SFX

Dark Days – Coldin Berlin

Camera operation, SFX, Creative development

The Power – Coldin Berlin

Lighting, SFX, Creative development

Midnight Turbo!

So, not actually a game.. more like an interactive music video toy sorta thing.. This was my entry for the 2018 Global Game Jam. I worked by myself and created a sort of plaything that you can fiddle with while listening to this rad retro wave song I found online (Midnight Rider by Fazzio – https://fazzioretro.bandcamp.com/track/midnight-rider).

https://globalgamejam.org/2018/games/midnight-turbo 

Swoop along! WooHoo! (You need an xbox controller to play this.. its for mac so you will prolly need to download the drivers from somewhere, lol)

Midnight Turbo

Wavelength

Wavelength is a casual ‘dot-to-dot zen’ game where players try to link as many coloured dots as possible before the timer runs out.

Play it now..

Follow the project

Honours:

  • Game Republic Student Showcase at the University of Leeds – Honourable mention for Best Game Design
  • 2017 Global Game Jam (Jork Jam Site) – Best Art (on site), and honourable mentions for Best Audio, Best Concept and Best Overall Entry

Press:

What Does It Take?

Interactive video with sound *Collaboration with Marisa Tapper

A monitor or projected screen display the feed from a camera hidden below it and directed towards the viewer. The feed cycles through several forms of distortion intended to convey a digitisation of the viewer by the machine. A robotic voice drones out the inner monologue of a computer consciousness as it tries to figure out the world around it. The monologue orates a narrative alluding to the machine’s arrogance and subtle disdain for the human race. The machine however, is also lonely and seems to crave interaction with the audience. The piece is intended to highlight our projected anthropomorphisation of machines throughout modern era, but especially in the face of growing interest in artificial intelligence and virtual assistants.