Kusmos Live is an innovative audiovisual software developed by the Russian design studio Kuflex Lab.

During these tough and uncertain times it has been inspiring to see so many audiovisual artists and events coming up with ideas to overcome the social distancing restrictions.

A few weeks back the guys from Kuflex Lab reached out to showcase their latest tool that stands out among all the virtual/hybrid solutions we have seen so far.

Kusmos Live allows to turn online concerts into interactive digital shows. Pretty neat uh?

Kusmos - Kuflex Lab - Audiovisual Interactive Tool

Let’s dive in and explore how it works. Down to the nitty-nerdy-gritty Kusmos is a custom VJ software written on openFrameworks + VDMX5 + Syphon.

In spring 2020 Kuflex studio began an experimental project Kusmos Live. The purpose of the experiment was to upgrade the Kusmos system in order to create a interactive online home shows.

In 2018-2019 the Russian audiovisual studio experimented with the innovative tool together with SILA SVETA studio on the Therr Maitz concert, Caprices Festival 2018, Nina Kraviz experimental performance at Coachella 2019.

OK, really cool but how exactly does interactivity work? We gave some feedback to Kuflex lab with a few tricky questions to better understand the whole potential of this new audiovisual tool.

KUFLEX: The Tracker program receives depth-sensor data, calibrates a point cloud as we need it (we can rotate the cloud, cut off everything you don’t need keeping only the data of the artist oneself and merge point clouds from two sensors) and sends it to the renderer (UE4 scene).

Scene functions build a 3D model using the data provided by the Tracker and then we can layer all kinds of features with effects on the model and also transform and distort it according to the artistic concept.

We also captured video from the laptop’s webcam and sometimes showed its picture. 3D scene acting as the artist surrounding, a set of virtual cameras for capturing from different viewpoints and visual effects were set up in advance in the Unreal Engine.

Then the OBS program captures the video of the launched scene and sends it to the video streaming server.

Kusmos Live - audiovisual tool

During our second live experiment we tested some new features for the viewers interactive communication with the stream.

While Leksha (Smolensk, Russia) was playing his ambient-set, a VJ (Moscow, Russia) was controlling visual effects with the use of commands via YouTube chat in real-time mode. Our team has implemented this function between the concerts and decided not to tell the viewers about it.

During the stream, noticing weird messages in the chat, some of the viewers started to realize that they could not only send a message to the chat but to even affect visualization.

In the end the concert has turned into a digital quest. Some viewers picked up effect control by sending particular commands to the chat. We have yet to comprehend how to develop this function in the future.

 AVC: We think it’s a really interesting and innovative project responding to the challenges of the pandemic. In terms of appeal for VJs there is the risk of potentially being extremely limited for the audiovisual artists creativity, and the aesthetic I guess would always be the same.

It would be key for the tool to have a wider range of possibilities of customization for the artists otherwise they might get tired soon. I guess they all want to leave their unique mark in the scene.

KUFLEX: Kusmos is a universal software tool, with the possibility of variability of visual and interactive solutions. As a rule, our team creates a virtual stage specifically for the performance of the musician.

Of course, we want to upgrade the program creating a database with different scenes, effects. In this case, the user will be able to construct the scene himself and combine the effects for his live/stream. 

AVC: It’s a bit unusual to keep promoting the idea of “God is DJ”. Is the DJ persona so relevant the viewer wants to watch during the entire show in a virtual environment?

KUFLEX: Regarding Kusmos Live project, the Kuflex team is collaborating with various musicians. We wanted to support the performers.

So this approach determined the emphasis on the figure of the musician on the virtual stage, under whose musical personality, sound we come up with a visual solution. We do not just shoot a video with a musician, as is often done in broadcasts, but create a digital avatar that changes depending on the script, music and VJ control.

At the same time, Kusmos Live is primarily a show, a kind of mix of live performance, a computer game and fantastic movie.

The virtual camera can fly through digital space, we can switch to different visual elements of the scene and include additional visual effects to the music.

Kusmos Live - Audiovisual Tool

We try to achieve the effect of real interaction with the viewer as well. Our team is developing a function of interaction through chat – viewers’ comments fall into the scene, they can affect the content through certain chat commands. But Kusmos can be used by artists of other genres.

In the near future we want to try to create a dance performance. Now we are discussing this idea with one Russian choreographer.

We will explore the topic of distances – where physical space ends, digital begins, the relationship of body and sound. Both dance and music will take place in live format.

Technically, the performer will find himself in different areas of camera scanning, on the screen we will observe how his digital avatar changes. Again, it will all be like shooting a movie in one shot and in real time!

360 Visual Festival - audiovisual event
AVC: We truly appreciate the viewers becoming active, participating and communicating with the stream. That creates a collective experience. 

Could people really tell if this was a live performance or not? It feels like the interaction should be more meaningful somehow, with a bigger impact to the overall audiovisual artwork.

KUFLEX: Our team worked on a concert from different cities. We thought about how best to organise remote control of virtual cameras and effects.

And suddenly our creative director had an idea to manage content through Youtube chat. During the broadcast, he wrote commands like cam1 (switching camera 1), stars ( the star effect was launched).

We intentionally did not talk about this function in advance to get the quest. As a result, some viewers guessed and began to help in managing the scene. We explore different possibilities about other ways of interaction.

In the future we want to create a client application for connecting to the broadcast via a mobile phone, desktop PC screen or VR. We intend to develop Kusmos as an art tool.

Our team believes in a power of collective interaction. We want to give a palette of visual solutions, effects. Let’s all together create beauty here and now! This idea is a sincere inspiration for us. 

AVC:  We would like to know more about how the collaboration you have done with Leksha where the viewers were controlling visual effects with the use of commands via YouTube chat in real-time mode.”

How does the input of data modify the visuals? is it like a live coding or common human language and how it is related to the VR software?

kusmos live - kuflex and leksha - audiovisual artists

KUFLEX: Usually in an offline format, we work like this: musicians play music, and VJ manages visual content live using MIDI controllers.

Some effects are linked to the amplitude of the sound. But in a situation where we did not have the opportunity to be onsite will all team, we decided to make control through chat commands.

We wrote a special function for our software that receives data from chat on YouTube using the Google API. We came up with several commands, for example: cam1, cam2, skin1, skin2, electric noise, lasers and the like. And when someone in the chat wrote one of these words, then a certain visual effect or a corresponding camera was included in our program.

In general, we have an idea to expand the number of commands and their appearance, so that it looks more like live coding. For example, add numerical arguments to the commands, which will additionally specify the parameters of a particular visual effect.

As for virtual reality, we have plans to create our own application for viewers who can watch live broadcasts using VR devices, thus more deeply immersing themselves in the atmosphere of the digital scene and additionally receive personal effects. 

Kusmos Live - audiovisual tool
 AVC: Fascinating this idea of “universal tool” for content in real time and interactive show. We think it’s important now to dig more into the idea of how it can involve more people in the creative process.

It opens new ways of investigation on how to make every audiovisual experience unique, not only in terms of the aesthetics of the piece but also regarding the narrative.

KUFLEX: Yes, this is the main object of research for us. Usually, a limited number of people can come to the offline exhibition. So we want to overcome any space frames. With Kusmos we don’t have any restrictions online! We can find ourselves in amazing digital worlds that are impossible in the physical world.

Now that Kuflex Lab and its creation Kusmos entered our radar we will most definitely keep following their progression, as always supporting innovation and creativity in the audiovisual art world.

Marco Savo

Marco Savo

Production Manager and New Media Art Curator based in London. Restless and passionate, a special blend between communication and creativity. I am the director of Audiovisual City. All my life has been dedicated to promote arts and new media in the contemporary society.

Leave a comment

Leave a Reply