An issue of interaction in a lecture of Piotr Madej: engage, appeal and provoke

Пйотр Мадей

As part of the Pandemic Media Space project, media artist Piotr Madej held a lecture “An issue of interaction” on November 25, 2020. The artist told about the most interesting things in artwork to him through his experience, and how it can be useful for the composers. 

Piotr Madej about himself:

I’m a media artist mostly focused on sound art. If to say more precisely I’m a composer, sound designer, foley artist, author of art installations, and a programmer. What is most interesting for me in artwork? It’s an issue of interaction. Privately I’m also a husband of smart and beautiful Beata, who is also an artist, and we are supporting each other on our creations. Now I’m studying for the third degree, and I’m working on my Ph.D. project. I don’t want to create a monograph of all possible interaction strategies in arts, because the subject is larger than the time borders we have and also requires longer preparatory studies. I just want to share a handful of insights with my colleagues-artists, from my experience. Perhaps someone will find it helpful in their research.

© Piotr Madej

Interaction means…

It’s a phenomenon that engages a participant’s sensitivity, appeals to his curiosity, and provokes his needs for experimentation. An artist prepares a set of tools designed to process a raw artistic matter in a planned manner. Placing these tools in the right context, an artist invites a participant to an act. And then a specific form of artistic collaboration begins: each participant, independently of his artistic preparation, has a chance to experience a unique fragment of a piece, which he participates in. In a world of sounds, interaction takes a lot of different shapes.

  • In some way, every musical instrument has this kind of power to inspire a player to exploration of new abilities to create sounds. The main difference starts when we put this instrument into a new context. 
  • In the opposite direction, we have an algorithm composition, where the composer takes some conditions, parametrize them, and make a musical structure using them in a sophisticated way. In this case, we can talk about the internal interaction that occurs between the parameters of composition. 
  • An interesting example is an intuitive improvisation that relied on inspirational feedback between all musicians involved, the sounds they are playing, and all other conditions which can make an influence them. 
  • Another interesting example is music designed for video games, where some parts of the piece are strictly written, but the final result of an arrangement depends directly on decisions the game player makes during the game. 
  • My understanding of interaction is more intuitive than strict, so I’m trying to avoid deep philosophy. In my practical artwork, I try to explore the whole area of interaction, and I’m looking for different strategies that can fit in different situations.

The situation that we experience now is good to make a review and an analysis of existing interactive artistic strategies. Nobody knows how long the situation will be continued. Pandemic Media Space is a great project because it’s trying to reach out to the situation and constructively take advantage of the new limitations. In my works, I’ve done in the last years, I used a few interactive strategies. Most of them I found useless in present conditions, so I decided to subject them to a critical analysis. In the last part of my lecture, I will present conclusions and postulates regarding my current and future artistic work, taking into account the current conditions observed over the past few months.

Most of my artworks can be divided into two categories: installations and audiovisual performances.

In an installation works, I’m trying to catch a spectator as a potential participant. In performances, the audience is more passive traditionally, but the tools I’m using — the instruments, controllers, and scenography elements — I’m preparing to give me as many interactive possibilities as I need.

Usually, I’m preparing some kind of background, “tape track”, and what I’m doing is an improvisation with controllers, follows the basic form of the piece.

Meetin’ is an audiovisual performance presented at the Audio Art Festival in 2013. From the technical side, the interaction was based on the light sticks that were moved in front of the camera. Each light had a different color and the patch in Max / MSP determined the position of a given color in the frame using color tracking. This position was then changed into a parameter specifying the sound properties. With 4 sticks I could control 8 live parameters simultaneously. I performed a solo part with a tape track background, and also Sania Rodobolska participated as a performer.

Minotaur, another of my performances, refers to a series of late works of Pablo Picasso, where the artist sketches the figure of a bull with a flashlight in the darkroom in front of the photo camera set to a long exposure. In this piece, I’m using light sticks, a camera, a multimedia projector, and a four-channel sound system. I’m moving the sticks in front of the camera, and track of the light made by my movements stays on the screen behind me. The basic form of the piece is divided by the series of pictures I’m sketching. Each of them has its tape track looped in the background. When I’m sketching, my movements are sonificated and works as a solo instrument part. After making one, I delete the sketch and start to draw the next one. Each subsequent one is bound to the more parameters I can alter both in sound and the pictures. Each picture is more complex, and we can hear the roars of the Minotaur pissed that we are approaching the center of the maze. Finally, a clear picture of the bull sketched by Pablo Picasso is emerging.

Bohdan Yemets
Folded Maps of Time, Lwów, 2019, Lem Station

Ayee

Another performance, Ayee, is about the basics of communication. This work is related to one piece of great Polish media artist, Wojciech Bruszewski, Yyaa. If we could take some frequencies in specific combinations, the formants, we start to hear vowels, a sound which starting to be meaningful in some way for us. Using a Leap Motion controller, a tool designed for precise tracking of hand movement, I’m altering my gestures into a sound that may resemble a human voice. The live video part of the performance is tracking the numeric data flown from my gestures. 

At a certain moment, the data view is interrupted by the rhythmically arising data noise. The noise is made of typographic signs. When it is more of it, it starts to form itself in the shape of Wojciech Bruszewski’s face, yelling at us from the screen constantly, without a breath. In this piece, I’m also using a background tape track, independent of the live performed sound. This piece ends with my part in a duet with the recorded Wojciech Bruszewski.

Harp

“Harp” is experimental Mixed Reality work consisting both in physical and augmented space. Part of it is a kind of installation using live tree trunks with a colored yarn stretched in the shape of harp strings. Augmentation is pinned to the geographical place, defined by GPS coordinates. The visual part of the augmentation is a 3D form corresponding and crossovering the physical one. By moving the device which can provide the augmentation, like a smartphone or tablet, you can explore also the sonic part of the augmentation. It’s consists of a small form, interactive sound sources arranged topographically around the visual form, and It’s acting like a virtual instrument, based on granular sound synthesis. When the device is passing these places when it’s moving, it’s making a spatial mixing and altering these virtual sound sources. The altered sounds are based on the voices of the animals, which you can hear simultaneously with the real natural sonosphere of the place.

Perception of spheres

It’s an installation, focused around a special object called an armillary sphere. The sphere was originally designed for measuring the movement of celestial bodies. One day I imagined if we could reverse this action, use it to make the celestial bodies moving, it can be a great instrument. I’ve prepared a virtual simulated model of the Solar System. Each object in simulation — the planets, the Earth’s Moon, the Sun, and some asteroids, have sound generators designed for sonification of the basic physical parameters of the dedicated object.

The visualization can be immersive when it’s using a dome projection system, but it can be also projected in a panoramic view. Sound is projected by the multichannel sound system and mixed by the ambisonic virtual mixing console based on ICST ambisonic objects in Max 8. The position of each celestial object in simulation is tracked both in video and audio projection synchronously. We can use the model of the Sphere to navigate the Solar System that surrounds us. Each rotation of the Sphere translates into the orientation of the simulation. The sound does not correspond to the real conditions prevailing in space, as we know, the sound in the vacuum does not propagate. But the sonification corresponds to the actual spatial relations between objects.

Sound generators are tuned accordingly to the basic known physical quantities that define celestial bodies, as mass, diameter, density, escape velocity, rotation period, length of the day, medium distance to the sun, the medium diameter of the orbit, and the period of the orbit. As there is a huge range of values for these all parameters, I have used Frequency Modulation synthesis generators which can produce sound in the acoustic band even if the carrier or modulator oscillator frequencies are tuned outside this range. In summary, a virtual instrument in a form of an audiovisual installation was created by implementing an algorithmic composition based on sonification of the Solar system parameters and interaction with the physical controller.

© Piotr Madej

Reflections/Refractions — Paradox

This art installation was created during the international art conference Paradox in 2015 in CK Zamek, Poznan. The process of gentrification can be observed in one of the most important streets in the center of the city, St. Martin. A lot of shop windows and service premises were deserted. Most of the premises still operating are bank representative offices. The building of the connected Alpha skyscrapers dominates the historical perspective of the street’s architecture. I used the view of these skyscrapers as a kind of mosaic canvas. As we pass near the view, the mosaic is filled with random bits of photos taken along the street. At the same time, the field recording sound collage is coming. Just like a street that used to be bustling with life, the premises of which changed owners and finally abandoned, when we move away, only the view of skyscrapers remains.

Leyden Jar

The installation was created during an international art workshop in The Hague in 2016.

During our visit to the Netherlands, we were supposed to visit Leiden, and I was supposed to be the guide for this trip. As part of the tour, I intended to prepare a model of a Leiden jar from the available materials. Unfortunately, during the whole trip, we did not reach Leiden, but the Leiden jar inspired me to create an art installation that was presented at the end of the workshop. An instrument was created containing excerpts from field recordings from our trips in the Netherlands. The instrument has the form of 16 bottles connected to a touch board controller. When we touch individual bottles, sounds appear. Each bottle contains its own set of sounds, appearing in random order. Some of them making an electrical click like the discharge of an original Leiden jar.

 

An electroacoustic piece from 2014, it’s data on the course of future astronomical events — due to large fluctuations in gravitational interactions between objects in the Solar System, the effectiveness of precise prediction of the position of celestial bodies is limited in time. Ephemeris was a diary in Ancient Greece.

The works I have mentioned present a set of different interaction strategies. Some of them are based on proximity detection, others based on touch. Some of them use the participants’ own devices, and others, I mean performances, are based on my improvisational interaction with the prepared elements of the work. By the end of 2019, it seemed that the possibilities of using interactive techniques in artistic activities would increase, but the epidemic situation forced the reduction of some possibilities: first those based on direct contact and touch of work, and then all additional activities in the common physical social space.

The Salt of the Earth

The limitations caused by the pandemic and the lockdown occurred when I was finishing my master’s thesis defense. The subject of the work was in some way narrative: I wanted to tell the story of my great-grandfather, Franciszek Krzeczkowski, who 100 years ago was a salt miner in the Wieliczka salt mine, and his fascination with beautiful salt crystals, also interesting because of their properties, was contagious for his friends. I also became infected with his fascination. So I decided that it would be best to take the form of a live audiovisual performance in the original space of the mine, to provide a particular kind of immersion. Unfortunately, a lockdown took place a month before the planned date of the presentation.

I definitely had to rethink the form of work. I expected that the second deadline might be announced while the lockdown was still in progress, so I decided to apply several elements to compensate that I’m not presenting this work in the mine. Likewise, I took the Unity computer game engine, where I placed fragments of the mine’s 3D space reconstructed from plans and photos, and other abstract spaces related to the subject of the work.

I wanted to use the interaction strategies known from computer games and here I immediately realized that they had a lot of limitations. FPS games have the mechanics of movement in space based on the simultaneous use of the mouse and keyboard. People of my generation who have played this type of game since the mid-90s know that this system did not come into existence right away, but evolved with the development of the graphics capabilities of computers. Currently, the standard is to use the W, A, S, and D keys to move the character in the XY space while moving freely with the mouse that allows you to turn around and look around. Other movements or physical mechanics are activated with other keys if needed. This makes it necessary to learn specific motor coordination between the right and left hands, which is also a common feature when playing a musical instrument. 

This comparison is as bold as it is justified, because mastering the smooth movement in such a way in 3D space requires many hours of practice, and mastering this technique in such a way as to “forget” about the medium of a keyboard and mouse and experience the full immersion of the 3D environment on the computer screen requires at least a few weeks of practice. Of course, I know that the timescale for mastering basic proficiency in a musical instrument is much longer. But, it makes a serious obstacle to the exploration of this created 3D space for people who have not yet had the opportunity to learn to move in FPS games.

The use of a computer game environment allowed for the preparation of interactions in the form of several interesting mechanics:

  • the already mentioned free exploration of the vast system of rooms, 
  • gravitational interaction with some objects like a collision or the use of a flux of force, 
  • the use of proximity switches in various combinations of dependencies, 
  • the application of all these techniques also at the level of sound realization, which allows defining certain fragments of the work as virtual instruments based on samples, 
  • the use of an interactive graphical interface with information and navigation functions, 
  • using artificial intelligence to control Franciszek Krzeczkowski’s character.

Regardless of the technical aspect, I placed graphic, audio, and video materials in the spaces, which constituted documentation of the preparations for the defense in the initial version. There are also materials constituting the content of the artistic work — music, and animations prepared in the technique of stop-motion animation and computer 2D and 3D animation.

An important part of the work was LAB stations intended for the sonification of physical phenomena involving salt, which were intended to be performed live during the presentation. To provide online access to these LAB stations, as well as the live space of my studio from the level of virtual space, I prepared a system of monitoring screens in the virtual mine with the use of USB cameras to live to observe the situation in my studio.

The entire presentation was conducted online under the conditions described. Thanks to this, it was possible to maintain the character of audiovisual performance, as the only person who interacted with the work during the presentation was me. The commission and the guests watched the presentation on the MS Teams platform and YouTube. In practice, it has turned out that with the measures taken, it is impossible to present the work online to be generally available.

The first limitation is due to the size of the project file — the maximum size of a 3D scene that can be put online is approximately 300 MB, my project is close to 3 GB.

Another limitation is the use of video players that are not available online.

The next limitation results from the live camera monitoring system used in the work — this element is not applicable for logical reasons:

  1. Either I would have to give online unlimited access to my cameras,
  2. Or assume that each user will have the same number of cameras.

This can be solved by replacing the live image with recorded documentary material, but it is already a serious interference in the structure of the work.

To sum up — the work prepared in the era of a pandemic with assumptions resulting from known limitations that turned out to be in practice completely unsuitable for later presentation in the form of an online service. The only effective method of presenting live work is to use a VNC server. Such a server is designed to transmit video and audio from the host’s desktop and also gives the client the ability to give control over the keyboard and mouse.

This requires launching the VNC server with the program and preparing the cameras with the LAB stations that are part of it. Besides, it is also possible to give keyboard and mouse control to interested participants. Several similar presentations were made. Independently, the work exists online in the form of several video shortcuts documenting fragments of the presentation. I have also prepared a 360° video presentation.

© Piotr Madej. Ph: Beata Madej

Which interaction strategies seem effective?

Limited opportunities to participate in live artistic events, such as concerts, exhibitions, festivals, performances, and other forms of artistic presentation mean that most of the activity is moved to the Internet. Apart from the forms of presentation that do not involve participants in interaction, such as portals presenting multimedia content (e.g. SoundCloud, YouTube, Vimeo), there are several options:

Create interactive interfaces using web development tools such as HTML5 and JavaScript. Participation in 3D spaces adapted to interact with other participants, such as Mozilla Hubs, Second Life, VR Chat. The possibilities of these systems are large, but also in this case you can take into account some limitations resulting from the possibilities provided by the creators of the systems. In some situations, it may be associated with additional payments or simply technical constraints.

The third way I see is to be able to implement custom spaces with WebGL. The Unity system makes it possible to compile the generated worlds into WebGL. Here, apart from the limitations resulting from the WebGL system itself, it seems that the possibilities of artistic expression are the greatest, but it requires considerable technical knowledge and time to prepare. I am currently testing this path for the Pandemic Media Space project.

Finally, a quick overview of technologies that allow the use of non-contact forms of interaction that can be used in physical space:

Movement tracking and detection:

– Camera

– Microphones

– Leap Motion

– Kinect

– Other sensors on Arduino (or similar):

  – Infrared (temperature changes)

  – Ultrasonic (Distance to the obstacle)

  – Microwave (Velocity of an object)

– Photoresistors

Another interesting and hybrid kind of sensors, but requires some touch:

– Gyroscopic and accelerometric

– Smartphones

– Wiimote Nintendo

– Touchboard

– Light Sticks (my approach) and many more.

As you can see, although some interactive artistic strategies are reduced, there is still a lot of them left. I cannot present all possibilities, because they are dependent on the individual artistic creation of each artist. Each one can combine existing possibilities in their creative way, and even find the simple solutions I didn’t mention because I didn’t think of it. Hopefully, it seems like the new limitations still leave the door to unlimited experimentation with the possibilities that remain. As we know, the artists are always able to explode known limitations with their creativity!

Leave a Reply

Your email address will not be published. Required fields are marked *