72U Project, Summer 2016
Role: Creativity & Development
Technology: Processing, Arduino
Something feels hard. Is it you? Or me. Why don’t you touch it then? Fondle is an interactive sound installation that manipulates the senses of touch and sound to explore what it means to seek out pleasure. Through the use of recorded tracks and a touch-sensitive interface, the installation makes the familiar act of picking up an avocado sound intensely intimate and feel slightly uncomfortable. Go ahead and give it a hand job.
72U, Summer 2016
Role: Creativity & Development
FaceTheColors is an interactive installation where people use their face as an interface to ‘draw’ their emotions with different palettes of colors on a big video projection. A face recognition system uses the face position and expression of people in front of a camera to draw any “emotion” with different colors ‘emerging’ from each person’s facial expressions.
This video projection works as a mirror to the user drawing an abstract color representation of people’s expression. A colorful experiment to play with and experiment with your face. Each person and expression has their own palette of colors.
The first prototype of this project was developed during the “Interactive computation: artificial vision applied to scenic arts and dance” workshop in Hangar on February 2016 based on a workgroup idea.
Workshop, March 2016
Role: Idea & Development
Technology: Processing, Openframeworks
R-Control is a tangible, touchable and collaborative interface and controller to interact in a different way with analog synthesizers and electronic instruments. Conceived for live gigs as centerpiece of the performance. It allows you to control, play and mix any kind of synthesizer that has a MIDI interface. It's hard focused on the interface and thanks to the visual feedback you get from it, it’s easy to understand and create with.
The main idea is to unite the world of musical interaction with the hardware and get the best out of each one, maximizing the performance and interactive sound, allowing each part to perform their specific job, and complement with visual appeal that the interface gives.
This was the final project of the New Interface for Musical Expression Postgraduate in the Music Technology Group.
Master Degree Project, September 2014
Technology: Pure Data, Openframeworks
XMas Dlights was a project made for a spanish radio station, Kiss FM, for the 2014 Christmas campaign. The goal was to change the tedious traditional and well-known Christmas Carols in the Christmas tree with the number one radio hits of the year, specifically the ones who loves more the Kiss FM audience.
With this creative goal in mind we conceived an independent whole product, a little box within a well-designed packaging that would be sent as a gift to some influencers and a little amount of listeners of Kiss FM Radio.
In technological terms It was an ambitious project that was built in short-record time: a standalone radio gadget that plays a special, only created for christmas, streaming service via Wi-Fi and makes blinking the christmas tree lights integrated with the gadget at the playing music rhythm.
To this end we used the Raspberry Pi mini-computer platform with some specific software and hardware made for us to have a good user experience and make an easy step-by-step installation at home.
Lola Mullen Lowe, December 2014
Role: Software Development
Technology: Raspberry Pi, Python, C++
Awarded at Cannes Festival with Silver Lion Prize
After the great collaboration with my friend and musician Rodrigo Rammsy with R-Control development we decided to make some new stuff in a different way, exploring the Audio Visual boundaries of live electronic music.
He was invited to make a new live gig in the Barcelona well-known Telenoika Audio Visual Open Creative Community, so it was the opportunity to make a new and experimental Visual Set for his live electronic music set based in hardware synths.
It was my first time working with live visuals, what I decided is to make an approach to this based in some things learned before with R-Control project instead of learning typical VJ software, so I created from scratch a new software for the project able to know what is playing each of the hardware synths of his live set and make a visual interpretation of this. In addition to this I connected a MIDI controller to a bunch of visual filters implemented in the software that allow me to change it with the music dynamics making a totally free live interpretation during the gig.
The software was made using the Openframeworks C++ framework and from then it’s in continuous development. If interested you can check the code on Github repository.
Personal Project, November 2014
Role: Idea & Development
Technology: Openframeworks, OSMC, MIDI
Skate Wall is a project born in LOLA Madrid in collaboration with Nomad Skateboards to decorate the walls of the agency in Barcelona. They gave us some recycled boards to create new artistic expressions.
The basis of the work is a digital illustration printed on vinyl. From this, we leverage technology to give more power to the idea, we have installed a small circuit based on Arduino that controls a proximity sensor, red LEDs and a small speaker.
Thanks to the proximity sensor we know if we have someone in front of the table, how far away and for how long. Based on these inputs we can decide what sounds, stored in a mini sound card, can shoot. These sounds are amplified with another small chip that directs them to a loudspeaker. At the same time, we make the red LEDs embedded in the wood, blink.
Personal Project at Lola Mullen Lowe, June 2015
Role: Concept & Development
Technology: Arduino, Audio, Electronics
This installation was a Light Show running throughout a DJ set at Mira Festival 2014 in Barcelona and it was made together with all workshop participants “Lights vs Pixels, Playing with interactivity” at the Mira Live Visuals Arts Camp conducted by french interaction design studio Screen Club.
In this workshop we worked on several technology issues and skills to map images, videos and animations to a bulb light matrix of “low resolution”. In first place, from an interaction design perspective, and after, from the technical implementation side.
Finally, the light show was a 8x8 bulb lights matrix controlled with a custom software made for us in Processing. The software allowed us to create several music reactive animations as well as manage the animations with all kind of external MIDI Controllers.
Workshop Collaborative Project, November 2014
Technology: Processing, DMX, MIDI, Lighting
The app that allows kids to see what speed their cars can reach. TrueSpeed detects the speed of a Hot Wheels car and converts it into real scale.
Kids will not only measure the exact speed of a Hot Wheels car in km/h, they will also be able to see what that speed would be if the toy cars were real-size cars. And since we want kids to spend more time in the real world and less in the virtual one, the app also allows them to play with friends, challenging each other to break speed records with their respective cars.
A simple and fun mobile app that uses technology to encourage kids to play one of the most popular real-life games of all time: car racing.
My role here was to study the viability of the application and create a working protoype. After that an independent development studio was in charge of make the whole implementation.
World Cup Race was “phygital” experiment made by Lola Hack Lab that uses an old scalextric game with two cars hacked and connected to Twitter as a form of visualization of the digital world in the real one.
The stream of official "hashflags" implemented by Twitter for supporting the different national teams at the World Cup 2014 was converted in a real race with 2 slot cars powered by those tweets competing against each other, each tweet to each team made the car go further, and the fans who tweeted the most won the race!
The scalectrix track was modified using an Arduino Platform in two ways. On one way, we connected the movement of the race cars in real time with the official country hashflags being tweeted in the match time. Each hashflag mentioned moves the corresponding car which advances a fixed amount of distance on the track. In the other way we could see who is winning the race, we installed a digital lap counter. In addition, you can see the cars race via live streaming with our two strategically placed webcams. Each race started an hour before the match and continued throughout the game.
Lola Mullen Lowe, June 2014
Role: Technical Help and Video Stream
Technology: Arduino, Processing, Video Streaming
The Sónar+D Innovation Challenge (SIC) is organised in collaboration with Music Technology Group of the UPF in Sonar Festival. Is an accelerated innovation program based on creative technology and open collaboration between companies and top international talent. It helps companies to quickly innovate in topics that directly impact their business; to find the right talent in the way and to create forward-looking projects.
The challenge I was selected to participate in 2017 edition was measuring emotions to enhance shared music experience in music festivals with Teosto Futures Lab from Finland.
After some initial remote work with other team members and Teosto we came up with an idea to produce during the festival: A real time visualization of festival attendees emotional and excitement level during live music experience. We made a prototype for measuring heart rate and galvanic skin response with Bitalino sensors from festival attendees while they watch Sonar Festival gigs streamed in real-time. The sensors were connected to bananas that people were keeping in their hands. Bananas were used because of their conductivity to replace a laboratory-like environment where sensors are taped onto test persons. Realtime filtering and analyzing sensor data with Python, music feature extraction from the live music stream matched to sensor data.
Freelance, June 2017
Role: Concept & Creation
Technology: Processing, OpenGL, Python
Pfadfinderai is a company with a broad experience in Interactive Installations and Art & Technology projects. I was working with them for this new interactive installation, helping to shape the bridge between Computer Vision system and interactive visualization.
This interactive installation was part of a series of installations working for long time periods inside a cruise ship sailing the Mediterranean Sea so it was mandatory to make a robust system with no mantainance and error free. Chef's tables was a system of different kitchen connected tables to help understand well this cooking class.
An industrial infrared camera captures the image of chef hands while he or she is cooking and send this info to the students tables. Students tables detect where are every plate or glass in the table and the background interact with this positions in realtime. I was in charge of Computer Vision integration of a FLIR infrared camera into Unity 3D Engine.
Role: CV Development
Technology: C++/C#, Unity3D
Google MyAccount Interactive Wall was an interactive installation to show in different venues and exhibitions to explain easily how privacy for Google works in the plaftorm. I was in charge of technical leading of B-Reel Berlin Office when we had to develop a second version of this project for Google.
First version was a short projection Beamer that used a playwood big stand that works as screen with one fixed touch points connected with an arduino platform that sends some orders to web based platform. There was not so easy to move and configure from venue to venue so we made it easier in the second version. Second version changed the mechanism working with an industrial big touch screen with almost same software developed before, an HTML5 Application working on standard browser.
I was working with different stakeholders and providers to look for new hardware and adapt the old software in every new venue or event.
B-Reel Berlin, Summer 2017
Role: Technical Lead
Technology: WebGL, HTML5, Interaction
ScooP is a Virtual & Augmented Reality Platform made by Sopra Steria Spain Innovation team that was partnering with Microsoft with the main objective to bring 3D CAD industrial environment to the manufacturing workshop thanks to augmented reality tools as Microsoft Hololens or tablet and smartphone.
I started working in this project as Product Design Lead and Human-Computer Interaction specialist to bring facility of use and understanding in the 3D environment as well as manage the new product iteration to work in a broad use cases of the industry. After this I was also collaborating with the first implementation of the platform in the aeronautic industry.
This platform is able to understand complex 3D CAD models and translate it to lightweight data models that we can use in any kind of devices. With more advanced devices that understand and interact the space as MS Hololens we can overlay virtual data to the real world to show the user augmented spatial data in the right moment.
One of the big steps of the new product design experience was the shared multiuser experience that made possible to interact different people in different places with the same object, for example an engineer is able to guide a maintenance work remotely to an operator in the workshop sharing some videos or docs in the context of the 3D space.
Sopra Steria Madrid, 2020
Role: Product Design Lead
Technology: Unity3D, AR/VR, Hololens
Smart Bag was a Research project created by Design Team of BBVA Invisible Payments Department. I started to work with them as Creative Technologist when this projects was just an idea and I became in charge of developing first product iterations from a technical and functional point of view, as well as show the economic viability.
To make this idea tangible I managed the project using a design thinking approach with focus in short-term result and user-centric vision. Two-weeks sprints were used to organize the team in several iterations to create functional hardware and software prototypes. At this moment different stakeholders were involve: ESNE Prototyping Lab, product design experts in fabrics, ergonomy and industrial design, as well as different areas from BBVA as business, technology and design.
We achieved three product iterations. First one was finished with a working software and hardware prototype that was useful to evaluate the state-of-the-art existing technology and the viability of the project. Second one was focused in unit cost and size reduction of the prototype and research of the ergonomy and aesthetics. Third one was focused in ad-hoc PCB development to calculate the estimate cost of mass production.
The result product after this three iterations was an IoT device easily integrated with different shopping bag designs that was able to detect when the consumer put a product inside the bag or leave the product in the shelf and a smartphone application that show in realtime information about the products or shopping info and could make the payment just before the consumer leaves the shop.
BBVA Madrid, 2019-2020
Role: Product Lead
Technology: RFID, BT, Angular
Sopra Steria Madrid, 2020
Role: Product Design
Technology: WEBGL, AR/VR, SparkAR
With my expertise I've been collaborating with big companies and known media. I've presented my projects at music tech festivals and events. I love to contribute to advertising and creative industries and I enjoy creating unusual forms of interaction. But one of the things I love most is enhancing the user experience with the latest technologies.