Photogrammetry avatar at Hollywood and Vine, image © 2014 John Craig Freeman

Artist Interview: ​Brain Sensing and Augmented Reality

December 3, 2014
Desi Gonzalez, Graduate Student, Comparative Media Studies, MIT

Desi Gonzalez is a graduate student in Comparative Media Studies at MIT. Her dissertation research includes investigating art and technology programs in museums, among them, our Art + Technology Lab. Recently, she talked to artist John Craig Freeman, one of the Art + Technology Lab grant recipients. Freeman is working on a project called EEG AR: Things We Have Lost. The acronyms in the title refer to two technologies the artist has been experimenting with: electroencephalography—or brainwave sensing—and augmented reality, which overlays computer-generated imagery onto the real world. Desi and John Craig sat down to discuss his project, its origins, and the use of public space as a site for technology-based art.

Desi Gonzalez: What is your plan for EEG AR: Things We Have Lost, and where did this idea come from?

John Craig Freeman: This project is based on a project of the same title that I did with Scott Kildall in Liverpool in the U.K. I went out into the city, asked people what they’ve lost, and recorded their response on video. Then I made a database of the lost things and, using augmented reality technology and geolocation, I returned the objects to the place where I had spoken with each individual. By the time the project was complete, the city was littered with all these lost things that people could see using a tablet or a phone. It was a portrait of what the city has lost, a collective imagination of what’s important to Liverpool.

JCF: In Los Angeles, I am using a technology called photogrammetry, in which you take photographs of something from multiple angles and extract the 3-D geometry to make a virtual object. I’m interested not only what people have lost; I’m also interested in the people themselves. Rather than simply making the objects, I am making avatar representations of people that I can place around the city. Then I’ll bring these characters to life so they’re not just inanimate avatars; the idea is that they will become responsive to you.  

DG: How do you plan on using EEG, or brainwave sensing, in this project? 

JCF: I’m planning to transform the Art + Technology Lab into a performance/installation space, with a dentist chair and privacy screens that you might see in a clinic. The public can drop in or make an appointment, and be hooked up to brainwave sensors that measure brain states such as attention. I’ve written a patch that hacks into the software so that if the value for attention goes up, these brainwaves will trigger a call to the database and a random lost object will appear on an iPad, a few meters in front of the person. The idea is that the public will conjure these lost objects by imagining them into existence. 

DG: You use emergent technologies to intervene in public spaces. How is the concept of public space in Los Angeles different from that of other places where you’ve worked?

JCF: In places like Europe, or Boston where I am based, the public space is still identifiable; a common or a public square is built into the city. Los Angeles was built without a central public space. It is completely distributed. In addition, most public space in L.A. is corporate owned. It’s interesting how the plaza at LACMA becomes alive in the evening. It’s because people will find a way to gather and make a space theirs.  

I intend to use the LACMA plaza as a kind of studio. I’ll be in residence in L.A. from January to May. As I develop my project, I’ll hold events at which people hanging out in the plaza can view objects through an AR viewing device that I’ve built. Later on, in the spring, I’ll give walking tours of the city.


Viewing lost things, LACMA, image © 2014 John Craig Freeman

DG: Are you working with any of the Art + Technology partners?

JCF: One of the Art + Technology partners is the CEO of a company called DAQRI, which focuses on augmented reality. They’ve been doing a lot of work with brain wave sensing and have been developing software that interfaces augmented reality with brainwave sensing. Brian’s thinking is that it might be possible to get direct signatures for specific objects—so, if a person thinks about losing their teeth, the software will produce the lost teeth object—which is kind of a cool, but creepy thought.  

Note: John Craig Freeman will be presenting components of his project EEG AR: Things We Have Lost on LACMA’s BP Pavilion and in the Art + Technology Lab during the months of February and March. For more information, visit lacma.org/lab or join our mailing list.