Technology with attitude

Interview with Ariel Garten, CEO and co-founder of InteraXon

4

Early this month at NeuroGaming 2013 we had the awesome opportunity to interview Ariel Garten, CEO and co-founder of InteraXon, a Toronto based BCI company that creates cutting edge thought-controlled computing products and applications.

Melanie from NeuroGadget was lucky enough to sit down with Ariel on 1st May, 2013 at the first ever NeuroGaming Conference and Expo in San Francisco, where a wide variety of people from neurosurgeons to game designers came together to discuss the present state, and future, of neurogaming.

Big thank you to Ariel for taking the time to talk with us and share some insights on InteraXon’s Muse headband and even the origin of Google Glass, despite being constantly busy and followed around by crowds of people at the conference.

As you might notice the audio quality of the video is far from perfect. We apologize for the background noise, hopefully the full transcript below the video will help to make the interview even more enjoyable to everyone.

Full transcript of our interview with Ariel Garten at NeuroGaming 2013

Melanie (NeuroGadget): I was hoping you could tell me a little bit about the Muse headset, what you guys are showing with it here, and hopefully follow up with a little bit about what you guys hope to do with it. Maybe if you’re going to integrate it with some of the other technologies like augmented reality.

Ariel Garten (InteraXon): Sure, well, this is Muse in case you haven’t seen it yet. It has sensors on the forehead and behind the ears. So I was here at the panel just talking about thought controlled computing, brain-computer interface, and specifically it was a panel on emotion. I brought the Muse up so that you can see it, because there are a lot of people who have ordered it on IndieGoGo who are looking forward to it coming out.

Melanie: So when is it supposed to come out exactly?

Ariel: It will be out at the end of the year.

Melanie: So that would be December or so?

Ariel: It’s Novemeber/December.

Melanie: Excellent. So what is your target audience for the Muse? What sort of people are buying it?

Ariel: So Muse is a device for everyone – it’s slim, simple, easy to use. [As for] the target audience, it seems the people buying it are interested in mental health. Interested in being able to improve their cognition or their memory, or decrease their stress. They are interested in self discovery, so quantified self and self tracking. We have a large developer community that has already signed up to build on top of Muse, and we continue to find new segments – Mom’s with kids with ADD, people with disabilities, and what we always joke are people with ‘tin foil hats.’ The people who believe that in some way the government reading their brain and hoping we can save them from it.

Melanie: Yes. As a neuroscientist that is my one pet peeve. As soon as I tell people about BCI technology they get really concerned – they’re like ‘does that mean you can read my thoughts?’ – and I’m always like ‘No! No it doesnt!’ So you come from a neuroscience background as well, right? How does that tie into developing something like the Muse? Most of the people I’ve spoken to about branching into the BCI world say that it’s mostly for computer scientists.

Ariel: My background comes from neuroscience and design, and both of those play very strongly into Muse. Our team is focused both on creating EEG processing with strong signal processing backend and understanding how it works, as well as the design side. I think your question was whether BCI is slanted more towards developers or towards neuroscientists?

Melanie: Yes

Ariel: Yeah, When I started what’s really a Neurotech company I was really approaching it from the Neuroscience side without realizing that most of my hires were going to be either in engineering, signal processing, or computer science space rather than neuroscience space.

InteraXon Muse headband

Melanie: So what, as a neuroscientist, do you actually do? Are you on the signal processing side of things?

Ariel: So the neuroscientists on our team do a bunch of things. They run our in-house research studies. We’re constantly doing studies looking at brain and behaviour, and how they interact with EEG. They are responsible for interfacing with our partner labs – we have partner labs both in Canada and the US – and they’re also involved in understanding where in the brain signals are created in order to understand the holistic EEG and try to do things like source localization, which we’re still very very early days on.

Melanie: So people are trying to do source localization with the Muse?

Ariel: Its very very early days for source localization.

Melanie: It’s difficult enough with a full EEG cap – four sensors seems like it would be kind of an impossible task.

Ariel: You start to get a hint at best.

Melanie: Yeah. Have you ever considered branching out into other types of sensors? There are a lot of fNIRS things that are coming to the front, I recently saw that there are portable MEG sensors coming out – are you thinking of branching into those things at all, or just sticking with EEG?

Ariel: So we always think about the different sensor modalities that can combine, and get readings. So Muse reads the whole brain EKG if you want it. Our sensor system is really really sensitive . The accelerometer can actually detect your pulse if you’re trying to sit very very still – you can’t fool the accelerometer. We haven’t done a lot of playing with fNIRS and no playing with MEG, but certainly when we think about the future of BCI and how it grows, we always look at how you can combine sensors to get better data.

Melanie: Ok, that’s cool! Right! So integration with other types of technologies. Not just branching into other sensing technologies, but with something coming out like the Google Glasses would you ever consider integrating them?

Ariel: So we’ve already done that, not with Google Glass, but we come out of Steve Mann’s lab at UofT (University of Toronto).

Melanie: Oh! I didn’t realize that Steve Mann was at UofT…wow!

Ariel: We have a Canadian hero!

Melanie: I don’t know how I didn’t make that connection given how much I’ve read about him…

Ariel: Yeah! So Steve Mann is at UofT. About a decade ago we started working on EEG with Steve and Chris, who’s my co-founder at InteraXon, and they worked on the EyeTap wearable system. So Chris basically worked on Google Glass before Google Glass even existed with Steve, and together with Steve we’ve done integrations with EyeTap and Brain Base and spent a lot of time thinking about how the two interact. So we’re pretty deeply versed in the space.

Melanie: That’s really exciting. Is there anything else you want to tell me about the Muse?

Ariel: Want to try it on?

Melanie: Sure! So this interfaces with iPhone?

Ariel: Smartphone or tablet.

Melanie: And it’s across platform?

Ariel: Mhm. [ sharing anecdotes]

Melanie: So now that you have the Muse all done, what is kind of the next step? Because this is already a beautiful, sleek, simple device.

Ariel: Thank you. So the next step is actually completing the optimization on the Muse. We’ve made a few little tweaks to make the sensor contact softer, smoother, better, and to change the shaping around the ears a little bit so that it sticks at the right angle on your head to really stay out of the muscle activity on your brows. So next is completing the tweaks on the Muse, we’ve already started with our manufacturer so the process there is going pretty smoothly

Melanie: So on a scale from 1 to ‘fully hackable,’ how easy is it to just take the Muse apart and get readings from it without interfacing with the software it’s meant to interface with.

Ariel: Oh, that’s a great question! Muse comes with cross platform SDK software so you can dump it into Puredata, or MaxMSP, or Matlab, or whatever your favourite software suite is or environment is. If you wanted to completely route around that, you probably can – go for it.

Melanie: Cool, that’s good to know. So you are a completely different system than the NeuroSky chips, yes?

Ariel: Yes. NeuroSky did a great job in opening up the market, but what we wanted to do with Muse was go well beyond the capabilities that were offered. So it’s a discrete component system that is real, near clinical quality, EEG.

Melanie: Great! Thank you so much

Ariel: Thank you.