Well today you can really buy apps that look at things like how many steps you do and that follow how much you would jump on your snowboard and things that look at your location and tell you who is around you. But that’s just a small part of what sensors with mobile devices can really know about what you’re doing. You know, motion sensors can tell you not about how much you walk, but where you walk, up the stairs, down the stairs, on the rough street, in the mountains and whenever. And there’s a lot of work to tell you exactly where you are going.
And you can, of course, do tracking not just with GPS, but Google and other people can track you indoors. And you know there are maps that are being built of buildings so they know exactly where you go in the mall and where you go in the university and so on. And there’s more. So a lot of work we did was using what you call audio scene recognition. And as everybody knows, if you close your eyes and just listen, you very often know where you are.
And you can show that with a microphone that you have in your smartphone, and even without recording continuous audio stream, which can be a problem for privacy reasons, but just small snippets you can really deduct things like a busy street, a football stadium, a church, a street. And you can also deduct what people are doing, things like washing your hands in a bathroom, making coffee, lawn mowing. Really everything you do, or a lot of stuff you do leaves a sound signature. And you can show - we’ve done this work actually quite a while ago - that you can very, very nicely recognise what people are doing just from the sound stream on a phone with a high accuracy.
And there’s more you can do - that’s one of our favourite papers a long time ago, but it’s still cool - so we’re looking at trying to monitor nutrition. This is actually, from many, many aspects in health, an interesting problem. Can you see how much and what people are eating? And it’s not very easy to do. You can do that by inserting electrodes into your throat and stuff like that, which doesn’t have much… a big public appeal. But what we did, we put a microphone in a person’s ear. And you know that you can hear yourself chewing. And by actually looking at this microphone, you can really count how many chews people are taking.
And you can recognise with reasonable accuracy if you’re just eating chips or apple or meat or whatever. Again, personal experience, you can hear it as well. We can take it a bit further. Looking at the stuff of a microphone, you can follow the course, for example, of conversation by looking, if it’s you speaking or someone else. And also looking at smartphone sensors, you can go on, for example, and look at who is in the room. So if I did a scan of Bluetooth IDs in this room, probably something like 30% of you would have their Bluetooth on, because of your car speaker and stuff. And those are things that allow me to identify you personally.
So it’s really quite interesting to see today there’s this discussion about NSA and privacy and stuff. And I go in discussion with people about that. And then you see they have their Bluetooth on, their Wi-Fi on, so they can be tracked, identified. People don’t understand how easy it is to actually extract information about people. Other things that you can do. If you analyse, for example, the voice on the phone, you can really get very good information about people’s emotional states, stress, and things like that. And again, those are things we did in different projects and are shown to be quite accurate. So in summary, it’s a small phone, but it’s really a Big Brother.
This thing can know a lot about you. And again, what you need to understand, the phone is just one device. One last example I wanted to give you of the complex things that you can do with a phone is we had a project with a psychiatric hospital. And the diagnosis of what’s called bipolar disorder, which is this illness that you very often see with Hollywood stars, when people go between mania and depression. And you can show, just using simple smartphone sensors, you can recognise with something like 95% accuracy on a daily basis if people are depressed or manic or normal. And we verify that in clinical studies. So again, that’s just a phone.
And what you’re really seeing today is that we have an invasion of devices that are smart and sensor-enabled and from Google Glass. And recently, I got myself a scale that connects to my iPhone and gives me the data. Not that it helped anything, but it’s just a feeling of doing something. And since it didn’t help, I have the proof that I need more gadgets to do something. And that’s the way you sell those things. So let me show you something you can do with other cool devices. So recently, we started working with Google Glass. And essentially it’s just glass that has a camera, a small display, and a couple of sensors.
One is the inertial measurement units, which essentially follows the motions of the head. The other one is something very interesting. It’s a blink sensor. It’s infrared LED that shines into your eyes and looks at the reflected energy. And that allows you to see if you blink your eye. What you can do with it is you can try to recognise activity. So here’s a nice video of the blink deduction so I can show you how it worked. And you can see here him wearing the Google Glass, and there’s the output of the sensor. And the sensor has been conceived by Google to actually notice when you take the device off. So it can be switched off.
But what happens is if you do small, involuntary blinks, like everybody blinks involuntarily, you will see these blinks as small blips here. And what happens is people in psychology have found out that the frequency of your involuntary blinks depends on what you’re doing. So if you concentrate, then you have a different frequency than when you’re doing sports or something. So we did a study. When we had people do a couple of activities, such as reading and playing games or doing a maths problem or doing some home improvement work or just relaxing in front of the TV.
And what you can show is that if you plot for these activities, essentially the blink frequency the mean blink frequency versus the variants of the essential motions of your head from the acceleration sensor, they really separate pretty nicely. So the interesting thing is just by having something like Google Glass that has a very simple diode looking into eyes, you can deduct what sort of complex activities people are doing with reasonably high accuracy. And again, that’s just a commercial device. Right? There’s more stuff that you want to do. I just can’t resist showing you one of our cool wearable stuff. So what we did is we devised something which is essentially a neck-worn touchpad.
So in a piece of textile, here we have a conductive textile. And you wear it around the neck. And in a real device, we wouldn’t have this stuff here. You would just have a normal collar. It doesn’t have to be tight. It doesn’t strangle you. All the students survived. And what you can do with it is you can look at… you can essentially, if you look at simple physics… look inside your neck. So you can see things that you’re swallowing, things that are moving inside your neck. So that’s Jin Hwang. That’s the post-doc who actually did the project. And you can see the system deducting her chewing, swallowing, moving her head, and talking.
And you can actually deduct what a person is swallowing, whether it’s a piece of bread or a piece of water. And again, it’s something that you could put into your tie. Finally, a good reason to wear a tie, because you would know what you’re eating. And give yourself a small electroshock if it’s too much chocolate. And so we can see, we can classify things like shaking heads and swallowing and so on.