Input test

Touch location learning of phone screen basing motion sensors

In the final project of Principles of artificial intelligence, we managed to establish a neural network using tensorflow , tflearn and sklearn. We wrote our own JS collecting program and then analyzed the wave of motion sensors of smartphones and defined an touching event. Better than previous study deducing only 4-digit PIN, our model can give coordinates of the touching location on the phone screen and successfully recognized a sentence.

In the GIF below, we used the output coordinates of neural network as input to click the keyboard, only one letter(x->c) was wrong but with auto-correction we succeeded in re-creating the sentence: A secret!

GIF

For more information, please browse our slides(English) and report(Chinese) .

Related