Usar la aplicación APKPure
Obtener Eye Type versión histórica en Android
Permite a los usuarios introducir texto, utilizando sólo los movimientos de sus ojos.
The Eye Type application allows users to input text to a mobile device, using only
the movements of their eyes. Application was developed for the IEEEmadC 2015 (Mobile
Application was developed for the IEEEmadC 2015 (Mobile Application Development Contest) http://ieeemadc.org/ and it won the Computer Society Special Award!
Developer: Evangelos, Greece, IEEE Region 8
Link to demonstration video: https://youtu.be/6KOoxkY7KBc
APPLICATION DESCRIPTION
This mobile application aspires to be a step towards ubiquitous gaze tracking
using handheld devices. The ultimate goal is to assist people with ALS, lock -in
syndrome, tetraplegia or any other people that can only move their eyes with using
their handheld device to communicate, browse the Internet or facilitate other
everyday tasks.
The eye type application consists of a visual keyboard, whose keys are “clicked”
by estimating the user‟s gaze (line of sight). Instead of using the traditional
QWERTY keyboard layout, which would make it very hard to accurately
determine which key the user is looking at, the application's main layout
comprises of a several large keys. Text is composed using predictive text input,
which depends on the combination of the keys clicked (each containing a specific
subset of letters). The visual keyboard layout consists of four keys at the corner
of the screen containing the letters of the alphabet equally distributed (hence
making them “ambiguous” keys) and three control keys (in order to accept a
word, correct mistypings and scroll through the predicted words).
DESCRIPTION OF USE
The eye type application works optimally on large screen devices such as tablets.
The user is asked to maintain his head still during the eye typing session. In order
to facilitate this, an object such as a large book may be used as a chin rest. The
illumination must be adequate so that the eyes of the user are correctly detected.
The user presses the „calibrate‟ button in order to calibrate the system and then
can start typing using their gaze. During the calibration phase, green dots appear
successively at certain points on the screen, followed by red shrinking dots. The
green dots remain visible for ~1 second and aim to prepare the user to direct his
gaze towards that point. When the red shrinking dots appear, the positions of the
eyes of the user are captured. Thus, in case the user needs to blink, he should do
so when the green dots appear.
Once the calibration session is completed, the typing view containing the visual
keyboard automatically appears and a magenta-colored circle indicates the
position of the gaze estimations. The user directs his gaze on the keys containing
the letters he wants to input. A button on the screen is considered as clicked upon
fixation on it for a specific time interval (~1 second) and momentarily turns green.
A disambiguation/predictive text engine predicts the desired word and allows the
user to either accept it, or select an alternative suggestion. As the user continues to
type, the predictive text engine attempts to determine which word the user means
to input and also offers alternative predictions (list of suggestions).
Last updated on 11/03/2016
Minor bug fixes and improvements. Install or update to the newest version to check it out!
Presentado por
Victor Manuel Serna Monroy
Requisitos
Android 3.0+
Categoría
Reportar