|
Research ->Projects, Publications, Software & Datasets, People
Software & Datasets->
Eye Gaze Guided Interface Code (interface code, real time eye movement classification algorithms)
Eye Gaze Guided Interface Code (interface code, real time eye movement classification algorithms)
This distribution contains the code for an eye gaze guided photo
viewing application (iGaze).
The code can be used as a test bed to create any types of the eye
gaze guided interfaces in Windows to test ideas related to usability
such as widget layout, size, visual feedback etc. The implemented
functionality is not limited to photo browsing and can be extended to
other examples of computer use where primary or an auxiliary input is
done by the eye movements.
The iGaze contains an implementation of two real time eye movement
classification algorithms Velocity Threshold Identification (I-VT) and
Kalman Filter Identification (I-KF). Both algorithms are implemented
under the umbrella of the Real Time Eye Movement Identification
(REMI) protocol. Implemented in iGaze real time eye movement
classification capabilities allow to test various research ideas that
relate to the accuracy, stability, and jitter of the eye-gaze input.
The release of the iGaze code summarizes the several year of
work done at the HCI lab at Texas State University. This release
summarizes the work presented in the following papers:
1) Calibration Accuracy and Resulting Interface Component Placement
O. V. Komogortsev, C. Holland, and J. Camou, Adaptive
eye-gaze-guided interfaces: design and performance evaluation,
In Proceedings of the ACM Conference on Human Factors in Computing
Systems (CHI), 2011, pp. 1255-1260. [link]
2) Real Time Eye Movement Identification (REMI) protocol to
facilitate real time eye movement classification
D. H. Koh, M. Gowda, and O. V. Komogortsev,
Real Time Eye Movement Identification Protocol, In
Proceedings of ACM Conference on Human Factors in Computing Systems
(CHI), Atlanta, GA, 2010, pp. 1-6. [.pdf]
3) Implementation and performance testing of real time eye movement
classification algorithms. General recommendations on the interface
component size, layout, and feedback.
D. H. Koh, S. A. M. Gowda, and O. V. Komogortsev, Input
evaluation of an Eye-Gaze-Guided interface: Kalman filter vs. Velocity
Threshold Eye Movement Identification, In Proceedings of ACM
Symposium on Engineering Interactive Computing Systems, Pittsburgh, PA,
USA, 2009, pp. 197-202. [.pdf]
We have decided to make the software available to the
eye-tracking research community free of charge. If you use this software in
your research please reference one of the papers mentioned above depending
on the aspect of the implementation that you use.
Two types of eye trackers were employed to work with this
software a) any non-wearable Tobii eye tracker (code
was tested with x120 system) b) non-commercial web-camera-based
ITU Gaze Tracker. This is an eye
tracker which an open source implementation. The executable for Gazetracker
2.0 beta is included with distribution for convenience and with permission
from ITU group. It is possible to connect other types of eye trackers to the
iGaze, however in this case the code should be modified.
Download iGaze source code here. If you download
the software it is assumed that you agree to the copyright agreement at
the bottom of the page.
The compressed files have been password protected. Please email Dr. Oleg Komogortsev for the password. Kindly indicate your university/industry affiliation and a brief description of how you plan to
use the software.
Please use words "iGaze software" in the subject
line.
Acknowledgment: special thanks are
expressed to Mr. Corey Holland for the optimizations, bug fixes, and GUI
included in the software. Currently this project is funded in part by the NSF CAREER award #CNS-1250718, in part by the #60NANB12D234 grant from the National Institute of Standards and funds from Texas State University. In the past this project was funded in part by the grant #60NANB10D213 from the National Institute of Standards.
COPYRIGHT NOTICE STARTS HERE--------------------------------------------------------------
Copyright
© 2024 The Texas State University
All rights reserved.
The software is distributed under the following license and to be cited in the
bibliography as:
O. V. Komogortsev, C. Holland, and J. Camou, Adaptive
Eye-gaze-guided Interfaces: Design and Performance Evaluation,
In Proceedings of the ACM Conference on Human Factors in Computing
Systems (CHI), Vancouver, BC, Canada, 2011, pp. 1-6.
or
D. H. Koh, M. Gowda, and O. V. Komogortsev, Real Time Eye
Movement Identification Protocol, In Proceedings of ACM
Conference on Human Factors in Computing Systems (CHI), Atlanta, GA,
2010, pp. 1-6.
or
D. H. Koh, S. A. M. Gowda, and O. V. Komogortsev, Input
evaluation of an Eye-Gaze-Guided interface: Kalman filter vs. Velocity
Threshold Eye Movement Identification, In Proceedings of ACM
Symposium on Engineering Interactive Computing Systems, Pittsburgh, PA,
USA, 2009, pp. 197-202.
IN NO EVENT
SHALL THE TEXAS STATE UNIVERSITY BE LIABLE TO ANY PARTY FOR
DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OF THIS SOFTWARE AND ITS DOCUMENTATION, EVEN IF THE TEXAS
STATE UNIVERSITY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGE.
THE TEXAS
STATE UNIVERSITY SPECIFICALLY DISCLAIMS ANY WARRANTIES,
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY
AND FITNESS FOR A PARTICULAR PURPOSE. THE SOFTWARE PROVIDED ON AN "AS
IS" BASIS, AND THE TEXAS STATE UNIVERSITY HAS NO OBLIGATION
TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR
MODIFICATIONS.
COPYRIGHT NOTICE ENDS
HERE-----------------------------------------------------------------
|