Wednesday, March 6, 2013

Samsung's Galaxy S IV Smartphone Could Have Eye Scrolling

Ref: http://mashable.com/2013/03/04/samsung-galaxy-s4-eye-scroll/

So, not a bad idea as you can see.

Thanks to Ruwan for spotting this :)

Monday, October 15, 2012

Using Camera API

Android SDK supports the connectivity to the built-in camera. Using the camera to take photos is relatively easy. It is somewhat harder to setup the camera preview to work properly.

In our main activity, we create the Preview object. This object will create the Camera object and return it to the CameraDemo activity.


Next we register couple of call-back method with the Camera to be performed when the user takes a photo.

shutterCallback is called when the shutter is opened and picture is taken. rawCallback and jpegCallback will get the data for the raw and jpeg encoding of the photo. It's up to you to do something with this data, such as save it to the SD card.

Sunday, July 22, 2012

The Camera class in Android

The Camera class is used to set image capture settings, start/stop preview, snap pictures, and retrieve frames for encoding for video. This class is a client for the Camera service, which manages the actual camera hardware.

http://developer.android.com/reference/android/hardware/Camera.html#setPreviewCallback%28android.hardware.Camera.PreviewCallback%29

Tuesday, July 10, 2012

How Does an Eye Tracker Work?


Most commercial eye-tracking systems available today measure point-of-regard by the
“corneal-reflection/pupil-centre” method (Goldberg & Wichansky, 2003). These kinds
of trackers usually consist of a standard desktop computer with an infrared camera
mounted beneath (or next to) a display monitor, with image processing software to
locate and identify the features of the eye used for tracking. In operation, infrared light
from an LED embedded in the infrared camera is first directed into the eye to create
strong reflections in target eye features to make them easier to track (infrared light is
used to avoid dazzling the user with visible light). The light enters the retina and a
large proportion of it is reflected back, making the pupil appear as a bright, well
defined disc (known as the “bright pupil” effect). The corneal reflection (or first
Purkinje image) is also generated by the infrared light, appearing as a small, but sharp,
glint.

Once the image processing software has identified the centre of the pupil and the
location of the corneal reflection, the vector between them is measured, and, with
further trigonometric calculations, point-of-regard can be found. Although it is possible
to determine approximate point-of-regard by the corneal reflection alone , by tracking 
both features eye movements can, critically, be disassociated
from head movements (Duchowski, 2003, Jacob & Karn, 2003).

Video-based eye trackers need to be fine-tuned to the particularities of each person’s
eye movements by a “calibration” process. This calibration works by displaying a dot
on the screen, and if the eye fixes for longer than a certain threshold time and within a
certain area, the system records that pupil-centre/corneal-reflection relationship as
corresponding to a specific x,y coordinate on the screen. This is repeated over a 9 to 13
point grid-pattern to gain an accurate calibration over the whole screen (Goldberg &
Wichansky, 2003).

Eye tracking Technologies and techniques


The most widely used current designs are video-based eye trackers. A camera focuses on one or both eyes and records their movement as the viewer looks at some kind of stimulus. Most modern eye-trackers use the center of the pupil and infrared / near-infrared non-collimated light to create corneal reflections (CR). The vector between the pupil center and the corneal reflections can be used to compute the point of regard on surface or the gaze direction. A simple calibration procedure of the individual is usually needed before using the eye tracker.
Two general types of eye tracking techniques are used: Bright Pupil and Dark Pupil. Their difference is based on the location of the illumination source with respect to the optics. If the illumination is coaxial with the optical path, then the eye acts as a retroreflector as the light reflects off the retina creating a bright pupil effect similar to red eye. If the illumination source is offset from the optical path, then the pupil appears dark because the retroreflection from the retina is directed away from the camera.
Bright Pupil tracking creates greater iris/pupil contrast allowing for more robust eye tracking with all iris pigmentation and greatly reduces interference caused by eyelashes and other obscuring features It also allows for tracking in lighting conditions ranging from total darkness to very bright. But bright pupil techniques are not effective for tracking outdoors as extraneous IR sources interfere with monitoring
Eye tracking setups vary greatly; some are head-mounted, some require the head to be stable (for example, with a chin rest), and some function remotely and automatically track the head during motion. Most use a sampling rate of at least 30 Hz. Although 50/60 Hz is most common, today many video-based eye trackers run at 240, 350 or even 1000/1250 Hz, which is needed in order to capture the detail of the very rapid eye movement during reading, or during studies of neurology.
Eye movement is typically divided into fixations and saccades, when the eye gaze pauses in a certain position, and when it moves to another position, respectively. The resulting series of fixations and saccades is called a scanpath. Most information from the eye is made available during a fixation, but not during a saccade.The central one or two degrees of the visual angle  provide the bulk of visual information; the input from larger eccentricities (the periphery) is less informative. Hence, the locations of fixations along a scanpath show what information loci on the stimulus were processed during an eye tracking session. On average, fixations last for around 200 ms during the reading of linguistic text, and 350 ms during the viewing of a scene. Preparing a saccade towards a new goal takes around 200 ms.
Scanpaths are useful for analyzing cognitive intent, interest, and salience. Other biological factors (some as simple as gender) may affect the scanpath as well. Eye tracking in HCI typically investigates the scanpath for usability purposes, or as a method of input in gaze-contingent displays, also known as gaze-based interfaces.