THE EYES HAVE IT – EYE TRACKING EVOLUTION IN MEDICAL DEVICE DEVELOPMENT

Citation: Lance P, Seeney P, “The Eyes Have It – Eye Tracking Evolution in Medical Device Development”. ONdrugDelivery Magazine, Issue 92 (Dec 2018), pp 36-40.

Philip Lance and Phil Seeney discuss the recent surge in the use of eye-tracking technology in the field of human factors, providing an overview of the technique’s history and how it may be applied to medical devices in the future.

RECENT GROWTH OF EYE TRACKING

Today we are seeing a rapid increase in the use of eye tracking technology in the development of products across multiple industries, reflected by the surge of publications on the subject (Figure 1). This adoption of eye tracking is the consequence of decades of work developing the technology to a point where it is now more effective and affordable. Certain industries, notably aerospace, automotive, marketing and the human-computer interaction sciences, derive considerable benefit from using eye tracking when developing user interfaces. However, the medical device industry has been slower to adopt and exploit the potential of eye tracking.

Figure 1: A histogram of all publications of this survey relevant for eye tracking data visualisation techniques. The number of published articles, conference papers and books has greatly increased during the last decade.1

WHAT IS EYE TRACKING?

Eye tracking is a tool used to measure and record the eye movements and gaze positions of an individual as they perform a task or use a device or piece of equipment. These data help us to understand how different designs are actually being used, and then infer how users are interpreting instructions and engaging (or not) with the product, allowing us to identify and address user issues or problems.

Whilst eye trackers differ in form, the fundamental logic is common: capture where the eye focuses its attention and capture the movement between these points of attention. Typical language to describe this is:

“Eye tracking is a tool used to measure and record the eye movements and gaze positions of an individual as they perform a task or using a device or piece of equipment…”

  • Gaze: Where the eye is looking.
  • Fixation: Where the gaze pauses in a particular position/on a particular area. Most information collected by the eye is during a fixation.
  • Saccade: The rapid movement (jumps) between fixations. There is little to no information collected during a saccade.
  • Scan path: A sequence of fixations and saccades, also known as a gaze plot.
  • Area of Interest (AOI): An area or region on the object being tested that is important for a design hypothesis.

With current eye tracking technology, eye movements across an object can be observed and partially evaluated in real-time. This enables human-factors moderators to observe the gaze of the study participant as it happens (Figure 2).

Figure 2: Eye tracking equipment being used in a usability study to observe and monitor.

This observance provides the moderator with an insight into the potential reasoning/thought processes the study participant is following. For example, a participant may be repeatedly returning their gaze to a feature on the device suggesting there is something about this feature that they find confusing or unsettling. Using these insights, the moderator can explore with the participant why they are concerned with a feature. With this information it usually possible for designers and engineers to improve the device’s affordance (the property of a user interface to intuitively imply its function).

The power of eye tracking in such situations can be increased by integrating it with other biometric technologies that gather bio-signals, such as electroencephalogram (EEG), electrocardiogram (ECG) and galvanic skin response (GSR), also known as electrodermal activity (EDA). These bio-signals help to provide insights into the emotional state of the participant. For instance:

  • EEG monitors the electrical activity of the brain. This can be used to help identify if, when an individual looks at a feature, they have increased concentration or not.
  • ECG monitors electrical heart activity, which can indicate the stress levels of the participant, and help recognise if an individual found looking at a feature taxing.
  • GSR monitors the electrical characteristics of skin. Skin conductance increases as sweat gland activity increases. Sweat glands are controlled by the sympathetic nervous system, which is subject to psychological or physiological arousal. Thus, by monitoring GSR it is possible to identify when a participant looks at a feature and they become more emotionally aroused. It is not possible yet to tell whether the emotion is positive, such as “happy”, or negative, such as “angry”.

A range of biometric sensors have been used successfully on several studies, however certain biometric technologies require sensors to be attached to the participant and sometimes this level of intrusion can be detrimental to the aims of the study and their use must be carefully considered. Of these bio-signal gathering devices, GSR requires only a couple of sensors which strap to the fingers and, used judiciously, the results from GSR have been found to be very insightful when used in conjunction with eye tracking.

Further insights can be obtained through the analysis of the eye tracking data on fixations and saccades. A popular analysis technique of scan paths is the creation of a heat map, whereby the scan paths of a number of participants are overlaid to identify where there is commonality. Such analyses can inform and provide deeper understanding of the human-device interaction (Figure 3).

Figure 3: Analysing the eye tracking data from study participants.

An example of an insight this technique can provide is the identification of a design flaw often referred to as the “vampire effect”. This is where an element of the design draws the attention of the user away from more important interactional elements of the device’s user interface, potentially leading to an increase in the risk of errors being made by the user.2 Having identified an element causing a vampire effect, this particular aspect of the design, be it in the instructions for use (IFU) or the product itself, can be redesigned to remove the distraction and improve the affordance of the user interface. Furthermore, a follow-up heat map can demonstrate and potentially quantify the degree of improvement achieved.

A phrase commonly dropped into device requirements specifications is “must be easy to use”, but defining “easy to use” is, in itself, not easy. However, through the use of eye tracking, potential design improvements can be made and evaluated in a way that can demonstrate an improvement in ease of use, such as through the observed improvement in affordance.

WHERE HAS EYE TRACKING COME FROM?

With the recent increase in interest, it would be easy to assume the concept of eye tracking is relatively new. However, eye tracking dates back 150 years. Examples of this early work are cited by Yarbus, such as Lamansky’s 1869 work on eye movement when changing fixation points, and Landolt’s 1891 work on characterising the jerkiness of eye movements when studying stationary objects. This early eye tracking was done by observation, typically involving some use of optics such as an arrangement of mirrors and telescopes.3

The use of still and motion-picture photography was introduced to eye tracking research early in the twentieth century with Dodge and Cline in 1901, building on Lamansky’s work. Analysing a series of photographs, they found there is a slight quiver at the beginning and end of an eye movement, identifying that a pair of eyes do not move absolutely together.4 This work helped form the foundations of our understanding on how eyes move, which is the basis for how we track eye movement today. The use of a series of images quickly moved onto film. In 1905, Judd, McAllister and Steele used film to record the movement of eyes (Figure 4).5

Figure 4: Images from Judd, McAllister and Steele’s 1905 film, showing the eyes moving whilst the head is kept still.

It was in the late 1940s when eye movement research moved from characterising and understanding, to being applied to practical problems. Fitts, Jones and Milton of the US Air Force began studying pilots’ eye movements during instrument flight – specifically for the purpose of improving instrument and instrument panel design. According to them, this research “provides the answers to many questions encountered in designing aircraft instruments and instrument panels on which a large number of instruments must be arranged in the most effective way”.6 Using eye tracking, they determined how pilots’ eyes moved between and fixated on instruments (Figure 5).

However, due to the size and weight of these early film cameras, the cameras that filmed the movement of the eyes could not be mounted on the user’s head. This was problematic as it prevented tracking all eye movements, so when the user moved their head position they prevented a clear view of the eyes for the camera. Head-mounted cameras only became a realistic proposition in the 1960s, when the first practical head-mounted eye-tracking equipment was produced.7

Figure 5: Eye tracking data collected and analysed in 1949 by Fitts, Jones, Milton, revealing a pilot’s eye movement between instruments on an aircraft’s instrument panel.

By the 1970s, eye tracking had progressed from solely studying eye movement to the first attempts at understanding what specific eye movements might mean, relating eye fixations to cognitive processes. In 1971, early pioneers in this work, Norton and Stark, identified and described “scan paths”8 and, in 1976, Just and Carpenter worked on fixations. Just and Carpenter looked at how the duration of a fixation might be linked to the mental processing involved. They also investigated scanning strategies, such as looking at a picture with a number of coloured dots. They noticed that whilst not being able to identify all the dots, it was still possible to recognise the highest proportion of dots were red.9 By the 1980s work on the relationship between eye movement and cognitive load measures was being conducted, such as the 1986 work by O’Donnell and Eggmeier.10

Since the 1980s most developments in eye tracking have been in the domain of technology and engineering. However, with the improvement of sensors, materials and optics, eye tracking can now be carried out either remotely, without being intrusive to the study participant, or performed with a pair of simple eye glasses worn by the study participant, as in Figure 2. Additionally, the improved computing power, software and algorithms of recent years have enabled rapid and semi-automatic processing and analysis of results, as in Figure 3.

“A popular analysis technique of scan paths is the creation of a heat map, whereby the scan paths of a number of participants are overlaid to identify where there is commonality…”

However, the fundamentals concerned with the meaning and interpretation of eye movements have progressed very little in the last 50 years. Whilst our understanding of the relationship between eye tracking, attention and cognitive processing has developed incrementally since the 1970s, we are still only able to use eye tracking to help gain insights from study participants. Eye-tracking information without context is limited and potentially misleading. For example, two participants (A and B) both read an IFU. The eye-tracking data shows A’s eyes followed each line systematically and B’s jumped around the instructions. Which participant understood the instructions? Could it be A having read each word in turn understands the instructions, but B who jumped about the instructions missed a lot of detail? Or, could it be A is out of their depth and followed each line hoping to gain some understanding but was left confused, whilst B, already well versed in the subject, quickly scanned for key points just for confirmation? Could it be both are confused or both fully understand the instructions? Currently eye tracking data alone is not sufficient to answer these questions, and still needs traditional human factors techniques to fully interpret the situation.

For eye tracking to reach its full potential, further breakthroughs in our understanding of the meaning of eye movements are needed.

FUTURE DEVELOPMENTS IN EYE TRACKING

“It is the early adopters of this technology that are leading the way, notably the aerospace and automotive industries, researching and obtaining a better understanding of the complex attentional and cognitive processes of users…”

Currently, there is increasing focus on improving our understanding of the meaning behind eye movements. Again, it is the early adopters of this technology that are leading the way, most notably the aerospace and automotive industries, researching and obtaining a better understanding of the complex attentional and cognitive processes of users. For instance, the aerospace industry is investigating more deeply how people monitor instruments and why there are lapses in that monitoring.11 The automotive industry is conducting work to better understand the cognitive load (also known as mental workload) a driver experiences whilst driving, and how different interactions and distractions affect it.12 Much of this work is driven by a wish to increase safety and reduce user-related errors/accidents. These are also major drivers in medical and drug delivery device development and parallel learnings can be obtained and positively applied.

However, there are limitations to the aerospace and automotive research approaches. Working with comparatively small numbers of study participants cannot provide the quantity of data needed to develop the depth of understanding necessary. Fortunately, a possible answer to this constraint is provided in the emerging discipline of visual analytics, designed to process and analyse vast amounts of eye tracking data. Using complex algorithms, it allows the data mining of vast databases, which will enable common structures and strategies used by people to be uncovered.13 By combining miniaturised, low-cost eye tracking equipment with the power of artificial intelligence (AI) and machine learning, huge databases can be created and analysed to reveal insights into how subjects maintain attention, and how they manage varying degrees of cognitive load.

Perhaps with visual analytics it will be possible for eye tracking to tell us if, from the previous example, it was Participant A or Participant B who understood the IFU. Thanks to the potential of low cost electronics, powerful computers and AI, in the future we will have the capability to evaluate and assess the usability of a medical device’s user interface just by simply issuing study participants with miniaturised, non-intrusive eye tracking kit and then waiting for the AI to pass its judgement.

On this note perhaps, at some further point in the future, medical devices will have their own eye tracking capability and AI. Maybe then these devices will watch us and decide if we are using them correctly, and interactively train or coach us if we are not!

REFERENCES

  1. Blascheck T et al, “State-of-the-Art of Visualization for Eye Tracking Data”. Eurographics Conference on Visualization (EuroVis), Swansea University, UK, June 9-13, 2014.
  2. Manhartsberger M, Zellhofer N, “Eye tracking in usability research: What users really see,” Empowering Software Quality: How can Usability Engineering reach these goals? 1st Usability Symposium, HCI&UE Workgroup, Vienna, Austria, 8 Nov, 2005.
  3. Yarbus AL, Haigh B (translator), “Eye Movements and Vision”. Plenum Press, New York, 1967.
  4. Dodge RC, Sparks T, “The angle velocity of eye movements”. Psychol Rev, 1901, Vol 8(2), pp 145–157.
  5. Judd CH, McAllister CN, Steele WM, “General introduction to a series of studies of eye movements by means of kinetoscopic photographs”. Psychol Rev Monographs, 1905, Vol 7(1), pp 1–16.
  6. Fitts PM, Jones RE, Milton JL, “Eye fixations of aircraft pilots III. Frequency, Duration, and Sequence Fixations when Flying Air Ground-Controlled Approach System (GCA)”. USAF Technical Report No. 5967, Wright-Patterson Air Force Base, Dayton, Ohio, 1949.
  7. Shackel B, “Note on Mobile Eye Viewpoint Recording”. J Opt Soc Am, 1960, Vol 50, pp 763–768.
  8. Noton D, Stark L, “Scanpaths in saccadic eye movements while viewing and recognizing patterns”. Vision Research, 1971, Vol 11(9), pp 929–942.
  9. Just MA, Carpenter PA, “Eye Fixations and Cognitive Processes”. Cogn Psychol, 1976, Vol 8, pp 441–480.
  10. O’Donnell RD, Eggmeier FT, “Workload assessment methodology”. Handbook of perception and human performance, Vol 2: Cognitive processes and performance, Boff KR, Thomas JP, Kaufman L (eds), John Wiley & Sons, 1986.
  11. Peysakhovich V et al, “The Neuroergonomics of Aircraft Cockpits: The Four Stages of Eye-Tracking Integration to Enhance Flight Safety”. Safety, Feb 2018, Vol 4(1), pp 8–23.
  12. Palinko O et al, “Estimating cognitive load using remote eye tracking in a driving simulation”. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA), Austin, Texas, US, Mar 22-24, 2010, pp 141–144.
  13. Andrienko G., Andrienko N., Burch M., Weiskopf D., “Visual analytics methodology for eye movement studies,” IEEE Transactions on Visualization and Computer Graphics, Dec 2012, Vol 18(12), pp 2889–2898.
Top