Highlights from the ATIA 2019 Conference – Part 1

The ATIA Conference provides great opportunities for learning, sharing, and networking in assistive technology, and the 2019 edition was no exception. Here’s the first of two posts featuring a few of the highlights.

Highlights from ATIA 2019 - Part 1

Highlights from ATIA 2019 - Part 1
I was fortunate to attend the ATIA 2019 Conference recently, held in Orlando, FL. Yes, it was nice to escape Michigan’s polar vortex and subzero temperatures, but even nicer was the chance to learn from others and connect with the great folks who work in the AT field.

Here are a few of the highlights from ATIA 2019. Note that there were about 400 presentations, 120 exhibitors, and 3000 attendees, so this is just one tiny sample representing one person’s ATIA experience. Even this tiny sample quickly got unwieldy in a single post, so I’ll start with just the highlights related to mouse access, then follow-up in a second post with highlights in some other topic areas.

“Hands-on” with some hands-free mice

A real strength of the ATIA Conference is the large number of exhibitors, offering a chance to try a wide range of devices and discuss characteristics with the people who make them.

Because I recently wrote several blog pieces on available hands-free mice, I was excited to see quite a few developers of those devices in the exhibit hall, and I was able to try out 8 different ones. My original idea was actually to use KPR’s Compass software to take performance data on my use of each device, but in real life it didn’t seem right to spring that on exhibitors. So in the end I just tried each one informally, talking with the developers, and taking notes.

A major take-home point is that if you are helping end users choose a hands-free mouse (or any access interface), it is really important to actually try the device yourself. Even though you may not be the target user for the device, some experience with it will help you understand what the device is asking of its user, and what it feels like to use it. As a regular mouse/trackpad user, some of these devices were much more similar to what I am used to than others, basically allowing me to walk up and use without any particular instructions or practice. Others provided a very new paradigm for the interaction, which took a bit of practice.

Here are a few notes from my trials with these devices. These reflect only my brief experiences with them, as a person with typical motor function, in a busy exhibit hall. Please take them with a grain of salt.

Wearable sensors and target trackers

In the general categories of wearable sensors and target trackers (often used with head control), I tried the HeadMouse Nano, Quha Zono, and the Glassouse 1.2. All of these provided very precise cursor control in response to my head movements. The HeadMouse Nano remains a high-performing classic, very easy to setup, and only a small dot sticker to wear on the forehead. The tracking camera found my dot right away, and I was up and running, with no need for calibration or any setup. The Zona and Glassouse both have a wearable sensor, with the Zona providing more flexibility about how and where that sensor can be mounted. The Glassouse is worn like eyeglasses (but with nothing in front of the eye). It was a bit heavier than I had imagined, but stayed put during my short trial and was comfortable. I used the Zona as a headset, but at least five other mounting options are available. The headset felt like it might be a little slippery on my head, but it stayed put just fine and the control was excellent.

Lip/chin joysticks

For lip/chin joysticks, I was able to try the IntegraMouse, Jouse3, and BJOY Chin. Control had a bit more of a learning curve than the wearables above; perhaps there is a technical difference in mapping joystick signals to mouse movement. For example, with the Jouse and the BJOY, there was a bit of a “dead zone” at the joystick center, so a tiny movement right near the joystick center would not move the mouse cursor at all. This could have advantages in some ways, but it felt different at first. The BJOY Chin is a bit clunkier, with the Jouse being more streamlined and more suited to movement using lips. I did not experience a “dead zone” with the IntegraMouse; overall control with the IntegraMouse was tight and precise, requiring very little movement of the stick, but it required a fair amount of force. My impression is that you could have success with either the IntegraMouse or the Jouse with less head movement required than the wearables above. And you don’t need to wear anything. But it would probably take some practice to get really good at controlling the cursor quickly and accurately.

Eye- and face-tracking

Finally, I spent some time at the Tobii Dynavox booth, trying both eye-tracking (for mouse control) and SmyleMouse (which is included as an access method with I-Series+ devices). I found these less intuitive than the other devices I tried. Eye-tracking required a short 9-target calibration, after which it seemed to follow my eyes pretty well. To do a mouse action, you first choose an action from the palette (such as left-click), then you dwell with your eyes on the item you want to click on. (For eyegaze typing, it can be setup to automatically click with every dwell, thus avoiding an extra gaze action per letter.) This sounds straightforward, but for whatever reason, it was hard for me to consistently know what the eyetracker system wanted me to do. My “mental model” of it needed to get more aligned with the system design. Similarly, with SmyleMouse, I did achieve cursor control, but I didn’t understand all the visual cues provided by the interface — these cues are intended to be helpful, and if I had taken the time to learn them, they might have been. The click-on-smile worked but required attention — you need to control your smiling and talking or accept unintended clicks.

Again, these are just my initial impressions, from very limited use, and from the perspective of a person with typical motor function. Individuals really need to trial these devices for themselves, ideally collecting performance data as part of the trials. And we clearly need a better understanding of which hands-free devices work best for whom; if you know of research or data that addresses this, please let me know.

Mouse for iOS is (almost) here

People have been wondering for years if mouse-style access will ever be possible on iOS devices like the iPad. Since iOS does not really have the concept of a mouse cursor, by design, it has been a difficult technical and political problem to solve. But now it looks like the wait is almost over: two companies have announced intent to sell an iOS mouse adapter sometime in 2019. This means that any pointing device that supports USB, including many hands-free mice, will be usable on an iPad.

The first is the AbleNet Sidekick. This requires some sort of switch to perform clicks; dwell selection is not available at this time. AbleNet had this working at their booth; pretty cool to be able to move a mouse cursor around an iPad!

AbleNet Sidekick mouse adapter for iOS, shown with a Bigtrack pointing device.

The second is the AMAneo USB. Made by CCS Microsystems in Germany, I saw this working at the Quha booth, showing the Quha Zono controlling an iPad.

AMAneo mouse adapter for iOS

The two iOS adapters seem very similar to each other, but there are a few differences. The Sidekick has a wired connection to the iPad, while the AMAneo connects via Bluetooth. The AMAneo USB also appears to have an adjustable dwell click feature, as well as an adjustable tremor compensation, which may be helpful for some individuals.

I do not know the price points for these — if you’ve heard anything, please let us know in the comments.

If you went to the ATIA 2019 Conference, what were some of your highlights? What topics would you like to see at ATIA in the future?

Stay tuned for Part 2 of these highlights, and see you at ATIA 2020!

Leave a Reply

Your email address will not be published. Required fields are marked *