D2 Projects

D2 Development of TR Tools for Communications Technology

Task Leader:

Richard Simpson, PhD, ATP

Co-Investigator:

Heidi Koester, PhD

Collaborator(s):

Andy Jinks, MACCC-SLP, ATP

John Coltellaro, MS, ATP

Project Overview:

The proposed tools will make extensive use of the Infrastructure Core. VISYTER will allow multiple parties (client, local clinician, remote clinician, caregivers, family members) to participate in assessment and training sessions. VISYTER will also archive sessions to allow interested parties to review assessment and training activities after the fact. The ability to record training for later review will be used to provide self-guided training activities to clients. VISYTER will also allow clinicians to view a coordinated replay from multiple streams during a “live” (as opposed to archived) session. This will be especially useful for clients with very slow access methods or who are prone to fatigue. Currently, a clinician might have a client perform two or three trials while the clinician observes the client’s posture, then two or three more while observing the client’s actions with the input device (e.g., pointing device), and then two or three more while observing the screen. The ability to view coordinated replays from multiple viewpoints will allow the clinician to obtain that information from a single set of trials.

Secure Store and Forward Technology (SSFT) will be used to store information from data logging hardware and software. The tele-monitoring portal (part of the TR Portal) will be modified to provide appropriate displays for the data.

Project Objective(s):

The following specific aims will be pursued:

Method(s):

Interface Design Assistant (IDA)

Members of the investigative team developed Compass (Koester'97; Koester'98; LoPresti'02; Ashlock'03; Koester'06; Koester'07), a computerized battery of skills tests used to measure user performance with computer access devices, and the IDA (Koester'06; Simpson'06; Koester'07; Simpson'07; LoPresti'08; KoesterIn Press). IDA uses the results of Compass tests to recommend changes in system settings (e.g., mouse sensitivity, repeat keys) and the configuration of pointing devices, text entry devices and switch interfaces.

The original Compass/IDA code was written to facilitate development of a web-based version consisting of lightweight stand-alone modules to provide asynchronous follow-up support to clients while minimizing additional costs to clients, clinicians or third-party payers. The web-based version of IDA will be integrated with the Tele-monitoring Portal. Clinicians will be able to program specific tests for a client, and the tests will provide recommendations on whether, and how, to modify the configuration of AAC and computer access devices to improve performance.

Computer Activity Monitor (CAM)

Language activity monitoring (LAM) features are already integrated into many AAC devices. Similar technology has been developed for use on computers, for reasons such as monitoring employee productivity, but not for clinical purposes. Collecting data on how, and how often, input devices are used can provide valuable outcomes information to clinicians, clients and third-party payers.

CAM software will be developed to record and timestamp all text entry and pointing activity, and upload this data to the SSFT. The software will distinguish between multiple input devices. To protect privacy, the type of key that is entered (i.e., alphanumeric, modifier, function) during text entry events will be recorded but the actual key will not. Similarly, the location of the mouse cursor and button presses will be recorded but actual buttons, menus and links will not. Several measures of pointing and text entry performance don’t require knowing the actual goal of text entry and pointing actions (Hwang'02; Keates'02a; Keates'02b; Hwang'03; Hwang'04), and investigators have demonstrated that it is sometimes possible to evaluate performance in the absence of this information (Trewin'97; Trewin'99; Trewin'02; Trewin'04; Hurst'08).

Expected Findings and Deliverables:

The figure below shows the proposed infrastructure for synchronous TR service delivery. Remote team members will interact with the client and local team members through VISYTER using multiple cameras, focused on the client, the client’s activation site(s) and input device(s). The remote team
members will view the screen of the AAC device or PC through a camera or by screen sharing. PCs and PC-based AAC devices will also directly record input and output events.

Asynchronous TR tools for use between interactive clinical sessions will also be explored. LAM is already integrated into many AAC devices. CAM software will provide similar capabilities for computer access interventions and for PC-based AAC devices. IDA will provide recommendations regarding the configuration of a client’s computer or AAC device.

Service Delivery Scenarios


Figure: Service Delivery scenarios. In panel (a) the client is using an AAC device that is not PC based. One camera focuses on the device, one on the client’s activation method, and one on the client’s face and body. In panel (b) the client is using a PC or a PC-based AAC device that can share its screen via Internet.


Project Update

2012

We developed a version of the CAM software that is compatible with Windows XP, Vista and 7. We have verified the software's compatibility with several computer access solutions. The software records the occurrence of text entry and pointing events (but not the content of those events) and stores a record in a data file. Initial testing indicated that the data file can grow very large, so modifications were made to the software to periodically compress the data file. Some software (in particular, speech recognition) do not generate "standard" keyboard and mouse events. The software is able to record the duration of computer use but cannot distinguish the exact type of actions being performed. The software has been made available through the RERC website.

The IDA pointing wizard has been further modified to make recommendations about the size of objects (i.e., icons, menus, buttons). Different solutions had to be implemented for Windows XP, Vista and 7 to accommodate quirks in each operating system. The pointing wizard and keyboard wizard are both functional and ready to be deployed on subjects' computers.

 


 

2011

During this reporting period, we have developed a version of the CAM software that is compatible with Windows XP, Vista and 7. We have verified the software's compatibility with several computer access solutions. The software records the occurrence of text entry and pointing events (but not the content of those events) and stores a record in a data file. Initial testing indicated that the data file can grow very large, so modifications were made to the software to periodically compress the data file. Some software (in particular, speech recognition) do not generate "standard" keyboard and mouse events. The software is able to record the duration of computer use but cannot distinguish the exact type of actions being performed.

The IDA pointing wizard has been further modified to make recommendations about the size of objects (i.e., icons, menus, and buttons). Different solutions had to be implemented for Windows XP, Vista and 7 to accommodate quirks in each operating system. The pointing wizard and keyboard wizard are both functional and ready to be deployed on subjects' computers.

 


 

2010

During this reporting period, we have identified several potential solutions for the CAM software. The AAC Institute has developed keystroke tracking software, and there is an open-source application that tracks keyboard and mouse activities that was developed for research in human-computer interaction. Neither of these exactly matches our design needs, however. We are examining both programs to determine whether we can modify them to meet our needs. Specifically, we would need to modify either software to record keyboard and mouse activity while maintaining the subject's privacy (e.g., recording that a keystroke occurred but not which keystroke it was; recording that a button or menu item was selected but not which button or menu item it was).

We have also established design requirements for the web-based IDA software. We have implemented and tested a "keyboard wizard" that observes the user entering text and makes recommendations about appropriate changes to the accessibility settings for the keyboard. We have also implemented a "pointing wizard" and are currently engaged in usability testing. Both the keyboard wizard and the pointing wizard should have broad utility outside of this project.

 


 

News:



IJTlogo

International
Journal of Telerehabilitation

 



D3 video screen on YouTube

TeleWellness: Interactive Mobile Health and Rehabilitation

 


 

Facebook   Twitter   LinkedIn  

You Tube