Logo for the Journal of Rehab R&D

Volume 45 Number 4, 2008
   Pages 505 — 522

Distributed cognitive aid with scheduling and interactive task guidance

Edmund F. LoPresti, PhD;1* Richard C. Simpson, PhD, ATP;2-3 Ned Kirsch, PhD;4 Debra Schreckenghost, MS;5 Stephen Hayashi, BS3

1Assistive Technology Sciences, Pittsburgh, PA; 2Department of Rehabilitation Sciences and Technology, University of Pittsburgh, Pittsburgh, PA; 3Department of Veterans Affairs Pittsburgh Healthcare System, Pittsburgh, PA; 4Department of Physical Medicine and Rehabilitation, University of Michigan, Ann Arbor, MI; 5TRACLabs, Houston, TX

Abstract — A cognitive assistive technology system has been designed for use by people with memory and organizational impairments. This system will provide a distributed architecture for both scheduling assistance and task guidance, as well as intelligent, automatic replanning on the levels of both the schedule and individual tasks. A prototype of this architecture has been developed that focuses on interactive task guidance capabilities. Scheduling software has been developed but not fully integrated with the task guidance features. The system has been preliminarily tested through simulated trials, monitored use of the prototype in a clinical setting, and usability trials of the task-design interface with rehabilitation professionals. Participants were able to respond appropriately to cues provided by the system and complete prescribed tasks.

Key words: activities of daily living, assistive technology, brain injury, cognitive impairment, dementia, human-computer interface, man-machine systems, prospective memory, software design, task guidance.

Abbreviations: ADL = activities of daily living, AI = artificial intelligence, COACH = Cognitive Orthosis For Assisting Activities in the Home, GOGORTH = cognition orthosis (programming language), HTML = hypertext markup language, HTN = hierarchical task net, PDA = personal digital assistant, PEAT = Planning Execution Assistant and Training, RAPS = Reactive Action Package System, SD = standard deviation, TBI = traumatic brain injury, XML = extensible markup language.
*Address all correspondence to Edmund F. LoPresti, PhD; AT Sciences, LLC, 160 N Craig St, Ste 117, Pittsburgh, PA 15213; 412-687-1181; fax: 412-687-1181. Email: edlopresti@at-sciences.com
DOI: 10.1682/JRRD.2007.06.0078

The ability to independently initiate and perform daily activities can be compromised by a variety of acquired neurological disorders and conditions. These conditions include traumatic brain injury (TBI), cerebrovascular accident, infectious and toxic encepalopathies, and dementia. The specific cognitive abilities affected can include activity planning, problem solving, self-initiation, attention, and prospective memory (the ability to remember activities that need to be performed and carrying out these activities at the appropriate time) [1-3]. These cognitive impairments can limit a person's independence in activities of daily living (ADL), as well as vocational, educational, and leisure activities.

External cueing systems can assist people with cognitive disabilities in the performance of ADL [4]. Such devices can remediate prospective memory limitations by reminding someone to perform a task at the appropriate time. A person may also need assistance with multistep activities because of problems remembering the steps in the task, problems with sequential processing (e.g., add flour and butter before adding water), or other difficulties [5-6]. Therefore, people may also benefit from a task-guidance system that provides instructions in sequence.

In addition to prospective memory and sequential processing limitations, a person may have difficulty with problem solving [7]. If events do not occur in the expected order, the person may not be able to recover by finding alternative ways to accomplish a task. In most cases, this limitation is shared by technology. However, artificial intelligence (AI) systems can adjust plans and instructional sequences in response to unexpected events or user confusion by reasoning about the constraints that specify the sequencing of steps in the plan or instruction, thus providing problem solving support for people with cognitive disabilities.

A number of devices and systems have been developed to provide reminders of scheduled events [8]. Typically, schedule data and the user interface exist on a single machine, such as a personal digital assistant (PDA). Most such devices do not provide for access to remote servers or distributed users and essentially function as an alarm clock. These devices and systems provide a single cue to remind the user to perform a given task, without providing a method for performing the task.

Several systems do provide some separation of the schedule data from the user interface. The ISAAC™ (Cogent Systems, Inc; Ft. Lauderdale, Florida) [9] and Jogger™ (Independent Concepts, Inc; Ambridge, Pennsylvania) [10] systems allow a clinician to create the user's schedule on one computer then download it to the user's device (e.g., downloading from a desktop computer to a PDA). The Jogger further allows for uploading patient activity response information from the PDA to the therapist's computer for outcomes tracking and analysis. The NeuroPage [11] and CellMinder (Institute for Cognitive Prosthetics; Bala Cynwyd, Pennsylvania) provide scheduling information remotely over a pager and cellular telephone, respectively.

Some systems, such as MAPS [12] and the Pocket Coach (AbleLink Technologies; Colorado Springs, Colorado) [6] provide step-by-step task guidance instead of scheduling assistance. Others have proposed combining the concept of task guidance with the separation of task data and the user interface [13].

Prior work in interactive task guidance for individuals with cognitive disabilities includes the COGORTH (from COGnition ORTHosis) programming language. COGORTH allows a clinician to design tasks for the client, including cues that would be displayed on a computer screen. These cues can be used to provide information about how to complete an activity, recover from errors or interruptions, and perform multiple activities simultaneously. In a series of studies, participants with memory impairments were able to perform tasks with reduced occurrence and severity of errors when receiving COGORTH cues, compared with a baseline condition with only written cues [14-15].

More recently, Kirsch et al. completed a feasibility study examining the effectiveness of alphanumeric paging for reminding [16]. This work used an e-mail and scheduling application (Groupwise©, Novell; Waltham, Massachusetts) to develop a prototype messaging system that was modifiable in real time. Specifically, Groupwise was used to send alphanumeric pages to a person with a TBI, with cues about making entries in a memory log. An ABA´ single-case design was used for the study. During prepager trials, the participant was reminded at the beginning of the day to make entries in his memory log when each therapy session concluded. During pager trials, he received an alphanumeric page 5 min before the end of every therapy session reminding him to write in his log, and during postpager trials, he again was asked to rely only on a verbal cue. Memory log entries increased dramatically during pager trials. Return to baseline was achieved (with some data overlap between conditions), suggesting that without ongoing intervention the memory log could not be maintained.

The Planning and Execution Assistant and Training (PEAT) System (Attention Control Systems, Inc; Mountain View, California) is a commercially available system that uses AI to automatically generate daily plans and to replan in response to unexpected events. Using manually entered appointments in conjunction with a library of ADL scripts, PEAT generates the best plan to complete all the required steps and assists with plan execution by using visual and auditory cues that alert the user to scheduled appointments. The user provides input to the device when a task has been completed or if more time is required to complete the task [7].

AI is also used for cognitive assistive technology by the Cognitive Orthosis for Assisting Activities in the Home (COACH) [17], an adaptable device to help people with dementia complete handwashing with less dependence on a caregiver. The COACH used artificial neural networks, plan recognition, and a single video camera connected to a computer to automatically monitor progress and provide prerecorded verbal prompts. It is able to automatically adapt its cueing strategies according to a user's preferences and past performance of the ADL and to play multiple levels of cue detail [17]. While the COACH is able to provide sophisticated monitoring and adaptive task guidance for this given task (handwashing), training it for a new task can be time intensive.

Another system that reasons about when to provide reminders is Autominder, which uses an automated planner to schedule and track reminders [18] and can perform operations such as allocating adequate time to complete tasks, checking through interaction whether required resources (e.g., tools) are available, and rescheduling tasks when circumstances change such that insufficient time is available to complete a task. Autominder incorporates a Plan Manager, which uses constraint-based temporal reasoning to store the schedule of required tasks; a Client Modeler, which makes inferences about events in the world based on available sensor data; and a Reminder Generator, which compares the user's schedule with sensed events to determine when a reminder is necessary. This allows the system to provide reminders only when they are needed and also allows for the user to remember on his or her own without receiving an unnecessary prompt. Autominder has been incorporated in the Personal Robotic Assistants for the Elderly (NurseBot) project, which uses mobile robots as a platform to deliver reminders and provide way-finding assistance [19].

The research described previously has addressed scheduling assistance, distributed systems, interactive task guidance, and adaptive planning. However, no system has combined these capabilities to deploy a single cohesive system for aiding people with cognitive impairments. The goal of this study is to develop a single system called ICue that takes a distributed approach for providing both scheduling assistance and task guidance, as well as intelligent automatic replanning of both the schedule and the instructions for tasks in the schedule. By housing the primary schedule maintenance and task-guidance software on a central server, the user delivery platform is hardware independent, allowing users to use whatever hardware best fits their needs (e.g., laptop, PDA, smart phone) or to use different hardware in different settings (e.g., home, work, community). This distributed architecture also aids the caregiver in remotely managing and monitoring client performance; the caregiver or rehabilitation professional can adjust the client's schedule, track whether he or she is successfully completing daily tasks, and be informed of emergencies while distant from the client. The system will further allow two-way interaction between the client and the system. In existing systems, the user simply acknowledges receipt of one prompt to move on to the next prompt. Our system will react to user input, adjusting task instructions or schedules based on information gleaned from feedback about the user's situation and context, including whether an activity or instruction has been completed or whether additional time or instructions are needed. Instruction sets will provide for branching from one instruction to another, based on client responses, in addition to simple sequential presentation of instructions. Using this feature, a caregiver who is familiar with a client's clinically observed difficulties can incorporate contingency planning in the instruction set so that clients can avoid errors or recover from them more effectively.

The architecture of ICue's Cognition Manager resembles the integration between deliberative and reactive planning in 3T [20], a control architecture for mobile robots and crew space systems [21-22]. In both cases, deliberative planning software maintains an entire schedule. Reactive planning software keeps track of the scheduled task(s) that are currently active and adjusts tasks in reaction to new information (e.g., sensor data from a robot or a user response). When a schedule item becomes active, the deliberative planner passes the new goal to the reactive planner. The reactive planner finds a task that will accomplish the goal and begins to execute this task. When the task is complete, the reactive planner provides feedback to the deliberative planner indicating whether the task was successful in achieving the goal. The 3T approach has been used to track humans performing tasks, specifically astronauts performing procedures [23], but has not previously been used to provide instructional assistance integrated with task tracking.

A prototype of the ICue system has been developed, focusing on interactive task-guidance capabilities. Scheduling software has been developed but not fully integrated with the task-guidance features. The system has been preliminarily tested through simulated trials and monitored use of the prototype in a clinical setting.

Software Design

ICue was designed to allow a caregiver to organize a client's activities into a daily schedule and instruct the client on how to perform activities in the schedule. ICue has four components (Figure 1):

· An Activity Assistant that guides a client through instructions to perform daily tasks as they arise on his or her schedule. The Activity Assistant resides on the server and delivers instructions to a standard Web browser on the user's computer or PDA.
A Design Assistant that aids a caregiver in defining the steps within a task and creating a schedule of multiple tasks.
· A Cognition Manager that (1) builds a client schedule using information supplied by the caregiver and (2) generates client instructions and monitors client feedback using knowledge from task analyses encoded by the caregiver and client feedback during activity execution.
· An Information Server that hosts the Cognition Manager.

Figure 1. Overview of ICue system. PDA = personal digital assistant, HTN = hierarchical task net.

The Cognition Manager consists of a Schedule Supervisor that builds and tracks activities in the client's schedule and an Instruction Sequencer that dynamically constructs the sequence of steps needed to accomplish a task. The Schedule Supervisor is a deliberative planner based on the Adversarial Planner [24], originally developed for the military. The Instruction Sequencer is the reactive planner developed using the Reactive Action Package System (RAPS) [25], designed for use with mobile robots. The Instruction Sequencer guides the client in performing the currently active task, providing subsequent steps as the user progresses through the task. It can automatically alter the sequence of steps in response to problems or based on client responses.

The Instruction Sequencer provides an instruction for the current step of the ongoing task to the Activity Assistant. Based on this information, the Activity Assistant dynamically generates a Web page as a task cue to the client. The cue can include text, a picture, speech, nonspeech audio information, or any combination of the above. The Activity Assistant makes this dynamically generated Web page available over the Internet. The client receives the cue through the Web browser on his or her computer, PDA, or other device with Internet access. As the client progresses through the task, his or her Web browser remains directed to the same Web address. The Activity Assistant dynamically changes the web page at this address to reflect the current step in the task as further information is provided by the Cognition Manager. The Activity Assistant also collects information based on the client's response and/or the passage of time, and returns this information to the Cognition Manager for use in selecting the next step. For example, if the user clicks a button, the Cognition Manager will move to the appropriate next step in the task and provide a corresponding cue through the Activity Assistant. On the other hand, if a preset time expires without a user response, the Cognition Manager will provide a follow-up instruction for the same step or may end the task and record that it was not successful.

A prototype of the Schedule Supervisor has been implemented but has not been fully integrated with the Instruction Sequencer. When integrated, the Schedule Supervisor will track a client's progress on the activities in the schedule. When it is time for an activity to begin, it will notify the client by launching the Instruction Sequencer for the instruction corresponding to the activity. It marks a planned activity as initiated when the client begins interacting with the Instruction Sequencer to perform the first instruction corresponding to the activity. It marks a planned activity complete when the client exits the final instruction normally. A planned activity is marked as failed if the client fails to start the activity in a timely manner or exits the activity abnormally. When an activity is marked as failed, the Schedule Supervisor may replan the remainder of the day by shifting the remaining activities forward, unless they are constrained to a fixed time (such as an appointment). If an activity runs over the time allocated in the schedule, the Schedule Supervisor can delay activities deemed less important. Such automatic replanning is constrained to shifting activities to an earlier time or canceling activities. The caregiver can interact with the Schedule Supervisor to modify a client's schedule in other ways, such as reordering activities, adding new activities, or planning beyond the end of the current day.

The Design Assistant allows caregivers (rehabilitation professionals, paraprofessionals, or family members) to define the steps necessary to complete an activity. We believe that most members of the target user population for ICue will not be able to independently perform the task design process and therefore a caregiver will perform task definition; however, the Design Assistant is designed to be easily usable, and high-functioning clients may be able to design their own tasks independently or with clinician assistance. Even when a caregiver is the primary task designer, client feedback will be necessary to ensure that task cues are appropriate for the individual client.

The Design Assistant provides a graphical editor for defining, viewing, and adjusting the content and ordering of the steps of an instruction as well as the presentation of the instructional steps to the user. The caregiver has flexibility to determine the level of detail for the task analysis, since some users will benefit from high-level instructions (e.g., "fry eggs"), while others will need more detailed steps ("get out frying pan," "crack eggs into pan," "turn burner to high heat," etc.). The caregiver can define the task steps at whatever level of detail is appropriate for the client. The caregiver can also select both the modality of prompts (text, pictures, vocal prompts, nonspeech sounds) and the content (e.g., specific wording, pictures of the user's own bathroom). In addition to conveying content (text, recorded speech, pictures), the caregiver has the option to assign an alarm sound to a given step. When the step becomes active, and whenever it repeats due to lack of feedback from the user, the alarm will sound to alert the user to check the device.

The Design Assistant was implemented in the Java programming language. Figure 2 shows the user interface for defining a task and the cue information for each step in the task. The task title, expected duration, and list of steps are shown on the left side of the screen. The right side of the screen displays information for the selected step. Here, the caregiver can define the cue for this step, entering text and/or selecting appropriate picture and sound files. The caregiver can choose to have multiple cues for the same step. For example, the system might first display a simple prompt, but if a certain amount of time elapses, the system will present a second, more detailed cue. The caregiver can define the amount of time to allow between repeats. If the client still does not respond after the final repeat, the caregiver can determine whether the system will move to another step by default or instead end the task. In addition, the caregiver can define "user responses"-buttons that will appear on the cue screen and allow the client to respond to the cue. The caregiver can design these responses to allow the client to indicate success (thus moving to the next step in the task), a choice (e.g., what the client wants for breakfast) that will affect which step is selected next by the Cognition Manager, or a difficulty. Using these features, a caregiver who is familiar with a client's typical difficulties will be able to incorporate contingency steps in the task so that clients can recover from common errors.

Figure 2. ICue system Design Assistant user interface for entering task and cue information.

Cue content, including cue text, button text, graphics, and sound files, must be generated by the caregiver. Cue and button text are entered directly into the Design Assistant; graphics and sound files must be saved on the caregiver's computer and then opened within Design Assistant. In a final product, it will be possible to provide libraries of standard graphics and alarm sounds, and including text-to-speech capabilities as an alternative to recorded speech cues may be desirable. However, some people with disabilities will need at least some customized graphics (e.g., showing their stove, not just a generic stove) and will respond more readily to a familiar recorded voice rather than a computer-generated or generic recorded voice. Future versions of the Design Assistant could include support for taking photos and recording sound files as an integral part of the task-design process.

Once the caregiver has defined the task and its steps, he or she can save the task. The task structure defined using the Design Assistant is translated into an extensible markup language (XML) document. The XML document is sent to the Information Server, where the task information is stored as a RAPS procedure that can be used by the Cognition Manager.

The Design Assistant will also allow the caregiver to compose the client's daily schedule. It combines automated planning software with software to assist caregivers in building the client's daily schedule. The client's daily schedule is constructed by first defining the set of activities that should occur at the same time every day. These include such items as meals, sleep time, bathing, and other routine activities. These activities are assigned to a fixed time period. The remaining blocks of time are viewed as open or "free time." We use automated planning techniques to fill these open blocks with activities that achieve the client goals for the day as specified by the caregiver. The fixed activities can vary for different days of the week, if needed. We believe the regularity of this approach matches well to the clients' need for consistency in their day-to-day activities.

To populate the open blocks with activities, the Design Assistant uses the Adversarial Planner [24], a hierarchical task net (HTN) planner. HTN planners use predefined hierarchical groupings of goals and associate them with constraints that are matched during searches (called goal decomposition). We chose to use HTN technology because goal decomposition improves planning efficiency by constraining searches when action sequences are built. The Design Assistant integrates this HTN planner with simple scheduling techniques for automatically ordering the goals passed to the plan [26]. The scheduler first determines which goals are possible candidates at the specified time by identifying all goals whose temporal constraints, resource constraints, and user preferences are met at the time specified. Given this list of candidate goals, the scheduler then determines which goal is most important to the user. To determine this, the scheduler looks at the optimization criteria set by the user. The optimization criteria are evaluated in a user-specified order and consist of the following categories:

· Meet required temporal constraints.
· Meet user-preferred temporal constraints.
· Prefer high-priority goals.
· Prefer goals with longer expected duration.
· Prefer goals using critical resources.

The user can adjust these criteria to create several variations of a plan whose initial conditions and goals are the same but whose scheduling criteria are different.

The selected goal is passed to the HTN planner. The planner decomposes the goal into primitive tasks and reduces the amount of time that is available in the open block. The scheduler then recomputes the candidate goals based on the effects that the scheduled task has on the schedule, e.g., resource usage, temporal changes. The cycle repeats, with candidate goals being identified, the highest ranking goal being selected, and the remaining time being returned for scheduling until the open block is filled or no tasks remain that fit in the block of time being scheduled. If the scheduler fails to find a goal for the open block, then the planner allocates the remainder of the block to free time. The planner then continues planning the day until all open blocks are filled. Because the planner schedules free time whenever an open block cannot be filled, the planner never fails to build a schedule. Schedules can be suboptimal, however, in that goals are not met, resulting in schedules with unallocated time periods (i.e., free time).

Once a schedule has been created, the user interacts with the Design Assistant to review its activities and adjust them if needed. When the schedule is deemed acceptable, it is stored in a database. Schedules in the database can be viewed remotely using a Web interface.

Currently, software for designing both task instructions and activities in the schedule have been implemented but not integrated. Figure 3 shows the user interface for organizing user tasks into a daily schedule. The caregiver is presented with a list of modeled tasks that he or she can select to be included in the schedule. For each selected task, the caregiver can provide timing constraints (e.g., the task must take place at 2:00 or the task is preferred to take place between 1:00 and 4:00). The caregiver can also prioritize the tasks, if some tasks (e.g., taking medication) are considered more important than other tasks (e.g., doing a crossword puzzle). Based on the expected duration of each task, the schedule constraints, and the priorities, the Schedule Supervisor creates a schedule. The results are presented to the caregiver, including the proposed schedule and a list of any selected tasks that could not be scheduled because of an inability to meet the constraints. The scheduling user interface and the Schedule Supervisor were implemented using the Java and Lisp programming languages.

Figure 3. ICue system user interface for building client schedules.

Currently, versions of the Design Assistant, Instruction Sequencer, and Activity Assistant have been implemented but not fully integrated as they will be in a final product. The Design Assistant can be used to design multistep tasks, as described earlier. Task plans are saved as XML documents but are also automatically translated into RAPS format by the Design Assistant. The RAPS document can be manually uploaded to the Instruction Sequencer. A second, distinct Design Assistant can be used to define a user's schedule and will automatically generate a schedule in response to constraints set by the user, as described previously.

In the absence of a fully functional Schedule Sequencer, an investigator must select a task to be performed (Figure 4). Once a task is selected, the Instruction Sequencer and Activity Assistant work together as described earlier to deliver task cues to the user. The task is activated within the Instruction Sequencer, which provides the appropriate cues in response to user feedback (or lack of feedback within a specified time period). The cue information (text, graphic, alarm sound, and/or sound file with spoken instructions), timing information, and user response information (buttons) are provided to the Activity Assistant. The Activity Assistant creates a hypertext markup language (HTML) document with the current cue and makes it available over the Internet; a client can then access it on a standard PDA or computer Web browser. In this way, the Activity Assistant will deliver the first cue to the client (Figure 5). Once the client begins to perform the task, he or she will be prompted for each step of the task and asked to give feedback when he or she completes the step (Figures 6-7). If the client does not respond within the time set by the caregiver, the system will present an alternative cue or end the task as previously determined by the caregiver using the Design Assistant (Figures 7-9).

Figure 4. ICue system Web page allowing caregiver/investigator to select task that will be presented to client. Figure 5. ICue system task cue prompting for user decision.
Figure 6. ICue system cue asking for user to acknowledge when step is complete or allowing user to back up to previous cue. Figure 7. ICue system cue for step in task with text and picture.
Figure 8. ICue system alternative cue for task shown in Figure 7. Figure 9. ICue system cue indicating that task has been abandoned and system will summon live assistance.
Design Rationale

A distributed architecture was selected for the ICue system to provide flexibility with hardware and to support telerehabilitation applications. Because the client's schedule resides on a central server, he or she may use a device that is most appropriate to his or her needs or multiple devices in different environments (e.g., a desktop computer at home and a PDA in the community). Also, a caregiver is able (with the client's permission) to access the client's schedule to track the client's success with his or her daily activities. A Web interface was selected for the client user interface (Activity Assistant) to maximize this flexibility. The caregiver user interface (Design Assistant) is a local program running on the caregiver's computer; this provides more flexibility for the caregiver to define tasks while offline or online. As Web 2.0 technologies mature, a Web interface may be implemented for the Design Assistant.

The appearance of the client's user interface is largely defined by the selections made by the caregiver in the Design Assistant. The Design Assistant, in turn, was designed to provide caregivers with the flexibility they need to customize prompts for their clients, without providing so many features that the user interface becomes confusing and daunting. The selection of features for the Design Assistant was based on clinical expertise from Dr. Kirsch's 22 years of experience in applying computer technology to the needs of people with cognitive impairments, particularly acquired brain injury. Feedback was provided from other rehabilitation professionals through a focus group and usability trials. The Design Assistant was designed with the expectation that caregivers will typically define tasks and that these caregivers will have a wide range of computer experience or rehabilitation expertise; accordingly, the Design Assistant is designed to be usable by people with relatively little computer experience. To date, the Design Assistant was not designed specifically for people with cognitive impairments to design their own tasks or tested for usability with this population, but high-functioning individuals with cognitive disabilities may be able to design their own tasks and schedules using this tool.

Human Subjects Protection

A series of trials were conducted to evaluate the ICue system. Studies involving people with disabilities were approved by the University of Michigan institutional review board, and these studies were conducted exclusively by investigators affiliated with the University of Michigan Health System. Studies involving caregivers were approved by the University of Pittsburgh and University of Michigan institutional review boards. All participants provided informed consent for their involvement.

Simulated Trials

Prior to development of the ICue software, the primary clinical features of the system were tested with various technological approaches, such as stand alone PDA applications and simple server-side, database-driven Web pages, presented on both PDAs and laptop computers. In effect, these early studies simulated later ICue features, so that the feasibility of providing interactive cueing on a PDA could be assessed [5]. Single-case pilot studies were conducted to examine whether patients with significant cognitive impairments could interact effectively with a PDA and whether the instruction sets presented by the PDA could facilitate task performance. Two tasks were identified for study based on difficulties experienced by patients in the clinical practice of one of the investigators. These tasks were setting a clock radio and navigating through a 14,000 ft2 rehabilitation facility. For both tasks, a task analysis was completed and individualized instruction sequences were predefined. For the navigation task, each cue was hard-coded with the use of HTML as a sequence of links to successive and preceding instructions. For some instructions, choices were offered with appropriate branching links depending on a patient's response. All these HTML pages were then copied onto an iPAQ 3650 (Hewlett-Packard; Palo Alto, California), where they could be accessed through the Microsoft Pocket Explorer Web browser (Microsoft Corp; Redmond, Washington). For the clock radio task, a series of interactive screens were designed for wireless presentation on a laptop computer. These pages were maintained on a server-side database, accessed with Macromedia® ColdFusion (Adobe Systems, Inc; San Jose, California). In both studies, the investigator therefore manually performed the tasks of the Design Assistant and Activity Assistant, converting a task description into HTML Web pages. However, the actual presentation of successive (and/or preceding instructions) was controlled entirely by user response. The experience for the patient was similar to what would be available with ICue, although these pilot systems lacked ICue's ability to adapt to user difficulties or other contingencies.

Single-case trials were conducted for each of two clients. The first client was a 69-year-old woman with cognitive impairment following a TBI but with additional evidence of early-stage multi-infarct dementia. Based on input from the client and family, a task was selected requiring that the client set her clock radio. A sequence of cues was developed consisting of both verbal and graphic material, providing her with visual guidance such as where on the alarm clock to find the buttons she would have to press as well as verbal instructions about details such as the time to be set. A series of iterative pilot trials was first conducted to assess error patterns and then develop modifications to the cueing sequence that specifically addressed her characteristic errors. A variant of an ABA´ single-case design was then implemented. Specifically, a series of single-trial "blocks" were introduced, with the client alternating between cued performance of the task and independent performance. The results are presented in Figure 10, adapted from Kirsch et al. [5]. For successive alternating trials, her performance during cued trials was characterized by consistently fewer errors. Learning over time was evidenced, even during the noncued condition. However, her improvement was fragile, with deterioration during the last two study trials for reasons independent of intervention (specifically, she was anxious about a family trip being planned for later on the day of testing, resulting in markedly reduced attention), but the pattern of improved performance with cueing was maintained [5].

Figure 10. Alarm clock task: Number of navigation errors during unassisted (A)and computer assisted (B) trials.

For the second study, a navigation task was investigated with a 19-year-old male who had severe memory and orientation deficits 3 months after sustaining a TBI. He had been unable to learn any routes from therapy office to therapy office, despite repetition over several weeks. Thirty routes were identified within the rehabilitation facility (e.g., between occupational therapy and physical therapy, or speech therapy and occupational therapy). Each route comprised three to six directional choice points. A large colored circle was placed on the walls of the facility at each choice point. The colors for these circles were all distinct and unmistakably identifiable from a distance as far as 100 feet. Each route in the facility was represented by a unique sequence of colored circles. Each route represented a task to be completed, and instruction sets were developed for each route. Each instruction directed the patient to walk to the next colored circle for the route being followed and then to tap the screen. A sample instruction for this task is shown in Figure 11.

Figure 11. Personal digital assistant with cue for navigation task.

A standard ABA´ single-case design was employed. During A (baseline) trials, the patient attempted navigation independently, during B trials (intervention) the simulated intervention was used, and during A´ trials (return to baseline), the simulated intervention was withdrawn. Investigators recorded the number of errors made during each trial that required therapist intervention (e.g., turning down the wrong corridor). When errors occurred, the patient was instructed to correct himself for A and A´ trials and to consult the PDA for B trials. Data are presented graphically in Figure 12. During A, trials a substantial number of errors were made. During B trials, the number of error was reduced. During A´ trials, the number of errors was similar to B trials but more variable [5].

Figure 12. Navigation activity: Number of navigation errors during unassisted (baseline A, return-to-baseline, A') and computer-assisted (B) trials

The results from these studies (using technological approaches later incorporated into ICue) indicate that the interventions did have an immediate benefit, based on the improvement from noncued to cued trials in each case. For both studies, interpretation is complicated by performance that did not return to baseline levels when the intervention was withdrawn. This may indicate that the intervention resulted in some degree of learning for these tasks, or it may indicate that the participants also experienced general learning due to repetition, and this learning effect was conflated with the effect of the intervention. However, the results were sufficiently encouraging to merit continued software development.

Field Trial of Current Prototype

The current version of ICue was used in preliminary field trials with a single client who had cognitive impairment following TBI. Task analyses were performed for two tasks of relevance for the client. The first task involved using a glucometer to measure blood sugar, and the second task involved learning to use a memory book. Based on these task analyses, each task was entered into ICue using the Design Assistant. The tasks were made available over the Internet via the Cognition Manager. The investigator selected which task would be active at a given time. ICue would then present the appropriate cues to the client.

The client was observed performing both tasks with cues from ICue. No control conditions were used for reference in these observations. The purpose of these exploratory studies was simply to demonstrate that the client could interact with sequential directions presented using the ICue system and perform the prescribed task. In each case, the client was able to respond appropriately to the cues provided by ICue and complete each task independently. For example, for the glucometer task, the client was able to respond to a combination of visual and verbal cueing by preparing the glucometer, taking blood glucose reading, and recording the result in a notebook. Since these studies were designed to assess client response independent of clinical outcome, we made no comparisons of task performance with and without cueing, other than clinical observation that the target tasks were completed successfully with cueing, in contrast to performance during therapy sessions during which these tasks were performed with multiple errors. These observations did not confirm the clinical utility of ICue but indicated that ICue can potentially provide appropriate and usable cues for relevant tasks to people with cognitive impairments. However, controlled trials are needed to evaluate the efficacy of ICue compared to a person's independent performance.

Usability Trials

Clinical effectiveness will require that ICue is useful not only for people with cognitive disabilities, but also for the people who prepare their schedules. This includes rehabilitation professionals, paraprofessionals, family members, and people with disabilities themselves. The design of the Design Assistant is intended to provide a simple approach for caregivers to follow when defining the steps of an activity. Rather than provide a programming-style interface, we are using a graphical approach that allows caregivers to specify each step and the relationship between steps from a display of options.

Six rehabilitation professionals were recruited for usability testing of the Design Assistant. The primary purposes of formal usability trials were to assess the extent to which the Design Assistant is considered by clinicians to be easy to learn and use, whether the user interface is clearly and efficiently designed, and whether all features identified by clinicians during focus groups have been adequately represented. Participants included two neuropsychologists, one occupational therapist, one center for independent living program manager, one center for independent living skills trainer, and one peer counselor.

Each participant received a kit with a set of tasks that led to the creation of a set of activities and a schedule for a hypothetical client. Participants completed a short survey at the completion of their evaluation, which asked for interface feature requests, bugs, and assessment of the system's usability and usefulness. Some survey questions used a 5-point Likert-type scale, ranging from "Strongly Disagree" to "Strongly Agree." Survey questions included the following (where "Solo" is the former name for ICue):

1. It was easy to learn how to use the Solo Design Assistant.
2. Using the Design Assistant was a very frustrating experience.
3. It would take too long to use the Design Assistant to define tasks in a clinical environment.
4. My clients would benefit from a system such as Solo.
5. I frequently guessed correctly when trying to figure out what to do next.
6. Most of my actions while using the Design Assistant were correct.
7. It is worth the effort to use Solo.
8. I feel I am capable of independently using the Design Assistant with a client after this experience.
9. I would be likely to purchase Solo if it were a commercial product.

Data collected included the scores for these Likert-type questions, answers to specific questions related to the schedule development user interface, and open-ended questions related to aspects of the system that subjects liked, did not like, or would like to see added. Subjects reported that ICue seemed beneficial (mean ± standard deviation [SD] score of 4.2 ± 0.75 out of 5) and that they would purchase ICue if it was a commercial product. Subject ratings of ease of use were mildly positive (mean ± SD of 3.7 ± 0.82, 3.2 ± 0.98, 3.7 ± 0.82, and 3.7 ± 0.82 for questions 1, 5, 6, and 8, respectively), with participant responses and investigator observations indicating usability issues to be addressed in future work. Despite these issues, subjects reported that using ICue was not frustrating (mean ± SD score of 1.7 ± 0.52 out of 5 for frustration) and was worth the effort to use (mean ± SD 4.3 ± 0.82 out of 5). Some expressed concern that ICue might take too long to use in a clinical setting (mean ± SD 2.5 ± 1.38 out of 5), which will need to be addressed in future work. Results of these usability trials, including specific user comments and feature requests, will be used to inform system revisions and guide enhancements. Examples of difficulties that were observed include participants expecting the name of a step to be mirrored in the cue text, confusion resulting from two "Add" buttons on the screen (one for a new step, one for a new client feedback button on the current step), and a tendency to type "minutes" or "seconds" when entering a time rather than using a separate box to select time units. Participant comments emphasized the need to include repeating events in the schedule (at repeat rates other than daily); schedule constraints related to whether the user is leaving the house; allow the system to present the user with choices of two tasks to perform in some instances, rather than a single task; make it easier to rearrange steps within a task; have task templates; and have tasks scheduled relative to earlier tasks. Participants also noted the possibility that a system such as ICue could reduce resistance to a family member's prompts because it is not a person telling them what to do. In regard to how the system makes scheduling decisions, some participants wanted the system to minimize free time (to provide structure and avoid boredom), but one expressed concerns that overloading the schedule could lead to fatigue, while another emphasized that the clinician should have the ability to provide more free time as a client progresses.


These preliminary results indicate that a system such as ICue has the potential to benefit people with memory and organizational impairments following TBI or other disabilities. However, considerable work is needed to establish the efficacy and clinical practicality of the system.

Currently, task instructions must be selected by person such as a caregiver or investigator. The automated scheduler must be fully implemented and integrated with the Instruction Sequencer and other aspects of the ICue system to provide automatic selection of tasks. Other features of the Cognition Manager, such as the ability to contact a caregiver in emergency situations, must also be implemented.

Currently, the interaction between ICue and the client is monitored by the Instruction Sequencer but is not recorded for future review. In future versions of ICue, all such interaction will be logged for use by the caregiver in measuring outcomes and adjusting instructions. This update will accommodate changes in a user's needs over time. This information can be used to determine which tasks were difficult for a client and which were performed with ease and perhaps did not require reminders. The caregiver can also identify and revise instructional steps that are not effective for a client.

Evaluation trials are needed to determine the true efficacy of the ICue system for people with cognitive disabilities. Such trials are planned for the current system, using ABA´ single-subject designs similar to those performed for the simulated system. Clinical trials will include three participants with a history of acquired cognitive impairments. The goal of the clinical trials will be to demonstrate that the cueing and guidance provided by a caregiver can be transferred to ICue, thereby increasing the possibility that the patient will be able to engage in these activities independently.

For each participant, one activity will be selected for intervention. During intervention sessions, each participant will be provided with a handheld computer. This device will be used to present instructions developed with the Design Assistant and presented by a prototype version of the Activity Assistant. The instructions will be based on a task analysis of the activity and will be much the same as those that would otherwise be offered by a caregiver.

For every activity or behavior, data will be collected, including (1) time to task completion, (2) number of sequencing errors, (3) number of subtask omission errors, and (4) number of interventions required from the therapist. Performance of the target activity using ICue will be compared with baseline performance. Based on the specific client and targeted activity, typical hypotheses may be (1) time to completion will be shorter, (2) sequencing errors will be fewer, (3) errors during performance of subtasks will be fewer, (4) subtask omissions will be fewer, (5) therapist interventions will be fewer, or (6) the targeted behavior will occur more or less frequently, depending on clinically identified goals. Results will be used to inform system revisions and guide subsequent enhancements.

Similar trials will be conducted for the full ICue system when the scheduler has been integrated. Once the software is fully implemented, more extensive and controlled evaluation trials will also be performed.

Clinical effectiveness will require that ICue is useful not only for people with cognitive disabilities but also the people who prepare their schedules. This includes rehabilitation professionals, paraprofessionals, and family members. The design of the Design Assistant is intended to provide a simple approach for caregivers to follow when defining the steps of an activity. Rather than provide a programming-style interface, we are using a graphical approach that allows caregivers to specify each step and the relationship between steps from a display of options.

Additional usability trials are planned to evaluate the usability of the Design Assistant for potential users without formal rehabilitation training, for example, family members and people with cognitive disabilities. The protocol for these trials will be similar to that described earlier for rehabilitation professionals.

Maintaining the user's schedule and task plans on the Internet allows the user to receive prompts from any device with a Web browser and supports caregivers in adjusting and monitoring the user's schedule. However, it presents two difficulties: the risk of lost connectivity and the risk of compromised privacy.

If the computing device with which the user is accessing ICue (e.g., computer, PDA, smartphone, etc.) loses contact with the Internet, currently no means exist for the user to receive prompts. For the system to be practical for real-world use, or even for extensive field trials, this limitation must be addressed. First, a back-up system will be needed that is local to the user's device. This back-up system may lack some of the intelligent features of the full system but would be able to provide basic prompting in accord with the user's schedule as of the time when Internet connectivity was lost. When an Internet connection is available, the backup system will periodically download the user's schedule so that it will be available locally if needed. Secondly, the server-side system will continually monitor its connection with each active user. If a connection is suddenly lost, ICue will automatically alert a designated caregiver. The caregiver can then take action to ensure the user's safety.

Also present is a risk to the user's privacy, if someone were to illicitly access the ICue server containing the users' schedules and task plans. The final ICue system will need to implement security measures to reduce this risk.


An interactive scheduling and task guidance system has been designed for use by people with memory and organizational impairments. A preliminary prototype of the system has been developed and implemented as a distributed system [27]. This distributed approach will allow for remote access by the client and one or more caregivers, allowing the caregivers to adjust the client's schedule, monitor outcomes, and provide emergency assistance. Initial simulated trials and field trials indicate the potential for this system to provide meaningful assistance to people with cognitive disabilities. More controlled clinical trials are needed to establish the true efficacy of the system. Usability trials are also planned to determine how easily a caregiver can enter tasks and schedules into the system. Ease of use for both clients and caregivers is necessary for a truly practical and useful system.


This material was based on work supported by a Phase I Small Business Innovation Research grant from the National Institutes of Health, National Institute of Child Health and Human Development, 5 R43 HD44277-02. This study was sponsored by the National Institutes of Health through a Small Business Innovation Research grant to AT Sciences, LLC.

Data collection, analysis, and interpretation of data for trials with people with disabilities were conducted by and at the University of Michigan.

If successful, AT Sciences plans to commercialize the ICue software with licenses to the University of Michigan and TRACLabs. AT Sciences had a direct role in the study design; in the collection, analysis, and interpretation of data for caregivers; and in writing and submitting this report for publication.

1. Cole E, Dehdashti P. Computer-based cognitive prosthetics: Assistive technology for treatment of cognitive disabilities. In: Proceedings of the Third International ACM Conference on Assistive Technologies; 1998; Los Angeles, California. New York (NY): Association for Computing Machinery; 1998. p. 11-18.
2. Ellis JA. Prospective memory or the realization of delayed intentions: A conceptual framework for research. In: Brandimonte M, Einstein GO, McDaniel MA, editors. Prospective memory: Theory and applications. Mahwah (NJ): Lawrence Erlbaum Associates; 1996. p. 1-22.
3. Levine S, Horstmann H, Kirsch N. Performance considerations for people with cognitive impairments in accessing assistive technologies. J Head Trauma Rehabil. 1992;7(3): 46-58.
4. Kime SK, Lamb DG, Wilson BA. Use of a comprehensive program of external cueing to enhance procedural memory in a patient with dense amnesia. Brain Inj. 1995;10(1):17-25.[PMID: 8680389]
5. Kirsch NL, Shenton M, Spirl E, Rowan J, Simpson R, Schreckenghost D, LoPresti EF. Web-based assistive technology interventions for cognitive impairments after traumatic brain injury: A selective review and two cases studies. Rehabil Psychol. 2004;49(3):200-12.
6. Davies DK, Stock SE, Wehmeyer M. A palmtop computer-based intelligent aid for individuals with intellectual disabilities to increase independent decision making. Res Pract Pers Sev Disabil. 2004;28(4):182-93.
7. Levinson R. PEAT: The planning and execution assistant and training system. J Head Trauma Rehabil. 1997;12(2): 245-52.
8. LoPresti EF, Mihailidis A, Kirsch N. Technology for cognitive rehabilitation and compensation: State of the art. Neuropsychol Rehabil. 2004;14(1/2):5-39.
9. Gorman P, Dayle R, Hood CA, Rumrell L. Effectiveness of the ISAAC cognitive prosthetic system for improving rehabilitation outcomes with neurofunctional impairment. NeuroRehabilitation. 2003;18(1):57-67. [PMID: 12719621]
10. Jinks A, Robson-Brandi C. Designing an interactive prosthetic memory system. In: Sprigle S, editor. Proceedings of the Rehabilitation Engineering Society of North America (RESNA); 1997; Arlington, Virginia. Arlington (VA): RESNA Press; 1997. p. 526-28.
11. Herch N, Treadgold L. NeuroPage: The rehabilitation of memory dysfunction by prosthetic memory aid cueing. NeuroRehabilitation. 1994;4(3):187-97.
12. Carmien S. MAPS: PDA scaffolding for independence for persons with cognitive impairments. In: Proceedings of the 2002 Human Computer Interaction Consortium; 2002 Feb; Winter Park, Colorado.
13. Thöne-Otto A, Walther K, Schulze H. MEMOS-Evaluation of an interactive electronic memory aid for brain-injured patients. J Int Neuropsychol Soc. 2003;9(4):583.
14. Kirsch NL, Levine SP, Lajiness-O'Neill R, Schneider M. Computer-assisted interactive task guidance: Facilitating the performance of a simulated vocational task. J Head Trauma Rehabil. 1992;7(3):13-25.
15. Kirsch NL, Levine SP, Fallon-Kreuger M, Jaros L. The microcomputer as an "orthotic" device for patients with cognitive deficits. J Head Trauma Rehabil. 1987;2(4):77-86.
16. Kirsch NL, Shenton M, Rowan J. A generic, "in-house," alphanumeric paging system for prospective activity impairments after traumatic brain injury. Brain Inj. 2004; 18(7):725-34. [PMID: 15204332]
17. Mihailidis A, Fernie GR, Barbenel JC. The use of artificial intelligence in the design of an intelligent cognitive orthotic for people with dementia. Assist Technol. 2001; 13(1):23-39. [PMID: 12212434]
18. Pollack ME, Brown L, Colbry D, McCarthy CE, Orosz C, Peintner B, Ramakrishnan S, Tsamardinos I. Autominder: An intelligent cognitive orthotic system for people with memory impairment. Robot Auton Syst. 2003;44(3-4): 273-82.
19. Pollack ME, Engberg S, Matthews JT, Thrun S, Brown L, Colbry D, Orosz C, Peintner B, Ramakrishnan S, Dunbar-Jacob J, McCarthy C, Montemerlo M, Pineau J, Roy N. Pearl: A mobile robotic assistant for the elderly. In: Proceedings of the AAAI Workshop on Automation as Caregiver; 2002 Jul 29; Edmonton, Alberta, Canada. Menlo Park (CA): AAAI Press; 2002.
20. Bonasso RP, Firby JR, Gat E, Kortenkamp D, Miller DP, Slack MG. Experiences with an architecture for intelligent, reactive agents. J Exp Theoretical Artif Intelligence. 1997; 9:237-56.
21. Bonasso RP, Kortenkamp D, Thronesbery C. Intelligent control of a water recovery system: Three years in the trenches. AI Magazine. 2003;24(1):19-44.
22. Schreckenghost D, Ryan D, Thronesbery C, Bonasso P, Poirot D. Intelligent control of life support systems for space habitats. Proc Conf Artif Intell. 1998;15:1140-45.
23. Bonasso RP, Kortenkamp D, Whitney T. Using a robot control architecture to automate space shuttle operations. Proc Conf Artif Intell. 1997;14:949-62.
24. Elsaesser C, MacMillan R. Representation and algorithms for multiagent adversarial planning. Technical Report MTR-91W000207. Washington (DC): MITRE; 1991.
25. Firby RJ. The RAPS language manual. Chicago (IL): I/Net, Inc; 1999.
26. Schreckenghost D, Hudson M, Thronesbery C, Kusy K. When automated planning is not enough: Assisting users in building human activity plans. In: Battrick B, editor. Proceedings of the Seventh International Symposium on Artificial Intelligence, Robotics and Automation in Space; 2003 May 19-23; Nara, Japan. The Netherlands: European Space Agency; 2005 Aug.
27. LoPresti EF, Simpson RC, Kirsch NL, Schreckenghost D. Distributed cognitive aid with scheduling and interactive task guidance. United States patent pending, application 60/663664. 2005 Mar 21.
Submitted for publication June 1, 2007. Accepted in revised form October 12, 2007.

Go to TOP

Go to the Table of Contents of Vol. 45 No. 4

Last Reviewed or Updated  Tuesday, September 1, 2009 7:57 AM

Valid XHTML 1.0 Transitional