Journal of Rehabilitation Research & Development (JRRD)

Quick Links

  • Health Programs
  • Protect your health
  • Learn more: A-Z Health
Veterans Crisis Line Badge
 

Volume 49 Number 10, 2012
   Pages 1547 — 1556

Feasibility of computerized brain plasticity-based cognitive training after traumatic brain injury

Matthew S. Lebowitz, AB;* Kristen Dams-O’Connor, PhD; Joshua B. Cantor, PhD

Department of Rehabilitation Medicine, Mount Sinai School of Medicine, New York, NY

Abstract — The present study investigates the feasibility and utility of using a computerized brain plasticity-based cognitive training (BPCT) program as an intervention for community-dwelling individuals with traumatic brain injury (TBI). In a pre-post pilot study, 10 individuals with mild to severe TBI who were 6 mo to 22 yr postinjury were asked to use a computerized BPCT intervention—designed to improve cognitive functioning through a graduated series of structured exercises—at their homes in an urban community. Outcome measures included objective neuropsychological and self-report measures of cognitive functioning. All participants were able to use the software in their homes. Some mild fatigue was reported, which tended to dissipate over time. Few technical difficulties were reported. Remote support was sufficient for what technical assistance was needed. Participants reported subjective improvement in cognitive functioning, and small to large effect sizes on self-report and neuropsychological measures are reported. We conclude that BPCT may be a viable intervention for TBI outpatients as an adjunct to comprehensive neurorehabilitation. The intervention can be delivered in patients’ homes with support provided remotely. Results of this study demonstrate the potential for treatment-related improvements many years after injury. Further study in controlled trials is warranted.

Key words: attention, brain injury, cognitive symptoms, computer-assisted therapy, feasibility, human information processing, neuronal plasticity, neuropsychology, rehabilitation, self-report.

Abbreviations: ANAM4 = Automated Neuropsychological Assessment Metrics Version 4, BPCT = brain plasticity-based cognitive training, CFQ = Cognitive Failures Questionnaire, FrSBe = Frontal Systems Behavior Scale, SD = standard deviation, TBI = traumatic brain injury, UES = user experience survey.
*Address all correspondence to Matthew S. Lebowitz, MS; Department of Psychology, Yale University, 2 Hillhouse Avenue, New Haven, CT 06520.
http://dx.doi.org/10.1682/JRRD/2011.07.0133
INTRODUCTION

Traumatic brain injury (TBI) typically affects a range of cognitive functions, including attention, processing speed, memory, and executive functioning, all of which can affect day-to-day functioning and cognitive efficiency [1]. A growing literature has suggested that rehabilitation interventions can be effective in treating these cognitive symptoms of brain injury [2–4]. The most common approaches to post-TBI cognitive remediation focus on teaching compensatory strategies to minimize the functional effect of cognitive impairments [5]. Another approach to cognitive remediation aims to restore impaired functions through the use of repetitive exercises or massed practice of specific tasks. However, success with this approach has been relatively limited; generalization of training to nontrained domains of cognitive function has not been consistently demonstrated [6].

In the present study, we examined the feasibility of using a widely available software program of computer-based mental exercises (sold commercially by Posit Science Corporation as Cortex with InSight; San Francisco, California) as an intervention for community-dwelling individuals with a history of TBI. This approach to cognitive rehabilitation, known as brain plasticity-based cognitive training (BPCT), is a theory-driven intervention intended as an "exercise program for the brain" [7]. It is administered on a laptop or desktop computer and consists of repeated trials on game-like tasks such as selecting a target stimulus out of an array of distracters or visually tracking an occluded, moving target stimulus. The software is designed such that the speed and complexity of the exercises increase as the user’s performance improves in order to consistently maintain a high proportion of successful trials while stimulating the brain with gradually more demanding tasks. Note that, while the present study only examined the feasibility of one particular software program, it is not intended to endorse any particular brand of software or to compare different software products to each other, but rather to investigate the feasibility of computerized BPCT as a remotely delivered intervention in TBI rehabilitation.

As evidence has accumulated that human brains retain plasticity throughout the lifespan, conceptions of the brain as fixed and unchanging have given way to a consensus view of the brain—and the cerebral cortex especially—as highly receptive to the transformative influence of experience and environment [8]. BPCT is based on theory and research suggesting that structured learning and intensive, repetitive practice of increasingly demanding sensory tasks with appropriate behavioral reinforcement can bring about changes in cortical representations of incoming sensory information, such that these neural representations will more accurately and precisely reflect the stimuli being perceived [7]. Cognitive domains including memory, information processing, and attention have been hypothesized to depend on the quality of such sensory inputs [6]. Thus, by improving the quality of sensory information in the brain, BPCT aims to improve areas of cognitive function that are among those principally disrupted in TBI.

In recent years, studies using brain imaging technologies have suggested that neuroplasticity and the brain’s responsiveness to training and experience can play an important role in recovery from neurological injuries and illnesses [9]. Some evidence suggests that plasticity may be heightened after TBI and that the environment and experiences to which the brain is exposed after an injury can effect neurological recovery [10–12]. Moreover, as BPCT progresses, the stimuli used in training tasks more closely approximate real-world input in order to facilitate generalization to everyday functions. Existing evidence suggests that computerized delivery of cognitive rehabilitation to individuals with TBI-related cognitive impairments is feasible [13]; a particular benefit of the BPCT approach examined here is that the training program is self-contained and does not require direct participation of a therapist. The current study investigates whether such an intervention could be implemented remotely, delivered in the homes of individuals with TBI.

Mounting empirical support already exists for the benefits of BPCT from research using the same software as the present study and earlier versions made by the same manufacturer and featuring the same approach to cognitive training. Two of the largest studies of computer-based cognitive training, the IMPACT Study (Improvement in Memory with Plasticity-Based Adaptive Cognitive Training) and the ACTIVE Study (Advanced Cognitive Training for the Independent and Vital Elderly), demonstrated that adults above age 65 who completed as little as 10 h of BPCT performed better on cognitive tasks than active controls (i.e., mnemonic memory training or serial reasoning training) or no-contact controls (i.e., individuals undergoing assessments only and no intervention) [6,14–15]. Follow-up data collected 5 yr after BPCT training demonstrated significant protection against the decline in health-related quality of life that was found among controls [15]. Other studies in older adults demonstrate that individuals who train with BPCT, as compared with both active (i.e., educational DVDs and quizzes administered via computer) and no-contact control groups, demonstrate strong generalization of training improvements to untrained, standardized neuropsychological assessments [7,16].

One recent study found that older adults who had completed a course of BPCT experienced measurable improvements in visual working memory, which corresponded with significant changes in the amplitude of neural responses to stimuli in the visual cortices as measured by electroencephalography [17]. Moreover, two studies have found that cognitively normal older adults who completed 10 h of BPCT had superior performance on timed instrumental activities of daily living than active (i.e., computerized cognitive tasks) or no-contact controls. These findings support the generalization of BPCT improvements to tasks of daily living [18–19].

Other studies with BPCT have demonstrated improved cognitive functioning in clinical populations with known cognitive dysfunction. Adults with mild cognitive impairment demonstrated improved memory performance after 40 h of BPCT training, as compared with individuals in an active (i.e., computerized cognitive tasks) control group whose performance declined at posttest [20]. Among individuals with schizophrenia, BPCT has been shown to improve performance on a variety of cognitive outcome measures [21–23]. Another study found that BPCT, when delivered along with vocational training and supported-employment interventions for individuals with schizophrenia, had a synergistic effect on employment outcomes. This combined intervention yielded improvements in employment variables (work performance, hours worked, percent employed) that exceeded those seen by individuals who received only employment interventions; these gains persisted well beyond the training period [24]. Though the profiles of cognitive deficits in the clinical populations that have benefitted from BPCT are not identical to that of individuals with TBI, the fact that BPCT’s effectiveness has been empirically supported in clinical populations with neurological deficits in attention, memory, and information processing may be seen as enhancing the promise of BPCT as a potential cognitive intervention for TBI.

Because of the range of cognitive impairments that are common following TBI, it is important to establish whether people with TBI are able to use the intervention prior to the initiation of larger trials examining BPCT’s efficacy. Thus, the current study explored whether the software is usable and well tolerated in a sample of community-dwelling individuals with TBI. The study was also designed to identify barriers to use of BPCT by individuals with TBI and determine whether or not any adverse events would result from it so that any such impediments might be addressed in future studies.

The present research is a pre-post preliminary feasibility study without a control group and was not designed to assess the intervention’s effectiveness. Indeed, the study’s design prevents its results from being taken as a demonstration of efficacy. However, effect-size estimates were calculated on both neuropsychological and self-report measures for potential use in the development of future studies aimed at examining the effectiveness of BPCT as an intervention for cognitive deficits associated with TBI.

METHODS
Measures

The primary study outcomes were independent use of the software and user feedback about the intervention. In addition, measures of basic neuropsychological function and self-report measures of real-world functioning were administered.

At the end of their participation in the study, participants filled out a user experience survey (UES) developed by Posit Science Corporation for the purpose of soliciting qualitative feedback from users of the software in which they provided feedback on their experiences using the software. The nine questions on this survey (e.g., "How fatigued did you get as a result of training?," "Were some exercises more fatiguing than others?," "Do you think the program was helpful to you in any way, and if so, how?") allowed for open-ended responses, with each question followed by two to four blank lines for participants to write their answers. The survey asked participants whether they experienced any difficulties with the computer or mouse they used during their training in order to ascertain the extent to which technical issues impeded their use of the software. It also asked how many days per week they trained (and for how long each day) on average to assess their compliance with the intervention. Other questions on the survey aimed to gauge fatigue and other difficulties participants experienced with the exercises themselves and to provide insight into participants’ overall thoughts about the program (whether they felt it helped them, what they might change about it, etc.).

Before and after training with the software, participants’ neuropsychological functioning was assessed using the TBI battery of the Automated Neuropsychological Assessment Metrics Version 4 (ANAM4), a validated, computerized neuropsychological battery that takes about 30 min to administer [25]. The ANAM4 consists of five computerized tests of cognitive functioning: simple reaction time (a measure of processing speed and efficiency), mathematical processing (a measure of working memory), procedural reaction time (a measure of attention and concentration), code substitution (a measure of encoding and memory), and matching to sample (a measure of spatial processing and visuospatial working memory). The simple reaction time subtest is administered twice and code substitution is repeated as a measure of delayed memory, resulting in a total of seven subtest scores for each participant.

The Cognitive Failures Questionnaire (CFQ) [26] is a 25-item self-report inventory with items regarding difficulties in several cognitive domains (memory, perception, and motor function) over the preceding 6 mo (e.g., "Do you bump into people?," "Do you find you forget appointments?"). A 5-point Likert-type scale (0 = never, 4 = always) is used for responses. Scores range from 0 to 100. Its reliability [27] and validity are well documented [28–29].

The self-report version of the Frontal Systems Behavior Scale (FrSBe) is a reliable, validated 46-item measure that assesses behavioral symptoms resulting from frontal lobe and/or frontal systems injury using 5-point Likert scale items [30]. In addition to a total score, three subscales can be generated (Apathy, Disinhibition, and Executive Dysfunction), allowing for richer qualitative and clinical information. This test was used to establish subjective levels of executive impairment.

Participants

Subjects were 10 individuals with mild to severe TBI living in an urban community. In order to participate in the study, potential participants had to meet the following inclusion criteria: a diagnosis of mild to severe TBI; be at least 18 yr old; be at least 6 mo postinjury; and have, in the judgment of the research team, adequate motor ability to be able to use a computer and mouse. Two additional participants consented to participate in the study but were not included because they chose not to complete the intervention after realizing that their home computers did not meet the software’s specifications. The remaining ten participants were 6 mo to 22 yr postinjury (mean = 9 yr, 3 mo; standard deviation [SD] = 8 yr, 4 mo). Nine of the 10 were female, and the mean participant age was 46.3 yr (SD = 16.6). Three were injured in motor vehicle crashes, four were injured as pedestrians struck by automobiles or motorcycles, one was injured in a fall, and two had injuries with other etiologies. Based on self-report information on injury characteristics, we classified severity of injuries using criteria set forth by the American Congress of Rehabilitation Medicine [31]. Five participants had mild injuries, two had moderate injuries, and two had severe injuries. One individual was unable to provide sufficient information to allow for injury severity classification. The sample was racially diverse: four participants were white, two were black, two were south Asian, one was east Asian, and one was Hispanic. Participants had completed a mean of 19.0 yr of education (SD = 3.6).

Procedures

Potential participants were drawn from a pool of individuals who had participated in previous studies at the research center or who had been former clinical patients of neuropsychologists on the research team (all of whom had agreed to be contacted about future research). After being contacted about the study, potential participants who expressed interest were seen at the research center and informed of further details regarding participation, including study procedures, possible risks and benefits of study participation, participant confidentiality, the voluntary nature of the research, and the participant’s right to withdraw from the study at any time. Participants’ capacity to give informed consent was assessed using the Aid to Capacity Evaluation [32]. Only individuals deemed to have capacity to give informed consent were enrolled in the study.

After consenting to participation, subjects completed the preintervention study measures and were given the software to use at home with instructions to use it for 40 min/d, 5 d/wk for 6 wk. They received weekly telephone calls from a research assistant to remind them to continue with their training and report any problems or complications they may have encountered. The research assistant also sent daily emails (Monday through Friday) reminding participants to complete their training, except in the case of two participants who opted out of receiving such emails at the time of consenting. After 6 wk, participants were scheduled for an appointment to allow for collection of postintervention data.

All participants installed and used the study software on their home computers, with the exception of one participant who borrowed a laptop from the research center to use at home for the duration of her participation. Participants were informed that they could call the research center with any study-related questions and were also given the toll-free customer service and technical support telephone number of the software manufacturer to use in the case of technical problems with the software. Participants trained on the software at home without supervision. Study personnel maintained contact with the manufacturer’s technical support team and research staff for the duration of the study, discussing with them any technical difficulties encountered by study participants. All participants were permitted to keep the software after the study ended.

RESULTS
User Experiences

Though some required varying amounts of assistance from study personnel or the software manufacturer’s technical support workforce, all 10 participants who completed the study were able to install and use the software in their homes without any in-person supervision. On the UES, a majority of participants (7 of 10) reported little or no difficulty using the software.

Most participants (8 of 10) reported using the software with at least the requested frequency of 40 min/d, 5 d/wk, on average. All participants reported using it for 40 min/d at least 3 d/wk. We obtained participants’ self-reports of time spent completing the intervention because independent documentation of their usage time was not reliably available. A research assistant was able to view logs of training time completed by participants whose computers were connected to the Internet at the time of training, which allowed for targeted follow-up with those who were falling behind in their intervention compliance. Some participants were not connected to the Internet when they completed the training or were only connected during some of their training time, precluding full examination of objective usage data for all participants.

Though most participants (8 of 10) reported that the program caused some fatigue, their responses on the UES indicated that this tended to be relatively mild and dissipated over the course of participation (e.g., "as the weeks went by it was less taxing"). Two participants reported that using the software caused headaches, and one reported experiencing eyestrain. None described any side effects severe enough to prevent use of the software altogether.

Most participants (7 of 10) reported noticing real-world benefits after using the BPCT software. Areas of improvement reported on the UES included concentration (e.g., "It is helpful for staying focused on what I am doing"), executive functioning (e.g., "It was tremendously helpful with planning, establishing routines, [and] organizing"), visual processing (e.g., "My visual processing has improved," "[I]t helped me zone in visually in chaotic situations"), memory, processing speed (e.g., "My processing . . . became faster and more accurate"), and cognitive stamina. Table 1 displays demographic information, injury characteristics, and user experience feedback for each participant.


Table 1.
Neuropsychological Findings

For each of the seven ANAM4 subtests, effect size estimates (Cohen d) were computed using the mean difference between participants’ preintervention and postintervention "throughput" scores (a measure that combines speed and accuracy of responses). All ANAM4 scores reported are throughput scores. The same effect-size estimates were computed using the differences between participant’s pre- and postintervention scores on the CFQ (and its 4 subscales) and the FrSBe (and its 3 subscales), for a total of 16 effect sizes. All but four of these were greater than 0.2, considered the threshold for a "small" effect. The largest effect sizes were observed for the CFQ total score (d = 0.58) and for a reduction in self-reported daily cognitive "blunders" (d = 1.45).

Table 2 shows the pre- and postintervention mean scores, SDs for each study measure, and the estimated effect size of the change from pre- to postintervention.


Table 2. 
0.42
DISCUSSION

The results of this pilot feasibility study suggest that the type of BPCT software studied in the present research can be used by community-dwelling individuals with a history of TBI. BPCT was found to be well tolerated—and in some cases, enjoyed—by the participants. The most frequently reported side effect was fatigue. Participants found it easy to use, suggesting it was not too complex for this population. The intervention has face validity, and the participants’ responses on the UES indicated that they perceived it to be beneficial. Participants reported a perception that aspects of their cognitive functioning improved as a result of using the program, and although throughput scores worsened slightly on one of the ANAM4’s two reaction-time measures, participants’ perception of improvement was generally corroborated by the rest of our neuropsychological measures. This is especially encouraging in light of the fact that all of the participants were at least 6 mo postinjury and several were many years postinjury. Therefore, any benefits that accrued were unlikely to have been due to spontaneous recovery from TBI. Controlled trials would be needed to more definitively test BPCT’s efficacy in individuals with TBI.

Our findings suggest that many individuals with TBI may be able to use BPCT software at home independently or with little guidance, although others may require significant support in the form of reminders, technical assistance, or encouragement. No adverse events occurred in the present study, and most participants reported no significant technical difficulties. Most participants reported fatigue and, more rarely, headaches and eyestrain. The fact that no participants were unable to complete the intervention as a result of fatigue or other side effects may suggest another advantage of the approach used in this study, especially considering that fatigue is typically cited as one of the main functional impediments faced by individuals with TBI [33].

LIMITATIONS

One limitation of the present study is its use of a convenience sample, which may not have been demographically representative of community-dwelling individuals with TBI. In particular, the sample tended to be well educated and consisted primarily of women, so further research may be needed to determine whether the feasibility documented here generalizes to more demographically diverse populations of individuals with TBI. Also, half of the participants in the present study were classified as having mild injuries, and our sample size is not large enough to warrant differential conclusions about the intervention’s feasibility at different levels of injury severity. However, most TBIs suffered in the United States meet criteria for a mild classification [34], suggesting that mild injuries were not overrepresented in our sample.

Additionally, independent documentation of time spent completing the intervention was not reliably available. Thus, although participants did report using the software consistently and benefitting cognitively from it, their compliance with the intervention cannot be verified with certainty. Future trials examining the efficacy of this intervention will need to ensure accurate documentation of usage. The manufacturer of the software used in the present study has addressed this issue by delivering the entire intervention through a Web portal, thus ensuring accurate monitoring of usage by participants.

CONCLUSIONS

Because attention, memory, and information processing are thought to be among the principal cognitive abilities that can improve with BPCT, this type of intervention might be used in tandem with existing empirically validated approaches to post-TBI cognitive remediation [3–4]. For example, improvements in basic cognitive abilities attained through BPCT could potentially enhance patients’ ability to learn and remember metacognitive strategies aimed at compensating for cognitive deficits or improving self-regulation [35]. It is nonetheless important to recognize that empirically supported cognitive interventions are not widely available or accessible to many individuals with TBI. The fact that individuals can obtain BPCT software independently and that those with TBI seem able to use it unsupervised suggest that its efficacy is worth evaluating as a treatment for individuals who would otherwise not have access to cognitive remediation, such as individuals in underserved rural areas.

The present study was designed to evaluate feasibility rather than efficacy. A more rigorous controlled clinical trial could examine whether or not the intervention leads to significant improvements in cognitive functioning and the extent to which effects of this type of cognitive rehabilitation generalize to everyday function.

Acquisition of data: M. S. Lebowitz.
Analysis and interpretation of data: M. S. Lebowitz, J. B. Cantor, K. Dams-O’Connor.
Drafting of manuscript: M. S. Lebowitz, J. B. Cantor, K. Dams-O’Connor.
Critical revision of manuscript for important intellectual content: M. S. Lebowitz, J. B. Cantor, K. Dams-O’Connor.
Study supervision: J. B. Cantor.
Financial Disclosures: The authors have declared that no competing interests exist.
Funding/Support: This material was based on work supported by the National Institute on Disability and Rehabilitation Research, U.S. Department of Education (grant H133P050004), and the Centers for Disease Control and Prevention (grant 1R49CE001171-01). Computer software was provided by Posit Science Corporation, San Francisco, California.
Additional Contributions: The authors thank Lisa Spielman, PhD, for assisting with statistical analyses and Laila Spina, PsyD, for assisting with study conceptualization and technical support during the study. We are also grateful to Tausif Billah for his invaluable assistance in collecting and managing data. Mr. Lebowitz has earned his Master’s of Science since this research was conducted. He is now with the Department of Psychology, Yale University, New Haven, Connecticut.
Institutional Review: The Institutional Review Board at Mount Sinai School of Medicine approved the study reported here.
Participant Follow-Up: The authors do not plan to inform participants of the publication of this study, but participants were informed at the time of participation that they could contact the research center for information about study results and corresponding publications.
iThenticateCrossref

Go to TOP

Last Reviewed or Updated  Wednesday, February 13, 2013 2:52 PM

Valid HTML 4.01 Transitional