Association for Behavior Analysis International

The Association for Behavior Analysis International® (ABAI) is a nonprofit membership organization with the mission to contribute to the well-being of society by developing, enhancing, and supporting the growth and vitality of the science of behavior analysis through research, education, and practice.

Search

34th Annual Convention; Chicago, IL; 2008

Event Details


Previous Page

 

Symposium #175
CE Offered: BACB
Critical Outcome Measures for Education Programs for Students with Autism
Sunday, May 25, 2008
10:30 AM–11:50 AM
Continental B
Area: AUT/DDA; Domain: Applied Research
Chair: Gregory S. MacDuff (Princeton Child Development Institute)
CE Instructor: Dawn B. Townsend, Ph.D.
Abstract:

Systematic replication may be defined as purposefully varying one or more variables across experiments to establish the reliability and generality of results. Although systematic replication may be common with regard to intervention technology, it has not been applied with the same vigor to the design of intervention program systems. This symposium will display annual outcome measures for the Education Program of the Princeton Child Development Institute (PCDI) and display data for three systematic replications--The New York Child Learning Institute, The Institute for Educational Achievement, and the Somerset Hills Learning Institute. Measures will include annual reviews of individualized programs, home programming services, consumer evaluations, and staff training and evaluation outcomes.

 
Consumer Evaluation: The Role of Measures of Social Validity in Program-Wide Decision Making.
KEVIN J. BROTHERS (Somerset Hills Learning Institute), Edgar D. Machado (Somerset Hills Learning Institute), Sandra R. Gomes (Somerset Hills Learning Institute)
Abstract: Measures of social validity, that is, consumer evaluations of the significance of the goals of a program; the appropriateness of procedures; and the importance of intervention effects, have long been held in high regard to behavior analysts (Wolf, 1978). Such measures enable human-service program developers to learn consumers’ perspectives on various aspects of a program (e.g., its implementation and outcomes) and respond to consumer feedback systematically rather than reactively. This paper will describe the annual process for obtaining consumer feedback that is in use across four autism intervention agencies. Results of measures used to access consumer satisfaction from parents, child study teams, Boards of Trustees, and staff members across these four programs and the usefulness of these measures for decision-making will be discussed.
 
Using Collective Data from Individual Behavior Change Programs to Effectively Evaluate an Entire Education Program.
DAWN B. TOWNSEND (Institute for Educational Achievement)
Abstract: Data on client progress are highly regarded in our field and accepted to be an absolute necessity for evaluation of behavior change at the individual client level. Such data, however, when aggregated from the total client population in any education program can also be used to evaluate the effectiveness of that program as a whole. The purpose of the current presentation is to highlight the importance of evaluation of the collective data from all individual behavior change programs and its use in critically evaluating the effectiveness and fidelity of educational programs serving individuals with autism. The presenter will define the components of individualized behavior change programs considered important and relative to the evaluation, discuss measures used to evaluate each component, and share data from the last 3 years for four educational programs serving learners with autism (i.e., PCDI, NYCLI, IEA, SHLI). These data will demonstrate the importance of this measure in evaluating effective treatment for individuals with autism at the educational program level and highlight the importance of replication of results over time to substantiate claims related to the delivery of quality services to individuals with autism.
 
Empowering the Parents of Children with Autism.
SUSAN M. VENER (New York Child Learning Institute)
Abstract: The skills children with autism acquire in the classroom often fail to generalize to home and community settings, and skills acquired in the presence of instructors do not generalize to parents. Similarly, skills acquired at home are also unlikely to generalize to the classroom or other community settings. The purpose of this presentation is to discuss some of the ways in which intervention settings can encourage parent participation and increase the likelihood that responses will generalize across settings. This talk will also identify ways to objectively measure if intervention programs include parents as partners in intervention and successfully program for generalization of skills across people and settings. Given collaborative efforts between instructors and parents, desirable outcomes can be achieved.
 
Using a Data-Based Protocol to Train and Evaluate Behavior Analysts for Autism Intervention.
EDWARD C. FENSKE (Princeton Child Development Institute), Gregory S. MacDuff (Princeton Child Development Institute), Patricia J. Krantz (Princeton Child Development Institute), Lynn E. McClannahan (Princeton Child Development Institute)
Abstract: Applied behavior analysis has been documented to be effective in addressing a wide range of skill deficits and behavior excesses displayed by individuals with autism. The plethora of scientific evidence supporting applied behavior analysis has led many reviewers to label it as the treatment of choice. Outcomes for individuals with autism who receive behavioral intervention services vary—some achieve skills within the normal range and placement in mainstream classrooms. Several studies have investigated variables, such as intensity of treatment and age of intervention, that may impact upon treatment outcome. Some researchers have suggested that the integrity and quality of intervention services deserve investigation. While several publications have specified academic courses of study that provide clinicians with important theoretical knowledge, there is little information on methods for specifying clinical training goals and assessing staff competency. A staff training and evaluation protocol will be presented which includes direct measures of staff and student behavior. Benchmarks for criterion performances have been derived from data collected at PCDI and three systematic replications.
 

BACK TO THE TOP

 

Back to Top
ValidatorError
  
Modifed by Eddie Soh
DONATE
{"isActive":true,"interval":86400000,"timeout":20000,"url":"https://saba.abainternational.org/giving-day/","saba_donor_banner_html":"Help create a brighter future for behavior analysis by donating on Giving Day!","donate_now_text":"Donate Now"}