|
Current Research on Staff Performance: From Initial Training to Maintenance |
Sunday, May 27, 2012 |
3:30 PM–4:50 PM |
603 (Convention Center) |
Area: OBM/AUT; Domain: Applied Research |
Chair: Michele D. Wallace (California State University, Los Angeles) |
Discussant: Keith D. Allen (Munroe-Meyer Institute) |
CE Instructor: Michele D. Wallace, Ph.D. |
Abstract: Training staff to be effective and lasting behavior change agents is a cogent, but often-overlooked endeavor. Research has demonstrated the use of multiple component training programs can be effective in developing staff performance; however, most of the training programs depend upon the use of instructions, modeling, role- play, and feedback. The first paper, entitled "Staff Training on Naturalistic Teaching Strategies: A Component Analysis," specifically addressed the necessity of various training components (i.e., feedback and modeling). Recently an alternative approach to staff training has been through the use of a video scoring procedure. The second paper, entitled "Using Video Scoring to Train Staff to Implement 3-Step Guided Compliance, Differential Reinforcement and Extinction," evaluated the effects of including video scoring as a component in staff training. Most research related to staff performance addresses initial training, however, the third paper, entitled "The Impact of Collecting Performance Feedback Data on the Treatment Integrity of the Data Collector," looks at how the use of performance feedback can increase maintenance of skills with respect to implementing a specific procedure. Finally, Keith Allen will provide an insightful discussion of the three papers as well as the general area of staff training. |
Keyword(s): Component Analysis, Staff Training, Treatment Integrity |
|
Staff Training on Naturalistic Teaching Strategies: A Component Analysis |
ROBYN LEE (Autism Behavior Intervention, Inc.), Christine Soliva (California State University, Los Angeles), Michele D. Wallace (California State University, Los Angeles), Marla Saltzman (Autism Behavior Intervention, Inc.) |
Abstract: Generally a number of procedures are utilized to train staff on how to implement behavioral procedures, naturalistic teaching strategies (NTS) is no exception. Given the importance of presenting a high number of learning trials when working with children with autism, we conducted a component analysis to examine the effects of modeling and feedback on staffs' abilities to utilize NTS when working with these children. Specifically, we measured the presentation rate of teaching opportunities a skills trainer presented to the child within the context of play after various staff training components. Four skills trainers who needed additional training in NTS were divided into 2 groups. One group received instructions and modeling first and if they did not meet criteria, feedback was added. The other group received instructions and feedback first, and if they did not meet criteria, modeling was added. Initial results indicate that although the skills trainers improved with either instructions plus feedback or instructions plus modeling, they did not meet competency until they received all3 components (instructions plus modeling plus feedback or instructions plus feedback plus modeling). Implications and future research regarding staff training will be discussed |
|
Using Video Scoring to Train Staff to Implement 3-Step Guided Compliance, Differential Reinforcement and Extinction |
WING YAN LAM (California State University, Los Angeles), Daniel B. Shabani (California State University, Los Angeles) |
Abstract: Staff training methodologies such as workshops, modeling, and feedback have been investigated extensively in the research literature. In recent years, video scoring as an alternative training methodology was proposed in the occupational safety arena (e.g., Alvero & Austin, 2004; Nielsen, Sigurdsson, & Austin, 2009). The preliminary data suggested that video scoring was an effective alternative training methodology. In video scoring, participants are required to observe and then score a video. Scoring is completed across number dimensions and usually includes scoring whether or not the model implemented the procedures correctly. Video scoring staff training procedures is quite different from training procedures in which modeling and feedback are used since in the ladder procedures, trainees are only required to observe the model. The added active scoring component is thought to promote better attention to the model and thereby better learning. In the current study, the effectiveness of workshops, video modeling, and video scoring plus scoring feedback were evaluated in a multiple-baseline design. Staff in a classroom serving students with special needs and behavior problems were trained to implement3 commonly used behavior management strategies, (1) 3-step guided compliance, (2) differential reinforcement of alternative behavior, and (3) extinction. Results indicated that workshops led to a moderate improvement in performance, while video modeling resulted in further improvements. These improvements were not signification enough to reach mastery criterion. The value of using video scoring plus scoring feedback to teach multicomponent behavior management strategies to paraprofessional staff will be discussed. In addition, suggestions regarding the effective use of video scoring as a staff training methodology will be provided. |
|
The Impact of Collecting Performance Feedback Data on the Treatment Integrity of the Data Collector |
MONICA HOWARD (Munroe-Meyer Institute), Raymond V. Burke (The Prevention Group), Janie Peterson (Behaven Kids), Roger Peterson (Behaven Kids), Jessica Wachtler (Behaven Kids), Keith D. Allen (Munroe-Meyer Institute) |
Abstract: Integrity is a common concern in treatment settings. Surprisingly, research has shown experienced supervisors to be as vulnerable to treatment drift as novice staff. Research on performance feedback has been repeatedly demonstrated as an effective way to respond to treatment drift and increase levels of treatment integrity; however, studies have primarily focused on a target subject without examining ancillary effects on the data collector. This study used a multiple-baseline design to evaluate the extent to which supervisor treatment integrity improved as a function of collecting performance feedback data on a staff member. Participants were supervisors in a day treatment center for young children with behavior problems and were responsible for delivering services to children in addition to managing staff. Data confirm that supervisor treatment integrity is susceptible to drift and suggests that integrity increases significantly when the supervisor is instructed to collect data on a staff member. Implications are discussed. |
|
|