Although NASP 2021 will not be in person this year, there are still so many opportunities to connect, learn more about your favorite PAR products and authors, and interact with PAR staff members. Join us during one of the following sessions:
LIVE session! Attendees will receive NASP CPD credit
Wednesday, February 24, 1:30–2:30 p.m.
Teleassessment With Children: Strategies for Success
Presented by Carrie A. Champ Morera, PsyD, NCSP, LP; Terri D. Sisson, EdS; and Dan Lee, BS
On-demand sessions! Attendees can claim CPD self-study credit
A Process Oriented Approach for Identifying and Remediating Reading Disabilities
Presented by Steven G. Feifer, DEd, and Jack A. Naglieri, PhD
The Neuropsychology of Written Language Disorders
Presented by Steven G. Feifer, DEd, author of the FAR, FAM, and FAW
Identifying Students with ADHD: Evidence-Based Assessment with the BRIEF2
Presented by Peter K. Isquith, PhD, coauthor of the BRIEF2
Tests and Scales: Evidence for Integrative Assessment of Executive Function
Helping Students Show What They Know: Enhancing Executive Functioning
Presented by Steven C. Guy, PhD, coauthor of the BRIEF2
State of Affairs: Trauma Assessment Practices in Children and Adolescents
Presented by Carrie A. Champ Morera, PsyD, NCSP, LP, PAR Project Director
Visit the booth!
Enjoy our industry-best Customer Support at our virtual booth. Make sure to stop by the virtual booth to download your coupon for 15% off all orders placed February 23 to March 9.
Register for NASP now! We can’t wait to “see” you online!
Last week, during the National Association of School Psychologists (NASP) Annual Conference in Atlanta, PAR sponsored the Trainers of School Psychology (TSP) poster session. Of 37 submissions to the poster session, three were randomly chosen as winners.
PAR is proud to announce these three posters as winners of the TSP poster contest!
Best practices in enhancing suicidality assessment skills using simulated patients
Stefany Marcus, PsyD, and Alexa Beck, MS, Nova Southeastern University
An empirical study of the perceptions of program accreditation by university program coordinators
Alana Smith, Ashley Carlucci, Dr. Jim Deni, Dr. Elizabeth M. Power, St. Rose University
Teaching psychoeducational assessment: Putting evidence-based practice to work
Sandra Glover Gagnon, Hannah Walker, and Haley Black, Appalachian State University
Congratulations to the winners and thank you to all the participants!
The National Association of School Psychologists (NASP) Annual Convention will be held February 13 to 16 in Chicago and PAR will be there. If you’ll be attending NASP, we hope you’ll visit the PAR booth and say hello! You can view product samples and even meet some of your favorite authors!
Here’s the schedule of when the authors will be available at our booth.
1-2 p.m.: Peter Isquith, PhD, coauthor of the Behavior Rating Inventory of Executive Function®, Second Edition (BRIEF®2)
2:30-3:30 p.m.: Sandra Chafouleas, PhD, coauthor of DBR Connect™
5:30-6:30 p.m.: Steven Feifer, DEd, author of the Feifer Assessment of Reading™ (FAR™) and
Feifer Assessment of Mathematics™ (FAM™)
11:30 a.m.-12:30 p.m.: Steven Feifer, DEd
2-3 p.m.: Peter Isquith, PhD
We also hope you’ll also make time to attend these informative sessions being presented by our PAR authors.
MS039: DBR Connect: Using Technology to Enhance Screening and Progress Monitoring
Sandra Chafouleas, PhD
SS031: 20 Years of Evidence: Assessing Executive Function With the BRIEF
Peter Isquith, PhD
SS018: PSW Method for SLD Eligibility: Empirical Findings and Case Studies
Steven Feifer, DEd
MS083: The Neuropsychology of Emotional Disorders: A Framework for Effective Interventions
MS165: Advanced Evidence-Based Assessment of Executive Function With the BRIEF2
MS112: Using a Neuropsychological Approach to Identify and Remediate Reading Disorders
MS166: Tests and Scales: Evidence for Integrative Assessment of Executive Function
As always, PAR will be offering special discounts on any purchases made at the PAR booth during NASP. You’ll save 15% plus get free ground shipping on your order!
Hope to see you in Chicago!
The concept of direct behavior rating (DBR) began in the late 1960s with school psychologist Calvin Edlund. He posited a program whereby teachers first explained to students what acceptable behavior was and then rated them at the end of each lesson. Unlike rating scale assessments, which ask teachers and parents to recall a child’s behavior during a 30-day period or so, direct behavior rating relies on real-time observation.
DBR combines the strength of a rating scale and the benefit of direct observation. Using this system, teachers can not only identify specific behaviors in real time, but they can also rate those behaviors.
From this idea, DBR Connect was created. PAR recently spoke with DBR Connect coauthors Sandra M. Chafouleaus, PhD, and T. Chris Riley-Tillman, PhD, to learn more about how this product can help students and teachers to succeed.
Q: Direct behavior rating has been around for quite some time. Historically, what changes have taken place to get us to where we are today?
Drs. Chafouleas and Riley-Tillman: Yes, direct behavior ratings were developed from daily behavior report cards, home–school notes, and other tools that educators and parents have used for decades as a way to communicate information about child behavior. We took that rich history of use and worked to standardize the instrumentation and procedures. This allowed for comprehensive evaluation of the psychometric evidence for use in screening and progress monitoring purposes. DBR Connect is the result of all of that research and development, overall supporting that DBR Connect can provide data that are reliable, valid, and sensitive to change.
Q: How does DBR tie into positive behavioral support and/or multitiered models of delivery of services?
Multitiered models of service delivery and positive behavioral support are founded in prevention—that is, early identification and remediation of difficulties. These frameworks require use of ongoing data to inform decisions about continuing, modifying, or terminating supports, and DBR Connect functions as an ideal prevention-oriented method for progress monitoring assessment.
Q: You have described DBR Connect as a hybrid tool. What do you mean by that?
DBR Connect offers strengths of both traditional rating scales and systematic direct observation. That is, like systematic direct observation, a predefined observation period is selected with repeated assessment to allow for comparison of data across assessment periods, required in progress monitoring. The instrumentation and procedures are highly efficient like rating scales because only a brief rating of the defined targets is needed to record data.
Q: You mention in your book that one of the roles of DBR is communication. Can you talk a bit about that?
Yes, DBR has a rich history in use for communication purposes, whether teacher–teacher, teacher–parent, teacher–student, or parent–student. It is easy to understand at all levels and provides a simple format for discussing behavior expectations.
Q: What guided your decision to focus on the three core behavioral competencies that you chose for DBR Connect?
Our research started with a broad review of the literature on school-based behavior expectations in schools—including consideration of indicators of student success and those areas most concerning to educators. We narrowed the literature to items that could be defined both in broad and narrow terms, and then conducted a series of research studies to identify those target behaviors that resulted in the strongest evidence for use. In the end, the core school-based behavioral competencies—that is, those behaviors that every student should display in order to fully access instruction and participate in the school environment—are academically engaged, disruptive, and respectful. That said, we also acknowledge that some situations may call for additional targets; thus, we maintain the flexibility by supporting use for any behavior of relevance to a particular context.
Q: Who is the target audience for DBR Connect?
Teachers are the primary users of DBR Connect, meaning they serve as the primary raters and producers of data summaries for decision making. However, all educators (e.g., administrators, school psychologists) can benefit from data reports to inform decision making, and there may be some situations in which other users may serve as appropriate raters (e.g., monitoring of behavior progress during counseling sessions). Remember, an important strength of a DBR data stream is the capacity to share with students and parents to communicate information about behavior.
For more information on DBR Connect or to take a tour, visit http://www.mydbrconnect.com/.