Evidence for the interpretation of Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) scores: An argument-based approach to screener validation

Stephen P. Kilgus, Wes E. Bonifay, Nathaniel P. von der Embse, Amanda N. Allen, Katie R Eklund

Research output: Contribution to journalArticle

Abstract

In accordance with an argument-based approach to validation, the purpose of the current study was to yield evidence relating to Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) score interpretation. Bifactor item response theory analyses were performed to examine SAEBRS item functioning. Structural equation modeling (SEM) was used to simultaneously evaluate intra- and inter-scale relationships, expressed through (a) a measurement model specifying a bifactor structure to SAEBRS items, and (b) a structural model specifying convergent and discriminant relations with an outcome measure (i.e., Behavioral and Emotional Screening System [BESS]). Finally, hierarchical omega coefficients were calculated in evaluating the model-based internal reliability of each SAEBRS scale. IRT analyses supported the adequate fit of the bifactor model, indicating items adequately discriminated moderate and high-risk students. SEM results further supported the fit of the latent bifactor measurement model, yielding superior fit relative to alternative models (i.e., unidimensional and correlated factors). SEM analyses also indicated the latent SAEBRS-Total Behavior factor was a statistically significant predictor of all BESS subscales, the SAEBRS-Academic Behavior predicted BESS Adaptive Skills subscales, and the SAEBRS-Emotional Behavior predicted the BESS Internalizing Problems subscale. Hierarchical omega coefficients indicated the SAEBRS-Total Behavior factor was associated with adequate reliability. In contrast, after accounting for the total scale, each of the SAEBRS subscales was associated with somewhat limited reliability, suggesting variability in these scores is largely driven by the Total Behavior scale. Implications for practice and future research are discussed.

Original languageEnglish (US)
Pages (from-to)129-141
Number of pages13
JournalJournal of School Psychology
Volume68
DOIs
StatePublished - Jun 1 2018
Externally publishedYes

Fingerprint

Risk-Taking
risk behavior
interpretation
evidence
Structural Models
structural model
Outcome Assessment (Health Care)
Students

Keywords

  • Behavior assessment
  • Item response theory
  • Structural equation modeling
  • Universal screening

ASJC Scopus subject areas

  • Education
  • Developmental and Educational Psychology

Cite this

Evidence for the interpretation of Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) scores : An argument-based approach to screener validation. / Kilgus, Stephen P.; Bonifay, Wes E.; von der Embse, Nathaniel P.; Allen, Amanda N.; Eklund, Katie R.

In: Journal of School Psychology, Vol. 68, 01.06.2018, p. 129-141.

Research output: Contribution to journalArticle

@article{83373a5e4f9f4a89aa35228e0274d75c,
title = "Evidence for the interpretation of Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) scores: An argument-based approach to screener validation",
abstract = "In accordance with an argument-based approach to validation, the purpose of the current study was to yield evidence relating to Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) score interpretation. Bifactor item response theory analyses were performed to examine SAEBRS item functioning. Structural equation modeling (SEM) was used to simultaneously evaluate intra- and inter-scale relationships, expressed through (a) a measurement model specifying a bifactor structure to SAEBRS items, and (b) a structural model specifying convergent and discriminant relations with an outcome measure (i.e., Behavioral and Emotional Screening System [BESS]). Finally, hierarchical omega coefficients were calculated in evaluating the model-based internal reliability of each SAEBRS scale. IRT analyses supported the adequate fit of the bifactor model, indicating items adequately discriminated moderate and high-risk students. SEM results further supported the fit of the latent bifactor measurement model, yielding superior fit relative to alternative models (i.e., unidimensional and correlated factors). SEM analyses also indicated the latent SAEBRS-Total Behavior factor was a statistically significant predictor of all BESS subscales, the SAEBRS-Academic Behavior predicted BESS Adaptive Skills subscales, and the SAEBRS-Emotional Behavior predicted the BESS Internalizing Problems subscale. Hierarchical omega coefficients indicated the SAEBRS-Total Behavior factor was associated with adequate reliability. In contrast, after accounting for the total scale, each of the SAEBRS subscales was associated with somewhat limited reliability, suggesting variability in these scores is largely driven by the Total Behavior scale. Implications for practice and future research are discussed.",
keywords = "Behavior assessment, Item response theory, Structural equation modeling, Universal screening",
author = "Kilgus, {Stephen P.} and Bonifay, {Wes E.} and {von der Embse}, {Nathaniel P.} and Allen, {Amanda N.} and Eklund, {Katie R}",
year = "2018",
month = "6",
day = "1",
doi = "10.1016/j.jsp.2018.03.002",
language = "English (US)",
volume = "68",
pages = "129--141",
journal = "Journal of School Psychology",
issn = "0022-4405",
publisher = "Elsevier BV",

}

TY - JOUR

T1 - Evidence for the interpretation of Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) scores

T2 - An argument-based approach to screener validation

AU - Kilgus, Stephen P.

AU - Bonifay, Wes E.

AU - von der Embse, Nathaniel P.

AU - Allen, Amanda N.

AU - Eklund, Katie R

PY - 2018/6/1

Y1 - 2018/6/1

N2 - In accordance with an argument-based approach to validation, the purpose of the current study was to yield evidence relating to Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) score interpretation. Bifactor item response theory analyses were performed to examine SAEBRS item functioning. Structural equation modeling (SEM) was used to simultaneously evaluate intra- and inter-scale relationships, expressed through (a) a measurement model specifying a bifactor structure to SAEBRS items, and (b) a structural model specifying convergent and discriminant relations with an outcome measure (i.e., Behavioral and Emotional Screening System [BESS]). Finally, hierarchical omega coefficients were calculated in evaluating the model-based internal reliability of each SAEBRS scale. IRT analyses supported the adequate fit of the bifactor model, indicating items adequately discriminated moderate and high-risk students. SEM results further supported the fit of the latent bifactor measurement model, yielding superior fit relative to alternative models (i.e., unidimensional and correlated factors). SEM analyses also indicated the latent SAEBRS-Total Behavior factor was a statistically significant predictor of all BESS subscales, the SAEBRS-Academic Behavior predicted BESS Adaptive Skills subscales, and the SAEBRS-Emotional Behavior predicted the BESS Internalizing Problems subscale. Hierarchical omega coefficients indicated the SAEBRS-Total Behavior factor was associated with adequate reliability. In contrast, after accounting for the total scale, each of the SAEBRS subscales was associated with somewhat limited reliability, suggesting variability in these scores is largely driven by the Total Behavior scale. Implications for practice and future research are discussed.

AB - In accordance with an argument-based approach to validation, the purpose of the current study was to yield evidence relating to Social, Academic, and Emotional Behavior Risk Screener (SAEBRS) score interpretation. Bifactor item response theory analyses were performed to examine SAEBRS item functioning. Structural equation modeling (SEM) was used to simultaneously evaluate intra- and inter-scale relationships, expressed through (a) a measurement model specifying a bifactor structure to SAEBRS items, and (b) a structural model specifying convergent and discriminant relations with an outcome measure (i.e., Behavioral and Emotional Screening System [BESS]). Finally, hierarchical omega coefficients were calculated in evaluating the model-based internal reliability of each SAEBRS scale. IRT analyses supported the adequate fit of the bifactor model, indicating items adequately discriminated moderate and high-risk students. SEM results further supported the fit of the latent bifactor measurement model, yielding superior fit relative to alternative models (i.e., unidimensional and correlated factors). SEM analyses also indicated the latent SAEBRS-Total Behavior factor was a statistically significant predictor of all BESS subscales, the SAEBRS-Academic Behavior predicted BESS Adaptive Skills subscales, and the SAEBRS-Emotional Behavior predicted the BESS Internalizing Problems subscale. Hierarchical omega coefficients indicated the SAEBRS-Total Behavior factor was associated with adequate reliability. In contrast, after accounting for the total scale, each of the SAEBRS subscales was associated with somewhat limited reliability, suggesting variability in these scores is largely driven by the Total Behavior scale. Implications for practice and future research are discussed.

KW - Behavior assessment

KW - Item response theory

KW - Structural equation modeling

KW - Universal screening

UR - http://www.scopus.com/inward/record.url?scp=85044039271&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85044039271&partnerID=8YFLogxK

U2 - 10.1016/j.jsp.2018.03.002

DO - 10.1016/j.jsp.2018.03.002

M3 - Article

AN - SCOPUS:85044039271

VL - 68

SP - 129

EP - 141

JO - Journal of School Psychology

JF - Journal of School Psychology

SN - 0022-4405

ER -