Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Exploring the use of network meta-analysis in education: examining the correlation between ORF and text complexity measures

Exploring the use of network meta-analysis in education: examining the correlation between ORF... Calls for empirical investigations of the Common Core standards (CCSSs) for English Language Arts have been widespread, particularly in the area of text complexity in the primary grades (e.g., Hiebert & Mesmer Educational Research, 42(1), 44–51, 2013). The CCSSs mention that qualitative methods (such as Fountas and Pinnell) and quantitative methods (such as Lexiles) can be used to gauge text complexity (CCSS Initiative, 2010). However, researchers have questioned the validity of these tools for several decades (e.g., Hiebert & Pearson, 2010). In an effort to establish criterion validity of these tools, individual studies have compared how well they correlate with actual student reading performance measures, most commonly reading comprehension and/or oral-reading fluency (ORF). ORF is a key aspect of reading success and as such is often used for progress monitoring purposes. However, to date, studies have not been able to evaluate different text complexity tools and relation to reading outcomes across studies. This is challenging because the pair-wise meta-analytic model is not able to synthesize several independent variables that differ both within and across studies. Therefore, it is unable to answer pressing research questions in education, such as, which text complexity tool is most correlated with student ORF (and, thus, a good measure of text difficulty)? This question is timely given that the Common Core State Standards explicitly mention various text complexity tools; yet, the validity of such tools has been repeatedly questioned by researchers. This article provides preliminary evidence to answer that question using an approach borrowed from the field of medicine—Network Meta-Analysis (NMA; Lumley Statistics in Medicine, 21, 2313–2324, 2002). A systematic search yielded 5 studies using 19 different text complexity tools with ORF as the reading outcome measured. Both a frequentist and Bayesian NMA were conducted to pool the correlations of a given text complexity tool with students’ ORF. While the results differed slightly across the two approaches, there is preliminary evidence in support of the hypothesis that text complexity tools which incorporate more fine-grained sub-lexical variables were more strongly correlated with student outcomes. While the results of this example cannot be generalized due to the low sample size, this article shows how NMA is a promising new analytic tool for synthesizing educational research. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Annals of Dyslexia Springer Journals

Exploring the use of network meta-analysis in education: examining the correlation between ORF and text complexity measures

Annals of Dyslexia , Volume 69 (3) – Jul 27, 2019

Loading next page...
 
/lp/springer-journals/exploring-the-use-of-network-meta-analysis-in-education-examining-the-OqtH0QPYQ3
Publisher
Springer Journals
Copyright
Copyright © 2019 by The International Dyslexia Association
Subject
Linguistics; Language and Literature; Psycholinguistics; Education, general; Neurology
ISSN
0736-9387
eISSN
1934-7243
DOI
10.1007/s11881-019-00180-y
Publisher site
See Article on Publisher Site

Abstract

Calls for empirical investigations of the Common Core standards (CCSSs) for English Language Arts have been widespread, particularly in the area of text complexity in the primary grades (e.g., Hiebert & Mesmer Educational Research, 42(1), 44–51, 2013). The CCSSs mention that qualitative methods (such as Fountas and Pinnell) and quantitative methods (such as Lexiles) can be used to gauge text complexity (CCSS Initiative, 2010). However, researchers have questioned the validity of these tools for several decades (e.g., Hiebert & Pearson, 2010). In an effort to establish criterion validity of these tools, individual studies have compared how well they correlate with actual student reading performance measures, most commonly reading comprehension and/or oral-reading fluency (ORF). ORF is a key aspect of reading success and as such is often used for progress monitoring purposes. However, to date, studies have not been able to evaluate different text complexity tools and relation to reading outcomes across studies. This is challenging because the pair-wise meta-analytic model is not able to synthesize several independent variables that differ both within and across studies. Therefore, it is unable to answer pressing research questions in education, such as, which text complexity tool is most correlated with student ORF (and, thus, a good measure of text difficulty)? This question is timely given that the Common Core State Standards explicitly mention various text complexity tools; yet, the validity of such tools has been repeatedly questioned by researchers. This article provides preliminary evidence to answer that question using an approach borrowed from the field of medicine—Network Meta-Analysis (NMA; Lumley Statistics in Medicine, 21, 2313–2324, 2002). A systematic search yielded 5 studies using 19 different text complexity tools with ORF as the reading outcome measured. Both a frequentist and Bayesian NMA were conducted to pool the correlations of a given text complexity tool with students’ ORF. While the results differed slightly across the two approaches, there is preliminary evidence in support of the hypothesis that text complexity tools which incorporate more fine-grained sub-lexical variables were more strongly correlated with student outcomes. While the results of this example cannot be generalized due to the low sample size, this article shows how NMA is a promising new analytic tool for synthesizing educational research.

Journal

Annals of DyslexiaSpringer Journals

Published: Jul 27, 2019

There are no references for this article.