Overall Scores by State

Column

All-student Reading Performance by State and Year

Column

Notes

This figure shows average NAEP 4th grade reading scores by state, plus the District of Columbia (DC) and the Dept. of Defence Educational Authority (DoDEA).

  • Connecticut is highlighted in blue. The national average is in red.

  • Hover over data points for exact averages and ranks (lower rank is better).

Connecticut’s performance overall seems pretty good, but disaggregation by race or by Title I status of schools tells a more troubling story; see following pages.

Gap: Black/White

Column

Connecticut in Context

Connecticut Gap over Time

Column

Notes

Upper Panel

Connecticut and the National Average are highlighted.

Among the 42 jurisdictions, 40 states plus DC and DoDEA, with a calculable black/white achievement gap, Connecticut’s is the 6th largest, 16 positions worse than the national average.

States are ranked from smallest gap (leftmost) to largest (rightmost). Lower rank as displayed in tooltip indicates smaller gap in achievement (lower rank is better).

Samples of black students for 10 states (at the far right) were too small to reliably estimate state averages for the group; no gap can be computed for these states.

Lower Panel

The lower figure shows the magnitude of Connecticut’s achievement gap in NAEP 4th grade reading scores for White vs. African American students over the most recent 18 year period.

Connecticut’s gap is relatively stable over the 18 year period, although there may be some slight trend toward improvement.

Gap: Title 1 (≤ 40%)

Column

Connecticut in Context

Connecticut Gap over Time

Column

Notes

Title 1 of the Elementary and Secondary Education Act provides funds to school systems with high percentages of students from low income families so that these schools can give extra instructional support for low-achieving children. There are specific formula that determine whether a school qualifies for Title 1 funds. These vary somewhat for the four Title 1 funding programs, but all take into account the proportion of low-income students in the school, as well as the governing state’s own funding formula.

In general, a school receiving Title 1 funds is required to focus expendature of that money on students at greatest risk of academic difficulties. But, if low-income students make up more than 40% of a school’s student population, then it may use Title 1 funds throughout the school as a whole.

The figures on this page contrast schools that receive no Title 1 funds at all with Title 1 schools where fewer than 40% of students are low-income, and so their Title 1 funds are meant to be used for low-income students alone.

On this view, Connecticut’s gap is 17th largest, only 5 rank positions worse than the national average.

Aggregation by Title 1 status is not available for 2003.

https://nces.ed.gov/fastfacts/display.asp?id=158

https://www2.ed.gov/programs/titleiparta/index.html

About

Column

About the NAEP

The National Assessment of Educational Progress (NAEP) is a federally mandated project administered by the National Center for Education Statistics (NCES) within the Department of Education’s Institute of Education Sciences. It was initiated in 1969 to serve as a nationally representative assessment of what U.S. students know and can do in various academic subjects, and how that changes over time. The NAEP is designed primarily to provide group-level data on student achievement across subjects.

Periodic reports based on NAEP results are commonly known as The Nation’s Report Card. These reports aggregate data at the state level. The NCES does not release NAEP results for individual students, classrooms, or schools. For each state, NCES does report aggregate NAEP results for various demographic groups, including divisions by gender, socioeconomic status, and race/ethnicity.

Assessment of mathematics, reading, and science takes place in odd numbered calendar years. Test results for these core subjects are collected for grades 4, 8, and 12. Assessment of other subjects less frequently and in fewer grades, and typically in even numbered years. Subjects include the arts, civics, economics, geography, technology and engineering, and U.S. history.

Column

About the Author

David Braze is a researcher and consultant with a background in linguistics and reading research. He has more than 25 years experience investigating the cognitive foundations of language, literacy, and educational achievement, including 17 years as a Senior Research Scientist at Haskins Laboratories. His research at Haskins, funded by the National Institutes of Health, emphasized the neurobiology of language and reading and applications to education. Dr. Braze consults widely for business, government, and non-profit sectors.

  email:
  website: davebraze.org

Column

About the Software

All data summaries in this dashboard were produced with the R statistical environment, version 4.1.0. The dashboard itself was made using an Rmarkdown workflow. The following table lists the non-base R packages used in building the dashboard. To see a full citation for a specific package, assuming you have both R and the particular package installed, call (e.g.) citation("dplyr") from the R prompt.

R packages used in this dashboard.
package version date
dplyr 1.1.0 2023-01-29
FDBpub 0.0.1.9999 2022-04-29
FDButils 0.0.10 2022-01-29
flexdashboard 0.5.2 2020-06-24
forcats 0.5.1 2021-01-27
ggplot2 3.4.1 2023-02-10
here 1.0.1 2020-12-13
MetBrewer 0.2.0 2022-03-21
plotly 4.10.1 2022-11-07
readxl 1.3.1 2019-03-13
stringr 1.5.0 2022-12-02

The code used to build this dashboard and the interactive figures it contains can be found on github at: https://github.com/davebraze/ct-achievement-gap.