TN School Letter Grades Explained

Introduction

Whenever you write something, you’re always supposed to have audience in mind. I do not have one in mind for this blog. It can be for educators or laypeople. I just want people to understand what is going on in education in the state of TN, what the implications are for their school, and how to interpret the results that they are seeing for letter grades for their school.

I will not debate the merits of this letter-grade system here. It’s already done, and I already spoke publicly at the public forum offered. In addition, I have signed more than one letter regarding these letter grades, and everything we warned about them has already borne fruit in less than 24 hours. The marks are already out here marking. Having seen that happen is what has inspired me to explain how these work and what they mean.

Disclaimer

I am not going to dumb this down. It isn’t that complex, but it’s more complex than, say, basic sports stats. Laypeople shouldn’t have any problems understanding this, but I just want to ward off any criticism that I’m writing some sort of arcane nonsense. If that’s what it is to you, then that’s just what it is.

Also, to be clear, I am writing this in my capacity as a private citizen, and any opinions here are my own and are not representative of the Campbell County School System or of East Tennessee State University.

Something that needs to be said

  • We weren’t informed that we would be given these grades until this Fall. This is like the teacher telling students that the way their grade is going to be calculated is totally different than what it was all semester.

  • The state has issued accountability protocols with the accountability metrics. We’re used to having these at the beginning of the school year for that year’s data, not for the previous year’s.

How were we “graded” before?

Accountability, agreed to in our ESSA plan, looked at several different factors, including Achievement, Growth, Chronic Absenteeism, Graduation Rate, Ready Graduate Rate, and English Language Proficiency Assessment. You can see how this accountability is laid out below (source).

Those are the weights and measures. The final score is calculated by weighting all students with 60% and students in subgroups (Black/Hispanic/Native American, Students with Disabilities, Economically Disadvantaged, English Language Learner) 40%. This is what a typical federal heat map looks like.

As you can see, this school scored 3.5. This is high enough to be a reward school in years past (anything school with a score greater than 3.2 was designated as a reward school).

You can see they had an “Absolute” and “Target” score. And the state takes the higher of the two. The TDOE completely did away with this. This is why many schools who are used to having high accountability scores are receiving low grades this year. The goalposts have moved.

How we are “graded” now:

This is a much simpler formula, and that might seem like a positive thing at first glance, but once you dig into how all of these metrics are determined, then you really see how much different of a system this is.

Let’s look first at how achievement scores are determined by federal accountability.

This means that schools with 45% or better of their students scoring proficient on the state test or meeting their double AMO (twice the amount set for their Annual Measurable Objective) will receive a score of 4 out of 4 for the achievement category. Let’s look how the state changed this for their accountability.

They spilt it into three grade bands, and they made the elementary and middle achievement higher at 49.5% and 45.4% respectively. The high school achievement score is lower at 40.1%. This is to earn maximum points. This begs the question why didn’t they include the pathway for improvement?

Growth

Growth is measured the same in both accountability systems, however, growth has been interesting in the state of TN since COVID. Because of Shelby County and Davidson County having multiple virtual years, this has really changed growth for smaller counties. Since growth is a metric that’s based on comparing students to their peers, and specifically, it’s comparing how students perform from one year to the next (have fun reading about it here) for grades 3-8 and comparing how students perform based on a predicted score, it’s hard to show growth when over a hundred thousand students in the pool would have had artificially low scores during and right-after COVID. Can you still show growth despite this? Yes, but you would have to really outscore the means from before. This is why despite showing tremendous gains in Achievement, Campbell County schools are still having trouble showing growth. Look at the improvement Campbell County has had in Achievement over the past three years. It is counterintuitive that this did not also turn into growth like it would have in pre-COVID years.

Subgroups

The following groups of students are recognized as historically underserved student groups:

  • BHN (Black/Hispanic/Native American)

  • ED (Economically Disadvantaged)

  • SWD (Students with Disabilities)

  • ELL (English Language Learners)

For federal accountability, we were held accountable for how these groups of students performed. It is a huge part of our planning. As you can see below, they’re weighted for 40% of federal accountability.

Schools that have a high percentage of these students have depended on their performance to have a high score for federal accountability. Repeating myself here, but it’s 40% of the school’s accountability score.

For state accountability, this completely changed. Now, these students only count for 10% of the accountability score, and it’s the growth score for only the students who score in the bottom 25%. If that sentence is confusing, you’re not alone. Having groups of students go from 40% of your accountability to 10% is shocking enough, but using a metric like a bottom quartile also makes this group of students fluid and not easily identified for intervention.

CCR (not the band)

College and Career Ready is a new metric that the state is using to see the percentage of students who are taking advantage of Early Postsecondary Opportunities (EPSOs). This is like the federal metric, Ready Graduate, but it differs slightly, and it’s an easier metric to attain. Here is a chart of how they differ. Also included is a metric for TISA graduates, which is something that triggers additional outcome funding, but it isn’t part of the accountability model.

Conclusion

Hopefully this will help you understand how Letter Grade accountability differs from what schools are used to. Let’s keep in mind that all of this is based on some pretty flawed logic. How a student performs on one test on one day is interesting information for a teacher, but it shouldn’t be used to evaluate a teacher or school or district.

If you want to read more about accountability, I suggest this blog post.

If you want to read a Director of School’s thoughts on all that’s happening here, please read this.

Creating GPTs and the future of EdTech

ChatGPT plus users now have the ability to make their own GPTs. Think of these like personalized ChatGPT bots that have specific parameters. For example, the first one I created is a writing tutor for students in TN. It doesn’t take any special coding to do one; you just chat your way through it with ChatGPT. You can upload documents, and that’s what I did. I found the latest writing rubrics for TN and the anchor papers that are provided. Because ChatGPT doesn’t care about formatting, I wrote a python script to write all of those dense PDFs into one RTF file, and I uploaded it.

I know a lot of teachers are going to fear that students are going to use these programs to cheat, so what I did was tell ChatGPT that I didn’t want it to create writing for students, only to give feedback on it. So instantly, I built a fence around my GPT that I wanted.

People like me are going to be building a ton of these, and eventually, we will probably have the opportunity to monetize these like YouTube Videos or TikToks. And honestly, that’s my plan. I want to built great resources that are specific to TN, and if I make some extra dough on that, good for me. Honestly, building it is its own reward for me.

You can keep track of my GPTs here: http://jasonhorne.org/gpt

Josh Heupel's Penalty Rates

Despite Josh Heupel’s success as a football coach at both the University of Central Florida (UCF) and the University of Tennessee (UT), and also despite perennial gaudy offensive numbers, Josh Heupel’s teams have been among the most penalized in college football over his six years of coaching. I will look at the extent of this, but I won’t get into why this is true; someone else can do that.

I took data from https://www.teamrankings.com/ to look at how Josh Heupel’s team did with penalties at UCF and UT. By comparison, I wanted to see how Gus Malzahn’s UCF teams have done the past three years, and I also wanted to see how Jeremy Pruitt’s UT teams did the previous three years before Heupel became the coach.

Because penalties aren’t the complete picture, I also wanted to look at winning percentage and the number of plays per game. Just taking a look at 6 years of UCF and UT, it’s obvious that penalties are not having any causal effect on winning percentage for a season.

So what is the reason Josh Heupel’s teams commit so many penalties? His teams’ penalty rankings seem to be inversely proportional to number of play rankings. Heupel shows an average ranking of 116.2 (out of 133 teams most years) in terms of penalties per game (a lower ranking meaning more penalties per game). Conversely, he shows an average ranking of 20.7 for plays per game, even having the number 1 ranking in 2020-2021. Because of this, I wanted to see if there was a relationship between plays per game and penalties per game.

I scraped all the penalty per game and plays per game data from https://www.teamrankings.com/ to get the data for all college football teams for both categories. I combined all the data into one data frame that included the average penalties per game, average plays per game, and the rankings for both. To look the relationship, I wanted to see a scatterplot of average penalties per game vs average plays per game, and I also wanted to see the Pearson r correlation coefficient.

scatterplot of penalties v plas

The scatterplot shows that there is a weak relationship between the average penalties and the average plays per game (r = 0.16, p < 0.05). Even though the relationship is weak, it is a statistically significant relationship. Ultimately, the number of penalties are a trend with Heupel-coached teams, and it doesn’t appear that he can blame it on the number of plays his teams has per game.

A deeper analysis of Tennessee’s penalties might show where they are happening. Are they happening more frequently on offense or defense? Are certain players or position groups committing these penalties? Against which teams are the most penalties committed: teams like Alabama and Georgia, or is it teams that aren’t as stiff competition?

The evidence suggests that while Heupel's offensive strategy correlates with a high volume of plays, it does not inherently lead to increased penalties, hinting at other factors at play. The nature of the penalties, their timing, and their distribution among players and game situations are dimensions still to be explored. In-depth examination could offer insights into whether the penalties are a byproduct of aggressive play-calling, lack of discipline, or strategic trade-offs deemed acceptable by the coaching staff.

Understanding the subtleties behind these penalties can be crucial for refining practice strategies and in-game decision-making. It can help in developing targeted coaching interventions to mitigate unnecessary losses while maintaining the aggressive edge that characterizes Heupel's approach. For the University of Tennessee, such insights are not just academic; they could be the key to fine-tuning a powerful offensive engine into a more efficient, disciplined unit that capitalizes on its strengths without succumbing to self-inflicted setbacks.

For now, the data provides a starting point for a more nuanced discussion on the interplay between plays and penalties under Heupel's tenure. Further research may illuminate the path to optimizing performance where it matters most — on the field where every play, and every penalty, can alter the course of the game.

Tennessee TN Ready/EOC Rankings and Average Proficiency

I wanted to see where my county ranked annually with other school systems in the state in terms of TN Ready/EOC proficiency. I also wanted to compare our county with other benchmark districts. Click here to access the Looker Studio.

I built this Looker Studio (formerly Google Data Studio) to illustrate this data. While I was at it, I also built some charts to look at proficiency over time.

I pulled the raw data from the state data downloads page here.

I used Python to concatenate the files from 2018 through 2023. I also used Python to clean the data and separate it into two different files (EOC and TN Ready). I then uploaded those files to Google Sheets for my Looker Studio.

Here is the raw TN Ready Data.
Here is the raw EOC Data.

Update: I added a spot for school ranks and average proficiency and included data up to 2022. We still do not have 2023 data for the entire state, and my local data is currently embargoed.

I will update that when it is public.

Building an Artificial Intelligence tool to predict test scores

You can find the tool here.

Using different variables, about how reliably can we predict test scores? For this project, the answer was around 40% reliable.

Project Explanation

This project is focused on using three different datasets to predict the ScaleScore for students on their Tennessee Comprehensive Assessment Program (TCAP) exam. The data sources used for this project include NWEA MAP Data, Attendance Data, and TCAP data from the years 2020-2021 and 2021-2022. NWEA MAP is a benchmarking test series that provides a snapshot of student progress through a grade-level, and is also used to predict achievement levels. Attendance data shows how many days a student attended school in a year, and includes demographic data that adds additional variables for multiple regression analysis. Finally, TCAP data includes the ScaleScore for students. The data files are merged using the student's state ID as the Primary Key.

The datasets used for this project include:

  • 2020-2021 Absentee Data: 2021_abs_data.csv

  • 2021-2022 Absentee Data: 2022_abs_data.csv

  • 2020-2021 MAP Test Data: MAP2021W_results.csv

  • 2021-2022 MAP Test Data: MAP2122W_results.csv

  • 2020-2021 TCAP Test Data: TCAP2021_results.csv

  • 2021-2022 TCAP Test Data: TCAP2122_results.csv

This document will cover the following sections:

  1. Project Explanation (this section)

  2. Data Cleaning

  3. Data Visualization

  4. Data Training

  5. Data Predictions

  6. Website Encoding

  7. Reflection

Data Cleaning

When cleaning the data, I opted to do some of it manually in Excel by removing unnecessary columns and creating a new column in the absentee data files for the number of days present. I did this to ensure that any identifying information was removed from the files and to get a more accurate picture of the relationship between attendance and Scale Scores. However, I understand that it's best practice to do all the cleaning in the code, and I did the remaining cleaning in Python.

When I tried to merge the files together, I ran into some challenges because I overcomplicated the process. At first, I thought an outer join would work, then I considered a left join. But both approaches resulted in a lot of NaN values and over 140k rows of data. I eventually realized that I only needed to merge the Attendance, MAP, and TCAP files together after concatenating them, because the year of the test was not important for the final result.

To prepare the data for analysis, I had to make some changes. Firstly, I converted the Course and ContentAreaCode columns from strings into numeric data. Next, I removed rows with Science and Social Studies because my analysis was only interested in Math and English scores. After that, I realized that I didn't need both the Course and ContentAreaCode columns, so I deleted the latter. Additionally, I converted Math scores to 1 and English scores to 2. However, the Course column didn't show a strong correlation with the target variable, so I ultimately decided to exclude it from the analysis.

To clean the data, I deleted more than 100 columns from the original files using Excel. While best practices suggest working with fewer files and keeping them consistent, for this project, I only needed one-off files. Therefore, it was easier for me to clean the data in Excel than to type it all into Python. Although this method may not be ideal for automation, it suited my project's purposes.

Data Visualization

In the visualization stage of the project, I performed a Pearson correlation to determine the correlation between each variable and the TCAP Scale Score, which is the target variable. Based on the results (see below), only a few variables showed any correlation with the Scale Score. These variables were TestDurationMinutes, TestRITScore, EnrolledGrade, ED, SWD, and n_days_present. However, I excluded TestDurationMinutes as it was not logical to use data from the MAP test for predicting the TCAP score. Ultimately, I focused on RIT Score, Grade Level, the number of days present, the economically disadvantaged status, and the student with disabilities status. I chose to work with these variables as they showed the most significant correlation with the Scale Score.

Course                 0.013974
TestDurationMinutes    0.199303
TestRITScore           0.523997
SchoolNumber          -0.048202
EnrolledGrade         -0.125896
TestGrade             -0.126014
ScaleScore             1.000000
school                -0.067635
n_absences            -0.166537
Black                 -0.037727
Hispanic              -0.029100
Native                 0.029105
BHN                   -0.041469
HPI                   -0.012874
Asian                  0.033111
White                  0.026005
ED                    -0.158510
SWD                   -0.316250
EL                    -0.029312
n_days_present         0.172471

After I selected the variables I wanted to focus on, I created scatterplots to visualize the relationships between the data. I specifically focused on the attendance and MAP data, as grade level, SWD, and ED wouldn't be suitable for a scatterplot. The scatterplot for TestRITScore and ScaleScore displayed a clear linear relationship, while the attendance data was more scattered, with more outliers for ScaleScore than RITScore. These visualizations helped me confirm that my data was suitable for analysis and gave me confidence in my approach.

Scatterplot

For the other data, since ED and SWD had binary (0 or 1) choices, and since grade levels are on a 3-8 scale, I chose to use BoxPlots to visualize those. A 0 means that the student isn’t Economically Disadvantaged or a Student with Disabilities.

Economically Disadvantaged BoxPlot

Students with Disabilities BoxPlot

Grade-Level BoxPlot

I also did a heat map that shows how the correlates for each of these compare.

Heat Map

Data Training

To prepare the data for machine learning, I needed to split it into two sets: the data training set and the data testing set. With 63,655 rows of data, the training set will be composed of 20% or 12,731 rows, and the testing set will be composed of 80% or 50,924 rows. I plan to experiment with these numbers to see how different splits affect the results of the machine learning model.

During the process of splitting the data into training and testing sets, I needed to ensure that there were no NaN values present in either set. It took me a while to troubleshoot the issue because a particular variable kept showing NaN and Infinite values, and I couldn't locate them for some reason. Eventually, I realized that I had forgotten to remove the NaN values from the x_train and x_pred variables, even though they were not present in other sets. This was a valuable lesson for me to learn in terms of checking all relevant variables for data inconsistencies.

I ended up running a multiple regression, and here is the data it returned:

Training/Testing Coefficients Intercept R-Squared
20%/80% 0.26169216 -7.37082399 1.17469616 -2.25918833 -4.18241256] 70.8358039289374 0.39895715901860773

The coefficients represent the weights assigned to each variable in the multiple regression model that was trained on the 20% data set. In other words, the coefficients indicate the relative importance of each variable in predicting the target variable (TCAP Scale Score).

In this case, the coefficients are for n_days_present, EnrolledGrade, TestRITScore, ED, and SWD in that order. A positive coefficient indicates that the variable has a positive effect on the target variable, while a negative coefficient indicates that the variable has a negative effect on the target variable.

For example, the coefficient for n_days_present is 0.2617, which means that for each additional day a student is present in school, their predicted TCAP Scale Score will increase by 0.2617 points. Similarly, the coefficient for EnrolledGrade is -7.3708, which means that as a student's enrolled grade increases, their predicted TCAP Scale Score will decrease by 7.3708 points.

The intercept value of 70.8358 represents the predicted TCAP Scale Score when all of the other variables in the model are equal to zero.

The R-squared value of 0.3990 indicates that the model explains 39.9% of the variance in the target variable, which means that the model is moderately accurate in predicting TCAP Scale Scores based on the selected variables.

Website Encoding

Although I had prior experience with HTML dating back to the late 1990s, I struggled to implement my project on my website using Squarespace due to my limited coding skills. Consequently, I sought assistance from ChatGPT to create a JavaScript feature that could take input from a form and use the coefficients to generate a predicted score. The feature turned out to be a great addition to my website, and it worked seamlessly.

Reflection

As I reflect upon the completion of this project, I have gained valuable insights in a relatively short span of time about utilizing Python for data analysis and visualization. This experience has been truly enjoyable and has revealed an intriguing parallel that I had never considered before. Just as English is my native language, Microsoft Excel is my primary data language. I tend to visualize all data in Excel sheets in the same way that I perceive foreign languages through the framework of English. When learning Spanish, I contemplate its relation to French and subsequently to English. Even while studying French, despite being proficient enough to think in the language, I occasionally revert to my native tongue during the thought process. This phenomenon is identical to my experience with Excel, which is why I opted to modify my CSV files in Excel prior to working with them in Python. If I had learned Python first, I might have preferred it for handling data. This unexpected mental revelation has left me wondering when I will begin to perceive the world through Python. Experiencing my first Python dream last night was a noteworthy milestone, as dreaming in a foreign language is often an indicator of the formation of appropriate neural pathways in the brain.

I have thoroughly enjoyed this project and eagerly anticipate creating more with an expanded range of variables. This approach can provide substantial insights into the mathematical dynamics of a student cohort, and incorporating additional years and data types will enable us to further train and test the model, ultimately achieving a greater degree of certainty in our predictions. The moderate correlation (r = 0.52) between RIT and TCAP is already promising, indicating the reliability of RIT as a benchmark exam. I am enthusiastic about broadening the scope of this project over time and discovering new possibilities in the realm of school data analysis.

Three Months of ChatGPT Part III: Learning

Of all the uses that ChatGPT has, how it has helped me learn things is my favorite. Currently, I’m working on a second master’s degree in Data Science, and it has proved invaluable in helping me learn.

School Learning

I’m taking a course, CSCI 5260: Artificial Intelligence. This course uses a book called Artificial Intelligence: A Modern Approach. Because the newest edition of this book was written in 2022, ChatGPT doesn’t know it, but it does know the previous version of this book. Before I do my reading, I ask it to outline the chapter for me with the key points. I use this outline to fill in my notes. I’ll show you what this looks like.

Reading Notes

This really prepared me to read the chapter, and it also gives me a guide for taking notes.

Once my professor posts his PowerPoint, I export it to an RTF file, and I have it make an outline of it. This gives me something to use when I’m taking notes during his lecture. By the time his lecture is finished, I have my notes from the book and notes from him, and I have the ChatGPT chapter summary and PowerPoint summary. I take all of that information and put it back into ChatGPT, and I ask it to make a study guide. I use this study guide to help me take my open-notes quiz. Note that I’m not taking any shortcuts here; I’m only deepening my learning by using ChatGPT to do the part that nobody has time for (making summaries and study guides).

A potential shortcut to this would be to record the audio of the lecture, and use a voice to text program to submit that straight into ChatGPT, but I’m not there yet. Those are some extra steps that could keep me from even having to take notes, but I find that taking notes improves my understanding.

During lectures, if the professor is explaining something that’s out of my depth, I ask ChatGPT to explain it to me. Here is an example from my last class.

Explaining Something

As you can see, I didn’t really understand how these two types of search were related, and ChatGPT explained it to me. So far, I’ve only shown you examples from a graduate-level Computer Science course.

Let’s see how it could help a third-grade student understand fractions.

Explain Equivalent Fractions to a Third Grader

Or if you need help helping them, it can suggest that, too.

Help a Third-Grader

What about high school stuff?

Let’s see how it does with something that nearly everyone has to learn. I’m going to let the whole thing play out.

A freshman reading Romeo and Juliet for the first time could benefit from having ChatGPT within reach rather than just struggling through the text and giving up on it. This is just like Cliff’s notes, except you can interact with it and if throws a word at you, just ask what it means. Instead of getting a dictionary definition, it will tell you what the word means in context.

Also, you can get loopy and ask it to rewrite the scene as if it’s an episode of Seinfeld or Friends just for fun.

Coding

I’m in two different classes that require me to use Python to do my work. One, CSCI 5010: Programming for Data Analytics, is basically an intro to Python class. I don’t use ChatGPT for help in it much because it’s designed for us to go through several steps to learn how to code with Python. If I used ChatGPT for that, I would never learn the basic code, and it would really stunt my growth in Python. The usefulness of ChatGPT is highly dependent on the user's level of knowledge, as it becomes increasingly powerful for those who are knowledgeable and relatively useless for those who lack knowledge. I am not going to be able to use it for advanced coding if I make it do my simple coding, too. Having given that disclaimer, the coding required in CSCI 5260 is much more complex than the coding in 5010, and I have used ChatGPT several times to explain what’s going on in the code so that I can learn more and understand it. Here is an example:

Analyzing Code

And if your code gets an error message, you can ask it to explain the error message to you, and it will even suggest code to fix it.

ChatGPT Fixing a Problem

It isn’t limited to Python, of course. You can have it write in other languages. This could be very helpful with HTML. For example, if I wanted to write some HTML code to insert into my site that has a LOOK AT THIS HTML in big letters using a cursive font, I could just ask ChatGPT how to do that.

LOOK AT THIS HTML

If we only had this back in the MySpace days.

For the uninitiated, how everyone has fixed their code for the past two decades is to go to sites like GitHub, StackOverflow, or just Googled it. This would rarely produce a 1:1 fix for your code. You would have to extrapolate how to fix your own code from someone else’s code fixing something adjacent. And while that was always a pretty decent learning experience, it just doesn’t compare to having ChatGPT look at your code for you. The more you talk to it about what you’re trying to accomplish, the more precise it is with your code.

Laws and Policies

I would not use ChatGPT in place of hiring an attorney, but sometimes you just need to understand what a law means. Let’s say you’re watching TV and someone pleads the fifth. Let’s see what ChatGPT says about that.

I plead the fifth

Maybe you want to figure out how that go into the Bill of Rights in the first place.

Still Pleading the Fifth

Let’s dig even deeper. Let’s say you’re a big Hamilton fan and want to see what Hamilton wrote about The Fifth Amendment in the Federalist Papers.

Hamilton Pleads the Fifth

I wonder if there are any exceptions to the Fifth Amendment.

Exceptions to the Fifth Amendment

This is all in one conversation that takes a minute or two to do. And if you check it against the primary source or other sources, you can see that it stands up. I like how it includes quotes from the sources if you ask about it. For example, if you ask about the sources from the Federalist Papers, it’s going to include quotes. I would be cautious about using ChatGPT as a source, and I definitely wouldn’t ask it to tell you how to cite something. I asked it to cite the Federalist Papers in APA format, and here’s what happened. I had to correct it.

APA Oops

Suggestions for Students

  • Understand that this is still in its infancy, but it is already a powerful tool that you need to know how to use.

  • Don’t trust it as a source any more than you would anything else, but it is reliable especially if you focus it to specific information that you want.

  • Try to get class materials into copyable text to put into to have it organize and analyze for you.

  • Tell it the name of your textbook and have it create guides for each chapter to make it easier to take notes on your reading.

  • Have it help you make study guides.

  • Feed your own typed notes into it and have it organize them for you.

  • Have it create a quiz for you and grade your responses. This works well if you give one response at a time.

  • If your professor provides you with a rubric, you can feed it into ChatGPT, and then paste your paper and have it evaluate your paper using the rubric. Ask it to give you feedback. This will be different than a human, but it will still be insightful.

  • Have it help you write emails to your teachers or professors that sound professional.

  • Have it proofread your papers for you. If you have it write it for you, then it will get caught in an AI detector. Just tell it to tell you where your mistakes are and put them in a table (it’s easier to keep track of them there).

  • Tell it to explain something complex to you like you’re ten years old if you don’t really understand it.

  • If you don’t understand something, tell it what you don’t understand about it.

  • Ask it to explain books to you and why they’re important.

  • Pretend it’s a friend who read the same book as you and talk to it like you’re hanging out discussing the book. If its tone is too impersonal, tell it to pretend it’s your friend.

  • If you are given an open-ended assignment, have it give you ideas on what you could do. (Think six-grade science fair.)

  • Have it help you brainstorm during the writing process.

  • Free write ideas and have it organize them for you.

That’s enough homework for now.

Three Months of ChatGPT Part II: Teaching

Generally, I teach two different classes: MEDA 3570 - Educational Technology for Pre-Service Teachers, and ELPA 5400/6400 - Developing Learners and Instructional Leaders for future PreK-12 Administrators. I want to write about how ChatGPT has impacted my teaching this semester.

Planning

In my MEDA 3570 class, I redid my projects with the help of ChatGPT who not only helped me organize the instructions better than they were, it also helped me develop a rubric to grade my projects. I could have done this by myself, but it was good having ChatGPT to think it through.

What was really great is that I had assignments from another class that I fed into ChatGPT and said “make my assignments more like these.” I have found that ChatGPT is best when it’s improving something, not when it’s creating from nothing. In my experience, ChatGPT is best utilized for improving existing materials rather than creating from scratch. As such, teachers who seek to enhance their existing materials may find it more useful with fewer obstacles than those who prefer to create from scratch.

Collecting Qualitative Data

I really like keeping a log of all my feedback to students on an assignment, then having ChatGPT look through that feedback for commonalities. I think of it like an item-analysis that you would use for an assessment, but instead of looking at who got #1 wrong or #12 wrong, I’m looking at what I’ve said over and over to students. Because of this tool, it frees me to give feedback in a loose, casual way, because ChatGPT is going to see through that and just tell me what I’ve had to say to students. It’s really worth trying especially if you’re teaching something mechanical like writing or art that has an element of subjectivity to it.

Getting to Know Your Students as Learners

I also use this as a way to get to know my students better. In my MEDA 3570 class, I always give a beginning of the year survey where I’m getting to know them. And like in my previous blog post, I take the open-ended answers and feed them into ChatGPT for analysis. It gives me a summary of my class. After feeding all of the information into ChatGPT, I have it make a summary of my class. The insights gained from ChatGPT's analysis of student feedback can be invaluable. Moreover, these analyses can be compared across different classes and semesters, providing a valuable tool for tracking progress and identifying areas of improvement.

In my ELPA 5400/6400 class, part of the program is to use the Strengthsfinder 2.0 for students to identify their strengths. It’s always interesting to look at a matrix of the strengths of a class. It gives them 5 strengths, so there is a lot of information to process with it, and it’s a good tool for putting together groups, but I wanted to see how ChatGPT would process these. I didn’t go too deep with it, but I did have it make a table of the strengths and then summarize them. This is what it returned:

“This information can give you a general understanding of the dominant strengths of the class and can inform your teaching and learning activities, such as creating group projects that utilize the strengths of individual students or creating a positive and supportive learning environment that plays to the strengths of the group.

 Based on the strengths data provided, this class seems to be a group of students with diverse skills and abilities. The strengths of the students include a mix of interpersonal skills (such as Relator, Woo, and Connectedness), analytical skills (such as Analytical, Input, and Intellection), leadership skills (such as Achiever, Activator, and Responsibility), and strategic skills (such as Strategic, Ideation, and Futuristic). Additionally, the class includes individuals who are adaptable (Adaptability), positive (Positivity), and self-assured (Self-Assurance).

Overall, this group of students seems to have a good balance of different strengths, which could make for a collaborative and dynamic learning environment. The presence of different skills could also allow for students to work together on projects, leveraging their diverse strengths to achieve their goals.”

While these applications of ChatGPT may not seem groundbreaking, they provide a starting point for its potential in the K-12 space. By inputting data such as learning styles and readiness levels, ChatGPT could be used to create small groups and perform other analyses to save teachers time. The true power of ChatGPT lies not just in what it can do with one simple query and answer, but in what it can accomplish during a lengthy conversation.

The Administrative Part of Teaching

Whether you’re an adjunct, full professor, or K-12 teachers, you’re going to get emails or phone calls from upset students, parents, or both. It’s just part of the job. ChatGPT can really help you respond to these in a balanced and professional way.

I’m going to feed this email into ChatGPT, and let’s see how it suggests I respond to it. Here is the email:

“Dr. Horne,

I’m very upset that my son made a 55 in your class. He has done all of his work and he’s a straight-A student. You gave him zeros those tests from when we was at Disneyland for a week and you wouldn’t come to school at 6:30am for him to make them up. You need to give my son and A. He’s a straight A student! I want to know what you’re going to do about this!

Upset Parent

Sample Email

This type of tool is very helpful because it doesn’t do what a template would do. A template doesn’t react to a specific situation. And honestly, this is one of those emails that, prior to ChatGPT, I would type 20 times until all of my anger and frustration were gone from it.

Coming at this from a different angle, if you’re writing an email and need it proofed or for someone to look it over, ChatGPT can also do that for you more quickly and more precisely than another human can. Also, it keeps you from having to bother a co-worker.

Letters of Recommendation

I encourage you to be careful when using it to write letters of recommendation because they’re all going to end up sounding the same. I’m joking. They all sound the same already. Here is a good process to follow when using it write letters of recommendation:

  • Input the student’s CV, GPA, ACT/SAT score, etc. Don’t put the student’s name or your name because you don’t want that in the deep recesses of an AI.

  • I would have the student send you their letter of intent or whatever they had to write. Input that as well.

  • Tell ChatGPT to write a letter of recommendation for whatever it is, but leave room for you to add personal comments at the end.

  • This will save you so much time, but it will also give you an opportunity to have something better than your usual letter of recommendation. It will be personalized for the student and include all the information you normally do, but it will include personal comments. How you do that is up to you, but this can help you do it quickly and effectively.

Here is an example. I’m going to input my own CV and tell ChatGPT that I’m writing a letter for a colleague who is apply to be Headmaster of the Van Buren School for Boys.

Sample Letter of Recommendation

I then asked it to recommend the same person for an Associate Professor position at the University of American Samoa. Look at how that letter differs from the first.

Sample Recommendation Letter

Thought Partner

One of my favorite uses of ChatGPT is as a thought partner. This does not replace good colleagues with diverse experiences, but it does provide someone with whom to talk and think when you don’t have a colleague available, and it also provides a different perspective than colleagues, all of whom have their own biases and quirks.

Here is an example:

Many of my students were not following my instructions, and I thought they were very clear. I asked ChatGPT about it, and I pasted in the instructions that I had given, and it gave me suggestions for improving them.

Instructions Suggestions

And in its previous response, it gave some advice on what to think about when students are not following your instructions.

ChatGPT Instructions Advice

I know that isn’t exactly mind-blowing, but again, it’s just a start. It’s a simple thing that every teacher encounters, and it’s a quick and easy way to think through it.

What to try

Here are some things I would definitely try in ChatGPT to see how they work out for you.

  • Feed it your Evaluation rubric scores and see what feedback it gives you.

  • Ask it to help you with an issue you’ve been having in your classroom.

  • Feed your student feedback comments and scores into it and see what it tells you.

  • Get advice on help a student with a specific disability (be careful not to give it confidential information like a student’s name).

  • Ask it to help you write a lesson on something you don’t enjoy teaching. Ask it to help you find a way to really enjoy teaching it.

  • Ask it to help you write a lesson on something your students really dislike learning. Ask it to help you find a way to get them to really enjoy it.

  • Ask it analyze your syllabus.

  • Ask it to make suggestions for your students on how to be successful in your class after you’ve had it analyze your syllabus.

  • Input your writing rubric into ChatGPT then have it analyze student writing samples. See if it scores it any differently than you do. How does your feedback differ?

  • Have it analyze your classroom procedures and rules for clarity and fairness.

  • Have it analyze a dress code for sexism.

  • Have it analyze a student handbook for clarity and bias.

  • Have it write a letter to a parent about a student who is failing your class.

  • Using a student’s CV and other information, have it write a letter of recommendation.

  • Using your CV, have it write a letter of intent for you.

  • Using your CV, have it write a short biography for you.

There is always so much more to say about this tool, but this should be a good starting place for anyone.

Three Months of ChatGPT Part I: Work

Admittedly, I’m not going to use ChatGPT to write this. One reason is that I need as many writing samples of my own writing that I can get in order to train it to write like me, and the other reason is that I think that would be a little too meta and beyond good taste.

I have been using ChatGPT for three months now, and I want to talk about how it has transformed my work, my education, my teaching, and my life. It really serves several different roles for me, and it’s worth it to document them here.

Work

In my role as a central office administrator, I do a lot of reading and a lot of writing. The writing is typically involves communication, interpretation, and extrapolation. I have found ChatGPT can do a lot of the hard work of my job comparing a situation to policy, comparing an updated policy to an old one, and getting key details from a text or an email.

The most powerful aspect of this for me so far has been the work it did on the qualitative section of a survey that I sent out. Instead of me having to pour over hundreds of qualitative responses and making meaning of them. I had ChatGPT categorize them for me. Of course, I didn’t just blindly copy and paste, and everything I do with ChatGPT always starts off as a conversation. I call him “Greg” in my head. For this situation, in particular, I started off with why I was doing the survey, what the questions were, and what I hoped to learn from it. I didn’t need Greg for the quantitative part, but Greg made the quantitative part happen in a matter of minutes. This saved me hours of work, and it did it more precisely than I would have given the low priority of this work and the time constraints.

Here is what this looks like. First, here is what a typical Microsoft (or Google) Forms spreadsheet looks like:

Microsoft Forms Excel Export

Taking the information from this spreadsheet and turning in into data is easy for the quantitative parts. In fact, Microsoft Forms does that for you anyway. The qualitative parts aren’t quite as easy. In real research, a sophisticated coding process is typically used for this, and I haven’t tried to use ChatGPT for that, but this is a very watered-down version of that. I took all the qualitative responses for an individual question and had Greg look at it.

Here is what happened when I gave this input: “In a survey given to teachers about a new grading scale, they were asked. Have you noticed any positive or negative effects on student motivation since implementing the new grading scale? (open-ended) Please categorize their answers and quantify the results,” and then I pasted the text from the column.

Greg replied, “Categorized answers and their corresponding quantities:

Positive Effects: 10 Negative Effects: 4 Mixed/Neutral Effects: 10 Not Applicable/Not Noticed: 5

The most common type of answer is mixed/neutral effects, with 10 responses falling under this category. Meanwhile, positive effects come in second place with 10 responses, and negative effects come in third with 4 responses. There are also 5 responses indicating that the respondent did not notice any effects or that the question was not applicable to their situation.”

Then I asked, “Can you make me a list of the responses in these categories in a table?” Here is what it gave:

Now, imagine doing this for ten questions like this and then having Greg write a summary. It gave this summary:

“Key Themes in the qualitative responses:

  1. Concern about the long-term effects of the new grading scale: Some respondents expressed concern about the potential long-term consequences of the new grading scale, such as lower expectations for students and a false sense of accomplishment.

  2. Mixed feelings about the benefits of the new grading scale: Some respondents saw the new grading scale as a positive development that gave students more opportunities to succeed, while others were worried that it could lower standards and expectations.

  3. Importance of setting high expectations and holding students accountable: Some respondents emphasized the importance of setting high expectations for students and holding them accountable for their work, even if the grading scale is more lenient.

  4. Differences in opinion about the appropriateness of the new grading scale: There were differing opinions about whether the new grading scale was a good idea or not, with some respondents expressing support and others expressing concern.

  5. Need for clear communication about the grading scale: Some respondents mentioned the importance of clearly communicating the new grading scale to students, parents, and other teachers.

  6. Concern about the impact on college readiness: Some respondents were worried that the new grading scale could give students a false sense of their abilities and preparedness for college.”

This is just one example of many, but this type of analysis work is the real time-saver with Greg. I know a lot of folks use it for email, but I do not like for Greg to write my emails for me, generally, unless they are very low-stakes. I would never send out something heartfelt and personal that was generated by AI. That’s how you wind up in a situation like this. However, some emails that are purely professional with no emotional stakes are perfectly fine for Greg. With high-stakes emails, I like for Greg to look over them for me and to look for bias, unprofessional language, and of course, for any typos. It’s hard to proof your own work, and instead of tying up a colleague to do it, now I have Greg for that (that’s a really poorly-worded sentence, and I wish I hadn’t already boxed myself in by saying I wouldn’t have Greg look at it).

I’ve also noticed when someone “GPTs me” at work. Like, seriously? You had GPT write that to me? It makes me feel some kind of way, but I’m not sure which kind of way yet. I’m mindful of that when I’m using it for email.

Greg has saved me countless hours at work, and I started keeping track of everything I used it for, but it started to be way too much for to keep up with. This is what I had before I quit keeping track of it.

In my next installment in this series, I’ll talk about how I use it to enhance my teaching as an adjunct.

ChatGPT Discussion Question

This it the discussion question I’m going to post for my undergrad Educational Technology class.

How do you think AI will change education in the future? Play around with ChatGPT. Here are some things to get it to do for you: 

  1. Have it write a lesson plan for something you think you might teach.

    1. Keep chatting with it to refine the lesson plan.

  2. Paste your biography from the introduction page from your website and get it to rewrite it for you.

    1. Do you like its writing better than yours? 

  3. Ask it to write a study guide for the Google Certified Educator Exam. 

  4. Ask it to explain a concept that you plan on teaching and ask it to explain it on the level you plan on teaching (for example: "explain mitochondria as if I were a high school freshman."). 

  5. Ask it to rewrite one of your tool descriptions in instructional tools. 

  6. Ask it to write all of your tools in the Assessment/Management section. 

Let me know how this goes for you. How is this going to change education? Please don't give solely positive or negative views. 

ChatGPT and Education

ChatGPT is a new tool that has the potential to revolutionize the way we approach education. Developed by OpenAI, ChatGPT is a large language model that is trained to generate human-like responses to natural language inputs.

One of the key features of ChatGPT is its ability to understand and respond to context. This means that it can carry on a conversation with a user and provide relevant and coherent responses, even when the conversation is about a complex or unfamiliar topic.

This capability has exciting implications for education. Imagine a student asking ChatGPT a question about a difficult concept in a textbook. Rather than simply providing a pre-written answer, ChatGPT could engage the student in a conversation and provide explanations and examples that are tailored to the student's level of understanding. This could provide a more personalized and engaging learning experience for the student.

Another potential benefit of ChatGPT is its ability to generate unique content. This could be useful in a variety of educational contexts, such as providing personalized writing prompts for students or generating customized practice problems.

Overall, ChatGPT has the potential to change the way we approach education. By providing personalized and engaging learning experiences, it has the potential to improve student outcomes and make learning more effective.

I had ChatGPT write this for me.

Data Primer

Because of some TikToks I had seen where teachers were complaining about how irrelevant looking at last year’s testing was to their teaching, I thought i’d make a Data Primer so teachers could find useful ways to use data, and so principals could see which kind of data is the most helpful to teachers.

This presentation is geared towards teachers in my school system, but it is easily applied to any teaching situation.

Click here to watch the video



The State of Education in Tennessee

I wrote this as part of an application package to get into a learning academy. I am sharing it here for posterity and because I think you’ll find it interesting.

The state of education in Tennessee is the story of a state that has been on the forefront of the education reform movement in the United States starting with Race to the Top in 2010 (U.S. Department of Education, 2010). “Tennessee has been the fastest improving state since 2011, with students showing more growth over time than students in any other state” (Tennessee Department of Education [TNDOE], n.d.), the COVID-19 pandemic has set those gains back, and now most efforts at the state department of education are to undo the learning-loss that students suffered during the ongoing pandemic (Tennessee Department of Education, 2020).

Chief among the reforms that were adopted as part of Race to the Top were the Common Core standards (Loveless, 2021) that, after two revisions, became the Tennessee Academic Standards (Tennessee Department of Education, n.d.). The adoption of Common Core began a period of parents and others questioning whether the federal government was too involved in education. Because of the backlash against President Obama, Common Core became associated with his presidency and Secretary of Education, Arne Duncan, and viewed in a negative light by President Obama’s political detractors and standardized testing detractors, alike (The Momma Bears, 2015). One of the reasons Tennessee won one of the first Race to the Top grants was because of the promise to adopt the new standards, and within five years, they were being replaced with slightly different, and differently named standards (Aldrich, Common Core is out. Tennessee Academic Standards are in. Here’s how teachers are prepping for the change., 2017).

Another victim of shifting political winds was the accountability testing program, Partnership for Assessment of Readiness for College and Careers (PARCC). Initially, Tennessee was set to adopt PARCC (Wesson, 2013), but it became associated with the Common Core State Standards, and Tennessee legislators quickly distanced themselves from the test (Aldrich, Why Tennessee legislators share blame, too, for TNReady testing headaches, 2018). This resulted in the General Assembly passing legislation requiring the TNDOE to adopt an RFP process and bid out the test. Tennessee cycled through two different companies, Measurement Inc. and Questar, both of whom had major issues (Tatter, 2016).  After having suffered through issues with both Measurement Inc. and Questar, Tennessee finally adopted Pearson as a testing vendor to deliver the TN Ready and TCAP tests (Aldrich, Tennessee chooses testing giant Pearson to take over TNReady, 2019). This has created stability with the testing program, whereas before, confidence was low whether the test would ever be delivered or would work online. Having the same test for multiple years has allowed TNDOE to give teachers tools like SchoolNet to help teachers analyze student performance in real time during the school year rather than waiting on test results during the summer months.

The other major event shaping the state of education in Tennessee today has been the COVID-19 pandemic.  In March of 2020, Governor Lee urged schools to close as quickly as possible between March 20 and March 30, 2020 (Tennessee Office of the Governor, 2020). Most schools did not reopen to in-person students that semester, rather nearly every school went virtual following plans issued by TNDOE (Tennessee Department of Education, n.d.). The legislature issued a hold harmless to drop TN Ready tests and allow LEAs to shorten their school years (Aldrich, Sweeping Tennessee legislation filed to drop TNReady tests, allow fewer instructional days as schools close due to COVID-19, 2020).

Even though virtual schools have been in operation in Tennessee since 2011 (Potts, 2016), fewer than a dozen LEAs had experience operating them. This meant only a few LEAs were able to deliver quality educational programs to their students through existing Learning Management Systems like Canvas or Blackboard. Other LEAs essentially were experimenting with this learning modality for the remainder of the year, and since everyone including students were being held harmless, LEAs encouraged their teachers to innovate.  

In the 2020-2021 school year, the state issued re-opening guidance that gave specific requirements for virtual schools, and LEAs had to submit a virtual school plan to the state. Different systems used different models for virtual instruction. Those who had already employed an LMS for their teachers to use with their daily instruction or 1-to-1 programs employed synchronous models where students would log in much like they were attending school in person. Other school systems used programs like Edgenuity and had teachers supplement the computer-based instruction with tutoring and managing the students’ work.

Meanwhile, the students who were attending school in person were doing so with many protective measures put in place such as frequent handwashing, social distancing, and staggered schedules. Schools were also contact tracing which meant quarantining and isolating students for long periods of time in accordance with TNDOE guidelines (Tennessee Department of Health, 2020). Students in quarantine or isolation were supposed to receive online instruction at home, but many school systems in rural areas were unable to do that, and many students who did not have technology at home were unable to do that. These challenges made virtual instruction look like one of the culprits for learning loss, and the TNDOE made it much more difficult to offer both in-person and online learning (Testino & Mangrum, 2021).

Because of the loss of instruction in 2020, a disjointed school year in 2020-2021, and varying quality in virtual school models, the data from the 2020-2021 school year showed that students experienced significant learning loss (Tennessee Department of Education, 2020). To equip LEAs with tools combat learning loss Tennessee General Assembly passed the Tennessee Learning Loss and Student Acceleration Act (Tennessee Code Annotated, 2021). Among the programs that the state mandated out of this legislation are as follows:

·       Summer learning-camps

·       Learning-loss bridge camps

·       After-school learning mini-camps

Additionally, the General Assembly added a caveat that “any 3rd grade student scoring at “approaching” or “below” grade level on the ELA TCAP assessment must receive additional supports and services. This involves the choice of multiple pathways, including summer programming, TN ALL Corps participating, retesting, and/or retention” (Tennessee Department of Education, 2020). This legislation has made many worry that many third graders will not be able to meet the requirements to pass third grade creating a logjam at that grade level when this bill is enacted in 2023-2024 (Aldrich, Tennessee strengthens third-grade retention requirements as legislature takes up flurry of education bills in special session, 2021).

In addition to the state legislation, the federal government passed the Elementary and Secondary Emergency Relief (ESSER) Fund as part of the CARES Act (Office of Elementary and Secondary Education, n.d.). Congress set aside $13.2 Billion in a grant fund for LEAS to access these funds through their State Education Agency (SEA) in three different disbursements commonly known as ESSER I, ESSER II, and ESSER III. According to their fact sheet, the U.S. Department of Education (2021), the LEA must “reserve at least 20 percent of funds to address learning loss through the implementation of evidence-based interventions and ensure that those interventions respond to students’ social, emotional, and academic needs and address the disproportionate impact of COVID-19 on underrepresented student subgroups”. LEAs can spend the remainder of the funds on a wide range of products, services, and other items that help districts improve their ability to educate students during a pandemic.

To help LEAs spend their money based on needs, the TNDOE has provided LEAs with planning resources (Tennesseee Department of Education, 2021). Additionally, organizations like SCORE (Tang, Price, & Brown, 2021) have also written papers to help LEAs can spend this money wisely. One suggestion is that districts spend this money on one-time investments so that they do not overextend themselves when the money has run out. Despite this warning, many districts have had to hire additional personnel to meet their students’ needs.

Hiring personnel is more difficult than it has been in the past. Some reports say that nearly 22% of the state’s teaching force is considering leaving it (Davis, 2021). In reaction to this, the state has started a Grow Your Own model, and have recently announced a Teacher Residency Program partnership between Austin-Peay and the Clarksville-Montgomery County School System (Tennessee Department of Education, 2022). It is not yet clear if teachers will continue to resign from education at the rate they have recently, but some suggest that findings solutions for this crisis is education’s biggest concern in the upcoming year (Morrison, 2021).

Beyond student achievement data and learning modalities, TNDOE is also looking to change how schools are funded (Tennessee Department of Education, 2021). Town halls have been conducted across the state to get the public’s input on funding local education agencies. Governor Bill Lee has tasked the TNDOE to seek public input based on the following investment strategies:

·       Prioritize students over systems

·       Empower parents to engage in their child’s education

·       Incentivize student outcomes

·       Ensure all students, regardless of location or learning needs, are served at a high-level

·       Reflect Tennesseans’ values

·       Create flexible funding that prepare students for postsecondary success (Tennessee Department of Education, 2021).

This change to funding will permanently change the way schools are funded which will change how schools operate. 89 school-systems currently have a lawsuit set to go to trial against the state of Tennessee because of the Basic Education Program (BEP) funding formula (Aldrich, Tennessee’s lingering school funding trial is reset before a new court, 2021).  The BEP has not been overhauled since 1992, and the Comptroller and others have asked that it be overhauled to reflect new realities (Associated Press, 2011). While the new funding model that will replace the BEP is not currently impacting the state of education in Tennessee, it will most certainly impact LEAs in the future more than any other change the state has made in decades.

It appears 2023-2024 is shaping up to be one of the most transformational years in public education in Tennessee because of the culmination of events that will happen that year: the state will change to a new funding formula for schools, ESSER money will have run out, and third-graders who are not proficient in reading will have to be held back or put through rigorous summer courses in order to be at the level. Additionally, it is likely that the COVID-19 pandemic will have lessened or will have come to an end by then. No one can guess what the future holds, but it seems like it will be an interesting time to be in public education in Tennessee. LEAs need to prepare for the end of ESSER and a new BEP now so that they are in a better position to meet their needs in the future. Developing low-cost, high0-impact programs to meet learning loss and reading intervention will be a key to success in the future. 

Works Cited

Aldrich, M. W. (2017, June 26). Common Core is out. Tennessee Academic Standards are in. Here’s how teachers are prepping for the change. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2017/6/26/21102827/common-core-is-out-tennessee-academic-standards-are-in-here-s-how-teachers-are-prepping-for-the-chan

Aldrich, M. W. (2018, April 23). Why Tennessee legislators share blame, too, for TNReady testing headaches. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2018/4/23/21104918/why-tennessee-legislators-share-blame-too-for-tnready-testing-headaches

Aldrich, M. W. (2019, May 30). Tennessee chooses testing giant Pearson to take over TNReady. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2019/5/30/21108233/tennessee-chooses-testing-giant-pearson-to-take-over-tnready

Aldrich, M. W. (2020, March 17). Sweeping Tennessee legislation filed to drop TNReady tests, allow fewer instructional days as schools close due to COVID-19. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2020/3/17/21196031/sweeping-tennessee-legislation-filed-to-drop-tnready-tests-allow-fewer-instructional-days-as-schools

Aldrich, M. W. (2021, January 21). Tennessee strengthens third-grade retention requirements as legislature takes up flurry of education bills in special session. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2021/1/21/22243450/tennessee-legislature-strengthens-third-grade-retention-requirements

Aldrich, M. W. (2021, September 2). Tennessee’s lingering school funding trial is reset before a new court. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2021/9/2/22654926/tennessee-school-funding-trial-bep-memphis-nashville

Associated Press. (2011, December 9). Comptroller calls for TN school funding overhaul. Retrieved from Desert News: https://www.deseret.com/2011/12/9/20237278/comptroller-calls-for-tn-school-funding-overhaul

Davis, C. (2021, December 7). Survey: 22% of Tennessee teachers say they're ready to leave education. Retrieved from News Channel 5 Nashville: https://www.newschannel5.com/news/survey-22-of-tennessee-teachers-say-theyre-ready-to-leave-education

Loveless, T. (2021, March 18). Why Common Core failed. Retrieved from The Brookings Institution: https://www.brookings.edu/blog/brown-center-chalkboard/2021/03/18/why-common-core-failed/

Morrison, N. (2021, December 24). Stopping The Great Teacher Resignation Will Be Education’s Big Challenge For 2022. Retrieved from Forbes: https://www.forbes.com/sites/nickmorrison/2021/12/24/stopping-the-great-teacher-resignation-will-be-educations-big-challenge-for-2022/?sh=495fb1ea157c

Office of Elementary and Secondary Education. (n.d.). Elementary and Secondary School Emergency Relief Fund. Retrieved January 2022, from Office of Elementary and Secondary Education: https://oese.ed.gov/offices/education-stabilization-fund/elementary-secondary-school-emergency-relief-fund/

Potts, K. (2016, March). Virtual Schools in Tennessee. Retrieved from Tennessee Comptroller of the Treasury: https://comptroller.tn.gov/content/dam/cot/orea/advanced-search/2016/2016_OREA_VirtualSchools.pdf

Tang, P., Price, M., & Brown, K. (2021, January 21). How Tennessee School Districts Can Make The Most Of ESSER II COVID Relief Funding. Retrieved from SCORE: https://tnscore.org/how-tennessee-school-districts-can-make-the-most-of-esser-ii-covid-relief-funding/

Tatter, G. (2016, April 27). Tennessee fires TNReady testmaker, suspends tests for grades 3-8. Retrieved from Chalkbeat Tennessee: https://tn.chalkbeat.org/2016/4/27/21098160/tennessee-fires-tnready-testmaker-suspends-tests-for-grades-3-8

Tennessee Code Annotated. (2021). Tennessee Learning Loss Remediation and Student Acceleration Act. Retrieved from LexisNexis: https://advance.lexis.com/documentpage/?pdmfid=1000516&crid=6c045431-33b5-453c-9d41-9fa8cd077fbb&nodeid=ABXAAGAAPAAB&nodepath=/ROOT/ABX/ABXAAG/ABXAAGAAP/ABXAAGAAPAAB&level=4&haschildren=&populated=false&title=49-6-1501.%20Short%20title.&config=025054JABlO

Tennessee Department of Education. (2020, September 23). Tennessee Department of Education. Retrieved from Tennessee Releases Data Showing Significant Learning Loss Among K-12 Students: https://www.tn.gov/education/news/2020/9/23/tennessee-releases-data-showing-significant-learning-loss-among-k-12-students.html

Tennessee Department of Education. (2021). Tennessee K-12 Public Education Funding Engagement. Retrieved January 2021, from Tennessee Department of Education: https://www.tn.gov/education/tnedufunding.html

Tennessee Department of Education. (2022, January). Grow Your Own. Retrieved from Tennessee Department of Education: https://www.tn.gov/education/educator-resources/educator-pipelines-innovative-pathways/grow-your-own.html

Tennessee Department of Education. (n.d.). Academic Standards. Retrieved January 2022, from Tennessee Department of Education: https://www.tn.gov/education/instruction/academic-standards.html

Tennessee Department of Education. (n.d.). National Assessment of Educational Progress (NAEP). Retrieved January 2022, from Tennessee Department of Education: https://www.tn.gov/education/assessment/naep.html

Tennessee Department of Education. (n.d.). Reopening Guidance. Retrieved January 2022, from https://www.tn.gov/education/health-and-safety/update-on-coronavirus/reopening-guidance.html

Tennessee Department of Health. (2020, September 2). COVD-19 Response Rubric. Retrieved from Tennessee Department of Health: https://www.tn.gov/content/dam/tn/education/health-&-safety/TDH%20COVID%20Case%20Response%20and%20School%20District%20Decision%20Making%20Protocol.pdf

Tennessee Office of the Governor. (2020, March 16). Governor Lee Issues Statement Regarding Statewide School Closure. Retrieved from Tennessee Office of the Governor: https://www.tn.gov/governor/news/2020/3/16/governor-lee-issues-statement-regarding-statewide-school-closure.html

Tennesseee Department of Education. (2021). ESSER District Planning Resources. Retrieved 2021, from Tennesseee Department of Education: https://www.tn.gov/education/top-links/esser.html

Testino, L., & Mangrum, M. (2021, April 26). Tennessee schools can't offer both in-person and virtual instruction starting this fall. Retrieved from Memphis Commercial Appeal: https://www.commercialappeal.com/story/news/education/2021/04/26/tennessee-schools-cant-offer-both-person-and-virtual-instruction-in-fall-covid-19-updates/7342955002/

The Momma Bears. (2015, May 21). The Momma Bears. Retrieved from Mc Queen's children too Blessed to be TN READY: http://www.mommabears.org/blog/category/parcc

U.S. Department of Education. (2010, July 22). Race to the Top - Game-Changing Reforms. Retrieved from U.S. Department of Education: https://www.ed.gov/comment/3490

U.S. Department of Education. (2021). ESSER Fact Sheet. Retrieved from U.S. Department of Education: https://oese.ed.gov/files/2021/03/FINAL_ARP-ESSER-FACT-SHEET.pdf

Wesson, L. (2013, July). TCAP and PARCC Student Tests. Retrieved from Tennessee Comptroller of the Treasury: https://comptroller.tn.gov/content/dam/cot/orea/advanced-search/2013/2013_OREA_TCAPandPARCC.pdf

Initial Research into Financial, Geographical, and Population Effects on Academic Achievement Part 2

After Part 1, I got a lot of people asking me to give the rankings for the School Systems in TN. I could not find specific financial information for municipal/special districts. I am embedding these reports from Google Data Studio. If you want to access that directly, go here.

Rankings for individual school systems. You can change the systme with the button at the top.

School System Ranking Table. Click the header to sort.

Raw Data