If you’ve ever looked at your students’ reading data and wondered, “What am I actually supposed to do with this?”—this episode is for you.
In this episode, we discuss:
- Why reading data often feels more like compliance than instruction
- How to actually use assessment data to guide day-to-day teaching decisions
Assessment is a constant in education—but too often, it’s disconnected from what happens in the classroom. In this conversation, we unpack one of the biggest misconceptions about reading data. As our guests explain, different types of assessments (screening, diagnostic, progress monitoring) are designed to answer different questions—and when we misunderstand that, instruction suffers.
We also explore what effective data-based decision making really looks like in practice. From thinking of instruction as a hypothesis to understanding why progress monitoring is one of the most underused tools in literacy, this episode offers a clearer, more actionable way to connect data to instruction. The goal isn’t more data—it’s better use of the data you already have.
Guests:
Dr. Jessica Toste, Associate Professor at the University of Texas at Austin and Editor of The Reading League Journal
Andrea Setmeyer, School Psychologist and Chapter Director at The Reading League
Show Notes
Resources mentioned:
Measuring What Matters (Open Access Article)
đź’¬ Want more insights like this?
Subscribe to the Literacy Talks Podcast Digest for episode recaps, resources, and teaching takeaways delivered straight to your inbox!
Do you teach Structured Literacy in a K–3 setting?
Sign up for a free license of Reading Horizons Discovery® LIVE and start teaching right away—no setup, no hassle. Sign-up Now.
Coming Soon: Reading Horizons Ascend™
From Pre-K readiness to advanced fluency, Ascend™ offers a consistent, needs-based reading experience across every grade, tier, and model—so every student can build mastery, one skill at a time. Learn More.
View Transcript
Unknown:
Welcome to Literacy Talks, the podcast for literacy leaders and champions everywhere, brought to you by Reading Horizons. Literacy Talks is the place to discover new ideas, trends, insights and practical strategies for helping all learners reach reading proficiency. Our hosts are Stacy Hurst, a professor at Southern Utah University and Chief Academic Advisor for Reading Horizons. Donell Pons, a recognized expert and advocate in literacy, dyslexia and special education, and Lindsay Kemeny, an elementary classroom teacher, author and speaker. Now let's talk literacy.
Stacy Hurst:
Hello and welcome to this episode of Literacy Talks. I'm Stacy Hurst, and we have the unique situation today of I am the only host for today. Lindsay and Donell are not able to be here because they're doing the thing they're teaching so but we do. We're so happy to have the guests that we have and the topic that we're going to discuss. So we are talking about assessment, and we have visiting with us today, Dr Jessica Toste and Andrea setmire from the reading League and so just to get started, Jessica, you are the editor of the reading league journal, which many of our listeners, if not all, have heard of. I know we've covered articles and other things that have happened in the journal. Do you want to give us a little bit of your background?
Unknown:
Sure, thanks for having us here today. I am an associate professor at the University of Texas at Austin in the department of special education. I have been there now for 13 years, and my work focuses on reading development and helping all of all students become successful in their reading. I primarily focus my work on students who have intensive and persistent difficulties with reading so reading disabilities or difficulties, for whatever reason they might be coming up, and I really try to understand ways that we can make interventions more responsive for them. We can intensify those interventions, and to better understand how we can meet students needs
Stacy Hurst:
very important work. Yeah, yeah, that's great. Thank you and thank you for being here. We also have Andrea seppmeyer and Andrea, you have actually been a guest on our podcast before, when I had FOMO of missing the last Do you mind taking a minute and introducing yourself to our listeners,
Unknown:
yeah, I'm so delighted to be back. Thanks, Stacy. I'm a school psychologist by training. That's how I came into this work and caring a lot about reading from working with students in public schools for 10 years, and then I founded the Indiana chapter of the reading League. So I found the reading league from Indianapolis and followed them virtually for a long time, and then founded the Indiana chapter, and then have been the chapter director at the reading League for the last four years, which is a complete gift to get to immerse myself in research and other folks who care a lot about evidence aligned practices, and just get to know the folks that make up the reading League and connect with other literacy advocates like yourself, well,
Stacy Hurst:
and thank you for that, and it is a great segue into what we've been focusing on in our in the last few episodes, and that is assessment. So both of you have lived in that world, and Jessica, right now you're also the editor of the reading league journal. And would you like to say a little bit about the most recent journal? It is all about assessment. So if you haven't got a hold of it yet, I recommend it.
Unknown:
Yeah, it's we're very excited about our most recent issue. So it's our January, February 2026. Issue is focused on reading assessment. So really kind of trying to dive deep into thinking about instructional decision making and how assessment data plays a critical role in our instructional decision making. So we have, I think, a really nice collection of articles that address different aspects of assessment, and hopefully can be useful in translating into practice, yeah, which
Stacy Hurst:
I think is such a strength about the journal, and I'm really happy that you're you're leading that, because it is something we need to translate it to action. And assessment is not something that's new to educators, right? We are immersed in that from our pre service days. But what would you think is one of the most common misconceptions about assessment that educators might have, no matter what role they're in, even administrators sometimes have misconceptions.
Unknown:
Yeah, I think because in recent years. And recent. Maybe I'm giving a longer timeline as I get older, recent becomes longer, but there's an increasing kind of focus on using assessment for the purposes of compliance. And assessment does serve an important role in compliance in lots of ways, but I think because we schools and educators are required to collect certain types of data that we don't think enough about the purpose that different sources of data are serving for us, and so we kind of lose this idea that a comprehensive reading assessment system really has to, by nature, integrate multiple sources of data, and each of those sources of data has a different purpose and varies in scope, and we really need to be clear on that purpose when we're trying to interpret those data and be responsive to those data. So I think sometimes, we often look to a data source, especially one that we're required to collect from our students, and we expect those data to answer all of the questions we have about our instruction, and when they don't, that becomes very frustrating. It makes makes us feel like assessment isn't valuable because it's not serving us the way we want it to, but really that often comes from the fact that we're not looking at the right data sources to answer those questions. So I think one of the biggest misconceptions is just thinking, sort of putting all assessment in the same box, and not being very clear about what each assessment was designed for, and what questions we should be asking when interpreting that that assessment,
Stacy Hurst:
I think that's really important to point out, and I believe it's isn't the first article in this edition that talks about that. Yeah, and Andrea, you have lived in this space for a long time as well. So what would you say is one of the most common misconceptions?
Unknown:
Thanks. I just want to add on that. Dr toss and her colleagues did write an article that goes just gives a really fantastic, clear definition about the different types of data, how they can be used, what they should not be used for, and it is the open access article, so it's that article measuring what matters is available for free at the reading league.org journal page. So make sure you check it out, because that I would agree, is the number one misconception that I still see in schools today is that misalignment. I have an evidence aligned test or a research based test, but I can still misuse it for the wrong purpose and not actually add value to my work?
Stacy Hurst:
Yeah, that's really important. And we will link that in our show notes, as they say, so they can have everybody can have access to that article. I think that is really important. I know my with my pre service teachers, I really emphasize and I think this is also something that I'm so happy to see. This be a focus. Assessment. Be a focus. I emphasize that that number really does not mean much. Like you have to look into the data and see what that number is telling you, make it tell a story, and it can really help guide your instruction. There's a lot of reading data out there, and sometimes it is hard to translate those numbers into a story that then leads to actionable instruction. So what? What does effective database decision making look like in practice?
Unknown:
Yeah, yeah. That's such a big question. I feel like, well, that could be like a 10 hour episode. Yes. I mean, I think I'll start by saying I keep that we should always enter every kind of instructional decision we make by keeping in mind that that decision is, at its core, a hypothesis about how we think our students, or one of our students, learns, and what they need instructionally. And because we enter each instructional con situation and each of our instructional decisions is really a hypothesis, we then need evidence to make sure that our hypothesis is accurate and that things are working out the way we wanted to. And so the really, the only reason, the only way we can know whether our instructional decisions are having the impact that we expect them to, is by collecting data to serve as evidence to test those hypotheses. And so in order to do that, we're not just collecting those data, but we're collecting those data. We're thinking about what what data we're collecting regularly, what additional data we might need to sort of create a bigger, more clear picture about our students and their learning and then we're regularly interpreting. Patterns in those data. So what do we see happening with the data? What is the evidence I have that my student is, is learning, is growing, is making progress or not, and if they're not trying to understand more about what's going on, and then that leading directly into, sort of into generating new hypotheses. So if a student is making great progress with instruction, then I'm moving forward with a hypothesis that this instruction is working, and I'm going to keep continue checking that to make sure that's true. If they're not making the progress that I would expect, then I need to have additional hypotheses and questions that I ask myself about how to adjust my instruction in order to better meet their needs. So I think the most effective database decision making is very iterative and cyclical, and that it is deeply embedded in the way that we think about our instruction, that it isn't, it isn't a way to, you know, collect a data point so that we can report out at the end of the year or at a parent meeting, or to our administrators that something worked or didn't. But it really directly serves us in our instructional decisions,
Stacy Hurst:
and it's part of the teaching process. It's not something separate from that. Yeah, that's important. I know that, especially lately. And I use the term the same way you do DR tost, we have had a focus on the MTSS model, and so people are familiar with the terms Screening, Diagnostic, progress monitoring and outcome assessment. What can either of you share about how teachers should be thinking about the relationship between the different types of assessment we have in practice?
Unknown:
I think it's really important to understand the why. So this goes back to the purpose. Because if you appreciate the why of each of those different types, you see how they go together. They're not in competition. They're not pulling you away from other things that are more important. They each serve a specific need that help guide your instruction. So screening helps us identify really quickly and efficiently who is at risk of developing reading difficulties. It doesn't tell us everything we we should know to teach. It just tells us really quickly who is at risk, and that's so important so that we can start intervention right away. Sometimes in early grades, or if you have a placement test for your intervention, you don't need to do additional diagnostic data to get started in the intervention. But sometimes you do, sometimes that universal screener just is an indicator, and then you need to do some diagnostic testing. And by diagnostic, I just mean uncovering the underlying skills that might be contributing to the risk that you're seeing. So that would be diving into word recognition skills and phonics and phonemic awareness, and even for older students thinking about their ability to decode multi syllabic words. And it's also thinking equally about those skills that contribute to language comprehension, so oral language skills and syntax and some of those pieces so diagnostic only as needed to make your instructional hypotheses and then get started with the intervention as soon as possible, only for students who you really need to because in a true MTSS process, you're going to start progress monitoring all of those students receiving intervention so that you know if your instructional hypotheses are working and you can make adjustments as you go. And to me, progress monitoring holds the most promise for our students who are still striving readers, and yet, I think it's the area that we neglect the most. And so I'd love to lift that up in conversation about, what does good progress monitoring look like, and why is it so important that it's worth our time in intervention? Yeah, I think Andrea captured the great overview, and I think I will highlight too that she started mentioning, kind of the different purpose of each of those, and aligned with an NTSS model. As you started the question with thinking about as we I'm using my hands, which doesn't work great in a podcast form, but as we think of moving from kind of up that pyramid, from all students and then a group of students who are receiving intervention, and then a smaller group of students who have more intensive intervention, keeping in mind that the level of instructional decision making is more fine grained, which means we require greater frequency of data collection, and so as students move into intervention, collecting data more frequently is critical to monitor their progress and to make sure that we're not we're using our time well. We're using our time well and their time well to make sure that they benefit from instruction.
Stacy Hurst:
Yeah, that's really important. I am thinking back to my days as a literacy coach, and we had a whole system for gathering that progress monitoring data. And I think there are some important nuances to keep in mind as we're looking at that. Hopefully you are doing it frequently enough right to make those determinations about your hypothesis. But I know some teachers, one we we would progress monitor students every other week. And some teachers, you know, from week to week, that score could vary kind of a lot, or teachers would get excited if there were two in a row that were going up and improving, and then the next one would be, you know, digressing a bit, and we'd have to have conversations like, look at the overall trajectory, right? You're just, it's just a snapshot, a moment in time. And Andrea, when you said that progress monitoring is something that I think we could really lift up and focus on. Would make a big difference for our instruction. Anything else to add to how to successfully progress monitor a student, things we have to keep in mind when we're doing that.
Unknown:
Yes, selecting the right measure is really, really important. So again, it has to be quick. Usually we're talking about curriculum based measures that are one minute passages or probes of equal difficulty, measuring the same skill over time. It's really important to think about your measure. You're collecting data that's helping you analyze if you're working towards that bigger goal often. So it may feel difficult for students as you get started, and you can absolutely normalize that for students. I have worked with students with reading disabilities for a long time as a school psychologist, and you can do really hard things for one minute with the right context and the right support in place to help that student be comfortable with that one minute task. So making sure you pick the right measure, often a curriculum based measurement and then having a plan in place before you start for who's going to monitor progress, how often, who's going to look at the data. How is that going to be communicated with the team? All of those pieces have to be really solidly in place, or it's too easy to get lost in the shuffle of the daily life at school. Yeah, that's good.
Stacy Hurst:
Jessica, what would you add?
Unknown:
Yeah, I'll add to that about kind of interpreting the data and things to think about in progress monitoring. One is that kind of that has come up already, but interpreting the progress data as a trend. So interpreting progress data over time, multiple data points over time, and this can get tricky, because we schools often use a measure that looks almost the same or is the same for screening and progress monitoring and screening is meant to be a single time point. It's a snapshot of performance, and it's meant to be interpreted that way, but that is not how progress data are meant to be interpreted. They're meant to be interpreted over time. And so I think the frequency of that with which we collect those data, you know, not responding immediately to a single data point, but really looking at how students are progressing over time is very important, and to the point that you made already of really understanding that variability and progress data is the norm, it should be seen as the norm rather than an exception. And this is true for most students and for students with intensive learning needs. It's very true. They're very they have, there's a lot of variability in performance over time, and so we tend to, as we collectively, as humans, tend to respond if, if data points go up or down, if they fall or grow very quickly, if you think of, if you look at like your retirement savings, and you get to look at a graph of how your your savings are doing, we tend to like panic or get really excited based on very like something happening. But really, just like we would with those graphs we see with our financial advisors, we should be interpreting kind of the trends over time and and allow that variability to take place and not not feel panicky when we see that variability.
Stacy Hurst:
Yeah, that's important, and that's a really good comparison. In fact, I think I'll apply that to the next time I get my retirement savings. I'll just take a deep breath and remember Jessica said we're looking at trends.
Unknown:
I feel like I say this all the time about CBM progress graphs. And then I feel like, every year when I meet with my, you know, enforced with my financial advisor, and then I like, shows me this thing, and I'm like, Oh my goodness. What happened this year? Everything's terrible. And he's like, no, no, no, no, no, don't look at that.
Stacy Hurst:
Good, a bonus life lesson right there that's helpful. And as you were talking, it made me think about kind of the next question that I wanted to ask, your research has focused on intensifying those interventions for students with those persistent challenges over the years, has that work changed the way you think about assessment. How did that interact with what your focus was?
Unknown:
Absolutely? Yeah, absolutely. I think if, if you're interested in anyone who's interested in intensive intervention, you are, by default, interested in assessment and data, I think because you really it's it's difficult, if not impossible, to successfully intensify intervention, which requires very individual level decision making for students, without Having assessment data to inform your decisions. And so I it's, it's interesting. I was just talking to a colleague recently that I feel like I live in the assessment world a lot, but if I had to list, like, my top 10 areas of research, I probably wouldn't write the word assessment, which is interesting. I'd write maybe day, you know, I'd write data based instruction and CBM like, but it's something that's so aligned with intensive intervention that you have to be collecting data. You have to be thinking about what assessments are right for your students and right for the decisions they're making, so that you can, you can make those you can make interventions more responsive to your students needs.
Stacy Hurst:
Yeah, that's really great. And Andrea, you've had kind of the inverse experience as a school psychologist. You're trained in assessment now, we met in the middle. Yeah, yeah. What would you have to say about how that's been for you, because, yeah, you started at the opposite end of that spectrum.
Unknown:
I guess in my training, you're right. We focused a lot on assessment, and I feel like I had a really deep understanding of the different purposes for assessment and the limitations of each type and how to apply them. And I was really surprised to go into schools and find out that teachers didn't value data the same way that I did. And I think it goes back to our broader conversation in the science of reading all the time about the differences in training. But this was one example where I would come in and say, I think this student needs to be progress monitored with oral reading fluency, and let's start that that protocol. And the teacher would say, Well, I don't care how fast they read, I want to know if they can comprehend what they're reading. So this real mismatch between our knowledge bases and what we valued can make it honestly challenging to have those conversations. But in the world of research and in the world of education, I don't think you can separate instruction from assessment, we have to keep our eyes on the students learning. Did they learn it? Did they learn it? Can they apply it? Have they mastered it? Right? And that's those are only questions that could be answered by data.
Stacy Hurst:
And honestly, I think people get overwhelmed if you try to separate it, if you just think of it as just part of your everyday teaching, it's less overwhelming, and that also makes me think about what how we're interpreting that data, and that assessment matters too, and I know, again, at the pre service level, I start very simplistically. We obviously we know about the simple view of reading, and that's a good framework to put into. You know, when you're watching a student and you're developing that hypothesis, like, is this a decoding issue? Is that language comprehension issue that could drive the assessment you give and help inform instruction? There's been a lot of conversation recently. This has been around for a long time, but I think it's been highlighted the instructional hierarchy, or the learning hierarchy. I think that's another framework that is useful in this conversation, when we're talking about assessment and moving a student through that and then, Andrea, I'm just thinking about what you just said too. I was that person. I was the literacy coach who said, We are going to do, I'm just gonna say the name we're gonna do. We were doing the DRA now, we're gonna do the fountain Pennell benchmarking. And I heard that all the time from teachers that, yeah, their accuracy was in the toilet, but they understood. But when you put that, even that information in the framework of the simple view of reading, we know why it answers the questions they were over relying on their language comprehension to try to compensate for their lack of decoding ability. I think I lost the plot there. But is there anything you could add to that? How we can text? Analyze the data that we're analyzing.
Unknown:
I mean, one thing I'll add, which I think is helps to to demonstrate the value of data, and then also to help us contextualize and use those data, is even though we're we keep saying we've probably said the word data 400 times in the last little bit that we've been talking but even though we're talking about assessments and using those data to make good decisions, those data, like interpreting those data, live alongside teachers. Professional judgment. And professional judgment is a has to be a driver of how data are interpreted and then acted upon and the data we're collecting. And when I think of progress monitoring, I really we use, you know, we graph those data, and we use those graphs to support decision making. Those data, those graphs, those aren't end products. They're decision making tools, and they're meant to be tools that teachers can use to balance and weigh their own professional judgment. And there are many times where students' data will indicate to you, maybe that they're not making progress or something, you know, something is going on with their learning, and it's only the teacher's knowledge of that student and what's going on in their life and what's going on during instruction, and all the things I know about that student that can help inform the most impactful decision. So I think always bringing together this idea of these data are so important for our decision making, but that is what they are. They are tools to support our decision making. I'm so glad you lifted that up, Jessica, I'm finding the same thing, just as you made that connection, Stacy, to the simple view of reading, that as teachers knowledge about the development of reading and about the science behind teaching reading, that they're able to see the why behind the assessment, and it's actually making it easier to have those conversations about why a guided reading leveled assessment isn't meeting the needs, because it's not a screener and it's not a diagnostic tool and it's not valid for progress monitoring. And so let's think about where it fits in and why students might be performing a certain way, and what could we use that really answers the questions we have based on our knowledge. That's a really great insight,
Stacy Hurst:
which makes sense, because learning how to read is multifaceted, and the assessment that is involved in that process should be too. But I think if you teachers who have that knowledge and they're growing it, which most of our listeners are, we identify as learners first, then that actually, over time, gets a little bit easier and you get a little bit more I know at least I have I'm better able to target which assessment and which instruction to do for my student. So I think that I really appreciate what you guys have both said about that, and I am now thinking to Jessica, you have been the editor of the reading league journal for how is this your this is your second How long have you been the editor? Yeah, part
Unknown:
of the second year. Okay, yeah.
Stacy Hurst:
So how did you make the decision to say, You know what, let's focus on assessment for this, this
Unknown:
issue or this particular issue. Yeah, well, for So, for our issues, for 2026 we so we moved, when I became editor, we were moving towards having thematic issues. And so 26 was the first year I got to think of all three issues in the year and what themes to focus on. And I, you know, kind of made a list of topics that were coming up in through things that the reading league activities and meetings and conversations with practitioners, that the reading League was organizing, and that I hear in the field and that I talk to talk about with teachers and assessment was one of the topics that came up over and over again. So it seemed like like a great way to photo to start off the year by talking about assessment when students need support every moment of instruction counts ascend focus by Reading Horizons is an adaptive K through 12 intervention that identifies skill gaps and delivers individualized, developmentally aligned support, helping students accelerate progress and reach grade level performance built on the same proven method and trusted foundational software educators have relied on for over 40 years. Ascend focus makes targeted intervention simple, effective and measurable. Ascend focus will be available for district and school implementation beginning with the 2026 school year. Visit reading her. Horizons.com/ascend, to learn more and sign up for updates.
Stacy Hurst:
I think the journal is such a good resource. It's a good example of really taking research and turning it into actionable information for teachers and educators and and researchers, so many people. How did you become associated with the reading league journal, like, What drew you to take that on, that role?
Unknown:
On Yeah, well, I mean, I've known about the reading league for many years, probably, probably for 10 years since its inception, and have been for years just so impressed with the the pace that the reading League has grown, the professionalism in all of the work that comes out of the reading League, and honestly, the impact that the reading League has had that so many organizations and so many of us in the reading research field have world have been thinking about and striving for for so long. So thinking about the journal, I just felt like it's, it's a unique it has it sort of like lives in a unique space where it brings together rigorous research and talks directly about rigorous research, while also translating that into information and practices that can be implemented in real schools and real classrooms with real students. And I, you know, I'm at the point in my career where I'm very motivated by the challenge of ensuring that the work that I do advances into meaningful practice. Doesn't mean every single study and every single research question, but it is all in service of moving to translating to meaningful practice. And so I thought that being involved with the journal would give me a nice way to have conversations with wider, broader number of people doing this work, to think about how to translate that work in really accessible ways, to be responsive in how we translate that work and have conversations about research.
Stacy Hurst:
Great. I know it's serving that purpose, and I love this issue. I yeah, I actually my copy is at my office on campus, because I'm always like, if I had it, yeah, I was like, I would show people to what is your vision for the journal over the next few years.
Unknown:
Yeah, well, our My hope is that we will be able to plan ahead for the the themes of our issues for each year, so that we can have more time to kind of do calls within the community, within broadly the educational community of folks who are doing work on the ground that it relates to the different themes that we're talking about, so that we can have a wider range of columns of people showing how, how research in whatever theme, whatever topic, is translating into practice. My hope is that those themes really touch on what we're talking about in at the time when they're published, so that they help answer people's questions and and give people easy access to the information that they want and that we have. We continue to have a nice balance between articles that are talking about rigorous research and understanding rigorous research and findings that come from that work, balance with articles about how to do these things, how to do new practices, how to apply things within school and classroom contexts.
Stacy Hurst:
Yeah, that's great. And I don't know if you can tell us this, but what is the next issue going to be about?
Unknown:
Yeah, I can tell you, for the rest of the year where our spring issue is focused on writing. So the spring issue will have a series of features that talk about writing instruction in different different contexts, to support learners at different ages and different kinds of writing practices. And then our Fall issue, that which is the last issue of the year 26 is going to be focused on language and the role of language in literacy.
Stacy Hurst:
Oh, I'm so excited. That's really great. Those are all the topics I've been that have been running around in my head for the last few months.
Unknown:
That's the hope is that we're like, touching on the things people have been talking about the most.
Stacy Hurst:
Yeah, also timely for me. And I don't mean to make this about me, but I am right now. Creating a course that will begin in the fall, that we get to be focusing on writing. So I'm so excited for that perfect and just to remind our listeners too, membership in the reading league is free. But is that correct me? If I'm wrong? Andrea, and then tell us about how to get the journal. Could you do that?
Unknown:
Yes, Stacy, we used to offer membership at the national level, and it was free. We've actually changed the structure, and so now membership belongs at the chapter level, and we really encourage people to join their local state chapter. And we have one in DC. We have 46 chapters across the country right now, so that's where membership lives. And then the journal is its own subscription. So anybody can subscribe to the journal. If you go online at the reading league.org, you'll see the journal home page and a couple of things to highlight there. So yes, subscribe, and then you will get it sent to your house three times a year. I always call it the best mail day, because I love getting something happy in the mail. But you can also access with your subscription the digital archives, so you'll have access to every single article that has ever been written by the reading league journal or published by the reading league journal. And there are some fantastic gems in there. I mean, you could really get a lot of content for PLCs and for coaching and for a lot of for higher education, there's, there's a lot of gems in there that you could use. So that treasure trove is worth mentioning. And then there are also a collection of freebie articles, or open access content that's on that journal homepage. So again, if you're interested, if you haven't ever seen the reading league journal and just want to read a couple of articles to see how practitioner friendly they are, what our focus is, how we're lifting up different voices in the field. The Journal website is a great place to check out
Stacy Hurst:
great Well, I feel like that was a little commercial break, because everybody should, everybody should subscribe. And I, selfishly, I always ask this too, how do you see what is offered in the journal relating or helping or beneficial to pre service teachers?
Unknown:
Yeah. I mean, I think so. For any listeners who, and myself included, who teach courses with pre service teachers. A lot of these articles are great reads in terms of entry into big issues, if you're talking about certain theories or talking about various aspects of reading development. We have across all of the years of the journal, some really awesome articles that have done deep dives into kind of impactful studies like that. We have a column called game changers that looks at a deep dive into a study that sort of changed the way we think about a certain area of reading. So those can be really nice articles to supplement reading research articles. So you read a theoretical research article, and then this shorter column that sort of translates, you know, why was this article important? What did it mean? How did that change practice? And then also, a lot of every issue, we have articles that are meant for practitioners. So they talk about how to apply practices in classrooms and schools. And hopefully, as those can supplement instruction that you're providing in in university level courses.
Stacy Hurst:
Yeah, I think that this particular this winter issue is a really good example of that, because they do we do have, like the research articles about assessment and then really actionable things for teachers to do, to implement and even to how they think about assessment. One thing I really appreciated about this issue too is that even the research articles were written in a way that is not I have this visual when I read a research article, and I probably read more than the average educator, but there are parts that I still have to really slow down or really like I'm googling, what is this again? And what is this statistical stuff? But I think I don't have that same I don't know it seems like in the journal, it makes sense in a way that really gets into my long term memory a little bit easier. So well done. And as the editor, that's no small feat to make sure that researchers and educators are able to communicate with each other. That's a really good way to do it. So you've talked about the next few issues, and now I'm super excited. So as you pointed out, though there are three issues a year, and I think that is again something better to look forward to than the retirement. It's. Yeah, I think you're just gonna keep going back to that. It'll be a happy male day for us. What else do you think about assessment? If you could have educators walk away with one thing from this issue about assessment? What would it be?
Unknown:
I think my one takeaway is not to think about assessment as one test or one moment in time decision, but really to think about assessment as that bigger decision making process, and that teacher professionalism and judgment plays a big role in that, that it's the accumulation of seeing that student read in a variety of settings, responding to a variety of tests, perhaps, and that it's not one thing, it's that whole big process of making sure students acquire the skills they need to be successful.
Stacy Hurst:
Great. Jessica,
Unknown:
yes, I will say yes to what Andrea said, and I will add on that. I hope the one big takeaway, as you know, if ever anyone does a PD on assessment or they go through a teacher prep program, is really to see student data as as something that is meant to be interpreted and acted upon, and to see assessment data as an essential part of responsive evidence informed teaching
Stacy Hurst:
great, and not to have that dread when it comes to it, because that it really does move the dial for students. And I know Jessica, you have really seen that in your work, right? And Andrea, you already knew that to start out with, because you're a psychologist. So, yeah, I think that is a good, a good note to end on, other than we have had a series of episodes, and we'll have a series of four on assessment leading up to the reading league Summit. And I'm sure I know both of you will be there, right? I'm so excited. I don't have to have FOMO this year because I will be there too. Do you want to tell us about this summit, how it's organized when it is, how people can participate?
Unknown:
Yeah, absolutely. The reading league summit is all about assessment and really making meaningful use of literacy and language data. So it's a perfect tie into this issue of the journal. It will be held in Syracuse, New York for two days may 5 and six this year, and the model for the summit is different than a typical conference. So when you go to the reading league conference or another big conference, you're thinking about like a choose your own adventure series, where you're carving out the topics in the breakout rooms that you want to go to. The Reading league Summit is different in that we're designing a two day cohesive learning experience where everybody is in the same room, learning all together. So we always choose a topic that we think needs to be highlighted in the field for a particular reason. And this year is data for all the reasons we've just talked about. We're making gains in our ability to really think about evidence aligned instruction, and we need to make sure our knowledge of assessment keeps pace with that so that we can make those connections. So we are diving deep into some misconceptions and some myths into those purposes of assessment. We're talking about language assessment in a panel format. And then we also have three data workshops, Dr Adria truckenmiller and Dr Anita Archer are going to be guiding us through some workshops where we look at literacy and language data, instructional data in real time, and practice talking to each other about the decisions we'd make. So it's going to be incredible two days. Dr toss will be there. I don't know if you want to talk a little bit about what you're diving into. Yeah, tell us, maybe, maybe not surprisingly, I'm going to be on the panel that's on student level data, so instructional decision making at the student level. So we'll be talking probably across our panel. We'll touch on screening, progress, monitoring, diagnostic assessment, all around student level instruction, instructional decisions.
Stacy Hurst:
Wow, that's good. I am. I'm literally counting the days. I'm so excited. And we usually do a recap episode, so we'll do that too, and I'm so I will be so happy to report on what you guys have put together and what we have to learn there. I'm looking forward to as are. I'm sure many other people can they still register? It's April at the time.
Unknown:
Yes, we have capacity at this space that we're back in, in Syracuse, New York, so everybody can come. And we're really even encouraging you to think about coming as a school team. What a great way to have conversation around are we using these words in the same way? Or. Where are we strong in our assessment model, and where do we have some gaps we might need to think about filling? Do our teachers and school teams have this knowledge in their in their everyday work? And so coming as a school team is also a really fantastic idea.
Stacy Hurst:
Great. What a wealth of knowledge you two have been today. Thank you so much for this conversation, and this is minor, but I want to note that we all pronounce data the same way the three of us do. Yeah, I don't remember that being mentioned in the journal, but you're right. To a colleague just yesterday, and they kept saying data, and I kept saying data, and finally, she was like, why are we saying it differently? Well, there's alignment here. So anything else you want to add about assessment, about the journal, I feel like we've had such a great conversation.
Unknown:
I mean, I think Andrea's recommendation to come with your school team to the summit? If you can come to the summit, I think that recommendation of to be with others and collaborate around data is so important that the more you can look at student data with your school team, your grade level team, your you know anyone at your school, you can convince you to have a conversation about data with with you. Is really helpful, because that that kind of analysis and hypotheses and interpretation of data only benefits more from more minds talking
Stacy Hurst:
about it. Yeah, and once you've seen it modeled, I think it really does impact you get always new ideas about how to look at data and how to interpret it. And I know there will be a lot of higher ed faculty members that are attending as well. And we have the unique situation where, what data do we look at with our students, and how do we guide them? They're not in the in the room with the actual students at this point having a classroom of their own. But I know already, just reading the journal, it gave me so many ideas, and I think that there will be a group of us in higher ed. So if that's something that we want to focus on while you're there, let's do it. But I know it's such an important skill to focus on, and so I'm really looking forward to that. Thank you for all of your work, both of you and everything that you do to help us translate that research into actionable practice, and most importantly, the reason that we're all here to make meaningful differences in the lives of our students so they can actually read and gain all there is to gain from life by being able to do that one very important skill. So thanks. Yeah, we might invite you back for the recap episode after the summit. So looking forward to those conversations too. Thank you
Unknown:
for wearing our I love data T shirts for that
Stacy Hurst:
there are T shirts and say, I need one. Yes, yes, I think we should. And we could have pronunciation guide for the word data as well.
Unknown:
You could say it however you want, as long as you're using it,
Stacy Hurst:
that's a good point. Let's not make something that's not the thing, the thing, right? Well, thank you very much, and thanks to our listeners for joining us for this episode. We hope you join us for our next episode of literacy talks.
Unknown:
Thanks for joining us today. Literacy Talks comes to you from Reading Horizons, where literacy momentum begins. Visit readinghorizons.com/literacytalks to access episodes and resources to support your journey in the science of reading.