March has come in like a lion…and I don’t mean the weather. SBA (Smarter Balanced Assessment) time is upon us and the lion is breathing, somewhat uncomfortably, down the back of my neck.
For me, this began last year at the end of June, when teams of teachers at all levels in my district met for 1 week of full work days to design performance based assessments to be used in classrooms this year. Our goal in the high school ELA group was to create standards-aligned tasks that would provide a SBAC-like experience. We began the year with these tasks and in December we shifted our focus to the SBA Interim Assessment options . It was decided our students would complete the Smarter Balanced Interim Comprehensive Assessment (ICA) for ELA, completing a Computer Adaptive Test (CAT) section as well as a Classroom Activity (CA) followed by a Performance Task (PT) section.
As with any change, there were grumblings. However, we knew students needed this exposure and to be honest, we wanted to see what this process and test were going to look like. Our English department administered the ICA in February: lessons were learned and reflection has followed.
- Log-in and get a feel for the system before you have to use it.
- The training videos, PowerPoints, and manuals are good starting points but actually accessing and practicing with the system is essential.
- Practice setting up a testing session before test day.
Students need practice typing while thinking and practice using the test’s universal tools
- I’m always flabbergasted by my students’ keyboarding skills when it comes to typing responses. They can text with lightning speed, using only their thumbs, on teeny tiny keyboards but they struggle with a semi-normal sized keyboard to produce anything that resembles fluid movement or reflects the speed of their thought. Building in opportunities to do this is essential.
- The universal tools ( in link-page 8) did not seem intuitive for several of my students and depending on what the students’ comfort level is with the machine they are testing on, or computer-based assessment in general, the tools are not necessarily easily accessed. Some coaching and practice was needed.
We all like to know what to expect on a test
- For the students, this administration had value simply in getting them exposure to the format and by not being a smooth process. They had to be patient and they had to adjust.
- For me, it is important to see how my students handled the question types and think about how I can best address some of the pitfalls I see before it’s live test time. I definitely have information to inform instruction.
Hand scoring is the elephant in the room
- Most of the English teachers in my building have anywhere from 90-150 students they will need to hand score 3 test items for the CAT portion. Yes, that has them scoring anywhere from 270-450 questions.
- We are extremely fortunate to have 2 theme readers in our building’s Writing Center who are scoring the Performance Task portions of the test. However, this poses a problem in the next area.
Accessing the data has some roadblocks
- The Online Reporting System will only provide the data for your students once all parts of the CAT and PT have been scored and marked as complete.
- Once you mark scores as complete the student’s responses no longer appear in the scoring system. This poses a problem to our use of the data to impact instruction if we are waiting for scoring of the PT portions to be completed.
Overall, the Interim Assessment was a necessary experience that had value for my students and myself. In hindsight, I may have argued for the use of the training test and some of the Interim Assessment Blocks for targeted evaluations and experiences. But for now, I am scoring the questions and asking myself:
- How will I use the data to do what is best for my students?
- What have I learned from this process and how will I apply it now and in the future?
- Is the time, both for my students and myself, worth the information I end up with?