E journal #11
What was the rubric process like for your team?
Who participated and how?
The rubric process worked well for us, utilizing the strengths of individuals on our team. We met as a team–Marean Jordan, Debbie Newport, Shawn Diez (Board Chair of SFF Education Committee)–and brainstormed how to approach the rubric process. We then met once more to write a draft for the first rubric to understand the process. Through lengthy discussion we worked on all seven of the principles (as a case study) and then Marean and Debbie each took three of the principles to articulate and “score” and combined them into one document. We then sent the final draft to the team to adapt, proof and finalize.
When you came together to talk about the rubric, was there agreement among team members?
There was general consensus about where we are in the developmental process, and we tried to be realistic about where we want to be. Our initiative is pretty comprehensive, so trying to address all of the components of our S2S plan using the rubric was challenging.
What differences, if any, surfaced through this process?
Not much, although we often found that we’re between stages on the continuum . For example, we sometimes judged ourselves at the embedding stage on some critera based on evidence but decided on deepening to allow for growth and maturation.
Did filling out the rubric help you reflect on your project’s progress?
It did in general, challenging us to look closely at many components of our progress and assess where we are at the end of our fourth year. We had a team meeting as well with our entire S2S team in the last 2 weeks, and looked at the plan we established in 2014. Although some of the people and plans have shifted, much of what we envisioned has been established and is deepening or embedding. Using the rubric helped us define how we are going to work toward completion of our plan in year 5. The rubric is also going help us target key areas for assessing our work and establish protocols for assessing each year.
What, if any, new insights about your project did filling out the rubric generate?
It helped us see that we are well on our way, and that decisions moving forward need to sustain and build on the work that has already been done. Comprehensive programming in visual arts and music and some arts integration in core subjects are in place; the next step is to offer additional professional development opportunities to enhance and deepen that work. The arts are increasingly being seen as one of the hallmarks and vital assets of the educational experience for students in the Sisters School District.
What, if any, aspects of your project did you identify that could use improvement/more attention?
There are a couple different components we are looking at improving in year 5. We plan to use technology more effectively in our arts programming, showcases, and assessments (i.e. digital portfolios to track students’ progress, videos of performances, etc.). We are also looking at developing a music technology class at the high school to complement the Americana Project for the most accomplished musicians, and to incorporate live sound, recording technology, auditorium tech, lights and staging into a modern music class to provide a more well-rounded career related experience for those on that track. Additionally, we learned through presenting the Missoula Children’s Theatre this spring as a one-week residency that there is great interest at the middle school level in developing a theater arts class for grade 7/8 students in year 5.
What feedback do you have about the rubric or the pilot process?
I personally find it challenging to do, but very effective in helping us gauge progress and reflect on where we are and what we are trying to accomplish. I appreciate the opportunity to report on all seven principles to see our successes and challenges as we move forward.
What, if any, changes to the rubric itself or the process of using it would you recommend? I don’t have many recommendations for improvement, although we found there was some redundancy when we were reporting on all seven principles. We weren’t always clear about the relationship between evidence and examples and what level of specificity we should be providing. We would like some ideas about how we can translate this information into data we can share. Perhaps this is the next step in the process. We look forward to hearing from OCF about what you’re learning and how it can shape our planning for assessment and evaluation.