Thursday, December 14, 2017

One Year in AP: All Learners Write (Week Seventeen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  
Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is down right hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along in the journey.  You can read how last week went here.

Week Seventeen: Answer the Question

This week the content focus was primarily on the Columbian Exchange and how emperors legitimized their rule.  The primary focus was the 1450-1750 time period.  Here were the standards for this unit:
  1. List five results of the Columbian Exchange.  
  2. List two examples as to how from each of the following empires legitimized their rule: Ming, Qing, Ottoman, Mughal, and Aztec.
This week's skill focus was aimed towards improving students' ability on the stimulus multiple choice and the document based question.
  1. Analyze primary and secondary sources.
  2. Analyze images.
  3. Write a thesis based in response to the claim.
  4. Pull evidence from a document to support a claim. 
  5. Contextualize the prompt.
Cite Specific Evidence

How do I know the students learned?

In the image below, the students filled out a chart each day with their understandings of the content.  When I actually had them fill out the chart, this helped me see their comprehension of the content. I walked around and saw what each student put in the chart and it was an easy way to see if they had learned.  It was a simple chart, just a place to house what they learned.  The picture below is a sample of the chart.



The way main way that showcased their understandings of the content and the skills was their writing.  I have stated numerous times in previous blog posts that they write all the time, and they still write all the time.  This week, the students wrote multiple times throughout the week to prove their abilities in each of the five skill focus. The purpose of some writing prompts were simply to review/refresh skills while others served to develop new understandings of the skills.  In other words, students were writing to learn. While the students showcased their ability on each of the skills, they also displayed their comprehension of the content.  Another reason I love having student write daily. 


How do I know the students learned and how do I know if they know what they were supposed to learn?

Last week, I used a Schoology quiz to have students evaluate different levels of the contextualization skill.  This week, I used the same process but with different skills.  To do this, I followed similar steps as last week.  Each student wrote a sample body paragraph of an essay.  I chose several of those student samples representing various levels of proficiencies.   For each of the samples, I took each item of the rubric and turned the rubric into a multiple choice quiz.  When the students took the quiz, they had the choice to choose which criteria best matched the proficiency of the sample. In the sample below,  the students had nine criteria to mark as either proficient or not.  In this case, the student incorrectly identified two as having mastered and failed to identify one proficiency.



Explain the Reasoning 

How do I know the students have learned?

I continue to spend a great deal of class time on writing. I feel that through writing the students show me what they know. Furthermore, writing allows them to use the content in a meaningful way, and is far superior then to demonstrating their understanding on a discrete point assessment.  Writing shows their true comprehension of the content.  I cannot stress this enough.  Writing displays that students have mastered the content and can use it or manipulate it in a meaningful way.  It also shows that they don't understand everything. They cannot fake their way while displaying their understanding through writing when they fail to fully grasp a concept


How do I know the students learned and how do I know if they know what they were supposed to learn?

As for the content chart, I wish I would have stayed more on top of this for their sake.  I did not go back to it everyday.  If I had been more intentional about referencing the chart daily, students would have been more reflective of their own learning. 



As for the Schoology quiz, I absolutely love this.  The students were questioning me as to why certain parts of the writings were proficient.  They were analyzing each part. That point is so crucial.  Each skill had special attention.  Each skill was evaluated and honed in on as to whether or not each student knew it. This quiz allowed them to see what their understandings were.  Also, it showed me what they know and needed help on.  All students took the quiz; therefore, I was able to quickly see everyone's understandings. Such a great learning tool.

Reflection and Impact

I need to be better at having the students quickly assess their understandings of the content in a meaningful way. Also, I need to do this on a daily and weekly basis.  While I know that the students are doing well with the content, they need to see that more frequently in class rather than the weekly online quizzes.

Tuesday, December 12, 2017

6-Step Process to Designing Curriculum (Part 3)


From Kern, Thomas, and Hughes. See link above.
I am currently taking a Foundations of Curriculum and Instruction course at UIC.  Our textbook, while a medical curriculum textbook, reminds us that curriculum design crosses education fields and that what we are doing in our classes every year has its grounding in research.  Kern, Thomas, and Hughes in their book provide a 6-step approach to curriculum development.  My goal is to share the theory behind our current practices to serve as a guide as design and redesign our courses.  Steps 1 and 2 can be found here.

Step 3: Goals and Objectives


Once we have identified the needs of our learners, we need to clarify our goals and objectives.  While there may be some differences in how these terms are used in our various institutions, on the broader level, goals are the overall purposeful outcomes while objectives are measurable elements.  

When we state our goals, they help us to define and clarify our content, priorities, learning methods, and evaluation/assessment outcomes.  These are not only important for ourselves as we re/design our courses, but they are also important to communicate clearly to all stakeholders--peers, parents, students, administrators, etc.


According to Kern et al. there are five elements to keep in mind when writing a clear and measure objective: who will do how much (or how well) of what by when?  A key here is to use descriptors that are less open to interpretation.  For example, how can we measure the verb "know" as opposed to "identify, list, recite, define, etc."  


In practice:


For this step I wonder a few things, but many depend on the situations at our individual institutions.  For example, do we know the overall purpose of our course?  If there is not a clear "out" such as an AP exam or placement test, why do we cover the content and skills that we do?  Do we consider the general and targeted needs assessments as described in the previous blog posts?

Furthermore, does our course align to and build upon the goals of the courses before/after it in the sequence?  Does it need to?  If we are unable to articulate these goals and objectives, then we often end up duplicating assessments if not content and essential questions.  

Additionally, we may end up over-emphasizing a specific assessment-type when it doesn't really measure the outcome we are looking for.  For example, is the focus of my English class to read a novel or is the focus to practice skills via the novel?  If we are not clear in the focus of materials versus objectives, we may over-assess in some areas (for example plot of a text) when our main goal is something more sophisticated.  We also may misrepresent the total number and weight of questions that are not the focus of our stated objectives.  If we want students to practice more higher-level skills, more of our assessments should be weighted this way.

This further leads to a question of assessing Socio-Emotional skills and other subjective measures.  If we assess objectives like "paying attention," are those elements we instruct and model?  Are they truly the objectives we want to assess in our course?  If they are, we should be clear about how they are instructed and assessed.  If they are not, we should realign our focus to the objectives we do want to measure.

Once these goals and objectives are established, we are then able to move on to Step 4: Instructional Strategies, which I will discuss in the next blog post.

Thursday, December 7, 2017

One Year in AP: Student Suggestions (Week Sixteen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  
Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is down right hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along in the journey.  You can read how last week went here.

Week Sixteen: Answer the Question

This week the content focus was primarily on the Columbian Exchange and the global flow of silver.  The primary focus was the 1450-1750 time period.  Here were the standards for this unit:
  1. List five results of the Columbian Exchange.  
  2. List three effects of global flow of silver.
This week's skill focus was aimed towards improving students' ability on the stimulus multiple choice and the document based question.
  1. Analyze primary and secondary sources.
  2. Analyze images.
  3. Write a claim based in response to the claim.
  4. Pull evidence from a document to support a claim. 
  5. Contextualize the prompt.

Cite Specific Evidence

How do I know the students have learned?

The students took two mini quizzes this week. They were more traditional in nature, as in they were on paper and "graded" by the me, the teacher.   One was a short content objective quiz on the content objectives above.  The students averaged 80% on the content assessment.  The second quiz was a stimulus based assessment.  The students average 70% on the stimulus assessment.  Here is a sample question from the stimulus based exam.


How do I know the students learned and how do I know if they know what they were supposed to learn?

This question is still kicking my butt.  I try each week to know how they know.  This week I took a suggestion from my student about using Schoology to have each student evaluate writing samples. To do this, I first had each student write one part of the DBQ essay in a discussion post.  After they submitted their samples, I took several of the writing samples posted by the students and created a Schoology quiz.  In the quiz, I took one of the samples and for each of answer choices was an item from the rubric.  The student's selected each answer choice they felt the writing sample earned.  In the sample below the student felt only four of the points were earned.



Explain the Reasoning 

How do I know the students have learned? They consistently perform well on the content tests.  The students requested more frequent content assessments to hold them accountable. Since they requested more assessments, I gave them more assessments! Their performance was at the same level as it was on the last unit exam when there were more objectives being tested.  I am not sure what to make of that data point.  Since they performed at about the same level, should I keep doing the more frequent assessments or hold out until the end of the unit?

What the more frequent assessments did reveal was that there were a few gaps in their content knowledge.  The students need instruction on the economic systems involved in the Columbian Exchange and the role certain empires played.

There was some good news to using more frequent content assessments.  There were fewer outliers.  On the last content test, there were a few extreme cases of students "bombing" the test.  On this assessment, there were not such cases.  There was much more of a middle norm, which would be a case that the more frequent assessments may help in preserving the positive narrative of the course.

How do I know the students learned and how do I know if they know what they were supposed to learn? Using the Schoology quiz to assess student writing was amazing! First, it held all students to submitting a written response and then evaluating specific responses.  In the past, I walked around the room and the student typically write in pairs or I cannot get to every response.

The new use of the Schoology quiz enabled me to have all the students record theirs.  I wanted to know if they knew what excellent writing was. Second, the quiz revealed to me, the teacher, that they do in fact know what excellent contextualization writing is.  Almost every student was extremely close to accurately evaluating their peer's writing sample.   I cannot wait to use this method again!


As for the stimulus exam, the student scored at the same level as they did on the previous unit exam.  But this time the students spent a day going through the test with explanation of the answers.  I have written a detailed response for as to why each of the answer choices were right or wrong.  Having the students spend the day going through the test was fantastic.  Hearing the dialogue and conversations centered around the questions revealed that the students read most of questions too quickly. Even though they averaged about the same, they are understanding what they don't know and why they are getting questions wrong. There is still a lot of work to go in this arena.  


Reflection and Impact

As I stated last week, I need to get student's input more frequently.  They have given me such great ideas to improve student learning.  I used two of the suggestions this week and I know they helped.  I going to use the Schoology quiz more frequently to have the student evaluate student work.  It is so quick and simple to do and it reveals if students know what is expected of them.  Such a powerful instructional tool.  I cannot wait to use again!

Tuesday, December 5, 2017

The Lost Art of Field-Tripping

        
The best field trip I experienced was when I took my first group of students to see Much Ado About Nothing at Chicago Shakespeare Theater over twenty years ago.  The students, who had never seen live theater, were fascinated with the lighting, scenery, the physical comedy of the actors, and the entire sensory experience.  The play jumped off the page and turned into something entirely different in the theater space. 

 AP students? Far from it. These were lower level students, some of the weakest readers in the building.  Why did it work so well?



Create barriers, but not financial ones.

This is a privilege, a delightful treat, not a march. We built a barrier (study guide, quiz mastery, lunchtime discussion/preparation) that allowed students who were interested and intrigued to join us, but left behind those who were not yet ready.

We made a point to take students who came from disadvantaged backgrounds. Despite living thirty minutes from world-class museums, these students had never entered the doors of these institutions.

Anticipate where your students may struggle.
 
What will intimidate your students?  When will they feel uncertain or uncomfortable? Directly teach behaviors.  Before I took my lower level class to see Shakespeare, we talked about what to do if they found it boring, had to go to the bathroom, or got hungry. We discussed the difference between applause and yelling out.  Students worry about getting lost, what to wear, and when they will get to eat. Openly address those fears. 

Let students design the trip.

A few years ago, we were studying the novel Things Fall Apart, and a student wondered what some of the African cultural references actually looked like. A few weeks later, the students walked into the African Art wing at the Art Institute of Chicago and exploded with awe when they saw what was at the entrance – the towering dramatic costumes of the mysterious edwugwu.  Suddenly, the students understood the power of these intimidating demi-gods.

Prepare yourself and prepare your hosts.

We use the education departments at these institutions.  When my students went to the Art Institute in conjunction with our reading of Things Fall Apart, the volunteer docents took the time to read the novel themselves the week before our trip so they could be better prepared to help students find those cultural connections.


Friday, December 1, 2017

One Year in AP: Do Students Feel the Same as I do? (Week Fifteen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  

Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is down right hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along in the journey.  You can read how last week went here.


Week Fifteen: Answer the Question

This week's post is a bit different than my usual weekly reflection.  Overall, I felt that the class was generally progressing positively towards the outcomes of the course with a few exceptions.  But how did the students feel?  After last week's post, and this nagging question of how they felt, I knew it would be important to get the students' perspective on the course.



An additional prompt for me to find out how my students felt came in the form of a text message that came earlier in the week from Kim Miklusak. She shared a few thoughts from her PHD program.  Her texts were, "Can the participants of your class tell the narrative of your class? Does the assessment result in a narrative impact?" From my previous blog and her texts, I knew needed to hear from my students.

Her first question ("Can the participants of your class tell the narrative of your class?") is one in which I think students do not often get the chance to reflect on and share with the teacher.  Students are great about sharing their narrative with their friends or with the teacher if something is going great or horribly wrong.  But if they are doing just fine? Can they still tell the narrative? More importantly, do they share that narrative with teachers? I wanted to find out if they could.



The second question ("Does the assessment result in a narrative impact?") is more troublesome.  If a student is generally feeling good about the course and the assessment goes poorly, does it change their narrative?

I needed to know the answers to these questions.  To find out, I shared with my students my previous blog post on how the assessment went. I asked the students to read it and then respond in a Schoology discussion to help me understand their narrative of the course.  Did they agree with my assessment or did they feel differently than me?





Cite Specific Evidence

Here are some of the student responses recorded in the discussion post:

Katie
After the last unit, I feel so much better about writing the part of the DBQ that we've learned; there's still a ways to go, but it doesn't really freak me out anymore. In that aspect, I think the test grades reflect that really well. That being said, the stimulus still makes me a little uneasy. A lot of the time, Mr. Heintz reassures me that I'm on track, but it's just hard to grasp in your head when you've gone through years of school thinking that B's just aren't good enough.

Kaspar
I would say that I am making progress from where I was in the beginning of the year. For me, the content is very easy and my knowledge in that category is very good. When I do the checklists, I try to finish all of them weeks in advance. During the rest of unit, I watch the videos again and take notes which I later use to review before the test. Doing all these things have really helped me succeed on the content (I missed only 1 question in this unit). Although we spend most of our time in class on writing, I think that you should give us practice to do outside of class where I can try by myself without someone guiding me. Furthermore, what I struggle with the most is the stimulus section. I definitely need to work on my timing and sometimes I have a hard time understanding the passages. However, I do think I’m making progress this year and so are my other classmates.

Raymond
I think that I have performed very well in this class and have learned how to apply what I've learned much better. Learning the information through checklists has allowed for more time to work on skills that will be on the AP test. This has helped improve our scores greatly since the focus in class is to work on DBQs since if you did the checklist, you should know all of the information. To corroborate the growth of applying the knowledge we know better, the DBQ average score have increased by half a point. This means that the class is getting smarter due to the increase in test scores which means that the learning of dbqs are improving people's scores, helping us learn more and apply better. This is why I believe that this works and I love this teaching style since it prepares us for the more important things in the class.

Maggie
I disagree with the test results. The class may have grown and increased their test scores but personally my test scores have decreased from the previous test.

Quinn
I think that this class isn’t that hard I just don’t put in enough effort and time into the class to do well enough. I didn’t like the DBQ part of the last unit exam because I didn’t understand what the prompt was and I didn’t even know where to start. Content part was alright but I know I could’ve done way better had I actually studied the night before. And stimulus I felt was easy. Even though my grade shows otherwise, I feel that I have gotten better in this class since the beginning of the year. I feel that the checklists are difficult and shouldn’t reflect our grade in the class as much as it does, I understand why it needs to be done, and why it is done the way it is, it’s just the review quizzes are to hard to pass unless I try them 15 times.

Julian  
I agree with pretty much everything said in the blog post, I see a lot of my classmates becoming more confident and well versed in the class material, which was reflected in the scores listed. I like this class a lot more than what my grade reflects, the teaching style of the checklists makes the information stick very well and the in class discussions are always very interesting. However, I feel like the trade-off that comes with these checklists is that they are often daunting when you are behind and hard to catch up on, especially when it comes to the review quizzes at the every end.


Explain the Reasoning 

I tried to pull a variety of viewpoints. While most of the responses corroborated my narrative, there were some that did not.  The samples above hopefully are reflective of that range of viewpoints.

Katie, Kaspar, and Raymond all more or less adhered to my narrative. I love Kaspar's response because it offered a few suggestions for how to impact student learning. As of the time I am writing this, I have implemented a few of those into the general instruction.  Raymond's response was included because of his style. He wrote it in the format similar to the format for the essays.  It was a proud moment and I wanted to include it.

Maggie's response is a tough one.  I sat down with her to discuss her response.  She was one of the students who's narrative was impacted by the assessment.  She was feeling good about the course.  She was happy about her progress in the course, not overly glowing, but had a positive feeling.  However, the assessment changed that viewpoint.  She was one of the outliers of the assessment and now she has a negative narrative of the course. This is troublesome and I need to do something about this.

Quinn's response is interesting because he claims that the test was easy and he could do better, but the assessment had a different result.  His response makes me question the use of the checklists.  He knows he can come in to get help on them instead of taking them that many times, but doesn't.  So, how do I reach him?   Also, how can a student be that far off of the results?  I need to build in more reflections so I can monitor student's beliefs.

Julian's response furthers Quinn's beliefs on the checklisks.  Julian missed a portion of school due to health reasons and has had trouble catching up.  This is true of any AP class, but it still represents a larger problem.  How can students feel successful when they fall behind so they stay motivated to complete the work?

Reflection and Impact

I need to ask the students more frequently how they feel.  I do ask informally, but I need to do it more formally to get all students to respond and gauge how all students are feeling.

I also asked the students if they have any suggestions for me.  Not only did I ask them to respond to my narrative, I wanted suggestions if they had them.  Most students did not have any, but there were a few that I have implemented those suggestions already.

Dave
The most challenging part of the class to me and many others would be the DBQ/writing elements. When we work in class and peer grade, we don't always get accurate/good feedback, which can be confusing at times. I'm not sure what we can do to fix this, but that's one thing I've noticed and confused myself with. Possibly more group grading?

Kaspar
I think that you should have a writing check on Schoology in the form of a timed practice quiz every two or three days. You could make us do a thesis one day, a contextualization the next day, and a full DBQ another day. Additionally, you could give us an example writing to grade where we, the students, can check the example based of the rubric.

Taking a moment to pause and figure out what is working is crucial to improving the instructional practices that impact learning.  I do not want my class to burden students or cause them to feel inferior because of it.  Asking and responding the student's is way to help all students.

There's No Feeling Like Skateboarding. S1E6 We Are EG Podcast

We hope you have enjoyed our previous episodes in our We Are EG Podcast series. This series is dedicated to telling the stories of Elk Grove High School. Staff, students and community members share their stories and in turn we get a glimpse of our commonality, building an understanding layered on the experiences of those who have lived and worked, studied and played, in the halls of Elk Grove High School.

In this week's episode of We Are EG, Ryan Asmussen sits down with fellow EG English teach Matt Snow and two students to discuss skateboarding.  Matt was at one time a semi-professional skateboarder and is currently the sponsor of the EG Skate Club.



If you have a story to share, or you have interest in learning about podcasting and/or joining us in the production of the We Are EG Podcast, we'd love to hear from you!

Tuesday, November 21, 2017

One Year in AP: Assessment Data (Week Fourteen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  
Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is down right hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along in the journey.  You can read how last week went here.

Week Fourteen: Answer the Question


This week the content focus was on all of the standards the class has covered this year so far.  The primary focus was the 600-1450 time period.  Here were the standards for this unit:

    1. List and locate the major trade routes and major cities during the time period of 600-1450.
    2. List and locate the Sui, Tang, Song, Mali, Byzantine, Caliphates, and Mongol Empires.
    3. List technologies used and provide one example as to how the technology encouraged interregional trade of luxury items.
    4. List five historical example as to how Empires facilitated Afro-Eurasian trade.
    5. List four examples as to how empires used conquered people in their economies
    6. List three methods used in the expansion of Islam throughout Afro-Eurasia.
    7. List one historical example as to how cross-cultural interactions resulted in the diffusion of advancements in Afro-Eurasia.
    8. List the crops/diseases diffused and their effects.
    9. Provide two examples as to how the Byzantine, and the Sui, Tang, Song reconstituted classical era governments through traditional and innovative sources of power to legitimize their rule.
    10. Provide two examples to how new forms of governance emerged in Afro-Eurasia. 
    11. Provide one example as to how the Islamic Caliphate and Japan each synthesized local with foreign traditions. 
    12. Provide two examples as to how the Inca and Aztec created an imperial system in the Americas. 
    13. Provide one example for how contacts or conflicts encouraged significant technological and cultural transfers.
    14. The fate of cities: Provide three reasons that contributed to the decline of urban areas between 600-1450.
    15. Provide five reasons that contributed to urban revival between 600-1450.
    16. Provide one example as to how women increased their status in the Mongol Empire, West Africa, Japan and Southeast Asia.
    17. Provide one detail as to how serfdom in Europe/Japan and mita in the Inca Empire exemplified coerced labor.
    18. Provide one example as to how peasants revolted in China and the Byzantine Empire.
    19. Explain how the diffusion of Buddhism, Christianity, Islam, and Neo Confucianism changed gender relations. 

This week's skill focus was centered on analyzing charts, maps, and texts and pulling evidence from documents to support a claim.

    1. Write one cause/effect, and one comparative short response that reflects understanding of essential content.
    2. Analyze charts, maps, graphs, and texts.
    3. Write a thesis statement, contextualize a prompt, and draw evidence from two documents to support the thesis.  

Cite Specific Evidence

First, how do I know that the students know they knew the content and how to do the skills?

I largely focused on this question in last week's blog post. I was pretty confident in their ability on the dbq and the stimulus multiple choice, which is cited in the blog post.  To reinforce and assess content knowledge, I relied heavily on the Schoology checklists that the students complete each week. They couldn't take the assessments until they finished the checklists.  It is a small insurance policy to ensure that the students know they know the content.

How do I know that the students were successful on the big three focuses: content, writing, and analyzing?

The students took three exams.

1. An eighty question multiple-choice content exam.  These are just basic fact recall on all content that has been covered in the year.  The majority of the questions were centered on the current unit, but a fourth of the questions were pulled from previous units.

The students scored an average of 67/80 - garnering an 83%.  This was taken in a 48-minute period.  The students were all able to finish in the time.

2. A document based question writing portion on the role cities played in Muslim society with two documents.

The students were assessed using the following rubric:


The students scored an average of 2.68/4.  The students missed the argumentation point the most.

3. A twenty question multiple-choice stimulus exam.

The students scored an average of 14.1/20.


Explain the Reasoning 

What do all of those numbers mean and tell me?

1. For the eighty question multiple-choice content exam, the students improved from the last content exam by 2%.  Which is great!  We covered 900 additional years of history in this unit and the students are doing better.  While the average has increased, there some students who did not do so well.  I am working on how to better serve those students while maintaining what is working for the majority.  Overall, the content checklists, ongoing formative progress check throughout the unit, seem to be working for the students.

The biggest concepts missed were centered around sinification, the spread of Islam, and the Song Dynasty. I am very fortunate to work in a district with MasteryManager.  The item analysis feature is amazing.  To give you a sample:




The students struggled with how Islam spread into particular areas. I recognize that the spread is nuanced, but the course deals in generalities for these questions.

Also, there were a  few errors on my account- such as weird wording-and a few questions that could pertain to two or more empires. The averages above reflect that change.

2. For the DBQ, a 2.68 is amazing! The students scored a 2.1 on the previous unit's dbq.  It was the same rubric and skills.  A half point of growth is great.  Also, this is the skill we have practiced in this unit day in and day out.  It is great to see it pay off.

3. For the stimulus exam, the students answered on average about two questions more than in previous unit.  This is a deal breaker.  Two questions on a stimulus exam might be the difference between scoring a 3 or a 4 or a 4 or a 5.  I credit this growth to the document work that we do on a daily basis.  They are reading a ton of primary and secondary sources that a difficult to digest.  Their daily grind appears to have worked!

After the exam, the students went through the tests and recorded what they missed and why they missed the questions. I borrowed heavily from Dan Saken and his Learning Celebrations.   A great tool to get the students to go through the assessments and reflect.

Reflection and Impact

Overall, I am extremely happy with the progress they are making.  There is still a lot to go.  There are three units and multiple more skills to master.  But the daily feedback on their writing and interpreting appears to be working.

I need to do something before the exams to ensure they know the content.  There needs to be a student reflection for them to know if they are ready and if they known it.  I am not sure what it is yet, but I am working on it for second semester.