We just completed our Kindergarten unit about living and nonliving things. As with all our elementary science units, we collect pre and post assessment data, in order to get an overall idea of how well students learn the intended concepts. Of course the main idea of the unit is differentiating between living and nonliving things, so we focus our pre and post assessment on this skill with a simple paper and pencil activity:
I remember back in my grad school days reading a study about student pre-conceptions (don’t you dare call them misconceptions now!) about what’s alive and not alive: I believe it was a study by Inagaki and Hatano (2002) reviewing different studies of conceptual change in children to learn how they think about living things. Good reading, even if it’s not required for a response paper!
Our own data from 7 classes of Kindergartners shows some interesting trends:
I broke it down a little further by categorizing these items as animals (cat), plants (tree, flower), animate objects (car, cloud, computer) and inanimate objects (ball, teddy bear, block):
Not surprisingly, animals (the cat) is the most obvious to Kg students of all the items, but I was impressed that plants followed so closely behind. Our Pre-K students do have a general plants and animals unit however, so perhaps that has some influence on this.
What’s most interesting to me is to notice which nonliving objects are misidentified the most: car, computer, cloud, and teddy bear are all misidentified by about 1/3 of the students! It’s pretty easy to understand why- 3 of them move and change, and 1 resembles a living thing.
Over the course of the unit, students learn about the characteristics of living things (they grow and change) and their needs. Students experiment by trying to grow seeds and blocks, also take care of a goldfish to learn about what it needs to live. In addition they play lots of sorting games to reinforce new ideas about what’s living and nonliving. How well does it work? Check out the post-assessment results:
Note: These results are missing one class that I haven’t received data from yet
Not too shabby. Granted this is not a very deep assessment (we have others for investigating students’ understanding in a more meaningful way), but for a quick check of factual knowledge this is good to see. It’s also useful to see what wasn’t learned as well, clouds appear to be especially confusing for some Kindergartners. Thinking about it, they do fit our characteristic of living things by growing and changing, and it’s difficult for students to investigate what clouds need or don’t need. So I’ll have to think more about this one- maybe the right approach would be a research investigation, posing the question Are clouds a living thing? and having students consult various resources (library books, websites, parent interviews) to collect “data” and then reach a conclusion as a class. Something for next year!
And that’s the wonderful thing about doing pre/post assessments (besides being a great example of learning for student portfolios)- it’s almost impossible to collect data like this and NOT have it influence and improve your teaching. So give it a try- I dare ya’!
I read this yesterday and had to try it out with my favorite 4-year old. Daughter Tabitha attends 1/2-day pre-K at the local public school. She started to circle the teddy bear saying, “A bear is alive”. “A teddy bear is?” I said in my best clarifying tone (she can’t read yet). “No, a bear is alive, but a teddy bear is not”. She went on to circle the cat, the tree and the flower, finishing with, “That’s all I can find.”
“How do you know the tree is alive?” I asked innocently. “It drinks”.
Over the course of the next few minutes, I was able to get out of her that they have studied living things in pre-K.
Thanks for sharing Christopher. That’s exactly the way to deepen a shallow pen and paper assessment like this too- using follow-up questions like that. Questioning like that will reveal a lot more of a student’s thinking, just like it did with your daughter and the teddy bear. We try to add oral components to our assessments, but of course that’s time consuming when you have a class of 20 (or more!). This is always a challenge with young students, and I’m still trying to come up with methods to make assessment more feasible…