My 3rd graders will be starting of the year with a unit on sound and light, which has always been one of my favorites (so much to play with and observe, and so obviously relevant to students’ lives). The sound part of our unit is pretty heavily based on the FOSS unit Physics of Sound which many teachers are probably familiar with. However, there appear to be some changes coming down the pipeline with the impending national science curriculum and this summer’s release of the K-12 science framework from the NRC (for a primer on this, check out this article from Education Week) The new framework has a new approach to physical science standards that could change the emphasis of units like mine.
Instead of grouping sound and energy under energy standards, as many standards documents have done in the past- the NRC’s committee decided to create a separate physical science standard called “Waves and their applications in technologies for information transfer”. Whoah- that’s a mouthful, but it’s cool to see information technology specifically mentioned. Why call out waves specifically? In the committee’s words on page 88:
“This idea is included in recognition of the fact that organizing science instruction around core disciplinary ideas tends to leave out the applications of those ideas. The committee included this fourth idea to stress the interplay of physical science and technology, as well as to expand student’s understanding of light and sound as mechanisms of both energy transfer and transfer of information between objects that are not in contact.”
I think this is dead on: even thinking about my own unit, we spend a lot of time playing with tuning forks and flashlights, but very rarely emphasize more technological applications such as speakers or radio waves. I realize there are developmental reasons to keep a sound and light focused on observable, visible phenomenon, but surely there are ways to make technological applications of waves more accessible to elementary and middle school students. Here’s one idea that I’ve used with success in the past to make sound waves visible:
Get a sound microphone probe, such as the one sold by Vernier at right. No need to buy one- I borrowed one from my high school physics department, so ask there first. You’ll also need to borrow the software that goes along with the probe to graph it’s output, for example Logger Pro . Now it’s time to play- get the probe set up to collect data, and put it near an instrument that can create pure tones (a electronic keyboard works well, especially on a setting like “Whistle”). Start collecting data and playing your instrument, and you should see the data graphing from the probe. It will look like a mess at first- so stop recording and play with the axes of the graph. You’ll need to stretch out time on the x-axis because most audible sound waves will be so packed together you won’t be able to see the wave. Once you’ve stretched out the x-axis enough that you can see a wave (like the one pictured ont the left), now comes the fun part! Start collecting data again, and experiment with playing notes at higher and lower pitches as well as louder and softer volumes. Pretty neat, huh? You’ll be able to see exactly how the sound wave changes- how volume is related to the height of the wave and pitch is related to it’s wavelength (or speed of vibration).
Once you’ve got the hang of it, any invisible sound wave can be made visible… which can lead to all sorts of class investigations. I wish I could find an online version of a sound wave visualizer that utilizes a laptop’s built-in mic, because that could allow many students to work with it at the same time. If anyone out there knows of one- let me know!
Any other ideas out there? How else can traditional teaching of waves be updated to include modern technology applications? I’m all ears!