Yesterday was another good day for me at BCCE. Here are some nuggets:
- SEE-I Technique: this can be utilized in chemistry to provide students an opportunity to clarify and explain their understanding of a concept. The “-I” part is the most critical, as students are asked to provide an abstract example of what the concept means. For instance, for charles law, you might describe the behavior of a 2 year old being provided with more and more candy.
- If you read this blog often, you’ll know I have an unhealthy obsession with Google Docs. I use it extensively with my classes, but never considered using the Chat feature to facilitate in-class collaboration during labs. The thought of high schoolers having free reign on a chat window is risky, but one of the suggested keeping it anonymous, and moderating it by adding guiding questions.
- I’m not a flipper, but I have been creating problem solving videos for my students to access outside of class for particularly challenging topics. One talk suggested using youtube to host videos, because you can gather a lot of information about how effective the videos are. For instance, you can not only track how many times the videos have been viewed, but at what point students stopped watching, and what parts they rewound and watched it again. This information could be informative for making video corrections, or in modifying your future videos.
- One talk made me consider whether I’m soliciting feedback often enough in my classes. While I probably won’t do it weekly, I think you can gather valuable information about your course’s strengths and weaknesses on a unit-by-unit basis. We’ll see.
That’s all for now. I’m ready for another full day of talks. Follow along with my notes, and add any comments or questions you may have, and I’ll try my best to clarify.
While not always in realtime (thank you Google Docs offline!), you can follow and ask questions/post comments on my notes from the BCCE conference here.
Day 1 was short, but pretty good. While all of the sessions I attended didn’t fit my needs, some were really good. I’ll only discuss one today.
The end of cookbook labs: Student-designed procedures: This talk was by Dr. Laura Lanni who has experience teaching both undergraduate chemistry and AP chemistry, so her perspective was very relevant for me. In her experience, when students are given a procedure, they tend to follow it blindly, finish it quickly, and ultimately forget what the purpose of the experiment was. In switching to student-designed procedures (DIY), students are forced to think about what they’re doing, and why they’re doing it.
She begins with a seemingly simple, and notably non-chemistry activity she asks her students to tell her how to make a Peanut Butter and Jelly Sandwich. she observes that students are very eager to just DO IT, rather than thinking about the process. Once the students realize that “put the peanut butter on one slice and the jelly on another slice” doesn’t yield their desired results (picture a jar of peanut butter and a jar of jelly on two slices of bread), they begin to focus on the steps of the process more closely.
Rather than presenting a prompt and leaving students to individually tackle with the tasks, her DIY approach is very discussion based. An exemplary example involved a lab that I generally dread due to the mindless procedure-following: Determination of the Equilibrium Constant of an Iron(III) Thiocyanate Reaction. The outcome of the lab depends heavily on solution preparation, and an understanding of how to use the data once its collected. There are so many steps, that students quickly lose sight of what they’re doing, and perhaps never really understand why they’re doing it. Great numerical results can often hide the fact that their conceptual understanding of equilibrium is lacking.
Dr. Lanni presents this lab as a DIY experiment. While students impulsivity doesn’t necessarily decrease over the course of the year (“Let’s just mix X and Y”), through discussion, students are able to, through discussion, move beyond using equal concentrations of each reactant to thinking about what information they actually need to obtain the equilibrium concentration. Let’s start with 0.1 M of each reactant. How does this get us to Keq? Is it an initial or an equilibrium concentration? Don’t we need at least one equilibrium concentration to find Keq? How can we find equilibrium concentration? We’ve used titration to find concentration, will that work? Damn you Le Châtelier! What other tools do we have to determine concentration? (Surprise! Beer’s law). Students can map out their ICE table, along with an experimental plan, and obtain their data with more of a firm understanding of what they’re doing and why they’re doing it. And pretty soon, students will notice that their results are consistent with other groups (Hey! The equilibrium constant is… constant!).
Dr. Lanni’s example of DIY with this equilibrium lab definitely inspired me to rethink how I approach this lab. Is it really necessary to find the Keq for a half-dozen different concentration ratios? Is it essential that students obtain their own FeSCN calibration curve? (or is it possible to sneak this into a separate Beer’s law lab?). If not a fully DIY approach, in what ways can I better assess during the lab that students are not missing the underlying concepts of the lab?
On to Day 2…
Over the next several days, I will be attending the Biennial Conference on Chemical Education held this year at Penn State. This is the second time I've attended this conference. It's a great opportunity to comment with chemistry teachers and professors who are passionate about chemistry and more importantly, improving the quality of student learning. It is similar to ChemED (held in alternate years), but more geared towards undergraduate chemistry.
The conference begins with afternoon symposia on Sunday afternoon. As is usually the case, there are far more interesting talks than I am physically able to attend. As of now, I plan to split my time Sunday between three lab-centered symposia:
- Issues on Teaching and Learning in the Chemistry Laboratory
- Electronic Notebooks in the Chemistry Laboratory
- Guided Guided Inquiry
I'll be taking notes/updating my schedule in this Google Doc (feel free to comment!), and I will blog about the highlights each day. Follow #BCCE2012 on twitter this week to hear from many more voices on this year's conference.
We’ve used Vernier probes and LoggerPro frequently in our classes for years, usually using a computer/LabPro interface to collect data. We of course always get good data, but the limiting factor is always the computer, as boot up/login time on our network could take anywhere from 5-10 minutes. Since we’ve got 40-minute periods on most days, this made it difficult to fit in labs as needed. This past year, one of our biology teachers ordered a class set of LabQuests, which we were able to borrow from time to time. The LabQuests made the data collection process much more seamless, and we were able to collect data much more efficiently in our 40- and 80-minute periods. We decided very early on that we wanted to order a dedicated chemistry set for next year.
Then the LabQuest 2 came out.
LabQuest vs. LabQuest 2
The original LabQuest boots up noticeably faster than the LabQuest 2. I’ve found that the LabQuest 2 takes at least 45 seconds to 1 minute, whereas the original Labquest is up in 10-15 seconds.
Both models have a rechargable battery. Even with heavy use both can last a full class day before they need to be recharged (when powered down during downtime).
The LabQuest 2 comes with significantly more screen real estate than the original LabQuest. (13 cm vs. 9 cm diagonal). Like a tablet, it has a horizontal and vertical orientation (turned off by default). Switching from one orientation to another is very clunky (1-2 seconds of blank screen when you turn it), and the vertical orientation doesn’t seem to add anything useful to the collection mode.
The touch screen has significantly improved in the new LabQuest 2. You can easily get by without a stylus. It is most responsive when navigating between tabs and menus, however when entering in text or numbers, it can be a little too sensitive.
The keyboard is annoying in both models, for different reasons. In the original LabQuest, even with a stylus, student had trouble typing due to the small screen space. With LabQuest 2, there’s plenty of room, the keys are relatively easy to hit, but at times too sensitive. The main keyboard would be improved if it included a period, comma, and basic symbols, rather than burying them 2 or 3 clicks deep (It took about 2 minutes for me to find the percent symbol, for instance). The placement of the space bar, also, is awkward. I suppose I’ve just become spoiled by the simplicity of iOS keyboards.
The LabQuest 2 has 3 analog ports (Channels 1-3) and two digital/sonic ports (DIG 1-2). The original Labquest has 4 analog ports, and two digital ports.
Both models have a USB and a mini-USB port. The Original LabQuest has an SD expansion slot, while the LabQuest 2 has a micro SD expansion slot.
Both models have a built-in microphone and ambient temperature sensor. The original LabQuest has a sound sensor. The LabQuest 2 has GPS, an accelerometer, and light sensor.
Both interfaces can connect directly to a computer via Micro USB port for added flexibility in data collection. Users can save data files and graphs to external flash drives. The LabQuest 2 stands out, however, because it has built-in WiFi. Its incredibly easy to get connected, and set up email. This allows users to print (to a wireless HP printer), email screenshots (.png), graphs (.pdf), LabQuest data files (.qmbl), and text files (.txt).
Students can also use any web device to connect to the LabQuest 2, either by scanning a QR code or entering a short URL into a browser. There, they can perform many of the same analysis as on the Qualitative Analysis iPad app, or simply watch live readings.
The Periodic Table app is similar in both models, but there is more data in the LabQuest 2 app. Both models have a stopwatch. In addition, the LabQuest 2 has an Audio Function Generator.
The added screenage on the LabQuest 2 and improved touch screen makes it much easier to collect and analyze data without connecting to a computer. For instance, we never had much success with managing the SpectroVis probe on the original LabQuest, and instead connected them directly to computers via USB.
Inspired by a recent Journal of Chemical Education article, I decided to add a spectrophotometry challenge as one possible lab challenge for my post-AP seniors.
Design and perform a colorimetry experiment to determine the percentage of cranberry juice in a cranberry-apple juice cocktail.
The paper suggests that apple juice absorbs around 392 nm, and cranberry around 520 nm. I collected an absorbance vs. wavelength full spectrum for several samples:
There is a single clear peak in the apple juice sample around 391 nm, which is consistent with the paper. This same peak is present in the Cran-Apple sample, so I assumed the other peak could be attributed to the “cranberry” juice. However, when I you look at the “Just Cranberry” (ingredients: cranberries, water) sample, there is also a peak at 392 nm, suggesting that the pigment present in apples are also present in cranberries (and in significant amounts). So using the “cranberry only” peak, I produced a calibration curve with diluted samples of Just Cranberry.
We can then use the linear(ish) relationship between absorbance and concentration to extrapolate the percent concentration of cranberry juice in the other samples
Though we’ve used the SpectroVis several times using this sort of protocol in several contexts, none of the three groups who did this challenge took this approach to answer the question, but they obtained very similar values (between 13-14% cranberry content).
All in all, the LabQuest 2 is a great improvement over the original LabQuest. Next year, I hope we’ll be able to explore the possibilities of the Connected Science System to enhance our lab experience.
Since break interrupted our current unit, I’ve been exploring using Juno to create a review for my students. Today, I tried to convert it into a mini mobile “textbook” for the rest of the unit. I found it refreshingly easy to do.
Click “New Slide” for a content page. This can be plain text, or you can insert a media file. Juno will automatically create multiple pages as needed to optimize viewing on devices of all sizes. Students can also adjust font size from their device.
If you click “Media,” you can insert an image, audio file, or video via upload or web link. You can also incorporate media files into the question and the answer choices. It is easiest to add videos via a youtube or vimeo link to ensure browser compatibility, and faster opening. You can also type in a regular web address to create a hyperlink.
You can see the (limited) formatting options by clicking “Format.”
Once you’ve created multiple slides or questions, you can reorganize by dragging and dropping.
I find it easiest to create a separate lesson document for each topic. These lessons can then be organized into a bundle.
Bundling your Documents
There’s not much there yet.
Repeat for all of the documents you wish to add to the bundle. You can drag and drop to reorganize them, or click “Add Section” to organize into subsections. It looks as though you can add bundles to your bundles, which will help organize your textbook into subsections.
Once you are finished, you can post the entire bundle, or individual lessons within your bundle. To bulk post, click on the bundle on your document list, and click “Start/Stop”
Or for individual lessons/assignments (preferable)
Click on individual students or whole classes to provide access to the lessons, then click save. It will be immediately accessible to student accounts.
Students can view the lessons from an HTML5 web browser, or from a mobile device (iOS for sure, haven’t checked Android). Here’s what students will see from an iPod touch/iPhone:
Pros and Cons
Juno is still in beta, so I’m hoping they’ll make some improvements before the full launch. For now, here are some pros and cons for the textbook component of Juno:
- Very simple, intuitive UI for creating and viewing lessons
- Easy to add questions for understanding checks, with immediate feedback to students.
- Can embed video: great for showing chemical demonstrations, flipped lessons, or solutions to sample problems.
- Integrated with my grade book, JupiterGrades. Once out of beta, it will likely be linked to other online gradebooks. See Terie’s post for more on assessments with Juno.
- Like with Jupiter Grades, the customer service is very responsive and helpful. The Juno help link also provide lots of information for creating your documents.
- Once a student opens a lesson, or you share the bundle/lesson on the marketplace, there are limited options for editing it. I haven’t explored it enough to see what happens from term to term. For now, I will post lesson by lesson.
- Limited formatting. I’m glad that there are subscripts and superscripts, but it isn’t even possible to import basic symbols, or copy and paste a symbol from a word document. LaTeX integration would make this more useful for math and science.
- No offline access. It would be great to make this available for offline reading, or easily exportable into epub document.
If you use JupiterGrades, give Juno a try with your classes. They make it very easy to import your rosters into Juno. If you don’t use JupiterGrades, you can still set up an account to use this with your students (sans gradebook integration).