New paper: Exploring teacher agency as they use mathematics curriculum materials

Well, it’s been quite a long time since I wrote a blog post — not surprising, I suppose, given the state of the world — but I’m happy to be back to share a new paper that just came out. This paper, published in Teaching and Teacher Education, reports the main finding of my practicum study.

As most of my blog readers will know, I spent many years writing mathematics curriculum materials before I began my PhD. As such, I’m very interested in the ways these materials are used by teachers. As a developer, my focus was on making the materials as educative and useful as possible. I saw my role as providing the information and support for teachers to create lessons that gave students opportunities to engage in rich mathematical experiences. As I spent more time talking to teachers, however, I started thinking about all the influences on teachers’ thinking and decisions that had little to do with their curriculum materials and more to do with all the other aspects of their professional obligations and pressures. It occurred to me that even with the perfect curriculum materials and abundant professional development, teachers will still face challenges in organizing their mathematics instruction around student thinking if pressure to produce good test scores or stick to a master schedule feels in conflict with the lessons they would otherwise plan and teach.

The existing research on these issues leaves little question as to whether things like standardized testing and limited class time play into teachers’ planning and teaching. However, I didn’t find much work that really dug into the nuances of teachers’ reasoning. In particular, there didn’t seem to be many studies that looked at teacher agency, or teachers’ sense of control over their decisions. When teachers plan lessons around test prep rather than rich mathematical tasks, for example, it is because they choose to do so? Or do they feel they have little choice?

I decided to focus my practicum on exploring this issue. For this study I used linguistic cues to explore teachers’ agency as they used mathematics curriculum materials. I examined when teachers felt agency to plan their instruction around considerations of students and how that sense of agency was influenced by other influences within teachers professional contexts. Unsurprisingly, I found that standards and assessments were a significant constraint on teachers’ agency. But I also found that students played into their thinking in complex ways despite these constraints. The two teachers in my study found space to consider their students even within strong constraints. And occasionally, student needs even overpowered other contextual constraints.

I’m not sure I discovered anything brand new in this study, but I do think it provides a new perspective on a chronic issue in mathematics teaching and learning. I also walked away from the study with a stronger appreciation of all the considerations, goals, and pressures teachers coordinate as they do their important and difficult work.

One more thing I want to mention: This paper was desk rejected from two journals before I submitted it to TATE, and the first round of reviews from TATE asked for revisions that were significant. The framing and background had to be completely redone, and I also had to redo a part of the analysis. But in the end, it was accepted. So, the moral of the story is: Don’t give up on a manuscript! This acceptance was all that much sweeter for the work it took.

New paper: Applying levels of abstraction to mathematics word problems

A new paper stemming from the CT4EDU work just came out in TechTrends. I think of this paper as a companion piece to this one that came out last year in Journal of Computers in Mathematics and Science Education. That article was an empirical analysis of student work, showing that looking at mathematics problem solving through the lens of levels of abstraction—an idea I drew from the computer science education literature—could provide a new and interesting perspective on what is going on with students’ reasoning when they make common errors. This new TechTrends piece is theoretical and looks at how we could adapt an instructional framework from CS education research for use in elementary mathematics (and perhaps more importantly, why it makes sense to attempt such an adaptation). In short, the empirical piece establishes a problem, or at least demonstrates a new perspective on a problem. This new theoretical piece looks at how we might solve it.

(Here is a link to a free view-only version if you don’t have access to TechTrends. As always, contact me if you’d like a preprint.)

The principal mathematics instructional issue that these papers focus on is the ways in which our culture of schooling tends to lead children to produce answers to mathematics problems without adequately attending to context. Every elementary school teacher, and probably every parent with kids old enough to have gone through elementary school, has likely watched a child skim over a contextualized problem, pick out the numbers, make a guess at which operation to use on them, and report a numerical answer without ever considering whether that answer makes sense in context. Even though a number of instructional frameworks have been developed to combat this issue (a few are cited in the paper), the problem continues to endure. Yesterday a colleague lamented to me that her daughter had produced an answer of 1/3 of a frog to a word problem. I’m also teaching math methods to preservice elementary teachers this semester, and last week, in response to an assignment requiring him to do a task-based interview with a child, one of my students expressed surprise that his interviewee was perfectly capable of executing the arithmetic required for the task, but seemed to have no idea how to make sense of what the word problems were asking him to do.

The framework that we (my coauthor Aman Yadav and I) present in this paper has yet to be tested empirically, so I can’t make any claims as to its efficacy for solving this issue. However, as someone who finds herself solidly situated at the intersection of two domains, I see a major part of my work as identifying ways that mathematics and computer science education might work together. Sometimes, that collaborative work can take the form of integrated instruction. Other times, I have found that the import of ideas from one of the disciplines into the other was really useful, even if the benefit is only children’s learning in one of the disciplines rather than both. In this case, I have found the focus on moving among levels of abstraction as a skill to be fostered in students, often mentioned in computer science education literature, to be very useful for highlighting a common instructional issue in mathematics. And I figure that new ways of looking at enduring problems might be helpful—and certainly can’t hurt.

 

New paper: Teacher profiles for CT implementation

Another paper based on the CT4EDU project is now available in Education and Information Technologies. This paper, written in collaboration with Aman Yadav and Rachel Larimore, presents an analysis of the classroom video we collected during the first year of the project. Each of our partner teachers implemented at least one unplugged math or science lesson in the 2018-2019 school year where they had intentionally planned to include attention to one or more computational thinking (CT) practices (abstraction, decomposition, patterns, or debugging). We coded the videos in an effort to make sense of how these teachers translated the CT ideas into their teaching practice.

(Note that this link will take you to a free, read-only online version. If you would like a preprint feel free to contact me.)

We found a lot of interesting and rich variation in the ways these teachers provided opportunities for their students to engage in CT. They used three primary strategies to do so:

  • They framed lessons around a CT practice. For example, one teacher introduced a science activity by pointing out how students would be engaging in debugging as they tested and refined the design of their rubber-band rockets.
  • They prompted students to use a CT practice in the moment. For example, it was common for one teacher to suggest that students stop and look for patterns as they worked through a math task.
  • They invited reflection on CT by pointing out or highlighting occurrences of CT that had already happened. For example, at the end of a math lesson, one teacher asked students to think of examples of how they had used abstraction during the lesson.

While most teachers used all three of these strategies at some point in their lessons, they ways in which they combined the strategies varied. We grouped our teachers into four profiles to highlight they ways in which they incorporated CT through use of the strategies:

  • Some teachers frequently used all three strategies, and tended to reference all four of the CT practices during the course of a lesson. These teachers seemed to want to support students in seeing the CT practices as general and widely applicable problem-solving strategies.
  • Other teachers tended to focus their lessons on one practice, often connecting their within-lesson prompting to pre-lesson framing or post-lesson reflection opportunities. They seemed to want to provide multiple extended opportunities for students to use one practice in a single lesson.
  • One teacher seemed to use CT more as a tool to guide her own thinking about a lesson than to communicate anything particular about CT to students. She did not use the CT vocabulary often, although she asked questions and directed discussions that seemed to be aimed at helping students engage in the higher-level thinking involved in CT.
  • Finally, one teacher mostly relied on in-the-moment prompting with little use of the other strategies.

Because we did not collect student data to evaluate the impact of these strategies and profiles of implementation, we worked hard to avoid placing a value judgement on them. Rather, in the discussion, we connected the strategies and profiles to work from other disciplines and reflected on the kinds of professional development that could support teachers in using these or other strategies as the bring CT into their classrooms — particularly CT that is integrated in another subject. I think the primary contribution of this paper is a shift from thinking about how to support teachers in learning about CT to thinking about how to support teachers to use CT thoughtfully in the context of their classrooms.

I hope others find the paper interesting and useful. I enjoyed writing this one, and my prolonged engagement with the video made me all the more appreciative of teachers.

New Paper: Levels of Abstraction in Math Problem Solving

As most of you know, one the many intellectual puzzles I ponder in my research and professional development work is trying to understand the relationship between the thinking processes highlighted in the computer science literature and the thinking processes used in mathematics — in particular, elementary mathematics.

This morning, a new paper I co-authored with Aman Yadav and Marissa Zhu came out in Journal of Computers in Mathematics and Science Teaching. This paper is the result of some of the examinations I’ve done of how mathematicians versus computer scientists talk about abstraction, specifically. In mathematics, the common narrative is that we should structure instruction around a basic progression from concrete representations and experiences to more abstract representations and experiences. The focus is on the distinction (or sometimes, continuum) between concrete and abstract. The words concrete and abstract aren’t things we ask students to think about in most cases. Rather, they are terms that guide pedagogical decisions or frame research studies (e.g., Agrawal & Morin, 2016; Dubinsky, 2000; Harel & Tall, 1991; Jao, 2013).

In computer science, on the other hand, abstraction isn’t discussed in relation to concreteness. Rather, computer scientists tend to instead draw distinctions between different levels of abstraction (e.g., Armoni, 2013; Hillis, 1998; Wing, 2006). The levels don’t vary in concreteness, per se, but rather in scope and level of detail. At higher levels of abstraction, one can consider a wider scope of a problem, but a courser level of detail. At lower levels of abstraction, one can consider a finer level of detail within a narrower scope. Learning to move among levels of abstraction to change your view of a problem is discussed as an important learning goal for students (Armoni, 2013; Hazzan, 2008).

I was intrigued by this idea of levels of abstraction and the need to change one’s viewpoint of the problem during different points in the problem solving process. Armoni (2013) discussed levels of abstraction in the context of algorithm design. In this new paper, we identified different levels of abstraction students needed to employ and move among in order to solve a commonplace elementary mathematics task. Using these levels as a lens, we examined fourth- and fifth-grade students’ work on the task. It turned out, interestingly but not surprisingly, that many of the errors students made while they solved the task related to challenges in moving among the levels of abstraction. Abstraction difficulties accounted for many more errors than execution of mathematical skills like counting or addition.

In the paper, we argue that this result suggests bringing CS’s explicit attention to levels of abstraction into mathematics instruction could improve students’ mathematics performance. I’m excited to share these findings with our CT4EDU teacher partners and see how they take up the ideas in their classrooms this year.

References

Agrawal, J., & Morin, L. L. (2016). Evidence-based practices: Applications of the concrete-representational-abstract framework across math concepts for students with mathematics disabilities. Learning Disabilities Research and Practice, 31(1), 34–44.

Armoni, M. (2013). On teaching abstraction in computer science to novices. Journal of Computers in Mathematics and Science Teaching, 32(3), 265– 284.

Dubinsky, E. (2000). Mathematical literacy and abstraction in the 21st century. School Science and Mathematics, 100(6), 289–297.

Harel, G., & Tall, D. (1991). The general, the abstract, and the generic in advanced mathematics. For the Learning of Mathematics, 11(1), 38–42.

Hazzan, O. (2008). Reflections on teaching abstraction and other soft ideas. ACM SIGCSE Bulletin, 40(2), 40–43.

Hillis, W. D. (1998). The pattern on the stone. New York: Basic Books.

Jao, L. (2013). From sailing ships to subtraction symbols: Multiple representations to support abstraction. International Journal for Mathematics Teaching and Learning, September 2013, 15 pages.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35.

New paper: Elementary teachers and integration of CT

I’m proud to share a newly-released article in the Journal of Technology and Teacher Education, written with MSU faculty Aman Yadav and Christina Schwarz, entitled “Computational thinking, mathematics, and science: Elementary school teachers’ perspectives on integration.” This is among the first journal articles we have out based on the CT4EDU project, which has been the focus of my research assistantship for two years. I’m excited that our findings are starting to roll out.

This piece focuses on what we learned from interviews we conducted with our partner teachers before the project began in earnest. The interviews focused on how teachers connected computational thinking to their existing teaching practices in mathematics and science. Our goal was to use the information we gained in the interviews to help us design professional development experiences that met teachers where they were.

In my view, one of the main contributions of this piece is that we worked hard at pushing against the view that the goal of educative experiences for teachers (and for students, for that matter) should be to identify and correct or eradicate misconceptions. Rather than focusing on identifying what teachers said that was contrary to commonly accepted views of computational thinking, we focused on the positive connections between CT and teachers’ existing practices. We viewed teachers’ current conceptions about CT as resources to build upon instead of mistakes to correct. Yes, the teachers tended to talk about algorithmic thinking as following predetermined steps. But moving from following algorithms to developing algorithms is a smaller step than moving from no knowledge of algorithms to developing algorithms. Similarly, they tended to talk about automation in terms of students automatically answering basic math facts, which is different than thinking about using a computer to automate something. Yet, they talked about students’ automaticity as lessening the cognitive burden for kids — and automation on computers can do that, too.

This paper took several rounds of review to get accepted, but that turned out to be a positive thing because it allowed us to talk specifically, in the discussion, about ways we built upon teachers’ thinking in our professional development sessions.

On a more personal note, I’ll also mention that I’m particularly proud of this piece because conducting the interviews was my first official foray into formal data collection in graduate school. So it’s nice to see that effort come to fruition both in the PD sessions and in publication. I hope others find the paper useful.

 

FIRST JOURNAL ARTICLE! Synergies and differences in mathematical and computational thinking

I have reached a milestone in my academic career: My first peer-reviewed journal article has been published!

Written with my UChicago colleagues Liesje Spaepen, Carla Strickland, and Cheryl Moran, the article is called, “Synergies and differences in mathematical and computational thinking: Implications for integrated instruction.” It is currently available online (this link will get you to a free eprint) and will eventually appear in a special issue on computational thinking from a disciplinary perspective in Interactive Learning Environments.

The history of this piece, and the special issue that will contain it, is rather interesting. Back in 2016, I attended a PI meeting for the NSF Cyberlearning program. I wasn’t a PI at the time, but the program invited interested folks to apply and come for learning and discussion. I applied and got accepted. That meeting included time for working groups to meet and discuss potential collaborations. There was a working group on computational thinking, and one of the outcomes of that working group was a structured poster session at AERA 2017. In the discussion period at the end of that session, the idea of a special issue came up. Several years later, the issue is finally close to finished!

The Everyday Computing project contributed two posters to the AERA poster session. One of those posters was the debut of our first learning trajectories, on Sequence, Repetition, and Conditionals. But during the time between the poster session and official start of work on the special issue, we published those LTs in the 2017 ICER proceedings. We were left, then, with the choice to either drop out of the special issue or come up with something else to contribute. I am reluctant to give up opportunities, so I campaigned for the latter.

Knowing the special issue was not focused just on CT, but specifically on looking at CT from disciplinary perspectives, I spent some time thinking about the discussions we were having (and continue to have) as a team about the relationship between CT and the kinds of thinking kids are already doing in elementary mathematics. Kids do decomposition and abstraction as they engage in mathematics, sure. And they put things in order, and repeat steps, and develop algorithms. But are all those things CT? When do to they start being CT?

During our processes of integrated curriculum development, we explored some of connections that really seemed to hold promise for leveraging mathematics to get kids ready to engage in meaningful computing. But we had also explored lots of them that fell apart under scrutiny. I was interested in seeing if we could come up with a way to systematically look across the elementary mathematics curriculum and make sense of the opportunities for connecting mathematics to computing.

So, we ended up doing a document analysis of the K-5 Common Core State Standards for Mathematics, mining it for potential connections to CT. We used our own trajectories as a starting point for the CT ideas we used to code the standards — a choice that was self-serving but, I would argue, justified given the work and detail we put into the articulation of those LTs and the ways in which they have been readily taken up by others. We looked, thoroughly and systematically, for ways in which general ideas connected to CT — like precision, completeness, order, identifying repetition, and examining cumulative effects — appear in K-5 mathematics. Then we scrutinized each connection, asking ourselves in each case if the thinking happening in mathematics could be built upon to forge a pathway into computing.

Unsurprisingly, we found lots of variation, including both connections that surprised us in their potential — like ways third graders think about when to stop counting as they subtract, and how well those map onto different kinds of loops — and surface-level connections that got thorny when we thought through the details — like the many subtle differences among examples of conditional-like thinking in K-5 math, and how few of the examples seemed synergistic with computing (or at least the kinds of computing kids might do in elementary school or shortly after).

There are lots of both kinds of examples in the paper, although we did not have enough room to explain most of them at the level of detail I might have liked.

More than the specific examples, though, there are two bigger ideas that I took away from writing this paper.

First, the question of whether or not CT-like ideas that appear in mathematics are useful leverage points for starting computing instruction, or even building readiness for later computing instruction, can’t be decided at a general level. There is no overall, general answer to this. Not all of mathematics is going to support computing instruction. On the other hand, not all integration of CT into mathematics is meaningless. We have lots of work to do figuring out our best avenues.

Second and even more importantly, working on this paper helped me (or at the risk of speaking for my coauthors, us) to articulate for myself a truth I think is both fundamental and often forgotten in debates about unplugged CT: Skilled curriculum development can’t be done one activity at at time. Ideas aren’t learned through one activity. They are developed across a curriculum. Evaluations of whether what students are doing is or is not CT don’t make sense to me when pointing to one activity completed in one hour of one school day. The finish line could be miles away, but that doesn’t mean kids aren’t making progress towards it. We need to think about development of ideas across time and give kids and teachers the space to think and learn.

I’m becoming less and less interested in debates about what CT is or is not. It needs to be decided, perhaps, but I’m willing to let others hash that out. I’m more interested in using the admittedly ill-articulated conceptualizations of CT I have so far, and imagining and studying how we can psychologize it for kids a al Dewey (1902) and spiral a curriculum around it a la Bruner (1960). Go ahead and keep shifting the finish line a bit. I’ll just keep trying to point my elementary school students in the right direction.

References

Bruner, J. S. (1960). The process of education. Vintage Books.

Dewey, J. (1902). The child and the curriculum (No. 5). Chicago, IL: The University of Chicago Press.

 

New Conference Paper: Time Tracking and Fact Fluency

I just returned from the 2019 AERA Annual Meeting in Toronto! It was a great meeting where I heard about lots of interesting research, met new colleagues, and best of all, got to present some of my own work.

My contribution to AERA this year was a paper I wrote with my friend and colleague, Dr. Meg Bates of UChicago STEM Education. A few years ago, we were given access to de-identified data from kids using an online fact practice game associated with the Everyday Mathematics curriculum. One of the most interesting features of the game is that students self-select one of three timing modes:

  • They can play in a mode with no time limit or tracking, where they take as much time as they need to answer each question and no information about their speed is reported to them.
  • They can play in a mode called Beat Your Time, where they still take as much time as they need to answer each question, but their total time is reported at the end of the round and compared to their best previous time. So, time is tracked but not limited.
  • Lastly, they can play in a mode with a 6 second time limit on each question.

When we noticed this feature of the game (and its user data), we starting digging into research on timed fact drills. It’s a highly discussed and controversial issue in elementary mathematics education. On one hand, several prominent researchers argue the potential connections to mathematics anxiety and inhibition of flexible thinking outweigh any benefits (e.g., Boaler, 2014; Kling & Bay-Williams, 2014). On the other hand, it’s well established that efficient production of basic facts is connected to later mathematics achievement (e.g., Baroody, Eiland, Purpura, & Reid, 2013; Geary, 2010). And arguably, even if it does not have to be discussed directly with kids, efficiency involves some amount of speed.

Overall, we were surprised at the inconclusive nature of the research when taken as a whole. There may be connections between timed fact drills and outcomes we don’t want (like math anxiety), but there has not been much unpacking of what features of timed testing are problematic. The game data — in particular, the contrast between the time limit mode and the Beat Your Time mode — gave us an opportunity to look at one particular issue: Does a focus on time always lead to detriments in fact performance, or is it specifically when time is limited?

Our analysis suggests that time limits may be the culprit. We compared students’ overall levels of accuracy and speed across modes, and found that students playing in the time limit mode had significantly (and practically) lower accuracy than when the same students played in the other two modes — but there was no practical difference in accuracy between the no time mode and Beat Your Time mode. So, in short, time limits were associated with lower accuracy, but time tracking was not.

When it came to speed, students were fastest in the time limit mode, but were still faster in the Beat Your Time mode than in the no time mode. So, Beat Your Time mode seemed to promote speed, without the detriment to accuracy associated with the time limit mode.

We were excited by this result. Although we can make no causal claims, the results do suggest that challenging kids to monitor their own speed when practicing facts could support the development of speed without promoting anxiety or other negative outcomes that can lead to lower accuracy (and general bad feelings about math). Although we did not see this result coming, it does make sense to us, upon reflection, that self monitoring could be helpful. In the future, we hope to do (or inspire others to do) more research on how metacognitive strategies could be applied to fact learning.

You can read the conference paper here and check out my slides here.

References

Baroody, A. J., Eiland, M. D., Purpura, D. J., & Reid, E. E. (2013). Can computer-assisted discovery learning foster first graders’ fluency with the most basic addition combinations? American Educational Research Journal, 50(3), 533-573.

Boaler, J. (2014). Research suggests that timed tests cause math anxiety. Teaching Children Mathematics, 20(8), 469-474.

Geary, D. C. (2010). Mathematical disabilities: Reflections on cognitive, neuropsychological, and genetic components. Learning and Individual Differences, 20, 130-133.

Kling, G., & Bay-Williams, J. M. (2014). Assessing basic fact fluency. Teaching Children Mathematics, 20(8), 488-497.

New Conference Paper: CT Implementation Profiles

Hi, all.

I just returned from the 2019 Annual Meeting of the Society for Information Technology and Teacher Education (SITE), where I made my first official presentation for the CT4EDU project. I wanted to share a bit about the paper and presentation for any interested folks who were not there.

We’re just finishing up our pilot year of the CT4EDU project. The project is an NSF funded research-practice partnership (RPP). Michigan State University (PI Aman Yadav and co-PIs Christina Schwarz, Niral Shah, and Emily Bouck) is working with the American Institutes for Research and the Oakland Intermediate School District to partner with elementary classroom teachers to integrate computational thinking into their math and science instruction. Over the spring and summer of 2018, we introduced our partner teachers to four computational thinking ideas: Abstraction, Decomposition, Patterns, and Debugging. Then we worked with our partner teachers to screen their existing mathematics and science lessons for opportunities to enhance or add opportunities for students to engage in these ideas. In the fall, the teachers implemented their planned lessons and we collected classroom video. (Note that in this first round of implementation, all of the lessons were in unplugged contexts.)

One of the first things we noticed was that there were some clear differences among teachers’ implementations of CT. In this work-in-progress paper, we share three patterns of implementation that we identified:

Pattern A: Using CT to Guide Teacher Planning
Some teachers were explicit within their plans about where they saw the CT ideas in their lessons, but did not make the CT ideas explicit to students during implementation.

Pattern B: Using CT to Structure Lessons
Among the teachers who did make CT explicit to students, some focused a lesson strongly on one particular CT idea. We described this pattern as structuring the lesson around a CT idea.

Pattern C: Using CT as Problem-Solving Strategies
Other teachers who made CT explicit in implementation seemed to reference the CT ideas more opportunistically. Rather than structuring opportunities to engage with one CT idea, they pointed out connections to one or more CT ideas as they worked through problems.

We’re looking forward to exploring how these different patterns of implementation relate to student thinking about CT as we go into our last year of the project — particularly as we begin considering ways to bridge students’ work in unplugged contexts to plugged activities.

You can find the conference paper here.

There is a version of the slides here.
(Sadly, the slides are missing the classroom video, which is clearly the best part of the presentation!)

Many thanks to everyone who came to my presentation.

New paper: Debugging LT

The LTEC project has a new paper out in the SIGCSE 2019 proceedings, authored by myself and my colleagues Carla Strickland, Andrew Binkowski, and Diana Franklin. It’s a new addition to our series of SIGCSE and ICER papers that detail learning trajectories that we developed through review of CS education literature. This time, the trajectory is about debugging (Rich, Strickland, Binkowski, & Franklin, 2019).

(If it’s helpful, you can read my description of what a learning trajectory is here.)

Although the overall approach we used to develop all of our trajectories was basically the same, we’ve tried to make a unique contribution in each publication by making particular parts of our process transparent through each paper. In our paper from SIGCSE 2017, we talked about the overall literature review and what we noticed as we examined the learning goals embedded in the pieces we read. In our paper from ICER 2017, we shared how we adapted our overall process from other work in mathematics education and focused on our synthesis of learning goals into consensus goals. In our paper from ICER 2018, we focused on one trajectory to give us room to discuss every decision we made in ordering the consensus goals.

This time, in addition to sharing a new trajectory, we also highlighted how we used the theoretical construct of dimensions of practice (Schwarz et al., 2009) to help us organize our consensus goals. We’re also really excited to be able to share more about the role that our learning trajectories played in the curriculum development we’ve been working on for two years now. We’re dedicating a significant piece of our presentation at SIGCSE to sharing an activity we are really proud of and how the trajectory shaped its development.

If you’ll be at SIGCSE, we hope you’ll come and check us out on Friday at 2:10 in Millennium: Grand North! (If you don’t come for me, come for Carla! She’s a great speaker whose PD facilitation is famous on Twitter.)

If not, please check out the paper if you are interested. Right now, the link above and the one on my CV page take you to the normal ACM digital library page. I’ll be switching the link in my CV to a paywall free version as soon as the Author-izer tool links this paper to my author page. At that time, we’ll also be sure to add a paywall-free link to the LTEC project page.

Although we have one more learning trajectory (on variables) that has been developed but not yet published, I suspect this might be the last conference paper from this work that I first author. The project is continuing to do wonderful work and you’ll be hearing more from us, but I’m into the thick of graduate school and not nearly as involved in the work any more. So, I just want to say that working with my colleagues at UChicago STEM Education on this line of work has been among my proudest and most gratifying professional experiences. I want to thank all of my collaborators, and also say a particular thank you to Andy Isaacs and George Reese, as without their graciousness I never would have had the opportunity to co-PI the project.

I’d also like to say thanks to all the folks in the CS education community who have been so receptive of our work and offered us such wonderful and helpful feedback. We’re particularly gratified for the shoutout that Mark Guzdial is giving us in his SIGCSE keynote this year.

From the bottom of my heart, thanks to all of you for making this longtime math educator who wandered into the CS education space feel welcome and like her contributions are worthwhile.

References

Rich, K. M., Strickland, C., Binkowski, T. A., & Franklin, D. (2019). A K – 8 debugging learning trajectory derived from research literature. In Proceedings of the 2019 ACM SIGCSE Technical Symposium on Computer Science Education (pp. 745–751). New York: ACM.

Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Acher, A., Fortus, D., Shwartz, Y., Hug, B., and Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632–645.

 

 

Vocabulary, Part 3: What I Learned (3rd edition)

This is my last post for the semester! At the end of my last two semesters, I wrote posts listing five things I had learned. I figured since this post is also serving as Part 3 of my vocabulary series, I’d try highlighting five things I caught myself saying recently that illustrate some significant learning over the last year and a half.

 

  1. As our last research team meeting, in a discussion about a study my lab-mate is planning, I said: Oh, so if that’s when you’re doing the measure, that works. Then it’s a delayed treatment design. I entered my research design class rather skeptical at the beginning of this semester, but it’s clear I picked up some useful vocabulary for talking about research. Even if I understood the concept of delayed treatment before, I couldn’t articulate it.
  2. It has been similar with specific processes of data analysis. A few days ago, my office-mate asked me something to the effect of: You have three research questions but you’re using content analysis for all three, right? My response made very specific use of the terms content analysis, thematic analysis, linguistic analysis, and discourse analysis with a particular meaning behind each one. I can’t write perfect definitions, but I understand the difference. A year ago those would all have had the same connotation to me. (Essentially, they all meant to look at text and try to find patterns. Which is not entirely wrong, but overgeneralizes.)
  3. In a class recently, while talking to the professor about an assignment, he said, You always seem to anticipate me disagreeing more than I actually do. My response? Well, I mean, I’m just participating in discourse as I’m thinking. This was a joke — one you probably won’t get unless you have read some of Sfard’s work recently. I was joking that as I’m writing papers, I have a hypothetical discussion with my mentors in my head, trying to anticipate how they’d respond. This fits with Sfard’s (2008) notion of thinking as communicating with oneself, a theory we had discussed that day in class. The joke isn’t all that funny, even if you do know Sfard, but I did think it was interesting the way I was able to spontaneously use her definition of participation in context.
  4. This semester, I wrote and rewrote my practicum proposal justification at least three times from scratch. This was a rough experience in a lot of ways — wildly frustrating and anxiety-provoking. But I have to admit that when I finally landed on an approach that was working, I thought to myself, Oh my gosh. I think I know what a concept paper is! We talked about concept papers in one of my courses last year, and even after reading examples and attempting to write one, I really had no idea what it was. In the end, I’m reasonably sure the first half of my practicum proposal became a concept paper. I could use that term correctly now in conversation. That feels like a victory given how much I struggled with it last year.
  5. And now for a bit of sappiness (it’s the holidays, after all). When I started my master’s program, I can remember needed to learn the specific meaning of cohort used by academia. I knew of the word before that, but I knew it as an old-fashioned way to refer to a friend, collaborator, or partner-in-crime. I didn’t know the collective meaning of a group of students who enter a program together. I re-learned it again over the last year and a half, and it’s become even more meaningful as a PhD student. My cohort is my tribe, and I’m grateful for them.

Have a lovely break, everyone. Reflect on all you’ve learned. Try not to think too much about how much there is still to go.

Reference

Sfard, A. (2008). Thinking as communicating. Cambridge: Cambridge University Press.