COVID-19

The health and safety of our participants and employees is our top priority. All events are suspended until further notice.

Learn More

Thinking Computationally About Computational Thinking

Jonathan Weber

Jonathan Weber

About the Author

Jonathan is a digital steward on Pinnguaq’s delivery team where he travels to communities to provide hands-on opportunities to develop digital skills, such as coding and digital art, for both students and adults. He has bachelor degrees in chemical engineering and computing technology, as well as a master’s degree in education, and has previously worked as a software developer and analyst in government and higher education.

App Development, Computational Thinking, Computer Fundamentals
Article

“Computational thinking” is a phrase that seems to be popping up more and more as of late. At its core, computational thinking means being able to understand computational concepts and processes, and use them to approach and solve problems. Over the course of this article, I will be describing and using four concepts (decomposition, pattern recognition, algorithms, and abstraction) to explain computational thinking. Through that process, you will hopefully build a better understanding of each concept and computational thinking as a whole. I chose these concepts because they are probably the most commonly and most notably found together to describe computational thinking, but yet they don’t represent the entirety of what computational thinking contains.

Why computational thinking?

“But Jonathan,” you might ask, “why would I want to learn about computational thinking? What possible value could this have?” To that, I will provide two answers: one is a very big-picture reasoning; the other is a very practical reasoning.

The Big Picture

Nationally and globally, there seems to be a surge of interest in developing digital literacy and digital skills, especially in young people and with good reason: as economies and societies modernize and technologize, digital literacy and digital skills are becoming increasingly necessary to be able to understand the world and fully participate in everyday activities. This interest is driving educators, academics, and policymakers, among others, to look for ways to integrate digital- and technology-related content into their teaching and curricula to achieve this goal. Including us, many of the organizations and initiatives that are actively pursuing this goal (code.org, the BBC, the International Society for Technology in Education, the K–12 Computer Science Framework, to name a few) support the development of computational thinking skills as one such way to achieve this goal.

To build an understanding of where the technology we use comes from, how it was made, and how it functions, you cannot avoid learning about computers and computing, hence the “computational” aspect of computational thinking. You also need to be able to adjust your perspective or “thinking” to align with the way problems are codified, expressed, and solved in a computational world. The shift in thinking required to think computationally leads into the second point.

The practical reason

While developing computational thinking skills certainly sets the stage for the development of computational artifacts including code, programs, and applications, computational thinking skills themselves can be thought of as potentially new, different, or useful ways of approaching problems and understanding even the non-computational world. The computational thinking concepts that I will be using as examples below can be used in all sorts of contexts. In fact, since I won’t be writing any code, I will be using them in a decidedly non-computational fashion, here in this article. Practically speaking, developing computational thinking skills and understanding computational concepts simply means that you have more tools in your problem-solving and world-understanding toolbox.

The four-concept model of computational thinking

Source: BBC, “Introduction to computational thinking”. Retrieved from https://www.bbc.co.uk/bitesize/guides/zp92mp3/revision/1 

As mentioned above, the four computational concepts of decomposition, pattern recognition, algorithms, and abstraction are one of the more common and more popular groupings of computational concepts. The four-concept model is in use by the BBC and code.org, among others, and so it is the one that you will most likely encounter in the wild, at least at first. To help illustrate each concept, I will be applying them, as best I can, to describing and explaining computational thinking. Just a note: the term “problem” is a convenient way to frame the use of computational thinking in terms of developing a “solution”, but computational thinking is just as much about ways of seeing and understanding the world as it is about coming up with solutions, however interrelated those two things may be.

Decomposition

Decomposition is the process of breaking bigger and more complex things apart into smaller, more easily manageable and understandable units. The idea of decomposition is that the smaller units of a problem might lend themselves more readily to being solved or have already been solved. I’ve actually already applied decomposition to some degree to help develop our understanding computational thinking, for example:

  • Earlier, I decomposed the term computational thinking into the two constituent words of “computational” and “thinking” and looked at what each word contributes to the meaning of the term.
  • I’ve broken the overarching concept of computational thinking into these four computational thinking component concepts, which, even if you’re going solely on the names of the concepts, might already give you some idea of what they are.
  • I could possibly also break apart the words that make up the names of these concepts and examine their components (e.g. their etymologies) to try to understand them further.

Even just looking for the different ways to decompose a problem can lead you to developing a better understanding of the problem and potential insight into a solution.

Pattern recognition

Pattern recognition is the process of looking for patterns to help you understand, express, or model a problem. This could mean patterns in the occurrence of the problem, patterns in what the problem produces, or other problems similar to the problem that you’re currently dealing with. The idea is that finding patterns could reduce the complexity of the problem at hand by making it possible to re-express the problem as the repetition of a smaller, easier to solve problems. It can also be used to extend how our knowledge might be applied. It is in this sense that pattern recognition helps us understand computational thinking: I can recognize the pattern of computational terms being applied more generically, that is, in a non-computational sense, which would give me insight into what other computational concepts might be included beyond the four-concept model. 

Admittedly, applying pattern recognition to understand computational thinking does not provide that much more information immediately. It does do two things, however:

  • It gives an idea of where else we might look to better develop our understanding of computational thinking.
  • It illustrates that not every computational thinking concept is equally useful or applicable to every problem.

Algorithms

An algorithm is simply a set of steps or instructions designed to perform a specific operation or complete a specific task. Applying algorithmic thinking can mean expressing a proposed solution to a problem in a structured, algorithmic manner, making clear the expected inputs, decision points, and expected outputs. Expressing a solution in this way allows you to systematically tweak the various elements of your solution, potentially enabling you to narrow down possible solutions to find a viable, functioning, or correct solution. Algorithmic thinking can also be applied to understand existing algorithms, that is, trying to break apart complex or unknown processes into smaller, simpler, and known steps. 

Applied to the task at hand of building an understanding of computational thinking, I can use algorithmic thinking to systematically examine each of the four computational concepts in similar ways, accomplishing the dual purposes of creating a predictable structure for you, the reader, to follow, as well as making sure that each concept is covered in sufficient detail.

Abstraction

Abstraction is the process of creating simplified forms or representations of more complex things. The idea behind abstraction is that not all details of a problem are always relevant all the time. Abstractions can help us understand a problem by hiding the complexity and details of some of the problem’s parts thereby allowing us to focus on fewer, more essential aspects. It can also help us find a solution by potentially re-expressing a problem in such a way that it resembles, partially or otherwise, an already solved problem. When I think of applying abstraction, I try to think about what “type” of thing something might be. For example, if I were trying to understand a blue jay, I can abstract away the specific details of the blue jay to realize that it is a kind of bird. Now, if I knew things that were true about all birds, I would be able to also apply that knowledge to the blue jay.

Applied to understanding computational thinking, I would say that computational thinking can be abstracted as a collection of thinking processes or problem-solving approaches. In this sense, this abstraction of computational thinking might serve to demystify the “computational” aspect and open up possibilities for the computations concepts within it to be applied in non-computational settings.

Conclusion: Not just thinking like a computer

While the approach to the core concept of this article—using computational thinking concepts to help develop an understanding of computational thinking—was meant to be a bit tongue in cheek, I hope that it also served to describe to you the practical applications of computational thinking. I will leave you with an excerpt from a book written by Seymour Papert, titled Mindstorms. In Mindstorms, Papert imagines how computation can be used to, among other things, help children learn and connect to and better understand what they learn. It is here that Papert positions computational thinking as one of many ways of knowing. He writes:

he advice “think like a computer” could be taken to mean always think about everything like a computer. This would be restrictive and narrowing. But the advice could be taken in a much different sense, not precluding anything but making a powerful addition to a person’s stock of mental tools. Nothing is given up in return. … In my experience, the fact that I ask myself to “think like a computer” … simply opens new ways for approaching thinking. … True computer literacy is not just knowing how to make use of computers and computational ideas. It is knowing when it is appropriate to do so.

Papert 1980, 155

Further reading

Code.org (https://code.org/curriculum/course3/1/Teacher)

BBC (https://www.bbc.co.uk/bitesize/guides/zp92mp3/revision/1)

K–12 Computer Science Framework (https://k12cs.org/computational-thinking/)

Notes

Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: what is Involved and what is the role of the computer science education community? ACM Inroads, 2(1). Retrieved from https://dl.acm.org/doi/10.1145/1929887.1929905

Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books: New York, NY.