Even at the hight of the digital revolution, schools all over the world seem to be struggling with what computer literacy should mean for them.
The most technophile view is that everybody should learn how to program in order to take full advantage of the creative potentials of computer technology.
With a focus on the digitizing of everything, another position is that everybody should at least be a competent user of computer technology, including the ability to create digital content and not just consume it.
Finally, some argue that computers are starting to be mature and ubiquitous enough that they are just a tool that should fade into the background like books or blackboards. After all, there is no subject dedicated to the study of paper & pencil technology - even though K-3 students do spend a lot of time learning to master those basic writing utensils.
While I agree that some level of programming or algorithmic thinking should be part of any general education, not everybody needs to be a programmer or be able to design computer systems.
But I also don't think that anytime soon, computers will be as transparent and self-effacing to their users that they matter as little to most people as what kind of air we breathe or what kind of paper we write on.
The sad truth about computer usage is that there are still many incompatible platforms and systems, many of them obsolete within a few years. Any user needs to be prepared to constantly learn new systems, even for doing roughly the same tasks - e.g. writing a letter or doing some calculations.
The only knowledge of lasting value in such an environment is to understand concepts and ideas and not just the most popular current implementation.
In order to achieve a more abstract, meta-level of understanding one needs to be exposed to several different instances of the same concept or idea. For example, learning foreign languages often heightens the understanding of our own native language and opens the door to understanding culture on a more abstract level. For centuries, a classical education has included the study of Latin as a foundation to understanding European languages and culture, regardless if anyone actually needed to speak Latin in their daily lives. The difference between education and skill training is that the former should strive towards enabling students to think on their own and understand the concepts behind what they are doing. The same should also be true for computer education.
For those reasons, schools should resist the temptation to use the most common, most popular computing platform on the premise that they are the easiest to use and best prepare students for the real world. Yes, like studying Latin, other foreign languages or the fundamentals of mathematics, doing so is hard work for both students and teachers. But we should be ready to put in the extra effort for the sake of gaining a deeper understanding.
Substituting the market leading commercial software products with their closest open-source equivalents has become a feasible way to a workable all-round computing environment for schools, specially thanks to education focused platform efforts like Raspberry Pi, Lernstick edubuntu or others.
In addition open-source systems have the specific advantage of not imposing any limits to curiosity. Open-source software continues the academic culture of intellectual and scientific freedom, where arguments are made in the open and are always open to scrutiny.
With last weeks announcement of the Raspberry Pi 2, we also learned that the new version will be able to run Windows 10 - free of charge to qualifying users. The promise of making things easier and more relevant for users in schools might turn out to be a Pyrrhic victory.
Schools should not just use open-source because it's cheap and possibly a political/ideological statement, but mostly because it is the right thing to do for fundamental didactic reasons.