Etusivu


25. 2. 2025

I read the article "New Junior Developers Can't Actually Code" by Namanyay 
(https://nmn.gl/blog/ai-and-learning) and felt that I also need to write
something about the subject.

Background: I am a self-taught computer user from Finland, born in 1993. I have
owned a computer since I was nine years old. I started using linux in 2005 and
have always been interested in maintaining my own server. I have learned how to do
that, by doing it. I have also taught myself how to code in various programming
languages, including C, Assembly, PHP, JavaScript and BASIC (not in any 
particular order). I have written my own operating system for IBM PC compatible
computers.

Almost no-one from my generation can use computers as good as I do, but they at
least understand the basics. They are able to install programs, and they are
(or at least used to be) able to understand the basic concept of server-client-
communication. They know and understand the concept of files and directories.
They easily understand the concept of a command line interface and learn few
commands quickly. And why wouldn't they? All of that is stuff that a human with
a normal level of intelligence should be able to do.

The same things cannot be said about the so-called "generation Z". They cannot use
computers. Most of them literally don't understand the basic concepts - and this
is also true for those who study computer sciences. They don't even have the
very basic knowledge that would be required to actually understand the more
advanced concepts of computers. MOST OF THEM DON'T EVEN HAVE A COMPUTER. Often
they get a personal computer from their school, and that's usually the first
computer that they have ever had in their possession.

Based on what I understand, there are usually two ways how the students return
the programming assignments: They either write the piece of code and send it to
the teacher, who then compiles it and sees that it runs properly, or the student
uses PuTTy or some other SSH client to connect to a remote computer, compiles
their code there and runs it. The "school laptop" has all necessary programs for
that pre-installed. The student does not know that they are remotely using a
computer that runs some kind of *nix operating system, nor do they know that
they are using the SSH protocol to do that. They don't understand any of that
stuff - they are just "using PuTTy" to do some things.

The "school laptop" is usually very restricted in what it can do. The Gen Z
student does not have root privileges to it, and they cannot install programs
to it. They are not allowed to run any other programs than the pre-installed
ones on it, which also means that they cannot run and test their own code
locally. They cannot install a virtualization software and create virtual
machines to try different operating systems or any low-level stuff - they don't
even know what a virtual machine is. They cannot even grasp the concept of it,
as they are also unable to do with stuff like command lines and filesystems.

The thing that strikes me the most is their complete lack of interest towards any of
this stuff, and also technological things in general. In addition to the fact
that they don't know anything about computers, they are also not interested to
learn. Somehow it does not bother them at all that they lack crucial information
to even understand what they are doing. They just want a programming job and are
everything else is indifferent to them. Actually knowing something about the
workings of some technical device is considered a "boomer thing".

Becoming a good programmer is impossible without knowing the basics and also
advanced concepts of computing. In universities the bar has already been set
very low to make sure that enough students pass the courses. It is now possible
to get a computer engineering degree without being able to install an operating
system, or even knowing what an operating system is, so of course they don't know
how to code.

The students alone are not to blame. There are also flaws in the teaching system.
In Finland we have this thing called "digital skills teaching" in all levels of
education. The teaching is done so that the student doesn't actually learn the
concepts or anything that would be useful in general - instead it mainly focuses on
using a specific computer program. For example, when studying spreadsheets,
the curriculum is very carefully structured so that every learned thing only
applies to the newest version of Microsoft Excel, and not to spreadsheet programs
in general. The computers in the class didn't even have the Scroll Lock button,
which is one of the most important buttons for navigating efficiently in a
spreadsheet table. Naturally word processing in schools also isn't about actually
learning word processing - instead they only teach how to use the newest version
of Microsoft Word. And the same problem exists with profession-specific programs
too in the upper levels of education.

The computers have also changed. In the past the computer used to do everything
that the user put it to do. In that sense an IBM PC is not really that much
different from those archaic computers that ran programs from a punched tape. 
Instead of a punched tape you now had a boot sector. The only thing that changed
was the media that is used to pass instructions to the computer - the computer
still did everything that the user wanted it to do. But that's not the case
anymore. Most new computers have all kinds of boot restrictions and other DRM 
stuff and cannot be used to try-and-learn low-level stuff anymore.