What is a computer system?
What does computer system mean?
A computer system is a basic, complete and functional hardware and software configuration with everything needed to implement computer performance.
This is the basic definition of the computer system as we know it, but it has undergone many formal changes over the past decades.
Techopedia explains the computer system
While this definition at first glance seems rather abstract, there are some fundamental aspects of computing that a computer system must facilitate.
First, there is the ability to receive input from the user. Then there is the ability to process the data. It is also possible to create information for storage and output.
It’s a computer system in a nutshell, but understanding what a computer system is also involves looking back at the chronology of how the computer has evolved over the decades.
Computer systems: first models
To look at the history of a computer system, we have to go back to Charles Babbage’s differential machine. This computer, (which was never fully built), predated and foreshadowed the mainframe and large-scale computers of the early 20th century, the machine of Von Neumann and his ilk, as computers, bulky and monolithic, first appeared in the human world.
Then, the personal computer or desktop computer was born. This model persisted for a long time, where the computer case or shell was the central hardware and used peripherals like a monitor, keyboard, and mouse, as well as software that was fed to the computer via floppy disks.
The operating system emerged early on as a convention to support the entire computer system in the box and ensure that users had a universal way to manage the software that was running on that hardware.
Next, in addition to the operating system, we found out about files, applications and executables, software products that are provided to run on a given operating system.
Over time, as Moore’s Law continued to apply and hardware got smaller, the laptop was born. Then came the mobile phone, and eventually the peripheral interface model with the mouse, keyboard, and monitor plugged in was replaced with a single touchscreen device, so no peripherals were needed.
At the same time, a key software breakthrough also applied. Cloud and software as models of storage meant that software was delivered digitally over the Internet, instead of being sold on physical media such as floppy disks and, later, compact discs. âOut-of-the-boxâ software has become somewhat obsolete, particularly in enterprise computing.
More recently, virtualization has revolutionized the way we think about hardware and software configurations. A modern computer system may not consist of a piece of hardware itself, rather it may consist of a virtualized computer system or a virtual machine that uses the resources of a grid to operate.
So what we think of as a computer system has changed in form, but not in substance. It still has all of these basic features: receiving user input, processing data, and storing information – it just does them in a much more elegant and capable way.
As the interface evolves and we approach a new world of AI and machine learning, we see how powerful computer systems can be.