Information Technology timeline
The next diagram shows the main advances in IT relevant to our concerns, i.e., the usability of operating system, applications and information handling approaches of the nearby decades, including a forecast for the next years. Every decade is entitled with its dominant technologies.
Computers started to get out of the labs and be intensively used by corporate and governmental organizations in the 60's and the 70's, in the shape of big hardware pieces containing all components for processing and storing information, so-called 'Mainframes', to which users could access through a 'terminal' (a device with screen and keyboard).
Operating systems (such as UNIX) and their specific applications were handled by text-based commands, which made it difficult for the common users to learn the operation of such systems. Information was stored in files, recorded in tapes, and in databases, recorded in hard drives.
Personal Computers (PCs)
In the 80's, smaller and affordable computers appeared in the market, which make that small companies and private users could take advantage of computing technology. Clones of the IBM's PC design had an enormous success.
The dominant operating system was Microsoft's MS-DOS, and the applications were still based on text commands, but the essential difference with mainframes is that the OS and apps where executed locally, so the owner of the PC was responsible for their proper working, which increases the difficulties for the non-expert users. Information was stored mainly in files and directories, recorded in floppy or hard drives.
Graphical User Interfaces
The invention of graphic-based commands was a true revolution in terms of software usability, enabling almost every person to “dare” to control the computer. In 1984, the Apple's MacOS and companion applications introduced such paradigm into the market, although it was the Microsoft's Windows which truly popularized the GUI software on PCs from the early 90's.
GUI systems continued organizing information in files and directories, although showing a graphical representation of such entities. Directories were renamed as 'folders'.
In parallel to the incessant GUI popularization, the Internet phenomenon started to grow in the form of millions of web pages and electronic mailing. Although computer networks had been invented before the 60's, the introduction of web browsers and graphical e-mail readers made it possible for all kinds of people to communicate to each other through a single 'hypernetwork'.
In the 2000's, web apps appeared as the new way to process information inside a web browser. Indeed, web applications have become the regular graphical interfaces to servers, since the app actually executes (totally or partially) on a remote computer.
Although any web browser works just like any other application in the local computer, its functionality is rather similar to that of the operating system, since the browser provides the computing resources that web applications need to run. Web apps information is stored in the server databases, but it is usually represented to the user as 'files & folders', which can be downloaded to the local computer storage.
In the current decade, the 'cloud computing' concept has become popular. Actually, this approach relates to advances in task distribution and efficient scalability of virtual and real servers. From the point of view of common users, however, working with a cloud-based app renders the same behavior as with a server-based web app. Therefore, there is no breakthrough in terms of software usability.
In the other hand, the information management approach of cloud computing has slowly shifted from the traditional 'files & folder' into a more “database-like” orientation, since the cloud app provides the user with all data handling commands but at the cost that the data is “trapped in the cloud”. Nevertheless, the cloud apps also provide with options to download that data as files in the local drives.
Also in the current decade, the GUI systems for traditional computers (desktops and laptops) must evolved for the specific constraints of mobile devices (reduced screen, touch-based input, etc.). Hence, the mobile OSs and apps are designed with a reduced number of commands, avoiding large menus or hidden options. Therefore, mobile apps are easier to use than web apps, which in turn are easier to use than computer apps, but at the cost of reducing functionality.
For the information handling aspect, data is just represented and managed inside the apps, but it can be synchronized across multiple frameworks (desktops, mobiles and clouds). In the Apple's iOS system, apps keep the data objects internally (in specific databases), while the Google's Android apps can operate on the Linux file system to “expose” the data objects as files and folders, although common users never utilize such functionality.
The near future of IT
In next decades, it is expected that nano and biotechnology will change dramatically the design of sensors and automatisms which, jointly with robotics advances, will arise as the new generation of “intelligent” devices that, eventually, will be connected to our IT systems, which is currently known as 'the web of things'.
In this scenario, the design of device apps will be the next challenge for software developers, as well as the design of an 'ubiquitous OS' able to smoothly control all those types of devices from an homogeneous framework, and to integrate the great variety of information generated by those devices into some kind of a 'hyperdatabase'.
At the end, the future IT systems will have to do much more things than current ones, while they surely will be much easier to operate.
Conclusion: IT usability is continually improving
From this overview of IT advances, it is clear that usability and user experience has been receiving increasing attention from software developers, and the cryptic text-based systems have evolved into much user-friendlier graphical, touchable and portable systems. As a prove, let us note that the number of IT users has been growing significantly since the late eighties and early nineties. For example, the next plot shows the remarkable growth of Internet users since 1995:
(source: The Open University)
However, it is habitual to see many users having problems when dealing with their latter-day computing devices. In page 'User's handicaps' we analyze some of the reasons for this apparent paradox.