top of page
Search
saoulowmortwhopoli

Evolution and History of Computers – PPT PDF: The Milestones and Challenges of Computing



Apple Computers, Inc. was founded on April 1, 1976, by college dropouts Steve Jobs and Steve Wozniak, who brought to the new company a vision of changing the way people viewed computers. Jobs and Wozniak wanted to make computers small enough for people to have them in their homes or offices. Simply put, they wanted a computer that was user-friendly.1


Jobs and Wozniak started out building the Apple I in Jobs' garage and sold them without a monitor, keyboard, or casing (which they decided to add on in 1977). The Apple II revolutionized the computer industry with the introduction of the first-ever color graphics. Sales jumped from $7.8 million in 1978 to $117 million in 1980, the year Apple went public.2




Evolution and History of Computers – PPT PDF



Over the course of a few years, Apple's market share suffered slowly after its peak in 1990 and by 1996, experts believed the company to be doomed. It was not until 1997, when Apple was desperately in need of an operating system, that it bought out NeXT Software (Jobs' company) and the board of directors decided to ask for some help from an old friend: Steve Jobs. Jobs became an interim CEO, or iCEO as he called himself (Jobs was not officially the CEO until 2000). Jobs decided to make some changes around Apple. He forged an alliance with Microsoft to create a Mac version of its popular office software. Not long after this decision was the turning point for the company. Jobs revamped the computers and introduced the iBook (a personal laptop) followed by iPod, an mp3 player, which became market leader.


"The Morris" was the first Computer virus which spread extensively in the wild in 1988. It was written by Robert Morris, a graduate student from Cornell University who wanted to use it to determine the size of the internet. His approach used security holes in sendmail and other Unix applications as well as weak passwords, but due to a programming mistake it spread too fast and started to interfere with the normal operation of the computers. It infected around 15,000 computers in 15 hours, which back then was most of the internet.


In 1991, the "Michelangelo" virus was first discovered in Australia. It would lay dormant until 6th March every year and then overwrite the first one hundred sectors on the storage devices with zeros, preventing the computer from booting. Only 20,000 computers were reported infected.


2000 was the year of "iloveyou". Again, it came via email however it sent itself to all contacts. It also overwrote office, image, and audio files. The virus came from the Philippines and infected over 50 million computers in less than 10 days. Most companies back then decided to turn off their email servers to stop spreading the virus.


In 2013 the new form of ransomware started with the CryptoLocker virus. There have been many new versions of this virus including Locky and WannaCry, as well as Petya (not the latest version). The original CryptoLocker virus infected about half a million computers in its original version. Some of these clones, such as TorrentLocker or CryptoWall, were specifically designed to target computers in Australia.


This year we have had virus attacks which spread very fast: WannaCry and NotPetya. Both of these viruses used a security hole within the protocol Windows uses to access files over the network (SMB). This security hole, named EternalBlue, was made public by a Hacker group called "Shadow Brokers", who stole it from the US National Security Agency (NSA). Although Microsoft released a patch for this vulnerability in March 2017, the number of systems worldwide based on obsolete/unsupported software, or that had not yet applied the latest updates, allowed WannaCry to gain a strong foothold through a phishing email attack. WannaCry infected around 200,000 computers across 150 countries before the "Kill switch" was discovered and stopped the virus from spreading further.


Computer-aided manufacturing was also developed in the 1950s, when computers were used to create G-code which was in turn translated into punched cards that could control machines. Punch tapes were produced through computer control, which could then increase the speed of both instruction creation and manufacturing.


By the 1990s, huge numbers of personal computers were being connected as the technology became more affordable. Until finally, in 1999, Salesforce became the first company to offer applications over the internet, heralding the arrival of Software as a Service.Three years later, the industry grew massively with video, music and other media being hosted and delivered online.he creation of UX design meant that lay people were gaining access to data previously reserved for programmers and the code literates.


On August 25, 2006, Amazon Web Services launched Elastic Compute Cloud (EC2), enabling people to rent virtual computers and use their own programs and applications online; this was quickly followed by Google Docs Services. One year later, binge-watching became a thing when a small start up called Netflix launched its video streaming website.IBM jumped on the bandwagon with SmartCloud and Apple launched iCloud. Around the same time, Oracle released its own Cloud.


A computer is an electronic machine that accepts information, stores it, processes it according to the instructions provided by a user and then returns the result. Today, we take computers for granted, and they have become part of our everyday activities. While computers as we know them today are relatively recent, the concepts and ideas behind computers have quite a bit of history - time for a whirlwind tour of how we got to the age of email, YouTube and Facebook.


In 1830 the English mathematician Charles Babbage conceived an analytical engine, which could be programmed with punched cards to carry out calculations. It was different from its predecessors because it was able to make decisions based on its own computations, such as sequential control, branching and looping. Almost all computers in use today follow this basic idea laid out by Babbage, which is why he is often referred to as 'the father of computers.' The analytical engine was so complex that Babbage was never able to build a working model of his design. It was finally built more than 100 years later by the London Science Museum.


Many different types of mechanical devices followed that built on the idea of the analytical engine. The very first electronic computers were developed by Konrad Zuse in Germany in the period 1935 to 1941. The Z3 was the first working, programmable and fully automatic digital computer. The original was destroyed in World War II, but a replica has been built by the Deutsches Museum in Munich. Because his devices implemented many of the concepts we still use in modern-day computers, Zuse is often regarded as the 'inventor of the computer.'


Around the same time, the British built the Colossus computer to break encrypted German codes for the war effort, and the Americans built the Electronic Numerical Integrator Analyzer and Computer, or ENIAC. Built between 1943 and 1945, ENIAC weighed 30 tons and was 100 feet long and eight feet high. Both Colossus and ENIAC relied heavily on vacuum tubes, which can act as an electronic switch that can be turned on or off much faster than mechanical switches, which were used until then. Computer systems using vacuum tubes are considered the first generation of computers.


Vacuum tubes, however, consume massive amounts of energy, turning a computer into an oven. The first semiconductor transistor was invented in 1926, but only in 1947 was it developed into a solid-state, reliable transistor for the use in computers. Similar to a vacuum tube, a transistor controls the flow of electricity, but it was only a few millimeters in size and generated little heat. Computer systems using transistors are considered the second generation of computers.


It took a few years for the transistor technology to mature, but in 1954 the company IBM introduced the 650, the first mass-produced computer. Today's computers still use transistors, although they are much smaller. By 1958 it became possible to combine several components, including transistors, and the circuitry connecting them on a single piece of silicon. This was the first integrated circuit. Computer systems using integrated circuits are considered the third generation of computers. Integrated circuits led to the computer processors we use today.


Computers become quickly more powerful. By 1970 it became possible to squeeze all the integrated circuits that are part of a single computer on a single chip called a microprocessor. Computer systems using microprocessors are considered the fourth generation of computers.


In the early 1970s computers were still mostly used by larger corporations, government agencies and universities. The first device that could be called a personal computer was introduced in 1975. The Altair 8800 was made by Micro Instrumentation and Telemetry Systems. It included an Intel 8080 processor and 256 bytes of memory. There was no keyboard, and instead programs and data were entered using switches. There was no monitor, and instead results were read by interpreting a pattern of small red lights.


The 1980s saw a rapid growth in the use of computer systems. By the mid-1980s, both Apple and Microsoft released operating systems with a graphical user interface. This is when personal computers started to look a lot more like the devices we use today. Since then there have been numerous technological advances and computers have become easier to use, more robust and much faster, but the fundamentals of how personal computers work were developed in this period, from around 1970 to 1985.


One key development in computing has been the growth in processing power. Modern computers use transistors, which are microscopic switches that control the flow of electricity. These switches can be turned on or off, which makes it possible to store binary information. More transistors means that more information can be processed. The capability of computers has grown from a single transistor in the 1950s to millions of transistors in modern processors. 2ff7e9595c


2 views0 comments

Recent Posts

See All

Comentários


bottom of page