Showing posts with label History. Show all posts
Showing posts with label History. Show all posts

Intel History

Intel is the largest manufacturer of microprocessors in the world, and they got their start by being the brains behind the world's most advanced consumer calculator.In 1972, the Busicom high-powered business calculator was released, and it was powered by an Intel 4004 chip, Intel's first microprocessor.Shortly after, in 1974, Intel broke into the personal computer market when they put their 8080 microprocessor in the Altair 8800, the first successful personal computer ever released.In 1978, Intel struck a deal with IBM to produce the 8088 microprocessor chip to power the brand-new IBM PC for home and small business use. With an ad campaign that featured a re-creation of Charlie Chaplin's "Little Tramp" character, the IBM PC went on to be a huge success and established Intel as a premier microchip manufacturer.

Significance

In 1982, Intel released the 80286 microprocessor, which it eventually shortened to just the 286. This was the first attempt by Intel to create a microchip that could run any of the software written for previous Intel processors. Prior to the release of the 286, none of the Intel processors were backwards compatible--able to run programs written for previous generations of processors. The ability to be backwards compatible with all previous generations is now standard with Intel products. The expanded compatibility of the 286 resulted in the sale of over 15 million personal computers throughout the world.

Time Frame

The 386 generation of microprocessors was released in 1985, and it was the first processor to allow a computer to multi-task, which is the ability to run more than one program simultaneously. The programs were simple, and they were limited to only two or three at one time, but this was a huge jump in technology for home computing.The next generation 486 was released in 1989, and this processor had a built-in math co-processor that allowed it to do complicated computations at a fraction of the time of previous generations. The 486 also allowed for a wider array of colors, and it also allowed for the introduction of true point-and-click technology.Prior to the 486, it was necessary to purchase a math co-processor separately to get the maximum speed out of an Intel microprocessor.

Effects

The Pentium processor was first introduced in 1993 at speeds of 60 Mhz and 66 Mhz. It contained over 3 million transistors that greatly expanded the processor's computing capability and that increased its speed.In 2000, Intel introduced the Pentium 4 family of processors, which featured an initial speed of 1.5 Ghz.Intel continued to make design changes to the Pentium line, which included introducing dual core and quad core processors that were the equivalent of two processors in one and four processors in one.In 2009, Intel finally retired the Pentium name and introduced a new core technology called Merom.

Considerations

The Pentium line of processors was actually going to be called the 586 line, but Intel found it difficult to put patents on a product that was referred to only by a number, so they decided to use the Pentium name instead. The name "Pentium" was created by a marketing firm named Lexicon Branding in 1992 and then used by Intel in its 1993 release. The very first line of Pentium processors was not very successful. A floating point error in the processor caused it to miscalculate on a regular basis, and this prompted one of the largest recalls in the history of the computer industry. It wound up costing Intel over $450 million to recall the defective chips. To avoid the problem ever happening again, Intel created a quality control division that checks each microprocessor before it leaves the factory.

AMD Processor

Advanced Micro Devices, Inc. (AMD) (NYSE: AMD) is an American multinational semiconductor company based in Sunnyvale, California, that develops computer processors and related technologies for commercial and consumer markets. Its main products include microprocessors, motherboard chipsets, embedded processors and graphics processors for servers, workstations and personal computers, and processor technologies for handheld devices, digital television, automobiles, game consoles, and other embedded systems applications.


AMD is the second-largest global supplier of microprocessors based on the x86 architecture after Intel Corporation, and the third-largest supplier of graphics processing units, behind Intel and Nvidia. It also owns 21 percent of Spansion, a supplier of non-volatile flash memory. In 2007, AMD ranked eleventh among semiconductor manufacturers in terms of revenue.


Advanced Micro Devices was founded on May 1, 1969, by a group of former executives from Fairchild Semiconductor, including Jerry Sanders III, Ed Turney, John Carey, Sven Simonsen, Jack Gifford and three members from Gifford's team, Frank Botte, Jim Giles, and Larry Stenger. The company began as a producer of logic chips, then entered the RAM chip business in 1975. That same year, it introduced a reverse-engineered clone of the Intel 8080 microprocessor. During this period, AMD also designed and produced a series of bit-slice processor elements (Am2900, Am29116, Am293xx) which were used in various minicomputer designs.


During this time, AMD attempted to embrace the perceived shift towards RISC with their own AMD 29K processor, and they attempted to diversify into graphics and audio devices as well as EPROM memory. It had some success in the mid-80s with the AMD7910 and AMD7911 "World Chip" FSK modem, one of the first multistandard devices that covered both Bell and CCITT tones at up to 1200 baud half duplex or 300/300 full duplex. While the AMD 29K survived as an embedded processor and AMD spinoff Spansion continues to make industry leading flash memory, AMD was not as successful with its other endeavors. AMD decided to switch gears and concentrate solely on Intel-compatible microprocessors and flash memory. This put them in direct competition with Intel for x86 compatible processors and their flash memory secondary markets.


AMD announced a merger with ATI Technologies on July 24, 2006. AMD paid $4.3 billion in cash and 58 million shares of its stock for a total of US$5.4 billion. The merger completed on October 25, 2006 and ATI is now part of AMD.

Solaris History

Solaris is the Unix-based operating system developed by Sun Microsystems, displays that company's ability to be innovative and flexible. Solaris, one could argue, is perpetually ahead of the curve in the computer world. Sun continually adapts to the changing computer environment, trying to anticipate where the computer world is going, and what will be needed next, and develops new versions of Solaris to take that into account.


Solaris was born in 1987 out of an alliance between AT&T and Sun Microsystems to combine the leading Unix versions (BSD, XENIX, and System V) into one operating system. Four years later in 1991, Sun replaced it's existing Unix operating system (SunOS 4) with one based on SVR4. This new OS, Solaris 2, contained many new advances, including use of the OpenWindows graphical user interface, NIS+, Open Network Computing (ONC) functionality, and was specially tuned for symmetric multiprocessing.


This kicked off Solaris' history of constant innovation, with new versions of Solaris being released almost annually over the next fifteen years. Sun was constantly striving to stay ahead of the curve, while at the same time adapting Solaris to the existing, constantly evolving wider computing world. The catalogue of innovations in the Solaris OS are too numerous to list here, but a few milestones are worth mentioning. Solar 2.5.1 in 1996 added CDE, the NFSv3 file system and NFS/TCP, expanded user and group IDs to 32 bits, and included support for the Macintosh PowerPC platform. Solaris 2.6 in 1997 introduced WebNFS file system, Kerberos 5 security encryption, and large file support to increase Solaris' internet performance.


Solaris 2.7 in 1998 (renamed just Solaris 7) included many new advances, such as native support for file system meta-data logging (UFS logging). It was also the first 64-bit release, which dramatically increased its performance, capacity, and scalability. Solaris 8 in 2000 took it a step further was the first OS to combine datecentre and dot-com requirements, offering support for IPv6 and IPSEC, Multipath I/O, and IPMP. Solaris 9 in 2002 saw the writing on the wall of the server market, dropped OpenWindows in favour of Linux compatibility, and added a Resource Manager, the Solaris Volume Manager, extended file attributes, and the iPlanet Directory Server.


Solaris 10, the current version, was released to the public in 2005 free of charge and with a host of new developments. The latest advances in the computing world are constantly being incorporated in new versions of Solaris 10 released every few months. To mention just a few, Solaris features more and more compatibility with Linux and IBM systems, has introduced the Java Desktop System based on GNOME, added Dynamic Tracing (Dtrace), NFSv4, and later the ZFS file system in 2006.


Also in 2006, Sun set up the OpenSolaris Project. Within the first year, the OpenSolaris community had grown to 14,000 members with 29 user groups globally, working on 31 active projects. Although displaying a deep commitment to open-source ideals, it also provides Sun with thousands of developers essentially working for free.


The development of the Solaris OS demonstrates Sun Microsystems' ability to be on the cutting edge of the computing world without losing touch with the current computing environment. Sun regularly releases new versions of Solaris incorporating the latest development in computer technology, yet also included more cross-platform compatibility and incorporating the advances of other systems. The OpenSolaris project is the ultimate display of these twin strengths-Sun has tapped into the creative energy of developers across the world and receives instant feedback about what their audience wants and needs. If all software companies took a lesson from Sun, imagine how exciting and responsive the industry could be.

Linux History

In order to know the popularity of linux, we need to travel back in time. In earlier days, computers were like a big house, even like the stadiums. So there was a big problem of size and portability. Not enough, the worst thing about computers is every computer had a different operating system. Software was always customized to serve a specific purpose, and software for one given system didn't run on another system. Being able to work with one system didn't automatically mean that you could work with another. It was difficult, both for the users and the system administrators. Also those computers were quiet expensive. Technologically the world was not quite that advanced, so they had to live with the size for another decade. In 1960, a team of developers in the Bell Labs laboratories started working on a solution or the software problem, to address these compatibility issues. They developed a new operating system, which was simple, elegant , written in C Programming language instead of Assebly language and most important is it can be able to recycle the code. The Bell Labs developers named their this project as " UNIX ".

Unix was developed with small piece of code which is named as kernel. This kernel is the only piece of code that needs to be adapted for every specific system and forms the base of the UNIX system. The operating system and all other functions were built around this kernel and written in a higher programming language, C. This language was especially developed for creating the UNIX system. Using this new technique, it was much easier to develop an operating system that could run on many different types of hardware. So this naturally affected the cost of Unix operating system, the vendors used to sell the software ten times than the original cost. The source code of Unix, once taught in universities courtesy of Bell Labs, was not published publicly. So developers tried to find out some solution to to provide an efficient solution to this problem.

A solution seemed to appear in form of MINIX. It was written from scratch by Andrew S. Tanenbaum, a US-born Dutch professor who wanted to each his students the inner workings of a real operating system. It was designed to run on the Intel 8086 microprocessors that had flooded the world market.

As an operating system, MINIX was not a superb one. But it had the advantage that the source code was available. Anyone who happened to get the book 'Operating Systems: Design and Implementation' by Tanenbaum could get hold of the 12,000 lines of code, written in C and assembly language. For the first time, an aspiring programmer or hacker could read the source codes of the operating system, which to that time the software vendors had guarded vigorously. A superb author, Tanenbaum captivated the brightest minds of computer science with the elaborate lively discussion of the art of creating a working operating system. Students of Computer Science all over the world worked hard over the book, reading through the codes to understand the very system that runs their computer.

And one of them was Linus Torvalds. Linus Torvalds was the second year student of Computer Science at the University of Helsinki and a self- taught hacker. MINIX was good, but still it was simply an operating system for the students, designed as a teaching tool rather than an industry strength one. At that time, programmers worldwide were greatly inspired by the GNU project by Richard Stallman, a software movement to provide free and quality software. In the world of Computers, Stallman started his awesome career in the famous Artificial Intelligence Laboratory at MIT, and during the mid and late seventies, created the Emacs editor.

In the early eighties, commercial software companies lured away much of the brilliant programmers of the AI lab, and negotiated stringent nondisclosure agreements to protect their secrets. But Stallman had a different vision. His idea was that unlike other products, software should be free from restrictions against copying or modification in order to make better and efficient computer programs. With his famous 1983 manifesto that declared the beginnings of the GNU project, he started a movement to create and distribute softwares that conveyed his philosophy (Incidentally, the name GNU is a recursive acronym which actually stands for 'GNU is Not Unix'). But to achieve this dream of ultimately creating a free operating system, he needed to create the tools first. So, beginning in 1984, Stallman started writing the GNU C Compiler (GCC), an amazing feat for an individual programmer. With his smart technical skills, he alone outclassed entire groups of programmers from commercial software vendors in creating GCC, considered as one of the most efficient and robust compilers ever created.

Linus himself didn't believe that his creation was going to be big enough to change computing forever. Linux version 0.01 was released by mid September 1991, and was put on the net. Enthusiasm gathered around this new kid on the block, and codes were downloaded, tested, tweaked, and returned to Linus. 0.02 came on October 5th.

Further Development

While Linux development, Linus faced some of the difficulties such as cross opinions with some people. E.g. Tanenbaum the great teacher who wrote the MINIX. He sent the letter to Linus as :-

�I still maintain the point that designing a monolithic kernel in 1991 is a fundamental error. Be thankful you are not my student. You would not get a high grade for such a design " Linus later admitted that it was the worst point of his development of Linux. Tanenbaum was certainly the famous professor, and anything he said certainly mattered. But he was wrong with Linux, for Linus was one stubborn guy who never like defeats. Although, Tanenbaum also remarked that �Linux is obsolete.� So very soon thousands of people form a community and all joined the camp. Powered by programs from the GNU project, Linux was ready for the actual showdown. It was licensed under GNU General Public License, thus ensuring that the source codes will be free for all to copy, study and to change. Students and computer programmers grabbed it.

Everyone tried and edited the source code and then it gives the start for commercial vendors to start their market. They compiled various software and distributed them with that operating system which people are familiar with. Red Hat, Debian gained more response from outside world. With the new graphical interface system like KDE, GNONE the linux becomes popular. The best thing today about Linux is it's powerful commands.

Rise of the Desktop Linux

What is the biggest complaint about Linux ??? That is it's Text mode. Many people get scared of seeing the command base interface which is not understandable. But if anyone starts learning the commands, it goes on interesting topics to learn new about the Operating System. Still now, very friendly GUI's are available for it's flexibility. Anyone can install the Linux without having the prior experience. Everything is well explanatory at the time of installation. Most distributions are also available in Live CD format, which the users can just put in their CD drives and boot without installing it to the hard drive, making Linux available to the newbies. The most important point about Linux is it's open source. So Computer users having low budget can have Linux and learn linux as it is free.

Linux's Logo - Penguin

The logo of Linux is Penguin. It's called as Tux in technological world. Rather Tux, as the penguin is lovingly called, symbolizes the carefree attitude of the total movement. This cute logo has a very interesting history. As put forward by Linus, initially no logo was selected for Linux. Once Linus went to the southern hemisphere on a vacation. There he encountered a penguin,not unlike the current logo of Linux. As he tried to pat it, the penguin bit his hand. This amusing incident led to the selection of a penguin as the logo of Linux sometime later.

Unix History

Since it began to escape from AT&T's Bell Laboratories in the early 1970's, the success of the UNIX operating system has led to many different versions: recipients of the (at that time free) UNIX system code all began developing their own different versions in their own, different, ways for use and sale. Universities, research institutes, government bodies and computer companies all began using the powerful UNIX system to develop many of the technologies which today are part of a UNIX system.


Computer aided design, manufacturing control systems, laboratory simulations, even the Internet itself, all began life with and because of UNIX systems. Today, without UNIX systems, the Internet would come to a screeching halt. Most telephone calls could not be made, electronic commerce would grind to a halt and there would have never been "Jurassic Park"!


By the late 1970's, a ripple effect had come into play. By now the under- and post-graduate students whose lab work had pioneered these new applications of technology were attaining management and decision-making positions inside the computer system suppliers and among its customers. And they wanted to continue using UNIX systems.


Soon all the large vendors, and many smaller ones, were marketing their own, diverging, versions of the UNIX system optimized for their own computer architectures and boasting many different strengths and features. Customers found that, although UNIX systems were available everywhere, they seldom were able to interwork or co-exist without significant investment of time and effort to make them work effectively. The trade mark UNIX was ubiquitous, but it was applied to a multitude of different, incompatible products.


In the early 1980's, the market for UNIX systems had grown enough to be noticed by industry analysts and researchers. Now the question was no longer "What is a UNIX system?" but "Is a UNIX system suitable for business and commerce?"


Throughout the early and mid-1980's, the debate about the strengths and weaknesses of UNIX systems raged, often fuelled by the utterances of the vendors themselves who sought to protect their profitable proprietary system sales by talking UNIX systems down. And, in an effort to further differentiate their competing UNIX system products, they kept developing and adding features of their own.


In 1984, another factor brought added attention to UNIX systems. A group of vendors concerned about the continuing encroachment into their markets and control of system interfaces by the larger companies, developed the concept of "open systems."


Open systems were those that would meet agreed specifications or standards. This resulted in the formation of X/Open Company Ltd whose remit was, and today in the guise of The Open Group remains, to define a comprehensive open systems environment. Open systems, they declared, would save on costs, attract a wider portfolio of applications and competition on equal terms. X/Open chose the UNIX system as the platform for the basis of open systems.


Although UNIX was still owned by AT&T, the company did little commercially with it until the mid-1980's. Then the spotlight of X/Open showed clearly that a single, standard version of the UNIX system would be in the wider interests of the industry and its customers. The question now was, "which version?".


In a move intended to unify the market in 1987, AT&T announced a pact with Sun Microsystems, the leading proponent of the Berkeley derived strain of UNIX. However, the rest of the industry viewed the development with considerable concern. Believing that their own markets were under threat they clubbed together to develop their own "new" open systems operating system. Their new organization was called the Open Software Foundation (OSF). In response to this, the AT&T/Sun faction formed UNIX International.


The ensuing "UNIX wars" divided the system vendors between these two camps clustered around the two dominant UNIX system technologies: AT&T's System V and the OSF system called OSF/1. In the meantime, X/Open Company held the center ground. It continued the process of standardizing the APIs necessary for an open operating system specification.


In addition, it looked at areas of the system beyond the operating system level where a standard approach would add value for supplier and customer alike, developing or adopting specifications for languages, database connectivity, networking and mainframe interworking. The results of this work were published in successive X/Open Portability Guides.


XPG 4 was released in October 1992. During this time, X/Open had put in place a brand program based on vendor guarantees and supported by testing. Since the publication of XPG4, X/Open has continued to broaden the scope of open systems specifications in line with market requirements. As the benefits of the X/Open brand became known and understood, many large organizations began using X/Open as the basis for system design and procurement. By 1993, over $7 billion had been spent on X/Open branded systems. By the start of 1997 that figure has risen to over $23 billion. To date, procurements referencing the Single UNIX Specification amount to over $5.2 billion.


In early 1993, AT&T sold it UNIX System Laboratories to Novell which was looking for a heavyweight operating system to link to its NetWare product range. At the same time, the company recognized that vesting control of the definition (specification) and trademark with a vendor-neutral organization would further facilitate the value of UNIX as a foundation of open systems. So the constituent parts of the UNIX System, previously owned by a single entity are now quite separate


In 1995 SCO bought the UNIX Systems business from Novell, and UNIX system source code and technology continues to be developed by SCO.


In 1995 X/Open introduced the UNIX 95 brand for computer systems guaranteed to meet the Single UNIX Specification. The Single UNIX Specification brand program has now achieved critical mass: vendors whose products have met the demanding criteria now account for the majority of UNIX systems by value.


For over ten years, since the inception of X/Open, UNIX had been closely linked with open systems. X/Open, now part of The Open Group, continues to develop and evolve the Single UNIX Specification and associated brand program on behalf of the IT community. The freeing of the specification of the interfaces from the technology is allowing many systems to support the UNIX philosophy of small, often simple tools , that can be combined in many ways to perform often complex tasks. The stability of the core interfaces preserves existing investment, and is allowing development of a rich set of software tools. The Open Source movement is building on this stable foundation and is creating a resurgence of enthusiasm for the UNIX philosophy. In many ways Open Source can be seen as the true delivery of Open Systems that will ensure it continues to go from strength to strength.

A Brief History of Computing - Operating Systems

1970

Development of UNIX operating system started. It was later released as C source code to aid portability, and subsequently versions are obtainable for many different computers, including the IBM PC. It and its clones (such as Linux) are still widely used on network and Internet servers. Originally developed by Ken Thomson and Dennis Ritchie.

1975

Unix marketed (see 1970).

1980 - October

Development of MS-DOS/PC-DOS began. Microsoft (known mainly for their programming languages) were commissioned to write the Operating System for the PC, Digital Research failed to get the contract (there is much legend as to the real reason for this). DR's Operating System, CP/M-86 was later shipped but it was actually easier to adapter older CP/M programs to DOS rather than CP/M-86, and CP/M-86 cost $495. As Microsoft didn't have an operating system to sell they bought Seattle Computer Product's 86-DOS which had been written by Tim Paterson earlier that year (86-DOS was also know as Q-DOS, Quick & Dirty Operating System, it was a more-or-less 16bit version of CP/M). The rights were actually bought in July 1981. It is reputed that IBM found over 300 bugs in the code when they subjected the operating system to their testing, and re-wrote much of the code.

Tim Paterson's DOS 1.0 was 4000 lines of assembler.

1981 - August 12

MS-DOS 1.0., PC-DOS 1.0.
Microsoft (known mainly for their programming languages) were commissioned by IBM to write the operating system, they bought a program called 86-DOS from Tim Paterson which was loosely based on CP/M 80. The final program from Microsoft was marketed by IBM as PC-DOS and by Microsoft as MS-DOS, collaboration on subsequent versions continued until version 5.0 in 1991.

Compared to modern versions of DOS version 1 was very basic, the most notable difference was the presence of just 1 directory, the root directory, on each disk. Subdirectories were not supported until version 2.0 (March, 1983).

MS-DOS (and PC-DOS) was the main operating system for all IBM-PC compatible computers until 1995 when Windows '95 began to take over the market, and Microsoft turned its back on MS-DOS (leaving MS-DOS 6.22 from 1993 as the last version written - although the DOS Shell in Windows '95 calls itself MS-DOS version 7.0, and has some improved features like long filename support). According to Microsoft, in 1994, MS-DOS was running on some 100 million computers world-wide.

1982 - March

MS-DOS 1.25, PC-DOS 1.1

1983 - March

MS-DOS 2.0, PC-DOS 2.0Introduced with the IBM XT this version included a UNIX style hierarchical sub-directory structure, and altered the way in which programs could load and access files on the disk.

1983 - May

MS-DOS 2.01

1983 - October

PC-DOS 2.1 (for PC Jr). Like the PC Jr this was not a great success and quickly disappeared from the market.

1983 - October

MS-DOS 2.11

1984 - August

MS-DOS 3.0, PC-DOS 3.0Released for the IBM AT, it supported larger hard disks as well as High Density (1.2 MB) 5?" floppy disks.

1985 - March

MS-DOS 3.1, PC-DOS 3.1This was the first version of DOS to provide network support, and provides some new functions to handle networking.

1985 - October

Version 2.25 included support for foreign character sets, and was marketed in the Far East.

1985 - November

Microsoft Windows Launched. Not really widely used until version 3, released in 1990, Windows required DOS to run and so was not a complete operating system (until Windows '95, released on August 21, 1995). It merely provided a G.U.I. similar to that of the Macintosh., in fact so similar that Apple tried to sue Microsoft for copying the 'look and feel' of their operating system. This court case was not dropped until August 1997.

1985 - December

MS-DOS 3.2, PC-DOS 3.2
This version was the first to support 3?" disks, although only the 720KB ones. Version 3.2 remained the standard version until 1987 when version 3.3 was released with the IBM PS/2.

1987

Microsoft Windows 2 released. It was more popular than the original version but it was nothing special mind you, Windows 3 (see 1990) was the first really useful version.

1987 - April

MS-DOS 3.3, PC-DOS 3.3Released with the IBM PS/2 this version included support for the High Density (1.44MB) 3?" disks. It also supported hard disk partitions, splitting a hard disk into 2 or more logical drives.

1987 - April

OS/2 Launched by Microsoft and IBM. A later enhancement, OS/2 Warp provided many of the 32-bit enhancements boasted by Windows '95 - but several years earlier, yet the product failed to dominate the market in the way Windows '95 did 8 year later.

1987 - October/November

Compaq DOS (CPQ-DOS) v3.31 released to cope with disk partitions >32MB. Used by some other OEMs, but not distributed by Microsoft.

1988 - July/August?

PC-DOS 4.0, MS-DOS 4.0

Version 3.4 - 4.x are confusing due to lack of correlation between IBM & Microsoft and also the USA & Europe. Several 'Internal Use only' versions were also produced.
This version reflected increases in hardware capabilities, it supported hard drives greater than 32 MB (up to 2 GB) and also EMS memory.

This version was not properly tested and was bug ridden, causing system crashes and loss of data. The original release was IBM's, but Microsoft's version 4.0 (in October) was no better and version 4.01 was released (in November) to correct this, then version 4.01a (in April 1989) as a further improvement. However many people could not trust this and reverted to version 3.3 while they waited for the complete re-write (version 5 - 3 years later). Beta's of Microsoft's version 4.0 were apparently shipped as early as '86 & '87.

1988 - November

MS-DOS 4.01, PC-DOS 4.01This corrected many of the bugs seen in version 4.0, but many users simply switched back to version 3.3 and waited for a properly re-written and fully tested version - which did not come until version 5 in June 1991. Support for disk partitions >32Mb.

1990 - May 22

Introduction of Windows 3.0 by Bill Gates & Microsoft. It is true multitasking (or pretends to be on computers less than an 80386, by operating in 'Real' mode) system. It maintained compatibility with MS-DOS, on an 80386 it even allows such programs to multitask - which they were not designed to do. This created a real threat to the Macintosh and despite a similar product, IBM's OS/2, it was very successful. Various improvements were made, versions 3.1, 3.11 - but the next major step did not come until Windows '95 in 1995 which relied much more heavily on the features of the 80386 and provided support for 32 bit applications.

1991 - June

MS-DOS 5.0, PC-DOS 5.0
In order to promote OS/2 Bill Gates took every opportunity after its release to say 'DOS is dead', however the development of DOS 5.0 lead to the permanent dropping of OS/2 development.

This version, after the mess of version 4, was properly tested through the distribution of Beta versions to over 7,500 users. This version included the ability to load device drivers and TSR programs above the 640KB boundary (into UMBs and the HMA), freeing more RAM for programs. This version marked the end of collaboration between Microsoft and IBM on DOS.

1991 - August

Linux is born with the following post to the Usenet Newsgroup comp.os.minix:

Hello everybody out there using minix-I'm doing a (free) operating system (just a hobby, won't bebig and professional like gnu) for 386(486) AT clones.

The post was by a Finnish college student, Linus Torvalds, and this hobby grew from these humble beginnings into one of the most widely used UNIX-like operating systems in the world today. It now runs on many different types of computer, including the Sun SPARC and the Compaq Alpha, as well as many ARM, MIPS, PowerPC and Motorola 68000 based computers.

In 1992, the GNU project (http://www.gnu.org/) adopted the Linux kernel for use on GNU systems while they waited for the development of their own (Hurd) kernel to be completed. The GNU project's aim is to provide a complete and free UNIX like operating system, combining the Linux or Hurd platform with the a complete suite of free software to run on it. In order to allow it to carry the GNU name, the Linux kernel copyright was changed to the GNU Public License Agreement (http://www.gnu.org/copyleft/gpl.html) on the 1st of February 1992.

1992 - April

Introduction of Windows 3.1

1993 - July 27

Windows NT 3.1, the first release of the Windows NT series, was released. Its name was chosen to match the current version of the 16 bit version of Microsoft Windows. NT contained a completely new 'kernel' at the core of the operating system, unlike Windows 3.x it was not based on top of MS-DOS. It was designed to be platform independant; original development was targetted at the Intel i860 processor but it was ported to MIPS and then to Intel's popular 80386 processor. The 'Win32' API was developed for Windows NT, providing a native 32 bit API that programmers used to the 16 bit versions of Microsoft Windows would be at home with.

1993 - December

MS-DOS 6.0. This included a Hard-Disk compression program called DoubleSpace, but a small computing company called 'Stac' claimed that DoubleSpace was partly a copy of their Compression Program, Stacker. After paying damages Microsoft withdrew DoubleSpace from MS-DOS 6.2, releasing a new program - DriveSpace - with MS-DOS version 6.22. In operation and programming interface DriveSpace remains virtually identical to DoubleSpace. MS-DOS 6.22 remains the last version of MS-DOS released, since Microsoft turned its efforts to Windows '95. Windows '95 (and later) DOS shell reports itself as DOS 7 - and includes a few enhancements, e.g. support for long filenames.

1994 - March 14

Linus Torvalds released version 1.0 of the Linux Kernel.

1994 - September

PC-DOS 6.3 Basically the same as version 5.0 this release by IBM included more bundled software, such as Stacker (the program that caused Microsoft so much embarrassment) and anti-virus software.

1994 - September 21

Microsoft released Windows NT 3.5. This included many features missing from the original 3.1 release, including support for compressed files and Netware compatibility.

1995 - March

Linus released Linux Kernel v1.2.0 (Linux'95).

1995 - May 30

The main feature of Windows NT 3.51 was a version supporting IBM's Power PC processor. Delays in the release of the processor meant delays in the release of Windows NT 3.51 (NT 3.51 only exists because the processor wasn't ready in time for NT 3.5). As the development team waited for the release of the processor they fixed bugs in the existing codebase. This made NT 3.51 reliable and therefore popular with customers.

1995 - August 21 [poss. 23]

Windows '95 was launched by Bill Gates & Microsoft. Unlike previous versions of Windows, Windows '95 is an entire operating system - it does not rely on MS-DOS (although some remnants of the old operating system still exist). Windows '95 was written specially for the 80386 and compatible computers to make 'full' use of its 32 bit processing and multitasking capabilities, and thus in some respects it is much more similar to Windows NT than Windows 3.x. Both Windows 95 and Windows NT provide the Win32 API for programmers, and when Windows NT 4 was released it had an almost identical user interface to Windows 95. Unfortunately, in order to maintain backwards compatibility, Windows 95 doesn't impose the same memory protection and security measures that NT does and so suffers from much worse stability, reliability and security. Despite being remarkable similar in function to OS/2 Warp (produced by IBM and Microsoft several years earlier, but marketed by IBM), Windows '95 has proved very popular.

1996

Windows '95 OSR2 (OEM System Release 2) was released - partly to fix bugs found in release 1 - but only to computer retailers for sale with new systems. There were actually two separated releases of Windows 95 OSR2 before the introduction of Windows '98, the second of which contained both USB and FAT32 support - the main selling points of Windows '98. FAT32 is a new filing system that provides support for disk paritions bigger than 2.1GB and is better at coping with large disks (especially in terms of wasted space).

1996 - June 9

Linux 2.0 released. 2.0 was a significant improvement over the earlier versions: it was the first to support multiple architectures (originally developed for the Intel 386 processor, it now supported the Digital Alpha and would very soon support Sun SPARC many others). It was also the first stable kernel to support SMP, kernel modules, and much more.

1996 - July 31

Windows NT 4.0 was released. The main feature was an update of the user interface to match Windows 95.

1998 - June 25

Microsoft released Windows '98. Some U.S. attorneys tried to block its release since the new O/S interfaces closely with other programs such as Microsoft Internet Explorer and so effectively closes the market of such software to other companies. Microsoft fought back with a letter to the White House suggesting that 26 of its industry allies said that a delay in the release of the new O/S could damage the U.S. economy. The main selling points of Windows '98 were its support for USB and its support for disk paritions greater than 2.1GB.

1999 - Jan 25

Linux Kernel 2.2.0 Released. The number of people running Linux is estimated at over 10million, making it an not only important operating system in the Unix world, but an increasingly important one in the PC world.

2000 - Feb 17

Offical Launch of Windows 2000 - Microsoft's replacement for Windows 95/98 and Windows NT. Claimed to be faster and more reliable than previous versions of Windows. It is actually a descendant of the NT series, and so the trade-off for increased reliability is that it won't run some old DOS-based games. To keep the home market happy Microsoft also released Windows ME, the newest member of the 95/98 series.

2001 - Jan 4

Linux kernel 2.4.0 released.

2001 - March 24

Apple released MacOS X. At its heart is `Darwin', an Open Source operaing system on FreeBSD. Using this MacOS X finally gives Mac users the stabilty benifits of a protected memory architecture along many other enhancements, such as preemptive multitasking. The BSD base also makes porting UNIX applications to MacOS easier and gives Mac users a fully featured command line interface alongside their GUI.

2001 - October 25

Microsoft released Windows XP - the latest version of their Windows operating system. Based on the NT series kernel, it was intended to bring together both the NT/2000 series and the Windows 95/98/ME series into one product. Of, course, it was originally hoped that this would happen with Windows 2000 but that failed. This failure was largely because of compatibility with some older applications, notably for home users problems with MS-DOS based games. Windows XP owes its success in part to some improvments in compability, and in part to time having passed - rendering much of the incompatible software obsolete anyway.

2003 - April 24

Windows Server 2003 is the latest incarnation of what began life as Windows NT. Windows Server 2003 is, as the name suggests, targetted at servers rather than workstations and home PCs, those are the realm of Windows XP. Security and reliability were key aims during the development and release of Windows Server 2003, critical if Windows is to replace the UNIX systems that serve many enterprises.

2003 - October 24

MacOS 10.3 continues to improve MacOS X, with major updates to 'Aqua' (the user interface) as well as performance improvements and new features.

2003 - December 17

Linux kernel 2.6.0 released. Many features from uClinux (designed for embedded microcontrollers) have been integrated, along with support for NUMA (used in large, multi-processor systems). an improved scheduler and scalability improvements help ensure Linux will maintain its reputation for running on everything from small embedded devices to large enterprise-class servers and even mainframes. As always support for new classes of hardware has been significantly improved.

� Copyright 1996-2005, Stephen White

A Brief History of the Internet and Related Networks

Introduction

In 1973, the U.S. Defense Advanced Research Projects Agency (DARPA) initiated a research program to investigate techniques and technologies for interlinking packet networks of various kinds. The objective was to develop communication protocols which would allow networked computers to communicate transparently across multiple, linked packet networks. This was called the Internetting project and the system of networks which emerged from the research was known as the �Internet.� The system of protocols which was developed over the course of this research effort became known as the TCP/IP Protocol Suite, after the two initial protocols developed: Transmission Control Protocol (TCP) and Internet Protocol (IP).

In 1986, the U.S. National Science Foundation (NSF) initiated the development of the NSFNET which, today, provides a major backbone communication service for the Internet. With its 45 megabit per second facilities, the NSFNET carries on the order of 12 billion packets per month between the networks it links. The National Aeronautics and Space Administration (NASA) and the U.S. Department of Energy contributed additional backbone facilities in the form of the NSINET and ESNET respectively. In Europe, major international backbones such as NORDUNET and others provide connectivity to over one hundred thousand computers on a large number of networks. Commercial network providers in the U.S. and Europe are beginning to offer Internet backbone and access support on a competitive basis to any interested parties.

�Regional� support for the Internet is provided by various consortium networks and �local� support is provided through each of the research and educational institutions. Within the United States, much of this support has come from the federal and state governments, but a considerable contribution has been made by industry. In Europe and elsewhere, support arises from cooperative international efforts and through national research organizations. During the course of its evolution, particularly after 1989, the Internet system began to integrate support for other protocol suites into its basic networking fabric. The present emphasis in the system is on multiprotocol interworking, and in particular, with the integration of the Open Systems Interconnection (OSI) protocols into the architecture.

Both public domain and commercial implementations of the roughly 100 protocols of TCP/IP protocol suite became available in the 1980�s. During the early 1990�s, OSI protocol implementations also became available and, by the end of 1991, the Internet has grown to include some 5,000 networks in over three dozen countries, serving over 700,000 host computers used by over 4,000,000 people.

A great deal of support for the Internet community has come from the U.S. Federal Government, since the Internet was originally part of a federally-funded research program and, subsequently, has become a major part of the U.S. research infrastructure. During the late 1980�s, however, the population of Internet users and network constituents expanded internationally and began to include commercial facilities. Indeed, the bulk of the system today is made up of private networking facilities in educational and research institutions, businesses and in government organizations across the globe.

The Coordinating Committee for Intercontinental Networks (CCIRN), which was organized by the U.S. Federal Networking Council (FNC) and the European Reseaux Associees pour la Recherche Europeenne (RARE), plays an important role in the coordination of plans for government- sponsored research networking. CCIRN efforts have been a stimulus for the support of international cooperation in the Internet environment.

Internet Technical Evolution

Over its fifteen year history, the Internet has functioned as a collaboration among cooperating parties. Certain key functions have been critical for its operation, not the least of which is the specification of the protocols by which the components of the system operate. These were originally developed in the DARPA research program mentioned above, but in the last five or six years, this work has been undertaken on a wider basis with support from Government agencies in many countries, industry and the academic community. The Internet Activities Board (IAB) was created in 1983 to guide the evolution of the TCP/IP Protocol Suite and to provide research advice to the Internet community.

During the course of its existence, the IAB has reorganized several times. It now has two primary components: the Internet Engineering Task Force and the Internet Research Task Force. The former has primary responsibility for further evolution of the TCP/IP protocol suite, its standardization with the concurrence of the IAB, and the integration of other protocols into Internet operation (e.g. the Open Systems Interconnection protocols). The Internet Research Task Force continues to organize and explore advanced concepts in networking under the guidance of the Internet Activities Board and with support from various government agencies.

A secretariat has been created to manage the day-to-day function of the Internet Activities Board and Internet Engineering Task Force. IETF meets three times a year in plenary and its approximately 50 working groups convene at intermediate times by electronic mail, teleconferencing and at face-to-face meetings. The IAB meets quarterly face-to-face or by videoconference and at intervening times by telephone, electronic mail and computer-mediated conferences.

Two other functions are critical to IAB operation: publication of documents describing the Internet and the assignment and recording of various identifiers needed for protocol operation. Throughout the development of the Internet, its protocols and other aspects of its operation have been documented first in a series of documents called Internet Experiment Notes and, later, in a series of documents called Requests for Comment (RFCs). The latter were used initially to document the protocols of the first packet switching network developed by DARPA, the ARPANET, beginning in 1969, and have become the principal archive of information about the Internet. At present, the publication function is provided by an RFC editor.

The recording of identifiers is provided by the Internet Assigned Numbers Authority (IANA) who has delegated one part of this responsibility to an Internet Registry which acts as a central repository for Internet information and which provides central allocation of network and autonomous system identifiers, in some cases to subsidiary registries located in various countries. The Internet Registry (IR) also provides central maintenance of the Domain Name System (DNS) root database which points to subsidiary distributed DNS servers replicated throughout the Internet. The DNS distributed database is used, inter alia, to associate host and network names with their Internet addresses and is critical to the operation of the higher level TCP/IP protocols including electronic mail.

There are a number of Network Information Centers (NICs) located throughout the Internet to serve its users with documentation, guidance, advice and assistance. As the Internet continues to grow internationally, the need for high quality NIC functions increases. Although the initial community of users of the Internet were drawn from the ranks of computer science and engineering, its users now comprise a wide range of disciplines in the sciences, arts, letters, business, military and government administration.

Related Networks

In 1980-81, two other networking projects, BITNET and CSNET, were initiated. BITNET adopted the IBM RSCS protocol suite and featured direct leased line connections between participating sites. Most of the original BITNET connections linked IBM mainframes in university data centers. This rapidly changed as protocol implementations became available for other machines. From the beginning, BITNET has been multi-disciplinary in nature with users in all academic areas. It has also provided a number of unique services to its users (e.g., LISTSERV). Today, BITNET and its parallel networks in other parts of the world (e.g., EARN in Europe) have several thousand participating sites. In recent years, BITNET has established a backbone which uses the TCP/IP protocols with RSCS-based applications running above TCP.

CSNET was initially funded by the National Science Foundation (NSF) to provide networking for university, industry and government computer science research groups. CSNET used the Phonenet MMDF protocol for telephone-based electronic mail relaying and, in addition, pioneered the first use of TCP/IP over X.25 using commercial public data networks. The CSNET name server provided an early example of a white pages directory service and this software is still in use at numerous sites. At its peak, CSNET had approximately 200 participating sites and international connections to approximately fifteen countries.

In 1987, BITNET and CSNET merged to form the Corporation for Research and Educational Networking (CREN). In the Fall of 1991, CSNET service was discontinued having fulfilled its important early role in the provision of academic networking service. A key feature of CREN is that its operational costs are fully met through dues paid by its member organizations.

A Brief History of Computers and Networks


As early as the 1640's mechanical calculators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first commercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Liebnitz in the 1670s the first true multiplying calculator appears in Germany shortly before the American Revolution.

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads (retrieves) the pattern and creates(process) the weave. Powered by water, this "machine" came 140 years before the development of the modern computer.

Shortly after the first mass-produced calculator(1820), Charles Babbage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and record-keeper, his difference engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while professor of Mathematics at Cork University, writes An Investigation of the Laws of Thought(1854), and is generally recognized as the father of computer science.

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith of MIT, the system uses electric power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's IBM.

Just prior to the introduction of Hollerith's machine the first printing calculator is introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially successful printing calculator. Although hand-powered, Burroughs quickly introduces an electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a machine he calls the differential analyzer. Using a set of gears and shafts, much like Babbage, the machine can handle simple calculus problems, but accuracy is a problem.

The period from 1935 through 1952 gets murky with claims and counterclaims of who invents what and when. Part of the problem lies in the international situation that makes much of the research secret. Other problems include poor record-keeping, deception and lack of definition.

In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to handle the math involved in his profession. Shortly after completion, Zuse starts on a programmable electronic device which he completes in 1938.

John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry assists. The "ABC" is designed to solve linear equations common in physics. It displays some early features of later computers including electronic calculations. He shows it to others in 1939 and leaves the patent application with attorneys for the school when he leaves for a job in Washington during World War II. Unimpressed, the school never files and ABC is cannibalized by students.

The Enigma, a complex mechanical encoder is used by the Germans and they believe it to be unbreakable. Several people involved, most notably Alan Turing, conceive machines to handle the problem, but none are technically feasible. Turing proposes a "Universal Machine" capable of "computing" any algorithm in 1937. That same year George Steblitz creates his Model K(itchen), a conglomeration of otherwise useless and leftover material, to solve complex calculations. He improves the design while working at Bell Labs and on September 11, 1940, Steblitz uses a teletype machine at Dartmouth College in New Hampshire to transmit a problem to his Complex Number Calculator in New York and receives the results. It is the first example of a network.

First in Poland, and later in Great Britain and the United States, the Enigma code is broken. Information gained by this shortens the war. To break the code, the British, led by Touring, build the Colossus Mark I. The existence of this machine is a closely guarded secret of the British Government until 1970. The United States Navy, aided to some extent by the British, builds a machine capable of breaking not only the German code but the Japanese code as well.

In 1943 development begins on the Electronic Numerical Integrator And Computer (ENIAC) in earnest at Penn State. Designed by John Mauchly and J. Presper Eckert of the Moore School, they get help from John von Neumann and others. In 1944, the Havard Mark I is introduced. Based on a series of proposals from Howard Aiken in the late 1930's, the Mark I computes complex tables for the U.S. Navy. It uses a paper tape to store instructions and Aiken hires Grace Hopper("Amazing Grace") as one of three programmers working on the machine. Thomas J. Watson Sr. plays a pivotal role involving his company, IBM, in the machine's development.

Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the relays, possibly causing the problem. From this day on, Hopper refers to fixing the system as "debugging". The same year Von Neumann proposes the concept of a "stored program" in a paper that is never officially published.

Work completes on ENIAC in 1946. Although only three years old the machine is woefully behind on technology, but the inventors opt to continue while working on a more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A later version eliminates this problem. To make the machine appear more impressive to reporters during its unveiling, a team member (possibly Eckert) puts translucent spheres(halved ping pong balls) over the lights. The US patent office will later recognize this as the first computer.

The next year scientists employed by Bell Labs complete work on the transistor (John Bardeen, Walter Brattain and William Shockley receive the Nobel Prize in Physics in 1956), and by 1948 teams around the world work on a "stored program" machine. The first, nicknamed "Baby", is a prototype of a much larger machine under construction in Britain and is shown in June 1948.

The impetus over the next 5 years for advances in computers is mostly the government and military. UNIVAC, delivered in 1951 to the Census Bureau, results in a tremendous financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an employee of that company proposes "reuseable software," code segments that could be extracted and assembled according to instructions in a "higher level language." The concept of compiling is born. Hopper would revise this concept over the next twenty years and her ideas would become an integral part of all modern computers. CBS uses one of the 46 UNIVAC computers produced to predict the outcome of the 1952 Presidential Election. They do not air the prediction for 3 hours because they do not trust the machine.

IBM introduces the 701 the following year. It is the first commercially successful computer. In 1956 FORTRAN is introduced(proposed 1954, it takes nearly 3 years to develop the compiler). Two additional languages, LISP and COBOL, are added in 1957 and 1958. Other early languages include ALGOL and BASIC. Although never widely used, ALGOL is the basis for many of today's languages.

With the introduction of Control Data's CDC1604 in 1958, the first transistor powered computer, a new age dawns. Brilliant scientist Seymour Cray heads the development team. This year integrated circuits are introduced by two men, Jack Kilby and John Noyce, working independently. The second network is developed at MIT. Over the next three years computers begin affecting the day-to-day lives of most Americans. The addition of MICR characters at the bottom of checks is common.

In 1961 Fairchild Semiconductor introduces the integrated circuit. Within ten years all computers use these instead of the transistor. Formally building sized computers are now room-sized, and are considerably more powerful. The following year the Atlas becomes operational, displaying many of the features that make today's systems so powerful including virtual memory, pipeline instruction execution and paging. Designed at the University of Manchester, some of the people who developed Colossus thirty years earlier make contributions.

On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main feature of this machine is business oriented...IBM guarantees the "upward compatibility" of the system, reducing the risk that a business would invest in outdated technology. Dartmouth College, where the first network was demonstrated 25 years earlier, moves to the forefront of the "computer age" with the introduction of TSS(Time Share System) a crude(by today's standards) networking system. It is the first Wide Area Network. In three years Randy Golden, President and Founder of Golden Ink, would begin working on this network.

Within a year MIT returns to the top of the intellectual computer community with the introduction of a greatly refined network that features shared resources and uses the first minicomputer(DEC's PDP-8) to manage telephone lines. Bell Labs and GE play major roles in its design.

In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its own operating system, UNIX. One of the many precursors to today's Internet, ARPANet, is quietly launched. Alan Keys, who will later become a designer for Apple, proposes the "personal computer." Also in 1969, unhappy with Fairchild Semiconductor, a group of technicians begin discussing forming their own company. This company, formed the next year, would be known as Intel. The movie Colossus:The Forbin Project has a supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first feature length movie with the word computer in the title. In 1971, Texas Instruments introduces the first "pocket calculator." It weighs 2.5 pounds.

With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little publicized judicial decision takes the patent for the computer away from Mauchly and Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for the first local area networks.

In 1975 the first personal computer is marketed in kit form. The Altair features 256 bytes of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next year Apple begins to market PC's, also in kit form. It includes a monitor and keyboard. The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with the first royal email message.

During the next few years the personal computer explodes on the American scene. Microsoft, Apple and many smaller PC related companies form (and some die). By 1977 stores begin to sell PC's. Continuing today, companies strive to reduce the size and price of PC's while increasing capacity. Entering the fray, IBM introduces it's PC in 1981(it's actually IBM's second attempt, but the first failed miserably). Time selects the computer as its Man of the Year in 1982. Tron, a computer-generated special effects extravaganza is released the same year.

(The source of the site goldenink)
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. Information Computer and Technology - All Rights Reserved
Template Modify by Creating Website
Proudly powered by Blogger