Showing posts with label Computer Science. Show all posts
Showing posts with label Computer Science. Show all posts

The History of Computing V - Linux

Some thought that the problem with Windows wasn�t that it was nearly universal�there are advantages to everyone, as far as that�s concerned�but that it was proprietary. That is, it was owned or controlled by one company, which means that no one outside of that company knows exactly how the product works. In computer science, the opposite of proprietary is open-source. That is, no one company owns or controls the product, and anyone can look at how the product works and even make suggestions on how to improve it.

In 1991, Linus Torvalds, a student in Finland, made an innocent post to an e-mail group. He wrote that he was working on a free version of UNIX, an operating system that had been in use for twenty years. The Torvalds version, which came to be called Linux (Linus UNIX), was released under an agreement called the General Public License. In essence, anyone could use Linux in any way and modify it in any way, as long as any changes made were not hidden or charged for. This ensured that Linux would remain �open.�

�Windows versus Linux� has become the main battleground in the larger war of �proprietary versus open-source.� Adherents on both sides believe their approach will lead to more effective software.

Source of Information : Broadway-Computer Science Made Simple 2010

The History of Computing IV - Microsoft Windows

Those earning microsoft certifications may be interested to know more about the history of the company. When IBM was readying its PC for launch, they needed an operating system, which is th core program that allows a computer to function. This is the program that starts to run when you turn on a computer, before you touch the keyboard or the mouse. Microsoft purchased an operating system from another company and adapted it for use on the IBM PC, calling its software MS-DOS, for Microsoft Disk Operating System.

In 1985, Microsoft�s initial fortunes were made with MS-DOS, but once Microsoft chairman Bill Gates saw the Apple Macintosh, he knew the days of MS-DOS were numbered. Microsoft developed its own GUI, a program that would run on top of MS-DOS, and called it �Windows.�

Few people remember the original Windows today, or its sequel, Windows 2.0. They were crude interfaces by today�s standards. They are most famous for starting a long legal battle with Apple, which claimed that the �look and feel� of Windows legally infringed on the designs of Apple computers. Microsoft, in turn, claimed that Apple had stolen the important ideas from Xerox anyway. Microsoft eventually won.

In 1990, Windows 3.0 was released, the first version that was truly popular with users and developers. In 1995, Windows 95 was released, which, unlike the previous versions, was an entire operating system and interface in one, and did not require that MS-DOS be installed beforehand on the system. Soon Windows 98, Windows NT, Windows 2000, and Windows XP were developed. These products dominate the industry, and the vast majority of computers run some form of Windows.

As Microsoft became ubiquitous in the computing world, a backlash developed. Critics claimed the company�s software presence on almost every computer in the world gave them an unfair competitive advantage. The company became entangled in antitrust lawsuits from rival companies and governments around the globe.

Source of Information : Broadway-Computer Science Made Simple 2010

The History of Computing III - Apple Macintosh

Although a great improvement over punch cards, some computer scientists saw limitations in a computer with only a keyboard for input and text for output. It was fine for researchers and computer experts to interact with the machine through obscure commands and oblique text messages, but if the computer was going into every home, it needed to interact with users in a different way.

In the early 1970s, researchers at Xerox developed a series of computers that communicated with the user through pictures, not just words. The culmination of their early efforts was the Xerox Star. It had �windows,� a �mouse,� and many other elements you would recognize today. Eventually this method of computer use�mostly visual, with little text�would be called a graphical user interface (or GUI), and every computer would have one. Unfortunately for the executives at Xerox, they proved better at funding interesting projects than at marketing the results.

Steve Jobs, the president of Apple Computers, toured the Xerox research facility in 1979, having traded some Apple stock for Xerox stock. He�d been told about this new interface and wanted to see it. He left impressed, and decided that Apple�s new computer, the �Apple Lisa,� would be the first mass-produced computer with a graphical user interface. Many of the Xerox researchers would soon be working at Apple.

Not many Apple Lisas were sold. It was an expensive computer, costing $10,000 when it debuted in 1983. But because Jobs was convinced that the GUI was the model for the future, he tried again.

During the Super Bowl in 1984, Apple ran one of the most famous commercials in history to introduce their next computer, the Apple Macintosh. Directed by Ridley Scott, director of the movie Blade Runner, it depicted an Orwellian future of gray-clad workers who mindlessly pay homage to a �Big Brother� figure on a huge video screen, until an athletic woman in running clothes smashes the screen with a flying hammer. What this had to do with the Macintosh was never clear, but the commercial was widely discussed around office water coolers and was repeated on news programs. Soon everyone had heard of the �Mac.� The new computer was cheaper than the Lisa, but less powerful. As with the Lisa, it was a slow seller at first, but the company stuck with it. There was no turning back to text-based computers.

Source of Information : Broadway-Computer Science Made Simple 2010

The History of Computing II - The IBM PC

For decades after World War II, computers were shared. Though they had grown smaller since the days of ENIAC, they were still very large and expensive. As a consequence, entire universities or companies, or even groups of schools and companies, would share the use of a single, large, and�for the time�powerful computer.

The people who used this mainframe, as these computers were called, would often never see it. The computer was generally locked in a secure location, and users would connect to it through phone lines or other wiring. At first, all the user saw was a teletype, which is like a combination of an electric typewriter and a printer. But later, video screens were introduced, which showed green or orange text on a black background.

By the late 1970s, computer scientists realized that smaller, less powerful computers could be placed right on user�s desks. �Garage� companies, so named because they were so small they might literally consist of one or two people working out of a garage, began making small computers for individuals, not whole companies. Sometimes these computers came in the form of kits.

IBM, the company that grew out of Hollerith�s census tabulators, was at this time very successful making mainframes. At first, IBM was skeptical that a market even existed for smaller computers, but eventually it decided to act. In 1981 the company introduced the IBM PC. The �PC� stood for �personal computer,� which is where that term for a single-user computer originates.

The price was over $1,500, which in 1980 was a lot of money, but it was still remarkably inexpensive compared to mainframes. In addition, the IBM name gave the personal computer instant legitimacy. Yet, the IBM PC did not mark a revolution in technology. The machine had been built almost entirely from off-the-shelf parts. What was revolutionary was the concept: A computer would appear on our desks at work, would come into our homes, would become an ordinary appliance like the telephone. At the start of 1983, when Time magazine would normally select its �Man of the Year,� for 1982, the editors instead selected �The Computer� as its �Machine of the Year.� The computer had come of age.

As you will find, today�s computers are not simply a little faster than the original desktops. A gymnasium full of IBM PCs would not equal the power of a single system today. You may not know what all these terms mean, but you will before you finish this book. For now, just marvel at how little $1,500 once bought (or, to be optimistic, how much $1,500 buys now).

Source of Information : Broadway-Computer Science Made Simple 2010

History of Computing I

The Abacus
The first efforts toward mechanical assistance aided in counting, not computation. An abacus is a mechanical device with beads sliding on rods, which is used as a counting device. It dates to at least the Roman Empire, and its ancestor, the counting board, was in use as far back as 500 B.C. The abacus is considered a counting device because all the computation is still done by the person using the device. The abacus did show, however, that a machine could be used to store numbers.


Jacquard�s Mechanical Loom
A loom is a machine for weaving a pattern into fabric. Early loom designs were operated by hand. For each �line� in the design, certain threads were �pulled� by an experienced weaver (or a poor worker under the direction of a weaver) to get the finished pattern. As you might guess, this process was slow and tedious.

In 1801, Frenchman Joseph-Marie Jacquard invented a mechanical loom in which threads had to be pulled at each stage in a pattern that was stored in a series of punch cards. A punch card encodes data with holes in specific locations. In the case of weaving, every thread that could be pulled had a location on the card. If there was a hole in that location, the thread was pulled. Jacquard�s loom used a series of these punch cards on a belt. The loom would weave the line dictated by the current card, then automatically advance to the next card.

Jacquard�s loom is not necessarily a computer, because it does no mathematical calculations, but it introduced the important idea of a programmable machine. The loom is a �universal weaver� that processes different sets of punch cards to make different woven fabrics.

Jacquard�s loom is not necessarily a computer, because it does no mathematical
calculations, but it introduced the important idea of a programmable machine. The loom
is a �universal weaver� that processes different sets of punch cards to make different
woven fabrics.


Babbage�s Counting Machine
Eventually someone put together a machine that could count and could execute a program, and wondered if a machine could be made to compute numbers. Charles Babbage, an English researcher, spent much of the 1800s trying to develop just such a machine.

One of Babbage�s early designs was for a device he called the �Difference Engine,� which produced successive terms in a mathematical series while an operator turned a crank. This may not seem a dramatic development, but at the time, mathematicians relied on tables of mathematical functions in which each value had been painstakingly calculated by hand. Thus, the Difference Engine was revolutionary.

Its success led Babbage to a more ambitious design: the Analytical Engine. Rather than being tied to a specific task like the Difference Engine, the Analytical Engine was conceived as a general-purpose computing device. Different programs would be fed to the machine using a belt of punch cards, just as in Jacquard�s loom.

The Analytical Engine was never built, because Babbage ran out of money. Like many researchers today, he was dependent on government grants to continue his work. In addition, his design may not have been possible to implement using the technology of the day. He was undoubtedly a man ahead of his time.


Hollerith�s Punch Cards
Under its Constitution, the U.S. government is required every ten years to count how many people reside in each state, a process known as the census. These numbers are used to determine the proportion of representatives each state receives in the House of Representatives.

Originally, this process was done entirely by hand. Census takers would fill out forms for each household, and then the results of these forms would be tabulated by state. This method was so onerous that by the late 1800s the 1880 census took more than ten years to complete, which meant that the next census was starting before the results of the previous one were known. Clearly, something had to be done.

The government created a contest to find the best solution to the problem. In 1890 it was won by a census agent named William Hollerith. In his design, each census form was encoded into a punch card. Machines called �tabulators� could rapidly process stacks of these cards.

This method was not only dramatically faster than manual tabulation, but also allowed the government to track demographics as never before, ask more questions of each citizen, and break up data along multiple categories. For example, rather than counting men who were above a certain age or were veterans, the tabulators could count the men who were in both categories, which allowed the government to better anticipate the funds that would be needed for veterans� pensions.

The system was a resounding success and led to Hollerith�s founding of the Tabulating Machine Company, which, several mergers later, became International Business Machines, or IBM, a company that would later dominate the world of computers for decades.


ABC
In the period from 1939 to 1942, John Atanasoff, a professor at Iowa State University, and Clifford Berry, a graduate student at the same school, created what is now considered the first modern computer. Their machine, which they called the Atanasoff-Berry Computer, or ABC, weighed about 700 pounds and had to be housed in the basement of the physics department. By current standards, it was a terribly slow machine, reportedly capable of only a single calculation every fifteen seconds. In contrast, a computer today can perform billions of calculations a second.

Atanasoff and Berry never completed the patent process on their work, and the machine itself was dismantled a few years after it was built, when the physics department needed its basement space back. This was unfortunate, as their pioneering work in the field was underappreciated. Credit for the first modern computer was instead bestowed on a more famous project: ENIAC.


ENIAC
Like William Hollerith�s punched cards, the ENIAC story is driven by governmental need. When World War II began, the United States was woefully underprepared for military operations. The army needed to develop and test a large number of weapons in a short period of time. In particular, it had to perform a number of ballistics tests to create artillery tables�in essence, a book showing how far an artillery shell would fly from a specific gun, given wind conditions, the angle of the gun barrel, and so on.

Like the mathematical tables of Babbage�s time, these artillery tables had been created by hand, but by now the army already had some devices for assisting in calculation. Called differential analyzers, they operated on mechanical principles (much like Babbage�s machines), not on electronics. But something better was needed, in aid of which the army hired John Mauchly and J. Presper Eckert, computer scientists at the University of Pennsylvania. In 1946, the machine they proposed was called ENIAC, which stands for Electronic Numerical Integrator and Computer. Like the ABC, it was truly a modern computer.

The term �modern� might seem too strong if you actually saw this machine. Computers of that era relied on the vacuum tube, a device that resembled a lightbulb through which one electrical current can control another. This controlling aspect was used to build logical circuits, because by itself one vacuum tube doesn�t do much. Indeed, ENIAC required about 19,000 vacuum tubes to do its work, filled an entire room, weighed thirty tons, and drew about 200 kilowatts (that is, 200,000 watts) of power. In comparison, a desktop computer purchased today would draw about 400 watts of power, which means ENIAC drew about 500 times more current, even though its actual ability to compute is dwarfed by the most inexpensive desktop computers of today.

What makes ENIAC so important is its reliance on electronics to solve a real-world problem. There were as few mechanical parts as possible, although some mechanics were inevitable. For example, ENIAC still used punch cards for input and output, and the parts that read and produced these cards were mechanical. The vacuum tubes were built into minicircuits that performed elementary logical functions and were built into larger circuits. Those circuits were built into even larger circuits, a design idea that is still used today.

For decades the ENIAC was considered the first computer, but in the 1970s the judge in a patent infringement case determined that ENIAC was based on the designs of the ABC. Other claims were also made, including those of Konrad Zuse, whose work in wartime Germany wasn�t known to the rest of the world for decades; and the Mark I, a computer developed around the same time at Harvard. The question of what was the first modern computer may never be settled.


Knuth�s Research
To this point computers were seen as increasingly useful tools, but computer science was not considered a serious discipline, separate from mathematics. One of the leading figures who changed this was Donald Knuth.

As an undergraduate studying physics and mathematics at the Case Institute of Technology in Cleveland in the 1950s, Knuth had his first contact with the school�s IBM computer. From then on, computers and programs were his obsession. He wrote programs for the IBM computer to analyze the college�s basketball statistics, and published research papers while still an undergraduate. When he completed the work for his bachelor�s degree, the college was so impressed with his computer work that he was awarded a master�s at the same time. His most famous accomplishment is The Art of Computer Programming, a proposed masterwork of seven volumes, of which three are completed. It�s no exaggeration to say that Donald Knuth�s writings are to computer science what those of Albert Einstein are to physics.

Source of Information : Broadway-Computer Science Made Simple 2010

SUBJECT AREAS IN COMPUTER SCIENCE

Within the computer science field, computer scientists can work in many areas. Depending on the profession, some computer scientists may need to know a little about each area, while others may need deep knowledge of one or two areas.


Artificial Intelligence
Artificial intelligence can be described as programming computers to perform tasks that require intelligence if humans were performing the tasks. This is not the only definition, though, and of all the areas in computer science, this one has perhaps the most contentious boundaries. Some researchers believe artificial intelligence must mimic the processes of the human brain; others are interested only in solving problems that seem to require intelligence, like understanding a request written in English.


Theory of Computation
The theory of computation puts limits on what can be computed. Some limits are practical. It may be shown, for instance, that a computer could solve a certain problem, but it would take hundreds of years to get the result. Other limits are absolute. Strange as it may seem, some questions have a fixed, numerical answer that cannot be computed. Scientists in this area also compare programming solutions to specific tasks in a formal way. For example, a common computing task is sorting, which just means to put items in some order (like alphabetizing a list of student records by last name). Countless ways can be used to approach the sorting problem, each with advantages and disadvantages. Computational theory is used to determine which situations are most suited to a particular approach.


Human-Computer Interaction
The computer scientist working in human-computer interaction investigates how people use computers now and how people and computers can work together better in the future. This research is similar to graphic design. A graphic designer is a specialist who knows how the colors, fonts, arrangement of text, pictures, and other elements make a book, magazine, or advertisement easier for the viewer to understand. Now that computer interfaces are increasingly graphical, the same kinds of ideas are used, except that a computer is interactive. For example, many programs now have a �toolbar,� which is a row of pictures that allow the user to select commonly used operations without navigating the entire menu of options. This kind of design innovation is a result of study in human-computer interaction.


Information Management
A database in general usage is any organized collection of data. In computer science, a database specifically means a collection of data that is stored in a computer-readable form. Examples include an online book catalog at the library or the account information for each person who has a VISA card. The information management area is concerned with how databases are created, stored, accessed, shared, updated, and secured.


Computer Graphics
Computer graphics is the generation of images through computers. It includes simple text displays as well as images that appear to be in three dimensions. An important part of computer graphics is computer visualization, which attempts to pictorially display data in a way that is most understandable for the user. For instance, a visualization can allow surgeons to preview a surgical procedure before actually performing it. Other forms of visualization involve data that have no natural pictorial form. As a result, these must be displayed in a form that tells a story or makes a point. If you�ve seen a graph or chart generated with a computer program that seemed to have no clear meaning, you know why this area is important. Like many areas in computer science, computer visualization is as much human psychology as machine capability. The computer visualization expert asks, �How do we as human beings process visuals?�

As computer graphics become more advanced, they terminate at a point called virtual reality, in which graphics and sensory feedback are all-encompassing. This can be seen, for example, a room in which every surface is covered with synchronized computer displays. It�s important to note that virtual reality does not promise an experience indistinguishable from the �real world,� although that may be a goal for some researchers. Rather, it is an experience in which the outside world is temporarily blocked from our senses.


Software Engineering
As previously discussed, a software engineer is involved with the entire process of a program�s development, not just in programming. Software engineering is concerned with improving the process of making software. This means creating new processes for software engineers to follow, new techniques for project management, new methods to test software to ensure its quality, and new metrics for measuring how effective any of the other new ideas have been.


MYTHS OF COMPUTER SCIENCE


Computer Science Is All about Math
The kind and degree of math involved with computer science depends on what area one works in. Most programming involves math no more advanced than high school algebra, but some specialties require more. Someone who writes a mortgage interest calculator would need to understand financial calculations. Someone who writes a program to plot the trajectory of a satellite through space needs to understand trigonometry and calculus. Most programs, though, are built upon basic operations like addition and multiplication.


Men Are Better Suited to Computer Science than Women
Judging by the number of men and women working in the field, one could say that men as a group are more interested in computer science than women. But there�s nothing to suggest that men are better at it. Women may have avoided computer science because of an aversion to math (which is probably caused by another myth) and because of media portrayals of computer scientists as socially awkward, pasty-faced �geeks.� Computer science is a field that rewards excellence, regardless of gender or ethnicity, and all those interested should apply.


Computer Science Is for Geniuses
Genius never hurt anyone in the sciences, but having a high IQ and a knack for programming and other computer science concepts are two different things. While the people at the top of any profession usually have extraordinary abilities (that�s why they�re at the top), plenty of �ordinary� people have excelled in this field.


Computer Security
Much sensitive data is stored on computers, including tax records, credit card bills, bank accounts, and medical histories. And with computers increasingly interconnected, it�s become easier for data to be stolen. The old adage, �A chain is only as strong as its weakest link,� shows its truth here in the information age, where every computer is a link to another. So it�s no surprise that computer security is a rapidly growing field. Computer security involves finding ways to protect data from unauthorized access. This includes installing software to limit intrusions to a network, instructing employees on safe habits, and analyzing the aftermath of a system break-in to learn how to prevent a recurrence. A related field is computer forensics, though in fact this is almost the reverse of computer security since it involves breaking through security to retrieve partially deleted files. The purpose of this �break-in� is to obtain and analyze evidence to be used in a court trial.

Source of Information :  Broadway-Computer Science Made Simple 2010

What is COMPUTING ?

Computers have become a ubiquitous feature of modern life. It would be difficult to get through a day without some activity involving a computer, be it composing e-mail on a computer sitting on a desk, using a computer hidden inside a cell phone, or receiving a bill generated by a computer at the power company. Computer science allows all these activities to happen.

But what is computer science? It sounds simple enough�computer science is a branch of science that studies computers. But not everyone who works with computers is a computer scientist. The use and development of computers comprises a number of overlapping disciplines.

Before these disciplines are discussed, you need to understand a few terms.

A program is a series of steps to accomplish a given task. In general usage, a program might refer to everyday instructions, written in English, such as instructions to change a tire or register for a college class. In computer science, however, the term �program� refers to a series of steps given to a computer.

A computer is an electronic device for performing logical and mathematical operations based on its programs. The term includes not only the obvious electronic devices that have a screen, keyboard, printer, and so on, but also computers that are embedded into devices like those at supermarket checkout counters or in DVD players. What makes computers interesting and powerful is that they can be given arbitrary sets of instructions to perform.

Hardware refers to all the physical devices that make up a computer system, both those inside the computer �case� and those outside the case, like monitor, keyboard, and mouse.

Software refers to the programs the computer executes. For example, the word processor Microsoft Word, or the computer game �Half-Life,� is software, as is a program that enables a cell phone display so the user can select a new ring-tone. By analogy, when you play a movie in a DVD player, the movie is the software and the player is the hardware.

A programmer is someone who creates programs.

User refers to a person who uses a software program or computer.

Source of Information :  Broadway-Computer Science Made Simple 2010

Major Fields in Computing

You may have heard terms like �computer engineering,� �computer science,� �information systems,� and �information technology� and wondered if they were synonymous. They are not, but they�re related through the levels of abstraction. You need to understand these terms too.

Computer engineering focuses on the bottom levels of abstraction: hardware design and use. It is an extension of electrical engineering, covering the design and analysis of computer hardware. In fact, in a university, computer engineering is most often taught in the school of engineering. While a computer engineer understands programming, most of his or her focus is on hardware, not software.

Computer science, the subject of this book, is the systematic study of computing processes. It is concerned with the middle levels of abstraction, from hardware use to software use. Computer scientists work primarily in the development of software, either at the practical level (improving the speed at which a web page performs a search) or the theoretical (exploring the limits of computers� recognition of human speech). The hope is that every subject that begins as theoretical ends as practical. To better understand software design and development, the computer scientist must understand how a computer works, even if he or she can�t create one, and thus is also an experienced computer user. In a university, computer science is most often taught in the school of science and mathematics.

Covering the top levels of abstraction is information systems, which is the study of how computing technology is used in business. Someone schooled in information systems has knowledge of business operations and programming, but is more interested in solving a business problem with existing solutions than in trying to invent novel solutions. In a university, information systems may be taught in the same department as computer science or it may be taught in the business school.

Finally, information technology is a broad term that does not refer to a particular field but instead covers all the levels of abstraction. While often restricted to mean the use of technology in business (similar to information systems), in general, the term encompasses the design, development, and implementation of computer hardware and software.

Source of Information :  Broadway-Computer Science Made Simple 2010
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. Information Computer and Technology - All Rights Reserved
Template Modify by Creating Website
Proudly powered by Blogger