Computer History (What's a computer)


Discipline of Computer Science has emerged since the year 1940, along with
berpadunya theory of algorithms and mathematical logic, and the discovery of
electronic computer with program storage capability. Is Alan
Turing and Kurt Godel, who in 1930 succeeded in integrating
algorithms, logic, and mathematical calculations and realize
in an instrument or a rule system. The principle of algorithm used
is from Ada Lovelace, who developed 60 years earlier.

Inventor of the algorithm itself is recorded in the early history of the
a man named Abu Abdullah Muhammad Ibn Musa al Khwarizmi. Al
Khwarizmi was a mathematician from Uzbekistan who lives in
the years 770-840 AD. In western literature, he is more popular with
Algorizm title. The word algorithm is derived from these titles.
While analog computer invented by Vannevar Bush in 1920,
and followed by an electronic computer developed by Howard Aiken
and Konrad Zuse in 1930.

Then John von Neumann's work demonstrates a phenomenal
in the year 1945, which is a computer architecture known as "von
Neumann machine ", where programs are stored in memory. Computer Architecture
This is then used by modern computers until now.

Year 1960 is a new chapter began formalization of Computer Science. Department
Computer Science at the universities began to boom built. Discipline
This new knowledge then known as Computer Science (Computer
Science), Computer Engineering (Computer Engineering), Komputing (Computing),
or Informatics (Informatics).

Definition

Along with the development of Computer Science, today many
Researchers are trying to make studies and conduct defining the
Computer Science. However, the basic mathematics of Computer Science
and engineering (engineering). Mathematics donated analysis methods, and
engineering design methods contribute to this field.

Denning defines in his paper Computer Science is quite famous
about the discipline of computer science [1]. This paper is the final report of the
projects and task forces of the Core of Computer Science established
by the two largest scientific society in the field of computers, namely ACM [4]
(http://acm.org) and the IEEE Computer Society [5] (http://computer.org).

Several other definitions are more abstract are:
Computer Science is the systematic study of algorithmic processes that
mengjelaskan and transform information: whether it is related to
theories, analysis, design, efficiency, implementation, or
applications available to him. Fundamental questions associated with
Computer Science is, "What can efficiently diotomatisasikan".

The One About Perception of Computer Science

Some misguided perceptions of Computer Science can we
rangkumkan as below:

Computer science is the study of computers. Science
Computer science is not only learning about computers,
as well as non-astronomy science telescopes, or
biology is the science which is not only learn about
microscope. Computers, telescopes and microscopes are instruments of science,
and not the science itself.

Computer science is the study of how to write
computer programs.

Computer science is the study of the use
computer applications.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

The word "Computer"


Over the years there have been several quite different meanings to the word 'computer', and a few different words for things we now usually called the computer.

For example "computer" are generally never used to mean hiring people to perform arithmetic calculations, with or without engine assist. According to the Barnhart Concise Dictionary of Etymology, the word is used in English in 1646 as the word for "people who count" and then before 1897 is also to "mechanical calculators. During World War II, the word refers to the workers the U.S. and British women whose jobs take a big war artillery road with such machines.

Charles Babbage designed one of the first calculating machine called the analytical engine, but because of technology problems are not made in his life. Various tools are simple machines like a good slide rule was also mentioned the computer. In some cases they are referred to as "analog computer", as they symbolize the number of continuous physical quantities rather than in addition to the different binary digits. What is now called the "computer" is generally only ever called "digital computers" to distinguish them from other tools (which is still used in the field of analog signal processing, for example).

In the thinking of other words for the computer, it is observed that prices in other languages chosen words do not always have the same literal meaning as the English word. In the French language for example, the word is "ordinateur", which means roughly "organizer", or "separating machine". In the Spanish used the word "ordenador", with the same meaning, although in some countries they use anglicism computadora. In Italian, the computer is "calcolatore", calculator, Computational pressed using logic behind such a sorting. In Swedish, a computer called "dator" from "data". Or at least the 1950s, they were called "matematikmaskin" (math engines). In Language Indonesians, the computer called "dien lake" or a "brain power". In English, other words and phrases are used, such as "data processing machines".

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

How Computers Work


As the technology used in digital computers has changed dramatically since the first computer in the 1940s (see History of the hardware count for more detail), most computers still use the von Neumann architecture, which was proposed in the early 1940s by John von Neumann .

Von Neumann architecture describes a computer with four main sections: Arithmetic and Logical Unit (ALU), control unit, memory, and input devices and the results (collectively called I / O). This section is connected by a wire file, "bus"

In this system, the memory bytes are numbered sequence (such as "cell" or "pigeon holes"), each containing a small piece of information. This information may be a command to tell the computer what to do. Tues may contain computer data needed to perform a command. Each slot may contain one, and what is now the data may then be ordered.

A memory storing various forms of information as binary numbers. The information will not be resolved binary form (encoded) with a number of instructions to turn it into a sequence of numbers or figures. For example: The letter F is stored as a decimal number 70 (or binary digits) using one method of solving. More complex instructions can be used to store images, sound, video, and various kinds of information. The information can be stored in a sell is called a byte.

In general, the memory can be rewritten over a million times - the memory can be thought of as a blackboard and chalk that can be written and erased again, rather than a legal pad with a pen that can not be deleted.

The size of each cell, and cell numbers, a great change from computer to computer, and technology in the manufacture of memory has changed greatly - from Electromechanical relays, to a tube filled with mercury (and later spring) in which the acoustic pulse is formed, until the permanent magnet matrix, to each transistor, the integrated circuits with millions of transistors on a single silicon chip

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

Embedded Computers


At approximately the last 20 years, many household appliances, particularly including a panel of video games but also includes a cell phone, video cassette recorders, PDAs and many in the household, industrial, automotive, and other electronic equipment, all of which contain electronic circuits such as computer eligible Turing-complete the above (with a note that the programs of these devices are often made directly in the ROM chip which would need to be changed to change the engine program). Other special purpose computer is generally known as a "microcontroller" or "Embedded computer" (embedded computer). Therefore, many of which restrict the definition to computer main purpose tool that is processing information, rather than becoming part of larger systems such as telephone, microwave oven, or a plane, and can be modified for various purposes by users with no physical modifications. The main frame computer, minikomputer, and personal computers (PCs) are the main types of computers that have this definition.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

computer


Computers are tools used to process the data according to the procedures that have been formulated. Computer word originally used to describe the work people do arithmetic calculations, with or without tools, but the meaning of this word and then transferred to the machine itself. Their origins, processing information almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics.

In such a definition is a tool like a slide rule, mechanical calculator types ranging from abacus, and so on, until all contemporary electronic computers. Better terms suitable for a broad sense such as "computer" is "that process information" or "information-processing system."

Nevertheless, the above definition includes many special tools that can only take one or several functions. When considering modern computers, the nature of their most important tool to distinguish them from the earlier count is that, with the right programming, any computer can emulate any nature (although perhaps limited by storage capacity and different speeds), and, indeed believed that the machine can now mimic computing tools that we will create in the future (though undoubtedly more slowly). In a sense, these limits are a useful test because the computer recognized the "general purpose" of the special purpose equipment early. The definition of "public purpose" can be formulated into a condition that a machine should be able to emulate a universal Turing machine. Machines that have this definition is known as a Turing-complete, and that they first appeared in 1940 in the middle of development throughout the world. See the article on the history of computing for more details of this period.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

Computer parts


Computer consists of 2 major parts of the software (the software) and hardware (hardware)

Hardware

* Processor or CPU as the data processing unit
* Memory RAM, which store data temporarily
* Hard drive, semi-permanent storage medium
* Input device, the media used to enter data to be processed by the CPU, such as mice, keyboards, and tablets
* Output device, the media used to display the CPU processing the output results, such as monitors and printers

Software

*Operating System
Basic programs on a computer that connects users with computer hardware. The operating system used is Linux, Windows, and Mac OS. Operating system tasks, including (but not only) controls the execution of programs on it, coordination of input, output, processing, memory, and software installation.

*Computer
Additional applications are installed in accordance with the operating system

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS

Understanding Computers


Computers are tools used to process the data according to the procedures that have been formulated. Computer word originally used to describe the work people do arithmetic calculations, with or without tools, but the meaning of this word and then transferred to the machine itself. Their origins, processing information almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics.

In this definition there are tools such as slide relu, type of mechanical calculators from the abacus, and so on, until all contemporary electronic computers. Better terms suitable for a broad sense such as "computer" is "that process information" or "information processing system."

Currently, computers are increasingly sophisticated. However, before the computer is not small, sophisticated, cool and light as now. In the history of computers, there are 5 generations of computer history.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • Twitter
  • RSS