a general history of the computer

in compuer •  7 years ago  (edited)

apc.jpgWhen did people start using numbers?

Human use of speech and calculations is the basic prerequisite for the computer being developed at all. we do not know exactly when humans really started counting, but the earliest ways we did it was that were drew lines on sticks or stones to set amounts. In 1960, on the border between Congo and Uganda, a bone was found with cuts, which are believed to represent a number of numbers. The bone is about 20,000 years old and its name - Ishango bone - after the people who lived in the area at that time.

The most famous calculator from ancient times is the abacus of the Romans. The calcareous stone was called calculus in Latin, and from here the verb comes to calculate. The abacus was led by trade routes to the East. The Chinese and Japanese developed the abacus and invented the ball frame, where calculations were made using wooden beads that slide on sticks.

However, in ancient times, the Greeks have also had some sophisticated timers that could copy the moon and the sun's circuits through mechanical gear wheels over 2,000 years ago. It was probably used for ship navigation and calculation of the time cycle between the Olympic Games.

The idea of ​​the number 0 did not come until around 600. It was Indian mathematicians who started using zero in his calculations as well as negative numbers to represent debt. The invention of the zero made it possible to calculate very large numbers, and is the prerequisite of the logical algebra, which is the math we use today.

What does the word 'computer' mean?

The word computer comes from the Latin verbum computer, which means calculating something mathematically. Computer is, according to the Great Danish Encyclopedia, the designation of a person who makes calculations. In the 19th century, they began to use the word about mechanical machines that inventors designed to make mathematical calculations. 0

the first calculators

It's hard to say what was the first computer. Already in the 16th and 16th century, astronomers and astrologers began to construct rain and counting machines that could help them perform mathematical calculations when ships were navigated in relation to the sun, the stars and the moon. One example is the German astronomer and mathematician Wilhelm Schickard who built a mechanical calculator that could aggregate, subtract, divide and multiply. You only have information about this machine from a letter that Schickard wrote to his colleague astronomer Johannes Kepler in 1623.

However, the only 19-year-old Frenchman Blaise Pascal (1623-1663), who in 1642 began to develop the principles that would later become the first calculator. Pascal was the son of a tax collector and wanted to invent a machine that would facilitate the father's workload and make tax calculations more reliable. After three years of work, Pascal reached a prototype he was pleased with, and in 1649 he gained royal privilege as France's only producer of calculators.

Despite this, Pascal's invention never became widespread, as it was hard and expensive to manufacture with the mechanics of the time.

Throughout the 1700s, the need to make complicated reliable mathematical calculations grew. One need partly due to growing state administration tasks such as more complicated tax collection and population statistics, and partly of new engineering in building construction and astronomy. Professionals often occupied over 100 books with tables to assist them in their work.

In order to meet demand and reduce the number of errors in state administration, in 1822, the British government awarded the English mathematician and Charles Babbage (1791-1871) obtained a scholarship for automated calculation research. Charles Babbage then developed as the first idea for a machine that in many ways resembles today's computer. Babbage dubbed his invention 'The Analytical Engine' - in Danish 'the analytical machine'. Like our computer, it had a controller and a calculator called the mill as well as a memory device, the big one. The numbers to be treated were represented as holes on a hole card, while the result or output was represented by a graph or a clock. Although Charles Babbage worked for his machine for almost 50 years, he managed to succeed.
never fully translate his theories into practicality.Babbage's prototypes and articles were enough to inspire another Englishman, the nobility Ada Lovelace (1815-1852). The 19-year-old Lovelace met Babbage in 1833 for a demonstration of the prototype of his prototype. Ada Lovelace, who had been working on mathematics since her childhood, was deeply fascinated by the machine and its potential. She therefore initiated a lifelong partnership with Babbage about the development of analytical machines. In one of his articles, Lovelace describes how coding could be applied to the analytical machine so that next to speech it would be able to handle and process symbols and letters. Thus, a form of computer program, and Ada Lovelace can therefore be described as the world's first computer programmer. At the same time, she was visionary and where Charles Babbage only saw mathematical potential in his machine, Lovelace predicted that future analytical machines would be able to translate and process a variety of things such as music, images and text in digital form. Something that only became reality over 100 years after her death. As Babbage and Lovelace worked with the analytical machine, the first series-produced calculators began to come under the name of the arithmometer. These new machines did not have the same degree of complexity and development potential as the Babbage model, but nevertheless they were the basis for far more efficient calculations for companies and states.

What was the hollow card machine?

When Charles Babbage designed his analytical machine, hollow cards were central when the machine was going to fed 'with data to be processed and when the data readings were to be read. The hole card technique used by Babbage in its model was inspired by French scientist Joseph Marie Jacquard, who had developed the technique of the 18th century in the construction of a mechanical tissue. On Jaquard's web, the lover activated chains of hollow cards via a pedal that ruled which chain wires should be lifted on the loom. Thus, the weaving of fabric was mechanized and when the loom had to weave intricate patterns, the human errors were avoided that could easily arise if the weaver, for example. forgot to lift a chain wire. Card machines were further developed in the latter half of the 1800s. In the United States, one introduced a machine when counting the population in 1890. It was Herman Hollerith who developed this hollow card system for the apartment. All information about the individual citizen was represented with holes in certain places on a map, and then the machine could pick up the holes at the different positions. The machines read the holes with electricity, which slipped through the holes and affected a counting lever. This way you could quickly sort information for different populations. Herman Hollerith founded the company The Tabulating Company, which later developed into becoming IBM, one of the most significant companies in the history of the computer. Hulkortteknikken was also used in the first electronic computers before keyboard and screen input and read data.

What is logical algebra?

The programming of a computer is based on the binary speech system, which is a speech system based solely on the two numbers 1 and 0. According to Datamuseum.dk, it was German mathematician Gottfried Leibniz, who wrote a dissertation about this system in 1703. Instead of using the 10's, he used the 2-digit system and used only combinations of numbers 1 and 0 to represent all numbers. His system, however, was complicated and it was not used in practice until an English mathematician named George Boole in the 1850s laid down some concrete rules: He allowed 0 to represent the logical value false and the 1 value true from the logic that all Logical statements can only be false or true. These rules are known as logical algebra (see sources). In the 1930s, American Claude Shannon built on Booles logic and demonstrated that the electrical circuit could be compared to Booles symbol logic. The electrical currents in an electrical circuit can signal 1 and 0: If there is current in a circuit it has the number 1 and if there is no current in the circuit, it signals the number 0. This logic is according to the data center Datamuseum.dk in computer science because the computer components have only two models · a switch is connected or disconnected · a cord is conductive or non-conductive · a field on a hollow card is pierced or not hole · a field on a computer disk is magnetized on one way or another The English mathematician Alan Turing developed a universal model for how all computers work - the 'Turing Machine' based on this logic. According to the article "Turing transformed the computer from human to machine" the model is not a specific machine, but describes how the computer can be programmed from combinations of the numbers 1 and 0. The turing machine thus has nothing to do with the physical components of the computer, but is an abstract model of a computer program.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!