abonnement Unibet Coolblue Bitvavo
  donderdag 21 augustus 2003 @ 08:51:44 #1
39225 Pink_Panther
Nichterig kutbeest
pi_12596902
Hoi,

Dit centrale topic is speciaal neergezet om er voor te zorgen dat de overdaad aan computer topics van de afgelopen periode een beetje in toom wordt gehouden. Computer nieuwsfeitjes komen dus vanaf nu in dit centrale topic. Dit houdt overigens ook in dat flamen over computers in dit topic niet gewenst is. Ik hoop dat iedereen hiermee kan leven. Er is nu een plaats waar computer artikeltjes geplaatst kunnen worden zonder dat dit de rest van de topic lijst laat ondersneeuwen.

Pink
  donderdag 21 augustus 2003 @ 08:53:02 #2
3288 MikeyMo
jou are een essol!
pi_12596920
damn, kom op hey. Ik bedoelde het met mijn topic tenminste nog serieus

is er al een centraal linux/unix topic?

[b]Op vrijdag 7 november 2008 08:54 schreef santax het volgende:[/b]
[..]
Blij dat er nog mensen hier zijn waar ik me wel in herken.
U, meneer MikeyMo, bent mijn nieuwe FOK!-held _O_
pi_12596928
De monitor is dat apparaat waar je de hele tijd naar zit te kijken.
  donderdag 21 augustus 2003 @ 08:54:25 #4
39225 Pink_Panther
Nichterig kutbeest
pi_12596934
quote:
Op donderdag 21 augustus 2003 08:53 schreef MarkyB het volgende:
De monitor is dat apparaat waar je de hele tijd naar zit te kijken.
Dank u!
Kijk, zo komen we er wel.
Pink
pi_12596943

Dit is een computerspel!

I don't suffer from insanity; I enjoy every minute of it!
pi_12596958
The History of the Computer

The earliest computing instrument is the 'abacus', which was first used 2,000 years ago. It is a simple wooden rach holding parallel wires on which beads are strung. These beads can be moved along the wire helping the user to calculate ordinary arithmetic operations.

In 1642 Blaise Pascal built the first "digital calculating machine". It could add numbers that were entered into the machine using dials. Pascal initially built it to help his father, who was a tax collector. Gottfried Wilhelm von Liebniz in 1671 invented a computer that could add and multiply. Multiplication was done by successive adding and shifting. This computer was built in 1694, and used a special "stepped gear" mechanism (which is still in use) for introducing the addend digits. Although the prototypes built by Leibniz and Pascal were not widely used, they remained curiosities until more than a century later, when Tomas of Colmar (Charles Xavier Thomas) developed the first commercially successful mechanica calculater in 1820. This calculator was capable of adding, subtracting, multiplying and deviding. By the 1890 the caclulator had developed into an apparatus that could accumulate results, store them and print them.

Charles Babbage realized (1812) that many long computations, especially those needed to prepare mathematical tables, consisted of routine operations that were regularly repeated; from this he surmised that it ought to be possible to do these operations automatically. He began to design an automatic mechanical calculating machine, which he called a "difference engine," and by 1822 he had built a small working model for demonstration. With financial help from the British government, Babbage started construction of a full-scale difference engine in 1823. It was intended to be steam-powered; fully automatic, even to the printing of the resulting tables; and commanded by a fixed instruction program.

The difference engine, although of limited flexibility and applicability, was conceptually a great advance. Babbage continued work on it for 10 years, but in 1833 he lost interest because he had a "better idea" --the construction of what today would be described as a general-purpose, fully program-controlled, automatic mechanical digital computer. Babbage called his machine an "analytical engine"; the characteristics aimed at by this design show true prescience, although this could not be fully appreciated until more than a century later. The plans for the analytical engine specified a parallel decimal computer operating on numbers (words) of 50 decimal digits and provided with a storage capacity (memory) of 1,000 such numbers. Built-in operations were to include everything that a modern general-purpose computer would need, even the all-important "conditional control transfer" capability, which would allow instructions to be executed in any order, not just in numerical sequence. The analytical engine was to use a punched card (similar to that used on a Jacquard loom), which was to be read into the machine from any of several reading stations. The machine was designed to operate automatically, by steam power, and it would require only one attendant.

Babbage's computers were never completed. Various reasons are advanced for his failure, most frequently the lack of precision machining techniques at the time. Another conjecture is that Babbage was working on the solution of a problem that few people in 1840 urgently needed to solve.

After Babbage there was a temporary loss of interest in automatic digital computers. Between 1850 and 1900 great advances were made in mathematical physics, and it came to be understood that most observable dynamic phenomena can be characterized by differenctial equations, so that ready means for their solution and for the solution of other problems of calculus would be helpful. Moreover, from a practical standpoint, the availability of steam power caused manufacturing, transportation, and commerce to thrive and led to a period of great engineering achievement. The designing of railroads and the construction of steamships, textile mills, and bridges required differential calculus to determine such quantities as centers of gravity, centers of buoyancy, moments of inertia, and stress distributions; even the evaluation of the power output of a steam engine required practical mathematical integration. A strong need thus developed for a machine that could rapidly perform many repetitive calculations.

A step toward automated computation was the introduction of punched cards, which were first successfully used in connection with computing in 1890 by Herman Hollerith and James Powers, working for the U.S. Census Bureau. They developed devices that could automatically read the information that had been punched into cards, without human intermediation. Reading errors were consequently greatly reduced, work flow was increased, and, more important, stacks of punched cards could be used as an accessible memory store of almost unlimited capacity; furthermore, different problems could be stored on different batches of cards and worked on as needed.

These advantages were noted by commercial interests and soon led to the development of improved punch-card business-machine systems by International Business Machines (IBM), Remington-Rand, Burroughs, and other corporations. These systems used electromechanical devices, in which electrical power provided mechanical motion--such as for turning the wheels of an adding machine. Such systems soon included features to feed in automatically a specified number of cards from a "read-in" station; perform such operations as addition, multiplication, and sorting; and feed out cards punched with results. By modern standards the punched-card machines were slow, typically processing from 50 to 250 cards per minute, with each card holding up to 80 decimal numbers. At the time, however, punched cards were an enormous step forward.

By the late 1930s punched-card machine techniques had become well established and reliable, and several research groups strove to build automatic digital computers. One promising machine, constructed of standard electromechanical parts, was built by an IBM team led by Howard Hathaway Aiken. Aiken's machine, called the Harvard Mark I, handled 23-decimal-place numbers (words) and could perform all four arithmetic operations. Moreover, it had special built-in programs, or subroutines, to handle logarithms and trigonometric functions. The Mark I was originally controlled from prepunched paper tape without provision for reversal, so that automatic "transfer of control" instructions could not be programmed. Output was by card punch and electric typewriter. Although the Mark I used IBM rotating counter wheels as key components in addition to electromagnetic relays, the machine was classified as a relay computer. It was slow, requiring 3 to 5 seconds for a multiplication, but it was fully automatic and could complete long computations. Mark I was the first of a series of computers designed and built under Aiken's direction.


Electronic Digital Computers

The outbreak of World War II produced a desperate need for computing capability, especially for the military. New weapons systems were produced for which trajectory tables and other essential data were lacking. In 1942, J. Presper Eckert, John W. Mauchly, and their associates at the Moore School of Electrical Engineering of the University of Pennsylvania decided to build a high-speed electronic computer to do the job. This machine became known as ENIAC, for Electronic Numerical Integrator and Computer (or Calculator).The size of its numerical word was 10 decimal digits, and it could multiply two such numbers at the rate of 300 products per second, by finding the value of each product from a multiplication table stored in its memory. Although difficult to operate, ENIAC was still many times faster than the previous generation of relay computers.

ENIAC used 18,000 standard vacuum tubes, occupied 167.3 sq m (1,800 sq ft) of floor space, and consumed about 180,000 watts of electrical power. It had punched-card input and output and arithmetically had 1 multiplier, 1 divider-square rooter, and 20 adders employing decimal "ring counters," which served as adders and also as quick-access (0.0002 seconds) read-write register storage. The executable instructions composing a program were embodied in the separate units of ENIAC, which were plugged together to form a route through the machine for the flow of computations. These connections had to be redone for each different problem, together with presetting function tables and switches. This "wire-your-own" instruction technique was inconvenient, and only with some license could ENIAC be considered programmable; it was, however, efficient in handling the particular programs for which it had been designed. ENIAC is generally acknowledged to be the first successful high-speed electronic digital computer (EDC) and was productively used from 1946 to 1955. A controversy developed in 1971, however, over the patentability of ENIAC's basic digital concepts, the claim being made that another U.S. physicist, John v. Atanasoff, had already used the same ideas in a simpler vacuum-tube device he built in the 1930s at Iowa State College. In 1973 the court found in favor of the company using the Atanasoff claim.

Intrigued by the success of ENIAC, the mathematician John von Neumann undertook (1945) a theoretical study of computation that demonstrated that a computer could have a very simple, fixed physical structure and yet be able to execute any kind of computation effectively by means of proper programmed control without the need for any changes in hardware. Von Neumann contributed a new understanding of how practical fast computers should be organized and built; these ideas, often referred to as the stored-program technique, became fundamental for future generations of high-speed digital computers.

The stored-program technique involves many features of computer design and function besides the one named; in combination, these features make very-high-speed operation feasible. Details cannot be given here, but a glimpse may be provided by considering what 1,000 arithmetic operations per second implies. If each instruction in a job program were used only once in consecutive order, no human programmer could generate enough instructions to keep the computer busy. Arrangements must be made, therefore, for parts of the job program called subroutines to be used repeatedly in a manner that depends on how the computation progresses. Also, it would clearly be helpful if instructions could be altered as needed during a computation to make them behave differently. Von Neumann met these two needs by providing a special type of machine instruction called conditional control transfer--which permitted the program sequence to be interrupted and reinitiated at any point--and by storing all instruction programs together with data in the same memory unit, so that, when desired, instructions could be arithmetically modified in the same way as data.

As a result of these techniques and several others, computing and programming became faster, more flexible, and more efficient, with the instructions in subroutines performing far more computational work. Frequently used subroutines did not have to be reprogrammed for each new problem but could be kept intact in "libraries" and read into memory when needed. Thus, much of a given program could be assembled from the subroutine library. The all-purpose computer memory became the assembly place in which parts of a long computation were stored, worked on piecewise, and assembled to form the final results. The computer control served as an errand runner for the overall process. As soon as the advantages of these techniques became clear, the techniques became standard practice.

The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947. This group included computers using random access memory (RAM), which is a memory designed to give almost constant access to any particular piece of information. These machines had punched-card or punched-tape input and output devices and RAMs of 1,000-word capacity with an access time of 0.5 microseconds (0.5 x 10 to the power of minus 6 seconds); some of them could perform multiplications in 2 to 4 microseconds. Physically, they were much more compact than ENIAC: some were about the size of a grand piano and required 2,500 small electron tubes, far fewer than required by the earlier machines. The first-generation stored-program computers required considerable maintenance, attained perhaps 70 percent to 80 percent reliable operation, and were used for 8 to 12 years. Typically, they were programmed directly in machine language, although by the mid-1950s progress had been made in several aspects of advanced programming. This group of machines included EDVAC and UNIVAC (see UNIVAC), the first commercially available computers.

Early in the 1950s two important engineering discoveries changed the image of the electronic-computer field, from one of fast but often unreliable hardware to an image of relatively high reliability and even greater capability. These discoveries were the magnetic-core memory and the transistor-circuit element.

These new technical discoveries rapidly found their way into new models of digital computers; RAM capacities increased from 8,000 to 64,000 words in commercially available machines by the early 1960s, with access times of 2 or 3 msec. These machines were very expensive to purchase or to rent and were especially expensive to operate because of the cost of expanding programming. Such computers were typically found in large computer centers--operated by industry, government, and private laboratories--staffed with many programmers and support personnel. This situation led to modes of operation enabling the sharing of the high capability available; one such mode is batch processing, in which problems are prepared and then held ready for computation on a relatively inexpensive storage medium, such as magnetic drums, magnetic-disk packs, or magnetic tapes. When the computer finishes with a problem, it typically "dumps" the whole problem--program and results--on one of these peripheral storage units and takes in a new problem. Another mode of use for fast, powerful machines is called TIME-SHARING. In time-sharing the computer processes many waiting jobs in such rapid succession that each job progresses as quickly as if the other jobs did not exist, thus keeping each customer satisfied. Such operating modes require elaborate "executive" programs to attend to the administration of the various tasks.

In the 1960s efforts to design and develop the fastest possible computers with the greatest capacity reached a turning point with the completion of the LARC machine for Livermore Radiation Laboratories of the University of California by the Sperry-Rand Corporation, and the Stretch computer by IBM. The LARC had a core memory of 98,000 words and multiplied in 10 msec. Stretch was provided with several ranks of memory having slower access for the ranks of greater capacity, the fastest access time being less than 1 msec and the total capacity in the vicinity of 100 million words.

During this period the major computer manufacturers began to offer a range of computer capabilities and costs, as well as various peripheral equipment--such input means as consoles and card feeders; such output means as page printers, cathode-ray-tube displays, and graphing devices; and optional magnetic-tape and magnetic-disk file storage. These found wide use in business for such applications as accounting, payroll, inventory control, ordering supplies, and billing. Central processing units (CPUs) for such purposes did not need to be very fast arithmetically and were primarily used to access large amounts of records on file, keeping these up to date. By far the greatest number of computer systems were delivered for the more modest applications, such as in hospitals for keeping track of patient records, medications, and treatments given. They are also used in automated library systems, such as MEDLARS, the National Medical Library retrieval system, and in the Chemical Abstracts system, where computer records now on file cover nearly all known chemical compounds.

The trend during the 1970s was, to some extent, away from extremely powerful, centralized computational centers and toward a broader range of applications for less-costly computer systems. Most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems, now use computers of relatively modest capability for controlling and regulating their activities. In the 1960s the programming of applications problems was an obstacle to the self-sufficiency of moderate-sized on-site computer installations, but great advances in applications programming languages are removing these obstacles. Applications languages are now available for controlling a great range of manufacturing processes, for computer operation of machine tools, and for many other tasks.

Moreover, a new revolution in computer hardware came about, involving miniaturization of computer-logic circuitry and of component manufacture by what are called large-scale integration, or LSI, techniques. In the 1950s it was realized that "scaling down" the size of electronic digital computer circuits and parts would increase speed and efficiency and thereby improve performance--if only manufacturing methods were available to do this. About 1960 photoprinting of conductive circuit boards to eliminate wiring became highly developed. Then it became possible to build resistors and capacitors into the circuitry by photographic means (printed circuit boards). In the 1970s vacuum deposition of transistors became common, and entire assemblies, such as adders, shifting registers, and counters, became available on tiny "chips." In the 1980s very large-scale integration (VLSI), in which hundreds of thousands of transistors are placed on a single chip, became increasingly common. Many companies, some new to the computer field, introduced in the 1970s the programmable minicomputer supplied with software packages. The size-reduction trend continued with the introduction of personal computers, which are programmable machines small enough and inexpensive enough to be purchased and used by individuals. Many companies, such as Apple Computer and Radio Shack, introduced very successful personal computers in the 1970s. Augmented in part by a fad in computer, or video, games, development of these small computers expanded rapidly.

In the 1980s the enormous success of the personal computer and resultant advances in microprocessor technology initiated a process of attrition among giants of the computer industry. That is, as a result of advances continually being made in the manufacture of chips, rapidly increasing amounts of computing power could be purchased for the same basic costs. Microprocessors equipped with ROM, or read-only memory (which stores constantly used, unchanging programs), now were also performing an increasing number of process-control, testing, monitoring, and diagnosing functions, as in automobile ignition systems, automobile-engine diagnosis, and production-line inspection tasks. By the early 1990s these changes were forcing the computer industry as a whole to make striking adjustments. Long-established and more recent giants of the field--most notably, such companies as IBM, Digital Equipment Corporation, and Italy's Olivetti--were reducing their work staffs, shutting down factories, and dropping subsidiaries. At the same time, producers of personal computers continued to proliferate and specialty companies were emerging in increasing numbers, each company devoting itself to some special area of manufacture, distribution, or customer service. These trends will probably continue for the foreseeable future.

Computers continue to dwindle to increasingly convenient sizes for use in offices, schools, and homes. Programming productivity has not increased as rapidly, and as a result software has become the major cost of many systems. New programming techniques such as object-oriented programming, however, have been developed to help alleviate this problem. The computer field as a whole continues to experience tremendous growth. As computer and telecommunications technologies continue to integrate, computer networking, computer mail, and electronic publishing are just a few of the applications that have matured in recent years.

Of the worst case scenario.
  donderdag 21 augustus 2003 @ 08:58:52 #7
3288 MikeyMo
jou are een essol!
pi_12596970
en als de stekker niet in het stopcontact zit doe ie het niet.
[b]Op vrijdag 7 november 2008 08:54 schreef santax het volgende:[/b]
[..]
Blij dat er nog mensen hier zijn waar ik me wel in herken.
U, meneer MikeyMo, bent mijn nieuwe FOK!-held _O_
pi_12597015
quote:
Op donderdag 21 augustus 2003 08:58 schreef MikeyMo het volgende:
en als de stekker niet in het stopcontact zit doe ie het niet.
Zeker nog nooit een laptop gezien?
pi_12597023
quote:
Op donderdag 21 augustus 2003 08:58 schreef MikeyMo het volgende:
en als de stekker voeding niet in het apparaat zit doe ie het niet.
We'll keep on whispering our mantras.
  donderdag 21 augustus 2003 @ 09:45:51 #10
13250 Lod
Sapere aude!
pi_12597521
En de meeste computers hebben een ingebouwde koffiekop houder .
GNU Terry Pratchett
pi_12597565
Als je niet kan mailen, betekent dat niet dat je computer kapot is
Where facts are few, experts are many.
pi_12597568
quote:
Op donderdag 21 augustus 2003 08:53 schreef MikeyMo het volgende:


is er al een centraal linux/unix topic?


interesse heeft!!
- Want die van een olifant is toch het langste
pi_12597583
als je de netwerkplug van een nietsvermoedende collega uit de switch trekt, kan dat hilarische momenten opleveren
- Want die van een olifant is toch het langste
pi_12597606
Snel met een bootable ghost cd Windows 3.11 op school pc's installeren is ook erg briljant (Ja, we zijn erg kinderachtig )
We'll keep on whispering our mantras.
pi_12597708
En als je je eigen internetverbinding afsluit, zet je niet het Internet uit.
Een auto en een man en een rivier
Vanaf hier
Is alles wat het lijkt.
pi_12597746
Lemmings is niet EEN computerspel
maar een GEWELDIG computerspel:P
  donderdag 21 augustus 2003 @ 13:29:11 #17
17137 Sander
Nerds do it rarely
pi_12602405
>> ONZ.
  donderdag 21 augustus 2003 @ 13:38:46 #18
30259 MarcellicA
Master of Muppets
pi_12602727
quote:
Op donderdag 21 augustus 2003 09:48 schreef Atreidez het volgende:
Als je niet kan mailen, betekent dat niet dat je computer kapot is
Als je computer kapot is, betekent dat niet dat je niet kunt mailen.
Life is what happens, while you're busy making other plans |||| Trumpet players don't die, they just fade away
pi_12602780
Het feit dat je iemand kent die met computers om kan gaan wil niet zeggen dat ie altijd zin heeft om je te komen helpen .
A gentle wave of heat flows over the FOK! forum
Get woke, go broke!
pi_12602843
Lemming.
En hij hoort eigenlijk in DIG thuis.
Maar daar willen ze hem niet.
En wat moeten we er hier mee? .
pi_12602940
quote:
Op donderdag 21 augustus 2003 13:43 schreef Pjederdy het volgende:
Lemming.
En hij hoort eigenlijk in DIG thuis.
Maar daar willen ze hem niet.
En wat moeten we er hier mee? .
lekker laten staan, tis toch wel een leuk topic, als mensen niet hele lappen text copy/pasten...
A gentle wave of heat flows over the FOK! forum
Get woke, go broke!
  donderdag 21 augustus 2003 @ 14:40:17 #22
44346 junkiesietze
Trotse Scooter-rijder.
pi_12604421
Reason ownt
Ik boek je met mijn neon je weet.
en ik heb ook een auto.
  donderdag 21 augustus 2003 @ 14:54:35 #23
32814 Tessje
washed up, has been drugaddict
pi_12604817
Boxen zet je aan door het knopje in te drukken
of, zoals bij mijn broertje, het knopje om te draaien
But you have every right to ask, why was I shaving your dog while wearing your bathingsuit.
Aventura, Fok!s eigen wandelende kermisattractie
Xboxlive: Tessje
pi_12604940
Computermuizen hoef je geen water en voer te geven tevens eten deze muizen ook geen kaas.

Over het algemeen zijn vrouwen minder bang voor een computermuis dan voor een muismuis

Zoals u ziet heeft de computermuis de natuur verslagen en is vele malen populairder. De muismuis is ge-evolueerd vanuit een uitwerpsel van een reptiel. De computermuis heeft in een paar jaar tijd met slechts minieme veranderingen enorm terrein gewonnen...

Laten wij dit laten blijken met een hoozee hoozee hoozeee

abonnement Unibet Coolblue Bitvavo
Forum Opties
Forumhop:
Hop naar:
(afkorting, bv 'KLB')