How does a computer work

  1. How do supercomputers work?
  2. How do computers represent data?
  3. How Binary Code Works and How Computers Use It
  4. HTG Explains: How Does a CPU Actually Work?
  5. How Microprocessors Work
  6. How Does a Quantum Computer Work?
  7. Explaining Computers: How Does a Computer Work?
  8. What Is a PC?


Download: How does a computer work
Size: 31.22 MB

How do supercomputers work?

• • by Last updated: March 28, 2023. Roll back time a half-century or so and the smallest computer in the world was a gargantuan machine that filled a room. When supercomputer—a computer that's millions of times faster than a desktop PC and capable of crunching the world's most complex scientific problems. What makes supercomputers different from the machine you're using right now? Let's take a closer look! Photo: This is Contents • • • • • • • • • • • • What is a supercomputer? Before we make a start on that question, it helps if we understand what a Serial and parallel processing What's the difference between serial and parallel? An ordinary computer does one thing at a time, so it does things in a distinct series of operations; that's called serial processing. It's a bit like a person sitting at a grocery store checkout, picking up items from the conveyor belt, running them through the A typical modern supercomputer works much more quickly by splitting problems into pieces and working on many pieces at once, which is called parallel processing. It's like arriving at the checkout with a giant cart full of items, but then splitting your items up between several different friends. Each friend can go through a separate checkout with a few of the items and pay separately. Once you've all paid, you can get together again, load up the cart, and leave. The more items there are and the more friends you have, the faster it gets to do things by parallel processing—at least, in the...

How do computers represent data?

It really has more to do with the way computers work. At the fundamental level, the transceiver is how the computer interprets anything (this is where you can find binary). A wire can either be sent electrical signals, or it cannot (there is no in between for on and off after all). This means that the representation for when a wire is sent an electrical signal has to be of 2 possible values. As such, binary is used. Good question! There's a lot of jargon in the world of computers, so it's possible that I use jargon that some folks aren't familiar with. A GIF is a type of image file that's popular on the internet these days, but you're right, "GIF" is jargon. I'd encourage learners to search the internet for jargon that is unfamiliar or ask a question as you've done here. I can then decide whether to reword something to avoid the jargon. There is a vocabulary review here: That only goes over the high-level vocabulary that's covered by the exam, it does not include all the jargon used in the articles and exercises. Nobody really programs in binary code anymore. People realized a while ago how difficult it is to write programs in binary, so they created alternative programming languages that were much easier to use. Now, we use languages such as Javascript and Python. If you are interested in learning to program, Khan Academy has lessons for beginners. Luckily for us, we don't have to input data into our computers through binary as that would be extremely tedious and error pr...

How Binary Code Works and How Computers Use It

Want to learn how how binary code works and how computers use it to store and process data? Keep reading! The basics of binary are within your reach! Without communication, we’d be nowhere, and we would have no way to talk or share coherently. The same is true for computers, except they don’t communicate with traditional language, as we do. Computers have a limited vocabulary, composed of a language called binary code. Instead of letters, the computer alphabet — if you can call it that — is made up of 1’s and 0’s. When compiled together, they create a complex language that only computers can understand. Well, that’s not entirely true, we can understand binary code too if we invest the time to learn it! What Is Binary Code? The concept of binary code is quite simple. Every 0 means off or disabled, and every 1 means on or enabled. In other words, you can look at them as a switch or lever. If you’re looking at binary code and you see a 1, then you know that particular data point is “on,” or has been initiated. The opposite is true when you see a 0 in the code. While that’s a rudimentary definition of binary code, it will certainly help you understand the language much better. To quantify how that kind of code can tell a computer what to do, or communicate rather, you must consider how these machines handle the information. Not necessarily in this order, computers will: • Receive inputs or commands from users through applications • Collect, store, and process data as informati...

HTG Explains: How Does a CPU Actually Work?

Most things in a computer are relatively simple to understand: the RAM, the storage, the peripherals, and the software all work together to make a computer function. But the heart of your system, the CPU, seems like magic even to many tech people. Here, we’ll do our best to break it down. Most of the research for this article comes from One note before we begin: modern CPUs are orders of magnitude more complex than what we’re outlining here. It’s nearly impossible for one person to understand every nuance of a chip with over a billion transistors. However, the basic principles of how it all fits together remain the same, and understanding the basics will give you a better understanding of modern systems. Starting Small Computers operate in RELATED: What is Binary, and Why Do Computers Use It? Modern computers use billions of transistors to perform calculations, but at the lowest levels, you only need a handful to form the most basic components, known as gates. Logic Gates Stack a few transistors properly, and you have what’s known as a logic gate. RELATED: How Logic Gates Work: OR, AND, XOR, NOR, NAND, XNOR, and NOT Doing Math With Gates With just two gates you can do basic binary addition. This diagram above shows a half adder, created using This gives us a simple setup with three distinct outputs: zero, one, and two. But one bit can’t store anything higher than 1, and this machine isn’t too useful as it only solves one of the simplest math problems possible. But this is...

How Microprocessors Work

" " Microprocessors are at the heart of all computers. Jorg Greuel/Getty Images The computer you are using to read this page uses a microprocessor to do its work. The microprocessor is the heart of any normal computer, whether it is a A microprocessor — also known as a CPU or central processing unit — is a complete computation engine that is fabricated on a single chip. The first microprocessor was the Intel 4004, introduced in 1971. The 4004 was not very powerful — all it could do was add and subtract, and it could only do that 4 If you have ever wondered what the microprocessor in your computer is doing, or if you have ever wondered about the differences between types of microprocessors, then read on. In this article, you will learn how fairly simple digital logic techniques allow a computer to do its job, whether it's playing a game or spell checking a document! " " Introduced by Intel in 1974, the 8080 microprocessor was the first microprocessor powerful enough to build a computer around. Science & Society Picture Library/Getty Images The first microprocessor to make it into a home computer Since 2004, Intel has introduced microprocessors with multiple cores and millions more transistors. But even these microprocessors follow the same general rules as earlier chips. An Intel Core i9 processor can have Intel's product range has widened substantially from the 1970s. As of this writing, the company still makes Pentium and Core CPUs for computers, but higher-performance PC...

How Does a Quantum Computer Work?

If someone asked you to picture a quantum computer, what would you see in your mind? Maybe you see a normal computer-- just bigger, with some mysterious physics magic going on inside? Forget laptops or desktops. Forget computer server farms. A quantum computer is fundamentally different in both the way it looks, and ,more importantly, in the way it processes information. There are currently several ways to build a quantum computer. But let’s start by describing one of the leading designs to help explain how it works. Imagine a lightbulb filament, hanging upside down, but it’s the most complicated light you’ve ever seen. Instead of one slender twist of wire, it has The outer part of this vessel is called the At such low temperatures, the tiny superconducting circuits in the chip take on their quantum properties. And it’s those properties, as we’ll soon see, that could be harnessed to perform computational tasks that would be practically impossible on a classical computer. Traditional computer processors Classical computers are designed to follow specific inflexible rules. This makes them extremely reliable, but it also makes them ill-suited for solving certain kinds of problems—in particular, problems where you’re trying to find a needle in a haystack. This is where quantum computers If you think of a computer solving a problem as a mouse What if, instead of solving the maze through trial and error, you could consider all possible routes simultaneously? Quantum computers do...

Explaining Computers: How Does a Computer Work?

The first modern computers were built in the 1940s but they were not only primitive by today’s standards but also took up an incredible amount of space. The first computer that was a clear ancestor of the PC sitting on your desk today was the PCs have come a long way since those days but the way they work hasn’t changed that much. Most of the fundamentals for explaining computers applies equally to today and 40 years ago. Let’s look at how computers operate. Fundamental Computer Concepts Understanding a few basic computing concepts helps make a lot of the mysteries of their operation a bit clearer. Binary Code First, it’s useful to remember that computers can really only recognize two numbers – 0 and 1. This is known as binary code. Everything you do with a computer comes down to those two numbers. The magic of modern technology comes from the processor’s ability to take large numbers of those 1’s and 0’s and combine them in particular ways. Computers can’t think, they can only evaluate information. It’s helpful to remember that when you run into problems, the computer isn’t out to get you — it’s evaluating things in a different way than you would like. Hardware vs. Software Hardware and software work together to perform all the tasks that you do on your PC. Hardware is the physical technology in your computer. This includes memory, storage, the processor, displays, and any other equipment built into the system. Software refers to the programs and applications running on t...

What Is a PC?

" " PCs include laptop computers, desktop computers and even handheld computers like tablets and smartphones. Penpak Ngamsathain/Getty Images The word A microprocessor is a small electronic device that can process data in the blink of an eye. You can find microprocessors in many devices you use each day, such as cars, refrigerators and personal computer, or PC. In fact, the concept of a computer has become nearly synonymous with the term PC. When you hear about a PC, you probably envision an enclosed device with an attached video screen, keyboard and some type of a pointing device, like a mouse or touchpad. You might also envision different forms of personal computers, such as desktop computers, towers, laptops and handhelds. The term PC has been associated with certain brands, such as Intel processors or Microsoft operating systems. In this article, though, we define a PC as a more general computing device with these characteristics: • designed for use by one person at a time • runs an operating system to interface between the user and the microprocessor • has certain common internal components described in this article, like a CPU and RAM • runs software applications designed for specific work or play activities • allows for adding and removing hardware or software as needed Initially, computers were huge, taking up large rooms with attached terminals allowing access by multiple users. In the 1970s, a man named Ed Roberts began to sell computer kits based on a microproce...