computers coloring page

All About Computers

Download/Print Coloring Page

history of computers coloring page

Download/Print Coloring Page

 

Video Transcript:

Hey there, I’m Yvonne and you’re watching Facts to Relax, the show where I research a topic I find interesting and want to learn more about and then share my newfound knowledge with you in a calm and soothing voice, while painting some pictures on the screen. If you’re a big nerd like me and you like the idea of learning new, random facts while also relaxing, I suggest you subscribe to this channel now and check back often for new videos on all manner of interesting topics. Today’s episode is all about computers. As someone who works on a computer all day, every day for my job, it’s a tad bit embarrassing how little I really understood about how they work and I made it my mission with this episode to change that. We’ll talk about the history of computers, some of the things computers can do, what the cloud is, why YTK scared so many people, and really dive into the inner-workings of the modern computer. I put some topics with timecodes in the description below in case you want to skip around and find something specific, or you can just sit back and get comfy with me while I start from the beginning. First though, let’s take a deep breath in and then slowly let it out  as we settle down to color and chill.

Though a computer used to be the job title of a person who did mathematical computations, now-a-days, a computer is an electronic device that processes, stores, and displays information. At first, they were simply used for numerical calculations, but it quickly became clear that any information that could be numerically encoded, which is to say, all information that exists anywhere, can be processed by a computer and ever since then, we’ve seen the different uses for computers explode. Besides the big desktop or portable laptop that you might think of as a computer, tablets and smartphones are also computers, but even more than that, so many other things we use every day contain their own computers. Newer washing machines, cars, microwaves, televisions, and robotic vacuum cleaners are all examples of equipment that process data and compute the answers we need like “how’s my gas mileage today?” or “how long does this cycle need to run to properly clean a large load of really dirty socks?” Computers can sequence DNA, predict stock market trends, and even track our friends and family on real-time maps to mere feet of where they’re actually standing. It’s mind-blowingly amazing all the things computers are capable of, if you really think about it. And we humans are the ones that invented them. A quick, collective pat on the back for us.

What is a computer though and how does it work? Well, computers see data in a series of ones and zeros and somehow, magically, they combine those ones and zeros and then output them to us in something our human brains can understand…like a video or a photo or an excel document. I’m not going to even try to explain how this all works, not just because I’m unconvinced that the answer isn’t just actually “witchcraft” but also because it would take much longer than we have in this video to properly teach, but I am linking to a lovely video in the description below that may help explain it if you want to dive a little deeper into actual programming.

But here’s what I can tell you. Computers use a combination of hardware and software to work. Hardware is any physical part that makes up the computer, including the monitor, the keyboard, the mouse, and all the wires, fans, and funny little parts inside. Software is what makes all those physical parts work. Think of all the different programs you use- the internet browser or YouTube app you’re watching this on, Microsoft Word, Photoshop, games, all that stuff. Those are all software. Basically, the software is using bits of code to tell the hardware what to do. Think of it like a job and the software is the boss telling all the employees what tasks must get done and how they should do them and the hardware is all the workers getting it all done. Software that comes with your computer, or your operating system, for example, is what’s going to tell the pointer on the screen to hit the like button on this video when you press the mouse, or hardware, over it. Go ahead, try it. Magic, right?! 

Now, let’s take a look at the hardware inside the computer, all the physical parts we could see if we were to take our computer apart.

The Motherboard is a large circuit board inside the computer which contains many of the important parts that make it work. Here are some of the things on the motherboard: 

The CPU or Central Processing Unit may look like just a small square piece of plastic and metal bits but underneath its simple exterior, it’s actually the brains of the whole computer. It processes information and carries out commands. Now this thing is doing so much work that it actually gets really hot, which is why it’s covered by a little metal piece called a heatsink, to draw heat away from it. The computer will also have a fan inside that helps cool things down and you may hear it running when it’s working extra hard.

Next comes the Random Access Memory, or the RAM. RAM is the short-term memory of the computer that gets used to complete calculations. It gets wiped whenever you shut off your computer and is not used to store files but it is important for the speed of the computer. When you’re shopping for a computer, the RAM may simply be referred to as “Memory” and the more you have, the more apps you can run at once and the better performance you’ll get. If you find your computer getting laggy or freezing up while you’re using it, it’s probably that you don’t have enough RAM to run all the programs you’re using at once. As software updates can make programs bigger and more memory intensive over time, an older computer may simply not have the amount of RAM needed to run newer programs, which is one of the reasons you may find yourself having to replace your computer every few years.  

Since RAM doesn’t save anything after the computer’s shut down, it also contains a hard drive where all long-term memory is saved. So that paper you’re writing or photo you’re editing would be stored here. The hard drive may use a magnetic platter or a solid-state drive. Solid-state drives are much sturdier and faster and also more expensive. When buying a computer you may see the hard drive listed as “storage” and you want to estimate how much you’re planning on saving on the computer when choosing how much you need. If you’re editing a lot of videos you need more space than if you’re just saving a text document now and then. If you see “SSD storage” listed that means it’s a solid-state drive. You can also purchase external hard drives which can be plugged into the computer to save files on instead of saving it to your computer’s hard drive, which again, you may find is a good option if you have an older computer and have run out of storage space, as newer files get larger and larger. I remember my first website ever was hosted by a company called “50megs.com” which meant that your website had fifty whole megabytes of storage space, which was quite a lot at the time. Now, a few photos from a newer digital camera can add up to fifty megs easily, which is why hard drives are getting bigger and bigger, in the gigabytes or even terabytes at this point. External hard drives can also be useful for backing up files in case your computer breaks and you lose everything saved on it. Cloud storage from companies like Google and Dropbox are another alternative for backing up files but we’ll talk more about that in a few minutes.

 A desktop computer may also contain expansion slots which allow you to add things like video cards for faster graphics and a wireless card to connect to a wireless network. Most laptops or anything smaller like a tablet or a smartphone don’t contain expansion slots. This is only one of the reasons that desktop computers tend to be more powerful than portable devices.

Along with the hard drive, RAM, and CPU, inside the computer you’ll also find the power supply unit, which is what allows us to plug in and utilize the electricity needed to run the computer. Portable devices like laptops, smartphones, and tablets, all also have batteries which can be charged so they can be used without being plugged in.

Anyways, essentially, right now what we’re talking about a “personal computer” which includes your home computer, laptop, smartphone, etc., but there’s also another type of computer as well that we all use but never see- a server. Basically, a server sends information along a network. For example, there is a server out there right now “serving” you up this very video. The internet works off web servers around the world that share information and on a smaller scale, you may use a server at work that allows you to share files with coworkers. This is also what makes cloud storage work, as when you’re saving files to the “cloud,” you’re actually saving them to a server, or multiple servers somewhere else in the world, rather than on your own personal computer’s hard drive or an external hard drive that you physically plug into the computer. A benefit to cloud storage is that it’s generally a pretty inexpensive way to back up files and you have the peace of mind to know that you’ve got another copy out there somewhere. A downside however, can be that cloud storage is often sold as a monthly subscription and if you can’t pay for whatever reason in a given month, you may lose all your data and therefore, you’re really committing to continuously paying a company to store your files indefinitely, rather than just purchasing your own external hard drive and knowing that no one can just delete all your favorite selfies if you’re short on cash one month.

 One of the biggest concerns about cloud storage is that your data may be lost if one of those hard drives breaks but normally if you’re paying for cloud storage from a reputable company, that data is backed up on multiple servers so the likelihood of losing your files may actually be considerably lower than if you’re saving it on your own computer or external hard drive. In fact, I can’t even tell you what my children looked like in the years between 2003 and 2009 because all those pictures are on an external hard drive that won’t actually load anymore when I plug it into my computer. Sad, right? Additionally, if you’re using a mega-corporation like Google to store your files, you can feel pretty confident that the company’s not going to just implode and leave you with nothing overnight. That being said, another concern with cloud storage is that your information, files, and data can be hacked and stolen or sold to other companies. I honestly don’t feel like I have a strong enough opinion either way to share with you here on that and really, everyone should do their own research when it comes to things like this and weigh their own feelings on the pros and cons of cloud storage when considering using it. Just because I think the risk of a random stranger getting ahold of my old photos is worth actually being able to look at them rather than the whirling ball of death my never-loaded external hard shows me, doesn’t mean that another person is wrong to be concerned about the risks and vice versa. I don’t think it’s an invalid concern by any stretch of the imagination but, keep in mind that most cloud storage companies now promise encryption which, at least, will make it harder for the hackers to get in and steal your data if you use cloud storage.

But moving on from this controversial topic now to my favorite part: the history of computers. When researching subjects, I like to go as far back as I can into their history. This just usually makes things make the most sense to me. So, when talking about computers, I’m going to start with the abacus. An abacus is a simple, completely non-electronic tool that can help humans perform mathematical calculations including addition, subtraction, multiplication, and division. First used almost 5,000 years ago, you can think of the abacus as the first calculator. Essentially, it’s a frame with a number of rows attached that have moveable beads on each row, which represent different numerical digits and as you move them around, you figure out the solution to your math problem. Which bead represents which number? Well, honestly, that depends on the style of abacus you’re using and what kind of equation you’re trying to solve. Despite the fact that I can’t properly teach you how to use an abacus, there are many other videos that can and I’ve linked to one that I found helpful below. Though it’s mostly been replaced by modern calculators, around the world, some people still use abacuses. For example, they can be used to teach math to children or by people who need to do math problems but have an impairment that keeps them from being able to use a calculator. 

The next big step in computer history is when a Scottish mathematician, John Napier discovered logarithms in 1614. Logarithms transfer a multiplication problem into an addition problem. While abacuses are digital devices, logarithms led to the first analog devices- calculating devices, slide rules, clocks with hands and faces, and vinyl records are examples of these. It’s funny because now we’ve moved widely back to using digital devices and away from analog but at the time, this was innovative and led us to where we are now. Then, Gottfried Leibniz created the binary number system in 1680, which is a system where 0 and 1 represent everything and can be used to solve problems. In 1801, French weaver and merchant, Joseph Marie Jacquard created a weaving loom that used punch cards to automate the design of woven fabrics. Instead of hand-weaving, cards were punched and assembled to show the loom what the design should look like ahead of time and then the loom automatically implemented the designs. One could say it was “programmed.” All of these discoveries and inventions would come together later to create computers. 

Next comes the Difference Engine, a machine designed by a British gentleman by the name of Charles Babbage in 1823 in response to the need he saw for a mechanical device that could automate long astronomical calculations. Though he never actually got around to building it because of disagreements with the engineer in charge of construction and the eventual loss of government funds that had been allotted for the project, had it been built, it would have stood about eight feet high and weighed about a ton and would have computed tables of numbers. Side note: several replicas of the Difference Engine based on Babbage’s designs have since been built and can be found at the Museum of London, the National Museum of American History, and in the IBM Corporation collection. The Science Museum in London even has a working version. About a decade after Babbage came up with the plans for the Difference Engine, a man named George Scheutz heard about his work and began working on a similar version. It took twenty years but eventually Scheutz, with help from his son, created a machine that could process fifteen-digit numbers and calculate fourth-order differences. What does that mean? I have no idea but it sounds pretty impressive. They won a gold medal at the Exhibition of Paris in 1855. Later they sold it to the Dudley Observatory in Albany, New York, who used the machine to calculate the path of Mars.

Now as we get to the end of the 1800s, the United States government found itself in a bit of a quandary. Every ten years they needed to take a census and the 1880 census was a bit of a doozy, in part because of the population boom and partially because they expanded the questionnaire and all of a sudden the government found themselves with more data than they could reasonably compute by hand. This led to the invention of the punch card tabulation system by a man named Herman Hollerith in 1890. This saved the government $5 million and a whole bunch of time and manpower tabulating the results of the census. 

The next big leaps in computer history came during World War II. Several new computer machines were built including one that Alan Turing invented while working for the British military, that was used to break codes used by the German army. An electronic computer called the Colossus, this was a top-secret project, and nobody even knew about it until much later. Meanwhile, over in Germany, Konrad Zuse created a series of machines, including the Z3 in 1941, which was the world’s first working, programmable, fully automatic digital computer that included a monitor and keyboard. It worked by writing and feeding the machine programs using a strip of film. Now, each of these machines had only one purpose. This was pretty much the case for all computers at this time, and there really were quite a few being invented-they were all single purpose machines, used for things like the design of wind tunnels, counting census numbers, predicting the weather, or who guessing who might win the next presidential election, but they could only do the one thing they were designed to do. And did I mention that each one could take up the space of an entire large room? So yeah, a giant computing machine that can only do one thing isn’t very practical and engineers began to realize the need for machines that could do multiple operations. This essentially meant that at the same time they were inventing computers, they also had to create a new language to actually make their inventions work as they envisioned them. This opened the door to a whole new career option: computer programming. Early computer programmers would write codes, often by hand, and feed them into the machines. In 1953, a computer scientist named Grace Hopper, YES FINALLY A GIRL, developed the first computer language, known as COBOL. It allowed people to use English words instead of numbers to give the computer instructions. 

Since then, innovations in computing have been flying at us, hard and fast. A few notable happenings include Douglas Engelbart creating a prototype for a modern computer that includes a mouse in 1963. In 1970 the first D-RAM chip was introduced, and in 1971 the floppy disk was invented by a team of IBM engineers, and in the same year, Xerox created the world’s first laser printer. At the same time, computer networks and email were starting to come into use. In 1975, the Altair 8800, the world’s first microcomputer kit was featured in Popular Electronics magazine and Paul Allen and Bill Gates created a software company called Microsoft. In 1976, Steve Jobs and Steve Wozniak create Apple Computers and introduce the Apple I, a single-circuit board computer, which is the first of its kind. A few years later the Apple II was released, complete with color graphics and an audio cassette drive to store stuff on. In 1981, IBM released the Acorn computer, which had an Intel chip and not one but TWO floppy disk drives. In 1982, Time Magazine named the computer “the machine of the Year,” which makes sense because that was the year the CD-ROM, able to hold 550 megabytes of pre-recorded data was introduced to the public, and Microsoft brought us Word, a program that’s still one of the most widely used for typing documents today. In 1984, the first mouse driven computer, the Macintosh, was sold by Apple for $2,500. Not to be outdone, in 1985, Microsoft introduced the Windows operating system which allowed a multi-tasking, graphical user interface. In 1990 the HTML coding language, which is still the basis for creating websites, was created by English programmer, Tim Berners-Lee. In 1991, the Apple Powerbook became the first laptop computer available. In 1995, the first DVDs were introduced and in 1996 the Palm Pilot was invented and Google was developed by Stanford University students, Sergey Brin and Larry Page. In 1999, Wi-Fi was introduced and connecting to the internet was changed forever. That same year, the world geared up for the devastation that Y2K was sure to bring. Since many computer programs only allowed two digits like 99 instead of 1999, there was concern that computers wouldn’t be able to operate when the date changed from 99 to 00 and that computer system infrastructures such as those for banks, government systems, financial databases, airline reservation websites, and power plants would be brought down as the new year arrived. However, thanks to the millions of dollars poured into software development patches and workarounds, when the year 2000 came around, there were very few actual issues that occurred. 

Speaking of the turn of the century, it was then that the first camera phone was introduced in Japan, with a maximum resolution of .11 megapixels and the USB flash drive, an alternative storage disk to the easily scratched CD was brought to the market, though at the time not all computers had an actual built in USB port to work with.

Over the next few years, Mac OS X, Windows XP, iTunes, Blu-ray, MySpace, Firefox, Facebook, and YouTube were all unveiled. In 2007, the first iPhone was available to give users a computer-like experience in a handheld phone. The Kindle allowed people to forgo paper books, and Dropbox offered the first cloud-based storage. In 2010, the Apple tablet was introduced and in 2011, the Chromebook was released. In 2016 the first reprogrammable quantum computer was created. Now-a-days, it seems like everything we use depends on a computer. Foe example, I’m an artist and I very rarely choose to paint by hand anymore, choosing instead to use an electronic drawing tablet. Robots slide through grocery stores taking note of stocking needs while we shop. Social media platforms come and go constantly, allowing us to socialize with people we probably never would have socialized regularly with before, all without ever having to see people face-to-face. The internet has made it easier for companies to market to potential consumers they could never have reached a few decades ago, and the constant electronic connection of carrying high powered computers in our pockets everywhere means that we can get our news almost as quickly as it happens. Thanks to COVID, many of us learned that with computers and technology, we don’t even need to leave our house to do our jobs. Whether we think this technological infiltration into our lives is a good thing or a bad thing, computers are here and it appears they aren’t going anywhere.

Now, on that note, I think it’s time to end this episode of Facts to Relax and if you have not found all this talk about computer technology and how it’s rapidly takinmg over the world as relaxing as you may have hoped, I sincerely apologize and offer you this photo of my cat Harry sitting on a rock to help you get back to your happy place. Just look at how cute he is, thinking he’s a wild jaguar or something when really he’s just the sweetest, most adorable little poopsie doodle. And despite everything else good and bad about computers, my friends, let’s just take a deep breath and appreciate this little gift that the technology has given us-the ability to share pictures of our cats with strangers on the internet.

 

Bye.

❁❁ SOURCES & RESOURCES ❁❁