Why does computers only understand 0s and 1s ?

Why does computers only understand 0s and 1s ?

ยท

2 min read

Computers

Computers are electronic devices. They started from 'Vacuum tubes' to 'transistors' to 'Integrated circuits'.

Integrated circuits

An integrated circuit or monolithic integrated circuit is a set of electronic circuits on one small flat piece of semiconductor material, usually silicon.

credits to Wikipedia

So what ๐Ÿค”?

So the only thing that a computer can understand is, not just a computer but any electronic device is High/Low or On/Off. So put in simple human terms there are only 2 alphabets in the electrical universe. As a result of which we humans have accepted to denote them as 0 and 1, where 1 denotes 'High' or 'On' and 0 denote 'Low' or 'Off'. Which left us with no choice but to create equivalent notations for every alphabet with just the 0s and 1s.

Rea3635e3de4d47e7eb25ac904fa2c7c7.jfif

Extras ๐ŸŽโœจ

So every thing you are currently looking at in your monitor is actually residing in your system as just a series of On and Off transistors in a strict order. Everything from characters you see to images, videos, software packages and programs included is residing in your computer as a series of on and Off transistors. That should explain to you the answer to the following questions

  1. Why do high quality images take up a lot of memory ?
  2. Why does your system get heated up when you use heavy software packages ?
  3. Why does it get slow when you feel your laptop got heated up to max?

Thank you for reading this through! ๐Ÿคฉ

Please let me know about your thoughts on this article. If you have any questions simillar to this please let me know in the comments.

Have a nice day, ๐Ÿ‘‹

ย