Usually, people ask what is computer? It is not only a device that we see like laptops or desktop around us.

The answer is present-day computers are all around us like your cell phone, digital watch, music player, game console, IPTV, many medical devices such as ct scan, MRI, and many more.

A computer consists of the hardware, the operating system, software program, and input/output peripheral. All considered under category.

Equipment required and used for “full” operation can be referred to as a system to perform many or dedicated tasks.

This term may also as well be used for a group of systems that are related and work together, in unique a community or computer cluster.

Definition of modern computer

what-is-definition-of-modern-computer

A modern-day computer consists of at least one central processing unit (CPU) in the shape microprocessor, along with some type of memory.

The processing aspect consists of out arithmetic and logical operations, and sequencing and manipulates units can change the order of operations in response to stored information.

Peripheral units encompass input units (keyboards, mice, joystick, etc.), output gadgets (monitor screens, printers, etc.), and touchscreen input/output gadgets that operate each function.

Peripheral gadgets allow statistics to be retrieved from an external source and they allow the result of operations to be saved and retrieved.

Applications

modern-applications-of-computer-demonstration

They are part of unique purpose devices like microwave ovens, music players, ac, remote controls.

Also regularly occurring devices like pc and cell phones such as smartphones. The Internet is run on computers and it connects hundreds of tens of millions of different systems and their users.

Computers are used to manipulate systems for a vast variety of industrial and devices as part of the production assembly line.

Manufacturing units such as industrial robots for car production and computer-aided design for design products.

History

The history is far older than we think. It was not like what is in modern days.

It should be noted that the history of computers starts around 2000 years before. At this time Antikythera Mechanism was used as a weather calculating device.

Found on Greek island Antikythera beneath water. surprisingly it was an astronomical computer equipped with mechanical parts such as gears.

Before it never looked like we see around us today. Early computers have been solely conceived as calculating devices.

simple guide units like the abacus aided humans in doing calculations considered as an ancient computer.

Early in the Industrial Revolution, some mechanical devices were built to automate lengthy tedious tasks, such as guiding patterns for looms.

History of digital computer

Electrical devices did analog calculations in the early 20th century. The first digital calculating machines were developed for the duration of World War II known as Turing Machine.

Allen Turing is the founding father of the modern computer concept. Also credit for enigma code-breaking.

Early digital computers consisted of relays and vacuum tubes and it was the main hurdle of the development of digital computers.

Because they were hungry for power and could not reduce their size to integrate more calculating circuits. Also produce a large amount of heat.

Firstly the history changed dramatically when the transistor introduced in late 1940. It leads to the development of MOSFET ( metal-oxide-semiconductor field-effect transistor).

without a doubt, it clears the path to developing a monolithically integrated circuit chip (IC). it could be possible till late 1950.

Finally, we got our microprocessor which was a benchmark to develop microcomputers in 1970. From here what is computer definition comes into existence for us?

The increase in speed, reduction in electricity consumption, and versatility of computer systems have been growing dramatically since then.

Moore’s law

The increase in speed, reduction in electricity consumption, and versatility of systems have been growing dramatically since then.

After that Moore’s Law comes into existence as well. But as of now, it won’t exist after the next decade.

The number of transistor doubles in a processor of computer in every two year.

Moore’s Law

As soon as it reaches 10 micrometers to 5-micrometer range quantum tunneling effect comes into the picture.

With the help of FinFET, the 20-micrometer barrier crossed but, surely, Moore’s law ends soon.

Generations

From time to time development in science and technology also evolves in the the computer. Accordingly divide into following generations.

First Generation (Electromechanical)

Device

Products

Calculators 

Pascal's calculator,  Arithmometer, Difference engine, Quevedo's analytical machines

Programmable devices

 Jacquard loom machine, Analytical engine, IBM ASCC/Harvard Mark I, Harvard Mark II, IBM SSEC, Z1, Z2, Z3

Second Generation (Vaccume Tubes)

Devices

Products

Calculator

Atanasoff–Berry Computer, IBM 604, UNIVAC 60, UNIVAC 120

 Programmable devices

ENIAC, Manchester Baby, EDSAC, Mark I,Pegasus,Mercury, CSIRAC, EDVAC, UNIVAC I, IBM 701, IBM 702, IBM 650, Z22

Third Generation (discrete transistors and SSI, MSI, LSI built-in circuits)

Device

Products

Mainframes

IBM 7090, IBM 7080, IBM System/360, BUNCH

Minicomputer

Hewlett-Packard HP 2116A, IBM System/32, IBM System/36, LINC, PDP-8, PDP-11

Desktop Computer

Programme 101, HP 9100

Fourth Generation (VLSI built-in circuits)

Device

Products

Minicomputer

VAX, IBM System i

 4-bit microcomputer

Intel 4004, Intel 4040

8-bit microcomputer

Intel 8008 micro processor, Intel 8080, Motorola 6800, Motorola 6809, MOS Technology 6502, Zilog Z80

16-bit microcomputer

Intel 8088, Zilog Z8000, WDC 65816/65802

32- bit  microcomputer

Intel 80386, Pentium,Motorola 68000, ARM

64-bit microcomputer

Alpha, MIPS, PA-RISC, PowerPC, SPARC, x86-64, ARMv8-A

 Embedded computer

Intel 8048, Intel 8051, ATMEGA, ARM

Personal computer

PC, Laptop computer, Personal digital assistant (PDA), Portable computer, Tablet PC, Wearable computer

Theoretical/Experimental (Future generation of computer)

Device

Products

Quantumcomputer 
Chemical computer  
DNA computing
Optical computer
Spintronics based computer
Wetware/Organic computer

D-WAVE, IBM

Classification

Here classification fall under two categories mainly.

Classification based on architecture

1 Analog

2 Digital

3 Hybrid

4 Harvard architecture

5 Von Neumann architecture

6 Reduced instruction set

Classification based on size

1 Mainframe

2 Supercomputer

3 Minicomputer

4 Microcomputer

5 Workstation

6 Personal computer

7 Laptop

8 Tablet

9 Smartphone

10 Single-board

What is hardware

Any kind of computer consists of hardware, software, and firmware. Above all only circuits and devices are made of a material which is known as hardware.

Input devices (definition)

The devices used to provide or input data to the central processing unit known as input devices.

Input devices can collect data automatically with the help of sensors or need to operate manually.

For example, a microwave or air conditioner (AC) collects temperature related data automatically. According to your settings CPU process this data to get the desired output or temperature.

Some common examples are:

1 Keyboard 

2 Digital camera

3 Mouse

4 Scanner

5 Joystick

6 Microphone

7 Trackball

8 Touchscreen

Output devices (definition)

Those devices convert CPU output into some action or desired form of data considered as an output device.

Like speaker convert our cd/DVD file data into sound waves or printer prints on paper.

Here again some common examples are :

1 Monitor

2 PC speaker

3 Printer

4 Projector

5 Coin-operated barrier

What is cpu

cpu-and-its-internal-parts

The central processing unit (CPU) manages the computer quite a several components.

In a way CPU is the main part consist of input/ output ports, memory, buses for moving data, and instruction.

The most important part is the processor also inside the CPU. Reads and interprets (decodes) the program instructions, remodeling them into manipulating indicators that prompt other components.

Control systems in advanced systems can also trade the order of execution of some instructions to enhance performance.

What is processor

A common function of all processors is the program counter, a specific group of cells (a register) that maintains which location in memory the next instruction is to be read from.

The program counter is (conceptually) just every other set of memory cells, it can be modified via calculations completed in the arithmetic logic unit.

Functioning of a processor

architecture-of-computer-processor

Here processor feature is as follows:

1 Read the code for the subsequent instruction from the cell indicated by the counter register.

2 Decode the numerical code for the instruction into a set of commands or signals for each of the other systems.

3 Increment the program counter so it points to the subsequent instruction.

4 Read anything records the instruction requires from cells in reminiscence (or possibly from an enter device). The location of this required memory register is typically saved within the instruction code.

5 Provide the imperative data to an arithmetic logic unit or memory register.

6 If the instruction requires an arithmetic logic unit or specialized hardware to complete, instruct the hardware to function the requested operation.

7 Write the result from the arithmetic logic unit lower back to a reminiscence place or a memory register or possibly an output device.

8 Jump lower back to step (1).

Arithmetic logic unit (ALU)

Undoubtedly arithmetic logic unit is able of performing two types of operations: arithmetic and logic.

Also, it might encompass multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots.

Some can only function on total numbers (integers) whilst others use floating point to symbolize real numbers, albeit with limited precision.

However, any computer that can perform simple and easy steps. It can be programmed to spoil down the extra complex operations into simple steps that it can perform.

Therefore, machines can program to execute any arithmetic operation. Although it will take more time to do so if it’s arithmetic logic unit does not without delay guide the operation.

Basic function of Arithmetic logic unit

The instruction set which can arithmetic logical unit understand may be restricted to addition and subtraction.

Additionally an arithmetic logical unit may examine numbers and return. Boolean reality values (true or false) depending on whether one is equal to, greater than, or much less than the other.

Logic operations contain Boolean logic: AND, OR, XOR, and NOT. These can be useful for growing problematic conditional statements and processing Boolean logic.

What is memory

A computer’s memory can be described as a group of cells into which binary digit (0 or 1) can be a store or read. Each cell has a “address” and can put a single digit. 

The information stored in memory can also characterize practically anything. Letters, numbers, even pc program can be placed into memory with equal ease.

In almost all modern computers, every memory unit is set up to save binary digits in organizations of 64 bits. 

The CPU has a set of memory cells known as registers that can be read and write.

Memory comes in two most important varieties:

1 Random-access memory or RAM

2 Read-only memory or ROM

What is Ram

Random-access memory or computer ram is a primary memory. All the programs or software run stay inside ram.

Even the processor primarily communicates with computer ram only. It means all the instruction and data fetch from ram and write back to it.

Because the processor can read from any memory address of ram directly and can write the same way. Its speed or data transfer rate is higher than all rest of the memory units.

But it can only retain data in it until the power goes off. Modern flash drives are similar to ram although it is a non-volatile memory.

What is Rom

Read-only memory or ROM is reloaded with data and software programs that by no means changes.

Consequently, the CPU can solely read from it. ROM is typically used to save the computer’s preliminary start-up instructions known as BIOS.

The software saved in ROM is frequently called firmware, due to the fact it is hardware than software.

Flash memory blurs the distinction between ROM and RAM, as it retains its data when power off but at the same time also rewritable.

What is firmware

computer-bios-firmware-ic

Many embedded devices are equipped with firmware so that every time they boot keep functioning. The firmware is known as middleware.

Any software program stored in a chip (IC) can not be modified without a defined procedure, such as BIOS ROM.

Also, refer as write once read many. As we boot our computer bios from start functioning. Firstly restore basic input /output routine services.

What is software

Definition that can not be complete without a discussion about software?

Software refers to components that do not have a physical form. It is in the form of binary encoded data or instructions, into the hardware.

Such as programs, data, protocols, etc. stored from which the device is built.

Also, software in a computer consists of programs, libraries, and related non-executable data, such as online documentation or digital media.

What is Program

The program kind set of instructions can be given to the computer to execute them.

In general modern computers based on Von Neumann architecture often have code in the structure of a programming language.

In realistic terms, a program maybe just a few instructions or lengthen to many hundreds of thousands of instructions.

Applications such as word processors and net browsers are a good example.

Also, a cutting-edge computer can execute billions of instructions per second (gigaflops) and hardly ever makes a mistake over many years of operation.

What is Bug

Large program packages consisting of several million codes may also take groups of programmers’ years to write.

Due to the complexity of the task almost without a doubt comprise errors. Errors in programs are called “bugs”. 

They may not affect the usefulness of the program. Can have only delicate effects.

But in some cases, they may additionally cause the software or the whole device to “hang”, becoming unresponsive. 

Grace Hopper scientist and developer of the first compiler. Who credited for having first used the term “bugs” in computer history.

After a dead moth used to be determined shorting a relay in the Harvard Mark II in September 1947. 

What is operating system

An Operating system (OS) is software that controls and manages hardware. Also, make the possible user interact with input/output devices/ports.

It manages applications, drivers, and all kinds of software. Some popular operating systems are Microsoft Windows 10, Linux, Apple Mac iOS, Android (for mobile)

What is a programming language

Every digital computer works on binary digits. Writing any program in binary is a very tedious job.

Here compiler and assembler comes in picture.

Now we can write instructions in humans like programming language and later it converts into machine language for execution.

Low level programming languages

Machine languages that symbolize them collectively termed low-level programming languages.

Assembly language is a good example of low-level programming languages. Mainly disadvantage is processor specific language.

For instance, an ARM architecture devices such as smartphone can’t recognize the language of an x86 CPU. 

High level programming languages

Although easier than a low-level language. writing lengthy programs in a low-level language is often tough and is also error-prone.

Therefore most realistic applications are written in high-level programming languages. High-level programming languages are less related to specific processors than assembly language.

It is greater related to the language and structure of the problem to be solved via a program.

Ada, BASIC, C, C++, C#, COBOL, Fortran, PL/I, REXX, Java, Lisp, Pascal, Python, Perl are few examples of high-level programming languages.

Future

There is ongoing research to make computers out of many promising new kinds of technologies. Such as optical, DNA, neural and quantum computers.

d-wave-quantum-computer

Most computer systems are universal and are capable to calculate any computable function. Specially constrained solely with the aid of their memory capability and working speed.

However special designs of computer systems can provide very specific overall performance for precise problems.

For instance, quantum computer systems can probably smash some present-day encryption algorithms ( by quantum factoring ) very quickly.

More Links:

1 Windows 10 on raspberry pi

2 Raspberry Pi

3 Mycroft Ai

4 Alexa

This Post Has 3 Comments

Leave a Reply