Classification of Microcontrollers:
Microcontrollers are classified on different aspects like architecture, programming language used, bus width, memory, instruction set etc. Here we will discuss only three classifications.
- Bus width
- Microcontrollers are classified according to the size of internal bus width. Eg: 4 bit, 8 bit, 16 bit, 32 bit etc., When we say we are using a 16 bit microcontroller, what it actually means is that instructions are sent in batches of 8 bits, or 1 byte. (If you need to understand what are bits and bytes, read about binary digits in digital number system tutorial)
- Architecture
- Architecture is a conceptual design and operational structure of a system. There are two main architectures considered in microcontroller designs. Harvard and von Neumann architectures. The difference is the way data and programs are accessed and stored.
- Harvard Architecture: In this architecture, physically separate memories with their own dedicated buses are used for instructions and data. This allows faster execution and data flow happens in parallel with each other.
- von Neumann: Contrast to Harvard Architecture, in this the controller can either read an instruction or read/write data from/to the memory. Both cannot happen at the same time as they use the same bus for instruction and data transfer.
If you have already guessed it, Harvard architecture is more to the advantage and most microcontrollers use this architecture.
- Instruction Set
- Another type of classification is architecture based on the instruction set. Computers operate on hardware instructions, also called as “command set” which is the basic set of commands a microcontroller, or a microprocessor understands. These commands can be called by the software to initiate various actions like ADD, SUBTRACT, LOAD etc. on the CPU.
There are two basic design philosophies dominating the market today. They are the CISC and the RISC.
- Pronounced as “sisk”, CISC stands for “Complex instruction Set Computing” uses complex instructions which are micro coded in the ROM (Read Only Memory) and they are called by the software whenever required. Most computers CPU’s have used this architecture. Basically it is a set of large and complex instructions which operate directly from the memory and acts as transition layer between the instructions and the electronics. At the time when computers started screening up, this CISC development was a huge leap in increasing the performance and was also less expensive due to instructions being micro coded. However, CPUs required dozens of data memory cycles in order to execute a single instruction.
- With each new generation of computer, chip hardware became more and more complex and micro coding each and every instruction started adding up to trouble. This is when designers and researchers thought of using a small set of simplified instructions and reducing the number of data cycles required to execute an instruction. This new design was known as “Reduced Instruction Set Computing” or RISC, (Pronounced as “risk”). RISC architecture was faster, required simpler hardware, and was less expensive compared to CISC architecture (Simple instructions uses less transistors and easy to design). Most CPU’s and microcontrollers today make use of RISC architecture and CISC is fighting for its endurance.
So which design wins? There is always a general disagreement between the experts as to which design is better. Some say CISC is complex and others say RISC requires complex software. But for us, we actually don’t care. Whichever design serves our purpose that is well and good.
Do you have anything to say?
Visit the Forum to discuss, learn and share anything related to robotics and electronics !!