Member-only story
My Summary and Most Interested Topic in Advanced Computer Organization 2 Master’s Course
1. Introduction
This lecture was intended for Master Students on 2015 Fall Semester held at Graduate School of Science and Technology, Department of Computer Science and Electrical Engineering, Kumamoto University. The lecture was delivered by Associate Professor Morihiro Kuga of Computer Architecture Lab, and followed by 12 international students. The lecture begins with the fundamental topics of representation of information, logic circuit, hardware description language (HDL) design and Verilog HDL, 24-hour digital clock design, and then microprocessor behavior and its design. After the fundamentals the lecture explains about hardware algorithms of arithmetic unit, hardware sorter, array processor and systolic array, functional memory and content addressable memory, and interconnection network. This is my own summary regarding to the lecture about how computer hardware works and who knows it might give you a description of what is inside the lecture or it might not be able to give description to you due to its difficulty. This summary is personally written by me and I license it as customized creative common share-alike (CC-BY-SA) where anyone may share, reuse, republish, and sell with the condition to state my name as the original author and mention the open and original version available here and the source codes in Github are licensed under General Public License 3.0 though they are just tutorial codes and nothing more.
2. The Fundamentals
The very first part of the lecture is to know what mathematic form information takes place and how the computer can receive, process, and output the information. In mathematics, information in the real world can be described as an analog signal. For example a sound is a 1D analog signal and image is 2D analog signal. For analog signal the precision is almost infinite (ideally) where each indicator has its own value. For computer however doesn’t directly process analog signal. The signal will be perceive by a sensor with not all the information is captured. Instead it took samples of the information and became what it is known as discrete signal. For computer today that is based on electrical the main representative value is “1” and “0”, thus for discrete signal to be represented with ones and zeros are called digital signals. For an analog signal to be converted into digital signal mainly it goes through sampling and quantization…