[ Pobierz całość w formacie PDF ]
.The processing unit and the control unit form the central processing unit (CPU).Becausethe design had only one processing unit, the programs needed to be written for that kind of designwere sequential, and most of the programming languages were created to be used in a sequentialmanner, which is a practice still used in today s programming languages, including C#.The biggestdisadvantage of creating such applications is that whenever your application had to wait for some-thing to happen, the whole system would freeze, creating a very unpleasant user experience.Threadswere introduced to minimize this kind of problem.Working with ThreadsLess than 20 years ago, most consumer operating systems (OS) could run one single processwith one thread of execution.(A thread is the smallest unit of execution that can be indepen-dently scheduled by the OS.) In a single-threaded OS, the computer runs only one application atthe time.There was normally a command-line interpreter that was interpreting the commandsentered by the user.When a command was entered, the interpreter transferred the control tothe processor to the application the command was referring to.When the application was done,it transferred the control back to the interpreter.If you think about it, this made a lot of sense,considering the fact that you had only one thread.The biggest problem was that the user couldfeel that the computer froze when an application did one of the following two things:¤'¤'Intensive calculations¤'¤'Fetched some data from the I/OWhen your application had to do intensive calculations, there wasn t too much you could do excepteither using a quicker computer to decrease the time it took to do the calculation or splitting theproblem into smaller ones and distributing it across several computers, both of which are expensiveoperations, and sometimes it might take longer to do the calculations.When your application fetched data from the I/O, your CPU was waiting for the data to come,doing no processing in the meantime.To improve the responsiveness of your application, the notionof multithreading was introduced.In a multithreaded application, one thread would spawn anotherthread to do the fetching and waiting while the parent thread continued to do other work.When thedata was needed, the parent thread was blocked waiting for the spawned thread to finish its work.This pattern is known as fork-join pattern.ADVICE FROM THE EXPERTS: Understanding ThreadsAlthough threads are not explicitly required for the exam, it is the authors firmbelief that a good understanding of how threads work in Windows can help youbecome a better programmer and understand this chapter.If you are already famil-iar with this subject, you can skip this section and jump to the next one, SpawningNew Threads by Using ThreadPool, after going through the code in this section.268 CHAPTER 7 MULTITHREADING AND ASYNCHRONOUS PROCESSINGX'Having one processor, it meant that only one thread could be run at any given time.This can beachieved in two different ways:¤'¤'Collaboratively: Every thread must give up the control so that another thread can execute.¤'¤'Preemptively: The operating system has a component called scheduler that makes sure thatno thread monopolizes the CPU.This is how Windows is implemented.The Windows scheduler works as follows:1.Every thread gets a priority assigned when it is created.A created thread is not automaticallystarted; you have to do that.2.When a thread is started, it will be added on a queue with all the threads that can be run.3.The scheduler takes the thread with the highest priority on the queue, and it starts to run it.4.If several threads have the same priority, the scheduler schedules them in circular order(round robin).5.When the time allotted is up, the scheduler suspends the thread, adding it at the end of thequeue.After that, it picks up a new thread to run it.6.If there is no other thread with higher priority than the one just interrupted, that threadexecutes again.7.When a thread is blocked because it has to wait for an I/O operation, or for some otherreasons such as locking (discussed later in this chapter in the Synchronizing Resourcessection), the thread will be removed from the queue and another thread will be scheduledto run.8.When the reason for blocking ends, the thread is added back in the queue to get a chanceto run.9.When a thread finishes the work, the scheduler can pick another thread to run it.There is one thread called System idle process that does nothing, except keeping the processor busywhen there is no other thread to run.This process of time slicing creates the impression that youroperating system can run several applications at the same time, including answering to the userinterface (UI) commands you send, such as moving the mouse or moving windows around.In.NET all applications have several threads [ Pobierz caÅ‚ość w formacie PDF ]