Full file at Systems and Middleware: Supporting Controlled Interaction 2-1

Chapter 2

Threads

At a Glance

Instructor’s Manual Table of Contents

Chapter Overview

Chapter Principles and Objectives

Lecture Notes

Quick Quizzes

Teaching Tips

Further Readings or Resources

Discussion Questions

Key Terms

Chapter Overview

Chapter 2 describes in detail threads and their role in the operating system. Threads are the mechanism an operating system uses to support concurrent execution of multiple programs. Each individual sequence of instructions is a separate thread executed simultaneously. There are reasons for multi-threaded programming, including maximizing a system’s responsiveness and utilization of its resources.

When there are more threads than processors, the operating system must divide the processor’s cycles between the various threads to simulate their concurrent execution. The operating system must organize this switching between threads.

Switching between threads is a complex operation; the operating system must freeze the data associated with one thread and reactivate the other thread. Switching between threads can either be cooperative, in which case the active thread must yield control to another thread, or preemptive, in which case the operating system interrupts threads at intervals and schedules other threads for operation.

The chapter concludes with a discussion on security issues.

ChapterPrinciples and Objectives

Principles / Objectives
Threads are the primary unit of concurrent execution. /
  • Understand a thread, program, and process.

The operating system is responsible for simultaneous execution of multiple threads. /
  • Explain that an operating system schedules the switching between threads.
  • Demonstrate the information that must be saved and restored to switch between threads.
  • Discuss the differences between cooperative and preemptive multitasking.

Most programming languages support multi-threaded programming. /
  • Explain the abstract notion of threaded programming.
  • Discuss the APIs for threading provided by Java, POSIX, and Windows.
  • Discuss the advantages of multi-threaded programming in terms of resource usage, responsiveness, and modularization of code.

Security is an important concern for multi-threaded programming. /
  • Identify causes of security issues.

Lecture Notes

Introduction

A thread is a single series of computational steps. A program has at least one thread and may have many more.Each program is also a process, which is an operating system-level container for the one or more threads started by the program.

The simplest programs only have a single sequence of instructions and thus only a single thread. Multiple threads can come into existence by either running multiple programs[This is unclear: Do the threads run multiple programs, or does a user?] or a more complex program that is multi-threaded. If a computer has multiple processors the operating system can obviously execute multiple threads simultaneously, or concurrently. However, whenever there are more threads than processes (and this is almost always the case), the operating system must switch between the instructions in each thread in order to simulate the effect of simultaneous execution.

Distinction Between a Program and a Thread / Program – Contains a series of instructions
Thread – Contains the computational steps used to execute those instructions

Examples of Multi-Threaded Programs

Multi-threaded programming is often a convenient way to develop applications. Nearly every programming language provides a mechanism for threaded programming. The Java programming language provides a thread which can execute anything that implements the runnable interface. UNIX systems provide a standardized POSIX API for the C programming language that supports similar thread creation and execution. The Windows operating system also has a similar C API for threading.

Whenever a program begins, its instructions are executed in a single thread. The program can choose to start or spawn additional threads to accomplish its purpose. Most commonly, multiple threads are used to implement programs that have multiple tasks to accomplish at the same time. Examples of this include a Web server that needs to be able to serve multiple client requests simultaneously. Another example is a video game, which needs to simultaneously run the AI for characters in the game, handle input from the player, and draw the visuals for the game back to the screen. In both of these examples, the operation of the program can be accomplished by dedicating a different thread to each different task.

Thread APIs / POSIX – Available on all versions of UNIX (Linux, BSD, Mac OS X, etc.)
Java – The Java thread and runnable classes
Windows – The Windows native thread API (Win32 API)
Most other languages including Python, C #, Lisp, and Scheme

Quick Quiz

  1. How does an operating system allow multiple threads to run simultaneously?
    ANSWER: The operating system interleaves[should be "interweaves"?] the operation of multiple threads on a single processor to simulate the effects of all threads running simultaneously.
  1. What is the POSIX API?
    ANSWER: It is an interface for threaded programming in C that is portable across different versions of the UNIX operating system.
  1. How do multiple processors help threading?
    ANSWER: When there are multiple processors, the operating system can actually run multiple threads simultaneously.

Reasons for Using Concurrent Threads

There are two primary reasons for using threads in a program. First, they can make a program that responds to multiple input sources more responsive. For example, if a single-threaded Web server is servicing a slow client, then all other clients will be slowed as well. If the Web server dedicated a separate thread to each client, the slow client would not affect the others.

Second, a multi-threaded program can make more efficient use of hardware resources on a computer. The most obvious example of this is the use of a multi-processor computer. However,this is also true on a single-processor system. Many pieces of hardware operate more slowly than the processor, even on a single-processor machine. For example, reading data off a hard drive or from a network connection is very slow. In such situations, multi-threaded programs can run a different thread while one thread is waiting for data from a piece of hardware.

Multi-threaded programming can also make the implementation of a complex system, such as a Web server, significantly simpler by modularizing its operation. Rather than focusing on servicing all clients with a single thread, a multi-threaded Web server can dedicate a single thread to each user. This means that the programmer implementing the instructions run by the thread only needs to write a program that services a single request. This is a much simpler program to write.

Benefits of Threaded Programming / Responsiveness – Slow inputs will not make other input slow
Efficiency – Let a task have use of the processor while another task is waiting for interactions with a slow piece of hardware
Modularity – Makes writing a program that accomplishes numerous taskssimultaneously easier by encapsulating each task in a separate list of instructions

Quick Quiz

  1. How does multi-threaded programming aid in the responsiveness of a program?
    ANSWER: If an individual thread handles each client of a program, the operation of a slow client does not hinder the speed of responses to others.
  1. How does multi-threaded programming aid in the efficient utilization of hardware?
    ANSWER: Multi-threaded programming makes maximal use of multiple processors. It also insures that the processor has something to do while a program is waiting for a piece of slow hardware,such as a hard drive or network connection.
  1. How does multi-threaded programming make programming easier?
    ANSWER: Multi-threaded programming can modularize a task into simpler computations that are repeated by multiple threads.

Switching Between Threads

The process of switching between threads is a complex thing for the operating system to achieve. There is a great deal of state, or context, to be saved for the current thread in order to move it off the processor.The context of the thread which is going to be run next must also be restored.

The context includes the location in memory of the stack that the thread is using, as well as the content of the processor’s registers, and the location of the next instruction that the thread was going to execute. The operating system saves this information into the task/thread control block (TCB). The information in the TCB associated with the thread that is going to be activated is restored by placing its stack pointer in the stack pointer of the processor, restoring the contents of the processor’s registers, and setting the program counter to the appropriate instruction. Then, execution is handed over to the thread, which continues as if it was never interrupted. The operating system generally simplifies this process by pushing the program counter and the registers onto the stack prior to switching to a new thread.

There is an important distinction between thread switching within a single program and process switching between different programs. In the latter case, the operating system must do much more complicated things since the two programs have different permissions and virtual address spaces.

Thread Context / Stack Pointer –The current state of functions that are executing
Instruction Pointer – The next command for the processor to execute
Processor Register –The current state of the CPU

Preemptive Multitasking

Causing the operating system to switch between threads happens in one of two ways: either through the use of cooperative multitasking or preemptive multitasking.

If the operating system uses cooperative multitasking, then a thread must proactively call a yield function which indicates to the operating system that the thread is ready to give up control of the processor to another task. Cooperative multitasking can be problematic if a malicious or poorly programmed thread monopolizes access to the processor. In such cases, other threads do not get a chance to run.

Instead of cooperative multitasking, most modern operating systems use preemptivemultitasking. In preemptive multitasking, the operating system interrupts threads at a given interval and gives other threads a chance to run. This is accomplished by setting a hardware timer to interrupt the execution of a thread at a specified interval. This causes a jump to the operating system’s interrupt handler, which decides which thread to execute next. In preemptive multitasking, a thread may also choose to yield[?] and give up control of the processor if it knows that it is the appropriate thing to do.

Security and Threads

There are two major security issues with threads. First, a thread may monopolize hardware resources, such as the processor, network card, or other pieces of hardware. Second, there may be unwanted interaction between threads. For example, one thread may read data in storage that is considered confidential by a second thread. The solutions to these security problems will be addressed in future chapters.

Quick Quiz

  1. What information must the operating system save and restore when switching between threads?
    ANSWER: The operating system saves the stack pointer, the instruction pointer, and the contents of the processor’s registers.
  1. What is the difference between cooperative and preemptive multitasking?
    ANSWER: In cooperative multitasking, thread switching only occurs when a thread yields control back to the operating system. In preemptive multitasking, the operating system interrupts the execution of threads.
  1. Why is preemptive multitasking more secure?
    ANSWER: A malicious or poorly programmed thread cannot prevent other threads from executing.

Teaching Tips

  • Most UNIX systems, such as Linux, contain extensive manual pages documenting most functionality. These manual pages are accessed with the “man” command. Try demonstrating the man pthread_create[Okay?] command to obtain more information about POSIX threading.
  • If your students do not have a strong Java background, they may find the Simple2Threads.java program in Figure 2.3 difficult to understand. You can help them out by providing the following alternative version of the program:

public class Simple2Threads {

public static void main(String args[]){

Thread childThread = new Thread(new MyRunnable());

childThread.start();

sleep(5000);

System.out.println("Parent is done sleeping 5 seconds.");

}

public static void sleep(int milliseconds){

try{

Thread.sleep(milliseconds);

} catch(InterruptedException e){

// ignore this exception; it won't happen anyhow

}

}

}

class MyRunnable implements Runnable{

public void run(){

Simple2Threads.sleep(3000);

System.out.println("Child is done sleeping 3 seconds.");

}

}

Further Readings or Resources

  • A tutorial on POSIX threads:
  • A Sun Microsystems tutorial on Java threads:
Discussion Questions

Some interesting topics of discussion in this chapter include the following:

Are your students convinced that multi-threaded programming can be easier in some circumstances? What do they see as potential pitfalls? Does the amount of actual experience with multi-threaded programming affect their opinion?

Do anyof your students remember the days of cooperative multitasking operating systems? If so, how does their experience with those operating systems compare to their current experience with modern operating systems? What is better? Is anything worse?

Key Terms

Cooperative multitasking – Form of multi-threading where programs must proactively give up control of the processor

POSIX – Portable thread API for UNIX operating systems

Preemptive multitasking – Form of multi-threading where the operating system interrupts threads and allows other threads to run

Process – Operating system-level container for multiple threads created by a program

Thread – A series of computation steps strung together

Thread context – Information required by a thread to be reactivated after being frozen

Thread switching – Process by which an operating system supports concurrent threads