Threading Complexities

As I explained previously, threads are like workers with separate to-do lists that share the same tools and materials and perform tasks at the same time. There are a couple of tough situations that these coworkers will often find themselves in, however, and you need to make sure that their employer has the proper processes to provide solutions.  Yes, I am going to run this metaphor straight into the ground.

First, you need to know what “concurrent” means to a computer.  Dictionary.com defines it as “occurring or existing simultaneously or side by side”.  Your computer, however, defines it as “switching between the tasks fast enough so that nobody notices that they aren’t occurring simultaneously”.  So, when I say that your employees are working at the same time, I actually mean that they are working one at a time but switching between who is working fast enough that the boss doesn’t realize they’re taking breaks.  The CPU is constantly juggling which thread gets to execute, based on priority and (usually) the order by which they come.  This same kind of exercise is happening with all of the processes that are currently running.

If you don’t notice, why should you care?  Well, there is a delay when switching between threads (or processes) called a context switch.  During a context switch, the CPU must save the state of the currently active thread, choose the next thread to give time to, restore that thread’s state, and continue its execution.  What this boils down to is that there is cost a  associate with multithreading.  You need to be aware of this cost, otherwise you may find your application running slower with multiple threads.  The reason is that your threads are unbalanced – that they are switching back and forth so much that the time it takes to do all of the context switching is greater than the time you save running two tasks simultaneously!

I ran into an example of this recently when attempting to improve the responsiveness of an application.  I was trying to do things on a separate thread to keep things smooth in the UI for the user; however I had to perform a task on a set of elements on the original thread (more on that later).  This forced me to call back to the original thread so often that the experience ended up being even worse.  I’ll explain more detail about how that works in WPF in a future post, but here’s the basic idea in pseudocode:

create a new thread to perform a CPU-intensive task; run:
   foreach object in somelist
      call back to original thread with the following task:
         perform an action on object

Because I went back to the original thread so often and so quickly, the CPU was spending most of its time context switching.  In order to fix it, I added one line:

create a new thread to perform a CPU-intensive task; run:
   foreach object in somelist
      call back to original thread with the following task:
         perform an action on object
      sleep for x amount of time

I added a sleep command in the separate thread.  This gave the main thread time to perform the task on the object, redraw, and settle in a little before I gave it another task.  This added a visual delay to the action on-screen (since there is x amount of time between each object being acted upon), but that was acceptable in this case to give the user a smooth experience.

Secondly, there is a key part of this metaphor that you have to consider: each worker is sharing resources.  This is good – it means that each thread can access the data it needs while executing; however, it comes with a caveat: you have to ensure no thread is changing that data while another thread is trying to access it.

Imagine if we have two workers sharing a drill.  Worker One is going to use it to screw a shelf to a wall, while Worker Two is going to drill a hole for the next shelf.  Now, imagine that Worker One has placed the screw where he wants it and is about to pull the trigger on the drill when a context switch occurs.  Worker One freezes, and Worker Two grabs the drill.  He pulls out the Phillips head bit that Worker One was using and replaces it with a drill bit.  Like an episode of Seinfeld, the worst thing happens at the worst possible time: another context switch.  Worker one takes the drill back and uses the drill bit on his screw, damaging the screw and quite possibly his hand.  This is called a race condition: multiple independent algorithms are dependent on a single shared resource, thus making the timing of each access of that resource critical to the success of each algorithm.

This means you have to be weary of using your global variables or the members of your class in a thread.  You must be certain that you aren’t changing something that your other thread is depending on.  The common way to handle this is by using mutual exclusion (or mutex) algorithms.  The basic concept is that, when using a variable that is common to other threads, you must ensure that the variable is not currently in use by another thread, often via queues or access flags.  Take a look at the previous link for a list of well-known algorithms with examples.  There is a wealth of knowledge related to solving race conditions, and I’m not even going to attempt to address it all.

If you take a close look at the pseudo code above, you’ll notice that I’m using a single thread to perform all actions on the set of objects, thus avoiding race conditions (as only one thread accesses them at a time).  This isn’t by my own design, however; this is a restriction given to UI elements in most, if not all, languages.  Because of their nature, UI elements aren’t thread safe.  They can be accessed by you, the graphics engine, or even the user.  Because of the amount of overhead required to allow UI elements to work in threads, they are restricted to being accessed only by the thread that created them.  Above, since I cannot access the UI elements in my worker thread, I have to call back to the thread that created the object and tell it to do the modifications I need.  This makes moving work to separate threads in a UI-heavy application complicated at times, but you get used to it pretty quickly.

So, it doesn’t sound near as simple as it did in my first post; however, don’t let this scare you away from using multithreading.  Once you get the hang of it, it is really quite simple to use.  Besides, you’ll need to be familiar with it before digging your hands into any serious user-oriented application.

2 thoughts on “Threading Complexities

  1. Блог отличный. Вручить бы Вам награду за него или просто почетный орден. =)

  2. Pingback: Multithreading in WPF | Andrew Eichacker

Comments are closed.