IOS iOS Concurrency: Getting Started with NSOperation and Dispatch Queues

Concurrency is always considered a monster of a subject in iOS development. It can be considered a dangerous area that many developers try their best to avoid. Rumor has it that multithreaded code should be avoided as much as you can. I agree with you that concurrency can be dangerous so if you don’t understand it very well. It is only dangerous because of the unknown. Think of how many dangerous actions or activities that people do in their life, many right? But once they master it, it becomes a piece of cake. Concurrency is a two sided-weapon that you should master and learn how to use it. It helps you to write efficient, fast execution, and responsive apps, but at the same time, misusing it will ruin your app with no mercy. That’s why before starting writing any concurrency code, think first why do you need concurrency and which API do you need to use to solve this problem? In iOS we have different APIs that can be used. In this tutorial we will talk about two of the most commonly used APIs – NSOperation and Dispatch Queues.

Demo Projects : https://www.dropbox.com/s/5spx0l0gnjmqx8l/ConcurrencyDemoStarter.zip?dl=0

Why do We Need Concurrency? :

I know you’re a good developer with experience in iOS. No matter what kinds of apps you’re going to build, however, you will need to know concurrency to make your app more responsive and fast. Here I summarize in points the advantages of learning or using concurrency:
  • Utilize iOS devices’ hardware: Now all iOS devices have a multi-core processor that allows developers to execute multiple tasks in parallel. You should utilize this feature and get benefits out of the hardware.
  • Better user experience: You probably have written code to call web services, handle some IO, or perform any heavy tasks. As you know, doing these kind of operations in the UI thread will freeze the app, making it non responsive. Once user faces this situation, the first step that he/she will take is to kill/close your app without a second thought. With concurrency, all these tasks can be done in background without hanging the main thread or disturbing your users. They can still tap on buttons, scroll and navigate through your app, while it handles the heavy loading task in background.
  • The APIs like NSOperation and dispatch queues make it easy to use concurrency: Creating and managing threads are not easy tasks. This is why most of developers get scared when they hears the term concurrency and multi-threaded code. In iOS we have great and easy to use concurrency APIs that will make your life easier. You don’t have to care about creating threads or manage any low level stuff. The APIs will do all these tasks for you. Another important advantage about these APIs is that it helps you achieve synchronization easily to avoid race condition. Race condition happens when multiple threads try to access shared resource and that leads to unexpected results. By using synchronization, you protect resources from being shared between threads.

Part 1: GCD (Grand Central Dispatch)

GCD is the most commonly used API to manage concurrent code and execute operations asynchronously at the Unix level of the system. GCD provides and manages queues of tasks. First, let’s see what queues are.

What are queues? : 

Queues are data structures that manage objects in the order of First-in, First-out (FIFO). Queues are similar to the lines at the ticket window of the movie theatre. The tickets are sold as first-come, first-serve. The people in the front of the line get to buy their tickets before the others in the line who arrived later. Queues in computer science are similar because the first object added to the queue is the first object to be removed from the queue.



Dispatch Queues : 
Dispatch queues are an easy way to perform tasks asynchronously and concurrently in your application. They are queues where tasks are being submitted by your app in form of blocks (Blocks of codes). There are two varieties of dispatch queues: (1) serial queues, & (2) concurrent queues. Before talking about the differences, you need to know that tasks assigned to both queues are being executed in separate threads than the thread they were created on. In other words, you create blocks of code and submit it to dispatch queues in the main thread. But all these tasks (Blocks of codes) will run in separate threads instead of the main thread.

Serial Queues : 

When you choose to create a queue as serial queue, the queue can only execute one task at a time. All tasks in the same serial queue will respect each other and execute serially. However, they don’t care about tasks in separate queues which means that you can still execute tasks concurrently by using multiple serial queues. For example, you can create two serial queues, each queue executes only one task at a time but up to two tasks could still execute concurrently.
Serial queues are awesome for managing a shared resource. It provides guaranteed serialized access to the shared resource and prevents race conditions. Imagine that there is a single ticket booth but there are a bunch of people who want to buy cinema tickets, here the staff at the booth is a shared resource. It’ll be chaotic if the staff has to serve these people all at the same time. To handle this situation, people are required to queue up (serial queue), so that the staff can serve the customers one at a time.
Again, it doesn’t mean the cinema can only handle one customer at a time. If it sets up two more booths, it can serve three customers at one time. This is why I said you can still perform multiple tasks in parallel by using several serial queues.
The advantages of using serial queues are:
  1. Guaranteed serialized access to a shared resource that avoids race condition.
  2. Tasks are executed in a predictable order. When you submit tasks in a serial dispatch queue, they will be executed in the same order as they are inserted.
  3. You can create any number of serial queues.

Concurrent Queues : 

As the name suggests, concurrent queues allows you to execute multiple tasks in parallel. The tasks (blocks of codes) starts in the order in which they are added in the queue. But their execution all occur concurrently and they don’t have to wait for each other to start. Concurrent queues guarantee that tasks start in same order but you will not know the order of execution, execution time or the number of tasks being executed at a given point.
For example, you submit three tasks (task #1, #2 and #3) to a concurrent queue. The tasks are executed concurrently and are started in the order in which they were added to the queue. However, the execution time and finish time vary. Even it may take some time for task #2 and task #3 to start, they both can complete before task #1. It’s up to the system to decide the execution of the tasks.

Using Queues : 

Now that we have explained both serial and concurrent queues, it’s time to see how we can use them. By default, the system provides each application with a single serial queue and four concurrent queues. The main dispatch queue is the globally available serial queue that executes tasks on the application’s main thread. It is used to update the app UI and perform all tasks related to the update of UIViews. There is only one task to be executed at a time and this is why the UI is blocked when you run a heavy task in the main queue.
Besides the main queue, the system provides four concurrent queues. We call them Global Dispatch queues. These queues are global to the application and are differentiated only by their priority level. To use one of the global concurrent queues, you have to get a reference of your preferred queue using the function dispatch_get_global_queue which takes in the first parameter one of these values:
  • DISPATCH_QUEUE_PRIORITY_HIGH
  • DISPATCH_QUEUE_PRIORITY_DEFAULT
  • DISPATCH_QUEUE_PRIORITY_LOW
  • DISPATCH_QUEUE_PRIORITY_BACKGROUND
These queue types represent the priority of execution. The queue with HIGH has the highest priority and BACKGROUND has the lowest priority. So you can decide the queue you use based on the priority of the task. Please also note that these queues are being used by Apple’s APIs so your tasks are not the only tasks in these queues.
Lastly, you can create any number of serial or concurrent queues. In case of concurrent queues, I highly recommend to use one of the four global queues, though you can also create your own.

GCD Cheatsheet



Now that you should have a basic understanding about dispatch queues. I’m going to give you a simple cheatsheet for your reference. The sheet is very simple and contains all information you need to know about GCD.

Comments

Post a Comment

Popular posts from this blog

iOS Architecture

Property vs Instance Variable (iVar) in Objective-c [Small Concept but Great Understanding..]

setNeedsLayout vs layoutIfNeeded Explained