Multi-Threading in Swift(Part-1)

Ajaya Mati
9 min readJun 15, 2023

--

Photo by John Cameron on Unsplash

“What is multi-threading? “.

Even if you are a beginner you must be having some basic knowledge of multi-threading, i.e. Running multiple threads at the same time.

So the first question arises why do we need multi-threading?

In a real-life application Consider a scenario when a user needs to download a file from the server. A single-core processor with a single thread can do one task at-a-time. So for a single-threaded application, the user must wait for the download to finish before using the application. Because at that time there won’t be any thread available to process user interactions.

How to solve this??

To tackle this issue we can dive into multi-threading. In the above scenario, we can use two threads, one to handle the download and one for the user interaction.

So what’s a Thread?

A thread is a context in which commands are executed. There are several ways to create Threads in Swift.

You can either create an Obj-C function and pass it through a selector as starting point

Thread.init(target: Any, selector: Selector, object: Any?)

Pass a closure into

Thread.detachNewThread {
}

or Sub-class Thread

class CustomThread: Thread {
override func start() {
print("Running on Custom Thread ", self.debugDescription)
super.start()
}

override func main() {
print("Main called for ", self.debugDescription)
}
}

let thread1 = CustomThread()
thread1.start()

Thread has another two important instance methods, cancel and exit. cancel() sets the current thread isCancelled property to true, which then gets cleaned up by the system gracefully. Whereas exit() makes the Thread stop executing immediately and does not provide an opportunity for cleanup or resource deallocation.

Serialization-Concurrency

Serialization is the process of carrying out the execution in a single thread. So only one task can be done at a single time. In contrast, Concurrency refers to carrying out two or more tasks at the same time but not necessarily simultaneously. Concurrency can be achieved by two methods.

Context switching and Parallelism.

Context switching is the process performed by an operating system to switch the CPU from executing one thread or process to another. When a context switch occurs, the current state of the running thread or process is saved, including its program counter, register values, and other relevant information.

In Parallelism multiple CPU cores are used to process multiple threads at the same time, as shown in the above diagram.

  • So for a Single-core environment to achieve concurrency context switching is used.
  • For a Multi-core environment, concurrency can be achieved via parallelism.

Sync and Async

Synchronous

Consider two tasks,

  • downloading a remote file
  • Open the file to show to the user

These two task needs to be done one after another. We have to first completely download the file then we’ll be able open it. This is called synchronous execution. Each task waits for any previous task to complete and then gets executed.

Single-Thread

Single-Threaded Sync execution

Multi-Thread

Multi-Threaded Sync execution

Asynchronous

Now Consider two mutually exclusive tasks, e.g. downloading a file from the remote server and updating the application UI. Well, we need to do both the task independently and at-a-time. This is called asynchronous execution. Each task can be executed without waiting for any other tasks to get completed.

Single-Thread

Async Single-Threaded execution using context switch

Multi-Thread

Async Multi-Threaded execution using Parallelism

In synchronous execution, the caller thread gets blocked and waits for the complete execution of the callee thread, but in asynchronous execution caller and callee thread gets executed independently without blocking each other.

Asynchronous and Concurrency?

We can see that no matter the number of threads present, in synchronous execution every task is performed one by one, whereas in Asynchronous execution multiple tasks get executed at-a-time. So Asynchronous is the way to achieve concurrency.

Concurrency/Serialization defines how two threads are running, whereas Async/Sync tells how two different threads are connected to each other.

Multithreading and concurrency are a must with the transition from single-core processors to multi-core processors. But the creation and management of Threads is difficult. By default, Threads do not have an autorelease pool, and things like memory management, race condition, deadlocks, etc. become complex with a lot of boiler code.

Luckily, we are provided an alternative by Apple, GCD.

Grand Central Dispatch

It offers a high-level and efficient approach to concurrent programming, by providing a set of APIs and features that abstract away many low-level details of multithreading, making it easier to write efficient and scalable concurrent code.

GCD provides Dispatch Queues.

What’s a Queue?

The GCD manages a collection of Dispatch Queues and uses a closure to schedule tasks for execution.

There are two types of Dispatch Queues available:

  • Serial Dispatch Queue:- Serial Queues, as the name suggests, execute tasks assigned to them in a serial manner i.e. one after another. Order of execution is guaranteed.
  • Concurrent Dispatch Queue:- Concurrent Queue runs multiple tasks at-a-time. Smaller tasks may end sooner, so their order of completion of tasks is not guaranteed.

By default, GCD provides us with two default Queues, although we can also create our own Dispatch Queues.

Main Queue (DispatchQueue.main)

The main queue is a serial queue that runs on the main thread, the thread on which UI updates and user interaction is handled.

DispatchQueue.main.async {
print("This will run on main Thread")
}

In general, it is best to use async instead of sync whenever possible. async will not block the current thread, so it will not cause deadlocks.

Global Queues ( DispatchQueue.global())

These are concurrent queues. The tasks assigned to these queues get executed concurrently on different threads other than the main thread. There are 6 different global queues based on quality of service.

  • .userInteractive - This queue is for tasks that need to be executed quickly, such as user input handling and animations.
  • .default - This queue is for tasks that don't need to be executed quickly, but should still be responsive.
  • .utility - This queue is for tasks that can be executed in the background, such as file I/O and network requests.
  • .background - This queue is for tasks that can be executed in the background and don't need to be responsive, such as image processing and data compression.
  • .userInitiated - This queue is for tasks initiated by the user that requires a quick response, such as opening a file or loading data in response to a user action. These tasks should have a high priority.
  • .unspecified - This queue has no specific QoS level. It is up to the system to decide how to prioritize tasks that are submitted to this queue.
DispatchQueue.global(qos: ).async {

}

Custom Queue

We can make our own queue with GCD.

let queue = DispatchQueue(label: "com.projectx.gcd", attributes: .concurrent)

queue.async {
print("Asynchronously does the work on the custom queue")
}

queue.sync {
print("Synchronously does the work on the custom queue")
}

The label is the name of the queue and is a required parameter. By default, all the custom queues are serial in nature but we can explicitly define the attribute to .concurrent to make a concurrent queue.

Managing Asynchronous functions

Managing asynchronous calls can be a little tricky. And with that, the question arrives how can we handle multiple asynchronous calls and handle the execution logic in the order we want?

  1. Running Tasks in Series

The first problem arises to simulate synchronous execution using asynchronous functions. By synchronous execution I mean waiting for completion of a task before running another one.

This is a very common scenario in applications with network calls. Imagine you have two APIs, one is login and another one is to fetch user details. Now in order to fetch user details, the user first has to login.

login(userId: String, password: String) {
fetchUserDetail(id: String) {
// do tasks afterward
}
}

In the above code, I used completion handlers(nested callbacks) to solve this problem. Well, this is one of the ways. Looks okay with only two function calls, but imagine if we are having a series of calls one after another.

func1 {
func2 {
func3 {
func4 {

}
}
}
}

Now this doesn’t look cool, is it? It’ll be very hard to understand and maintain this code. And in coders’ language, it’s called “ The Pyramid of Doom”.

Let’s move on to the next problem statement.

2. Running tasks in parallel

The second problem can be to perform some tasks when all the asynchronous calls get completed.

Now consider a similar scenario where we have to fetch two types of user details, one is API to fetch the user’s personal detail like name, profile picture, email, and contact details, and the other is to fetch all the stories the user has published (well you can consider the Medium app).

Now in the App, there will be separate sections to show the details, so the UI update can be done separately. So we can fetch the details independently and parallelly. But we want to show the loader at the start of fetching and hide the loader when we get responses for both API calls.

var flag1 = false
var flag2 = false
let id = "1234"

func startFetch() {
// Show the loader
startLoader()

fetchUserProfileDetails(id: id) {
flag1 = true
hideLoaderIfShould()
}

fetchUserPublicDetails(id: id) {
flag2 = true
hideLoaderIfShould()
}
}

func startLoader() {
loader.show()
}

func hideLoaderIfShould() {
if flag1 && flag2 {
loader.hide()
}
}

Above is one of the ways we can handle the situation, by taking two flags and checking if both are true, i.e. completed.

Surely the way we handle the above problem statement doesn't look good. But not to worry GCD provides us with another API, the DispatchGroup.

DispatchGroup

DispatchGroup provides 4 important methods

  1. enter(): Increments the task count by 1 for the DispatchGroup instance.
  2. wait(): Waits for the current task, for which enter() has called, to complete before moving to the next execution.
  3. leave(): Decrements the task count by 1 for the DispatchGroup instance.
  4. notify(queue: execute:): Calls to this method when all the tasks get completed.

Now let’s solve the previously discuss problem statements using these APIs.

  1. Running tasks in Series
let dispatchGroup = DispatchGroup()

// starts the tasks-1
dispatchGroup.enter()
login(userId: String, password: String) {
// ends the tasks-1
dispatchGroup.leave()
}

// waits for tasks-1 to complete
dispatchGroup.wait()

// this will starts the task-2, but will only execute after tasks-1 is complete
dispatchGroup.enter()
fetchUserDetail(id: String) {
// do tasks afterward

// ends the tasks-2
dispatchGroup.leave()
}

// this will get called when all the tasks gets completed
dispatchGroup.notify(queue: .main) {
print("Execution completed")
}

2. Running tasks in parallel

let id = "1234"
let dispatchGroup = DispatchGroup()

func startFetch() {
// Show the loader
loader.show()

// starts the task-1
dispatchGroup.enter()
fetchUserProfileDetails(id: id) {
// ends the task-1
dispatchGroup.leave()
}

// it will start the tasks without waiting for task-1 to complete
dispatchGroup.enter()
fetchUserPublicDetails(id: id) {
// ends task-2
dispatchGroup.leave()
}

// this will get called when all the tasks gets completed
dispatchGroup.notify(queue: .main) {
print("Execution completed")
// hide the loader
loader.hide()
}
}

Here every task starts with calling enter() to let the dispatch queue know the start of the task, and leave() gets called on the completion of every task to let the dispatch group know. Calling wait() will halt further execution until the last task is complete, and finally the notify method gets called when all the tasks are completed.

Summing Up

In this article, we learned the concept of multi-threading with the discussion of

  1. Thread
  2. Serial-Concurrent-Parallelism
  3. Sync-Async
  4. Queue
  5. Dispatch Group

Read more on issues in a multithreaded environment in part 2 of this series here.

--

--

Ajaya Mati
Ajaya Mati

Written by Ajaya Mati

iOS Engineer@PhonePe, IITR@2022 | Swift, UIKit, Swift UI

Responses (2)