Rust Intermediate

Intermediate Rust

Rust Intermediate

Concurrency and Multithreading in Rust

Concurrency and Multithreading in Rust

This chapter will explore the world of concurrency and multithreading in Rust. Concurrency allows you to handle multiple tasks simultaneously, while multithreading takes advantage of multiple CPU cores for parallel execution. Rust's strong ownership and type system help ensure safe and efficient concurrent programming.

Introduction to Concurrency and Parallelism

Concurrency is the concept of executing multiple tasks seemingly simultaneously. It's crucial for efficiently utilizing system resources and building responsive applications. Parallelism takes concurrency further by executing tasks simultaneously using multiple threads or processes.

Rust's ownership system and memory safety make it a robust choice for concurrent programming. The Rust compiler enforces strict rules that prevent data races and memory access issues, ensuring safer multithreaded code.

Using Rust's std::thread for Multithreading

Rust provides a powerful threading library, std::thread, for creating and managing threads. Threads allow different parts of your program to execute independently and in parallel. Here's a basic example of using threads to perform tasks concurrently:

use std::thread;

fn main() {
    let handle1 = thread::spawn(|| {
        for i in 1..=5 {
            println!("Thread 1: {}", i);
        }
    });

    let handle2 = thread::spawn(|| {
        for i in 1..=5 {
            println!("Thread 2: {}", i);
        }
    });

    handle1.join().unwrap();
    handle2.join().unwrap();

    println!("Main thread completed.");
}

In this example, two threads are spawned to print numbers concurrently. The join method ensures that the main thread waits for these spawned threads to complete before continuing.

Safely Sharing Data between Threads with std::sync

When multiple threads share data, the potential for data races arises, leading to unpredictable behavior. Rust addresses this issue with its ownership model and the std::sync module, which provides synchronization primitives for safe data sharing.

The Arc (atomic reference counter) and Mutex (mutual exclusion) are commonly used to share data safely between threads:

use std::sync::{Arc, Mutex};
use std::thread;

fn main() {
    let counter = Arc::new(Mutex::new(0));
    let mut handles = vec![];

    for _ in 0..10 {
        let counter_clone = Arc::clone(&counter);
        let handle = thread::spawn(move || {
            let mut value = counter_clone.lock().unwrap();
            *value += 1;
        });
        handles.push(handle);
    }

    for handle in handles {
        handle.join().unwrap();
    }

    println!("Final counter value: {:?}", *counter.lock().unwrap());
}

In this example, the Arc ensures reference counting across threads, and the Mutex ensures exclusive access to the data, avoiding data races. Threads increment a counter concurrently, and we safely print the final counter value at the end.

Rust’s concurrency and multithreading are worth a deeper look, and learning just the basics here won’t suffice when you need to write concurrent code.

Whenever you're ready

There are 4 ways we can help you become a great backend engineer:

The MB Platform

Join 1000+ backend engineers learning backend engineering. Build real-world backend projects, learn from expert-vetted courses and roadmaps, track your learnings and set schedules, and solve backend engineering tasks, exercises, and challenges.

The MB Academy

The “MB Academy” is a 6-month intensive Advanced Backend Engineering BootCamp to produce great backend engineers.

Join Backend Weekly

If you like post like this, you will absolutely enjoy our exclusive weekly newsletter, Sharing exclusive backend engineering resources to help you become a great Backend Engineer.

Get Backend Jobs

Find over 2,000+ Tailored International Remote Backend Jobs or Reach 50,000+ backend engineers on the #1 Backend Engineering Job Board