Desync is a Rust library I’ve been working on that provides a ‘hassle-free’ approach to writing asynchronous code. By ‘hassle-free’, I mean that it provides a very small API with a surprisingly large amount of functionality. It can replace both threads and synchronisation structures like Mutex in many situations. It’s excellent for situations where a developer wants to make a single-threaded data type work with multiple types and for building asynchronous code where the interactions between different types of object are made clear.

desync = "0.2"

The basic way it works is very straightforward: a type that needs to work asynchronously can be wrapped in a Desync object. Desync serialises access to a particular object: no two operations may happen simultaneously and every operation is performed in the order that it’s received. It’s very like a Mutex in this respect.

Desync provides two main ways of operating on the data that it contains. async will perform an operation in the background and sync will perform an operation and return only once it has completed. sync has nearly identical semantics to locking a mutex, with the exception that it has very strong guarantees about ordering: no operation queued after the sync call is made will execute before the sync call completes. async is similar except that it will run the task in the background and return immediately to the caller.

use desync::*;

let desync_thing = Desync::new(slow_object);

// Process in background...
desync_thing.async(|slow_object| slow_object.process());

// We can do something else while the slow object performs its processing

// Wait for and retrieve the result
let result = desync_thing.sync(|slow_object| slow_object.get_result());

The desync library takes care of all of the thread processing and synchronisation required to make this work. It works best with fairly coarse operations, and is particularly useful for things like performing database updates or sharing a data model between threads.

Using the futures library

Desync objects can work with the futures library, by generating a future indicating when an operation has completed (handy for turning any operation into a future):

let future_result = desync_thing.future(|slow_object| {

It also supports piping streams as of v0.2.0, via the pipe and pipe_in functions. This provides an easy way to perform background processing on a stream, as well as providing a way to implement an event loop directly from a stream.

let desync_thing         = Arc::new(desync_thing);
let processing_stream     = pipe(Arc::clone(&desync_thing), 
    |slow_object, input_value| {|input_value| slow_object.process_input(input_value))

pipe_in is similar but does not return a results stream: it’s useful for collating a stream into a Desync object or perhaps handling a stream of events. Together, these functions provide a very convenient way to implement stream processing.


One advantage of the way desync is designed is that there’s no hard requirement for the tasks to run on separate threads. If it’s compiled for a webassembly target, desync will disable its threaded scheduler and will instead resolve asynchronous operations whenever a sync operation is performed. This provides some limited support for webassembly and allows many programs that use desync to work even though webassembly has no threading support.

Putting things together

Desync’s sync and async functions are great for turning previously synchronous code into asynchronous code with little hassle. It’s possible to use async almost anywhere you want to start a long-running operation and retrieve the result later.

Something that’s worth noting, and very useful for writing code that flows is that sync takes a function with a local lifespan. This makes it easy to use variables from the calling function, something that other asynchronous frameworks can make extremely tedious to achieve:

let mut val = 0;
some_object.sync(|some_object| val = some_object.get_val());

// val is now set to some_object.get_val() (with no complaints from the borrow checker)

Calling sync is very similar to locking a mutex: if it’s the only method you use, it’s really identical. The main difference is that if there have been any async methods queued, it will wait for them first, and once the call has started, the order of operations is fixed.

The pipe methods provide a way to communicate with a Desync object without needing to call a method every time (as well as a more convenient way to execute Stream objects). As well as simple data streams, these are a way to pass in message queues, and allow Desync objects to communicate with one another without needing direct dependencies. This makes Desync a great foundation for a project that wants to use a component-oriented architecture with a high degree of code isolation (translated for humans: large software built from small, completely independent programs). This is also the basic principle behind the concept of actors, so desync can be used to build that kind of program as well.

So, that’s the desync rust library. Its API is small - all that’s really needed to use it are the sync and async functions. These can be combined to create any kind of asynchronous program, and desync’s central principle of scheduling operations for data often means that it’s possible to express a given algorithm more concisely using desync rather than the more traditional synchronisation structures, as well as making it easier to add some asynchronous operation to previously synchronous code.