47 Degrees joins forces with Xebia read more

The Road to (Functional) Reactive Programming in JavaScript, Part 2

The Road to (Functional) Reactive Programming in JavaScript, Part 2

This is part two of a three-part series on the road to reactive programming in Javascript. This road is comprised of:


The ongoing journey: Async strategies

After taking the path so far into account (covered in part one), we’re going to review the different architectures available for managing asynchrony flows. In essence, they all rely on these two categories:

  • Web APIs provided by the hosting environment: By adding a function (the callback) to the event handlers queue, these APIs fire it after some work is completed. Examples of this could be handling an AJAX request, an onload handler within the DOM, or a web worker response.

  • Timing events functions: setTimeout(callback, time) runs a callback once, after the specified amount of time, and setInterval(callback, time), which runs a callback at a specified time interval. These are not reacting to changes, just simply running our code. With this, we can break synchronous execution into asynchronous parts. It’s important to note that these functions don’t guarantee that they will be executed after the specified amount of time. They do guarantee that the callback will be added to the event handlers queue, but, as we saw earlier, it might have handlers already waiting for execution. So, the “time” argument should be treated as “not earlier than, but after the specified time.”

Callbacks

What is a callback?

We’ve seen this term before. But in reality, callbacks don’t exist. As we know, a callback is just a way of naming a JavaScript function given a specific context. Usually, callbacks are used on I/O operations, e.g., download data, read a file, fetch a database, etc.

const result = sum(2, 3);
console.log(result) // 5
const userList = fetchJSON('https://server.com/users.json');
console.log(userList) // userList is undefined

The trip to the server won’t happen now; it takes time. So for now, just remember that you will get this result later.

const userList = fetchJSON('https://server.com/users.json', readJson);
function readJSON (error, photo) {
  if (error) console.error('Download error!', error)
  else console.log('Download finished', photo)
}

The correct way of handling this is by passing a (callback) function to the fetchJSON function, which will run when the data is fetched (or not) and available. The readJSON callback is just a way to store some things to do at a later time.

What callbacks imply

  • The order in which things happen does not read top-to-bottom, it jumps around based on when things complete. Our brains plan things out in sequential, blocking, single-threaded semantic ways, but callbacks express asynchronous flow in a rather non-linear, non-sequential way, which makes proper reasoning about such code harder.

  • Callbacks aren’t trustworthy, nor composable (the only way of being so is by creating the infamous callback hell, a.k.a. pyramid of doom). Callbacks suffer from an inversion of control in that they implicitly give control over to another party. This control transfer leads to trust issues, such as whether the callback is called more times than we expect.

This being said, it is very important to understand how a callback works, because any other construct on the language are just iterations on this concept or just sugar on top of it, but behind the scenes, there’s always a callback, ergo a deferred function.

Promises

What is a promise?

Native Promises have been supported in JavaScript since the ES2015 spec. They represent the eventual completion (or failure) of an asynchronous operation and its resulting value.

Promises uninvert the inversion of control of callbacks, restoring trustability/composability. They don’t get rid of callbacks, they just redirect the orchestration of those callbacks to a trustable intermediary mechanism that sits between us and another utility.

With promises, you can write async code that still appears as though it is executing in a top-down way, which helps our brains plan and maintain async JavaScript code better. They also handle more types of errors due to an encouraged use of try/catch-style error handling.

// The fetch API returns a Promise
fetch('flowers.jpg').then(function(response) {
   if(response.ok) {
     return response.blob();
   }
   throw new Error('Network response was not ok.');
}).then(function(myBlob) {
   var objectURL = URL.createObjectURL(myBlob);
   myImage.src = objectURL;
}).catch(function(error) {
   console.log('Error in fetch operation: ', error.message);
});

What promises imply

Promises represent a great step forward in comparison to callbacks. Even so, promises still have limitations:

  • Sequence error handling is still muddled.
  • Single value and single resolution.
  • They are uncancellable.
  • The way of reading them is still not completely aligned with a sequence of steps.
  • Also, since each function inside of the callback has its own scope, we cannot access the previous data inside of a second .then callback.

Generators

What is a generator?

Generators are another concept introduced as part of ES2015. They are a special function type that does not run-to-completion like normal functions.

Instead, the generator can be paused in mid-completion, entirely preserving its state without pausing the whole program. It can later be resumed from where it left off, but, it doesn’t necessarily ever have to finish.

// Iterator aux func
function run(gen) {
   var args = [].slice.call( arguments, 1), it;
   it = gen.apply( this, args );
   var next = it.next();
   return next.value.then(
      function(value) {
         it.next(value);
      },
      function(error) {
         it.throw(error);
      }
   );
}
// Generator
function *readJSON() {
   try {
      // fetch returns a Promise
      const jsonContent = yield fetch('https://server.com/users.json');
      console.log(jsonContent);
   }
   catch (err) {
      console.error(err);
   }
}

run(readJSON);

Generators don’t entail asynchrony per se, but when combined with promises, their power is immense. To do this, an auxiliary function, run in our case, and boilerplate is needed. This technique, at the cost of slightly more complex code, lets your async code appear to execute in a top-down fashion, in a more understandable way than when using promises alone.

We are able to yield a promise, and through that promise control the generator’s iterator. The iterator listens for the promise to resolve (fulfillment or rejection), and then either resumes the generator with the fulfillment message or throws an error into the generator with the rejection reason.

The yield/next dualism is not just a control mechanism; it’s a two-way message passing mechanism too. A yield expression pauses waiting for a value, and the nextcall passes a value (or implicit undefined) back to that paused yield expression.

What generators imply

Generators’ power goes far beyond async flow control, but their benefit in this regard is the ability to express our code in a sync/sequential fashion. This lets us reason much more naturally about the code, hiding potential asynchrony behind the yield keyword, and moving that logic where the generator’s iterator is controlled.

Also, in contrast with single promises, you can emit multiple values over time, or write code that looks like it’s running in parallel. The setback to the use of this technique is either the needed help of an auxiliary library, or either boilerplate generated.

async/await

What is async/await?

Async functions are an ES2017 feature that further wrap generators and promises in a higher level syntax.

These can be considered as the conjunction of generators yielding promises, but with an adapted syntax that lets you combine those two concepts with no boilerplate.

async function run() {
   try {
      const jsonContent = await fetch('https://server.com/users.json');
      console.log(jsonContent);
   }
   catch (err) {
      console.error(err);
   }
}

run();

As you can see, there are no special calls, meaning no need for a library utility or messing with generator function declarations anymore. Now run() is a new kind of function: an async function, and instead of yielding a promise, we wait for it to resolve.

The async function automatically knows what to do if you await a promise, it will pause the function (just like with generators) until the promise resolves. Calling an async function like run() automatically returns a promise that’s resolved whenever the function finishes completely.

This architecture combines promises with sync-looking flow control code. The best of both worlds combined, effectively addressing almost all of the major concerns we outlined until now.

What async/await implies

We can say that async/await feature is the epitome of asynchronous control flow in JavaScript. And indeed, you probably won’t need anything else to solve other concerns with async code around.

Today, requirements for webapps continue to grow and face new challenges like managing continuous streams of incoming data, or having to constantly poll something like a mouse position or a device’s gyroscope information. While not very common in classical webapps, these things can actually be managed through async/await, or even with generators and promises.

But, it just so happens that there is a pattern that adheres to these concepts relatively directly: Observables.


Next, in this series, we’ll continue to the final phase of our journey: streams and the observer pattern. You can read it here.


Sources and further reading:

Ensure the success of your project

47 Degrees can work with you to help manage the risks of technology evolution, develop a team of top-tier engaged developers, improve productivity, lower maintenance cost, increase hardware utilization, and improve product quality; all while using the best technologies.