VDone Demo VDone Demo
Home
  • Articles

    • JavaScript
  • Study Notes

    • JavaScript Tutorial
    • Professional JavaScript
    • ES6 Tutorial
    • Vue
    • React
    • TypeScript: Build Axios from Scratch
    • Git
    • TypeScript
    • JS Design Patterns
  • HTML
  • CSS
  • Technical Docs
  • GitHub Tips
  • Node.js
  • Blog Setup
  • Learning
  • Interviews
  • Miscellaneous
  • Practical Tips
  • Friends
About
Bookmarks
  • Categories
  • Tags
  • Archives
GitHub (opens new window)

Nikolay Tuzov

Backend Developer
Home
  • Articles

    • JavaScript
  • Study Notes

    • JavaScript Tutorial
    • Professional JavaScript
    • ES6 Tutorial
    • Vue
    • React
    • TypeScript: Build Axios from Scratch
    • Git
    • TypeScript
    • JS Design Patterns
  • HTML
  • CSS
  • Technical Docs
  • GitHub Tips
  • Node.js
  • Blog Setup
  • Learning
  • Interviews
  • Miscellaneous
  • Practical Tips
  • Friends
About
Bookmarks
  • Categories
  • Tags
  • Archives
GitHub (opens new window)
  • Introduction to ECMAScript 6
  • let and const Commands
  • Destructuring Assignment of Variables
  • String Extensions
  • New String Methods
  • Regular Expression Extensions
  • Number Extensions
  • Function Extensions
  • Array Extensions
  • Object Extensions
  • 对象的新增方法
  • Symbol
  • Set 和 Map 数据结构
  • Proxy
  • Reflect
  • Promise 对象
  • Iterator 和 for-of 循环
  • Generator Function Syntax
  • Asynchronous Applications of Generator Functions
    • Traditional Methods
    • Basic Concepts
      • Asynchronous
      • Callback Functions
      • Promise
    • Generator Functions
      • Coroutines
      • Generator Function Implementation of Coroutines
      • Data Exchange and Error Handling in Generator Functions
      • Encapsulating Asynchronous Tasks
    • Thunk Functions
      • Parameter Evaluation Strategies
      • Meaning of Thunk Functions
      • Thunk Functions in JavaScript
      • Thunkify Module
      • Flow Management of Generator Functions
      • Automatic Flow Management with Thunk Functions
    • co Module
      • Basic Usage
      • Principles of the co Module
      • Promise Object-Based Automatic Execution
      • Source Code of the co Module
      • Handling Concurrent Asynchronous Operations
      • Example: Handling Streams
  • async Functions
  • Class 的基本语法
  • Class 的继承
  • Module 的语法
  • Module 的加载实现
  • 编程风格
  • 读懂 ECMAScript 规格
  • Async Iterator
  • ArrayBuffer
  • 最新提案
  • 装饰器
  • 函数式编程
  • Mixin
  • SIMD
  • 参考链接
  • 《ES6 教程》笔记
阮一峰
2020-02-09
Contents

Asynchronous Applications of Generator Functions

# Asynchronous Applications of Generator Functions

Asynchronous programming is extremely important for the JavaScript language. JavaScript's execution environment is "single-threaded", and without asynchronous programming, it would be completely unusable and would freeze. This chapter mainly introduces how Generator functions accomplish asynchronous operations.

# Traditional Methods

Before ES6, there were roughly four methods for asynchronous programming.

  • Callback functions
  • Event listeners
  • Publish/subscribe pattern
  • Promise objects

Generator functions brought JavaScript asynchronous programming into a completely new era.

# Basic Concepts

# Asynchronous

The so-called "asynchronous" simply means that a task is not completed continuously. It can be understood as the task being artificially divided into two stages: the first stage executes first, then other tasks are performed, and when everything is ready, the second stage is executed.

For example, there is a task of reading and processing a file. The first stage of the task is to send a request to the operating system to read the file. Then, the program executes other tasks, and when the operating system returns the file, it continues with the second stage of the task (processing the file). This discontinuous execution is called asynchronous.

Correspondingly, continuous execution is called synchronous. Because execution is continuous and no other tasks can be inserted, the program can only wait idly while the operating system reads the file from the hard disk.

# Callback Functions

JavaScript's implementation of asynchronous programming is callback functions. A callback function means writing the second stage of the task in a separate function and calling that function directly when the task resumes execution. The English name callback literally translates to "calling back".

Reading and processing a file is written like this.

fs.readFile('/etc/passwd', 'utf-8', function (err, data) {
  if (err) throw err;
  console.log(data);
});
1
2
3
4

In the code above, the third parameter of the readFile function is the callback function, which is the second stage of the task. It only executes after the operating system returns the /etc/passwd file.

An interesting question is: why does Node.js convention require that the first parameter of a callback function must be an error object err (if there is no error, this parameter is null)?

The reason is that execution is divided into two stages. After the first stage completes, the context of the task has already ended. Errors thrown after that cannot be caught by the original context and can only be passed as parameters to the second stage.

# Promise

There is nothing inherently wrong with callback functions; the problem arises with multiple nested callback functions. Suppose after reading file A, you need to read file B:

fs.readFile(fileA, 'utf-8', function (err, data) {
  fs.readFile(fileB, 'utf-8', function (err, data) {
    // ...
  });
});
1
2
3
4
5

It is not hard to imagine that if you need to read more than two files in sequence, multiple levels of nesting will occur. The code does not grow vertically but horizontally, quickly becoming tangled and unmanageable. This is because multiple asynchronous operations form tight coupling — if any one operation needs modification, its upper-level and lower-level callbacks may also need modification. This situation is called "callback hell".

Promise objects were proposed to solve this problem. They are not new syntax features, but a new way of writing code that allows nested callbacks to be changed to chained calls. Using Promise, reading multiple files in sequence is written as follows.

var readFile = require('fs-readfile-promise');

readFile(fileA)
.then(function (data) {
  console.log(data.toString());
})
.then(function () {
  return readFile(fileB);
})
.then(function (data) {
  console.log(data.toString());
})
.catch(function (err) {
  console.log(err);
});
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

In the code above, the fs-readfile-promise module is used, which returns a Promise version of the readFile function. Promise provides the then method for loading callbacks and the catch method for catching errors thrown during execution.

As can be seen, Promise's approach is just an improvement over callback functions. Using the then method, the two-stage execution of asynchronous tasks is clearer, but beyond that, there is nothing new.

Promise's biggest problem is code redundancy. The original task is wrapped with Promise, and no matter what the operation, everything is a bunch of thens, making the original semantics very unclear.

So, is there a better way to write this?

# Generator Functions

# Coroutines

Traditional programming languages have long had solutions for asynchronous programming (which are actually solutions for multitasking). One of them is called "coroutines", meaning multiple threads cooperate to complete asynchronous tasks.

Coroutines are somewhat like functions and somewhat like threads. Their execution flow is roughly as follows.

  • Step 1: Coroutine A starts executing.
  • Step 2: Coroutine A pauses in the middle of execution, and control is transferred to coroutine B.
  • Step 3: (After some time) Coroutine B returns control.
  • Step 4: Coroutine A resumes execution.

Coroutine A in the flow above is the asynchronous task because it is executed in two (or more) stages.

For example, here is the coroutine approach to reading a file.

function* asyncJob() {
  // ...other code
  var f = yield readFile(fileA);
  // ...other code
}
1
2
3
4
5

The key to the asyncJob function in the code above is the yield command. It means that at this point, execution control is handed to another coroutine. In other words, the yield command is the dividing line between the two asynchronous stages.

A coroutine pauses at a yield command, and when control returns, it resumes from where it paused. Its greatest advantage is that the code looks very much like synchronous operations — remove the yield command and it is virtually identical.

# Generator Function Implementation of Coroutines

Generator functions are the ES6 implementation of coroutines, and their greatest feature is the ability to yield the execution control of a function (i.e., pause execution).

The entire Generator function is a wrapped asynchronous task, or a container for asynchronous tasks. Wherever an asynchronous operation needs to be paused, a yield statement is used. The Generator function is executed as follows.

function* gen(x) {
  var y = yield x + 2;
  return y;
}

var g = gen(1);
g.next() // { value: 3, done: false }
g.next() // { value: undefined, done: true }
1
2
3
4
5
6
7
8

In the code above, calling the Generator function returns an internal pointer (i.e., iterator) g. This is another way Generator functions differ from regular functions: executing it does not return a result; what is returned is a pointer object. Calling the pointer g's next method moves the internal pointer (i.e., executes the first stage of the asynchronous task) to the first yield statement encountered. In the example above, this is executing up to x + 2.

In other words, the next method's role is to execute the Generator function in stages. Each call to next returns an object representing information about the current stage (the value property and done property). The value property is the value of the expression after the yield statement, representing the current stage's value; the done property is a boolean indicating whether the Generator function has finished executing, i.e., whether there is a next stage.

# Data Exchange and Error Handling in Generator Functions

Generator functions can pause and resume execution, which is the fundamental reason they can encapsulate asynchronous tasks. Beyond that, they have two additional features that make them a complete solution for asynchronous programming: data exchange inside and outside the function body, and an error handling mechanism.

The value property of the value returned by next is data output from the Generator function; the next method can also accept a parameter to input data into the Generator function body.

function* gen(x){
  var y = yield x + 2;
  return y;
}

var g = gen(1);
g.next() // { value: 3, done: false }
g.next(2) // { value: 2, done: true }
1
2
3
4
5
6
7
8

In the code above, the value property of the first next method returns the value 3 of the expression x + 2. The second next method carries a parameter 2, which can be passed into the Generator function as the result of the previous asynchronous stage, received by the variable y inside the function body. Therefore, the value property of this step returns 2 (the value of variable y).

Generator functions can also deploy error handling code internally to catch errors thrown from outside the function body.

function* gen(x){
  try {
    var y = yield x + 2;
  } catch (e){
    console.log(e);
  }
  return y;
}

var g = gen(1);
g.next();
g.throw('出错了');
// 出错了
1
2
3
4
5
6
7
8
9
10
11
12
13

In the last line of the code above, the error thrown using the pointer object's throw method outside the Generator function body can be caught by the try...catch block inside the function body. This means that the error-producing code and the error-handling code are separated in time and space, which is undoubtedly very important for asynchronous programming.

# Encapsulating Asynchronous Tasks

Let us see how to use Generator functions to execute a real asynchronous task.

var fetch = require('node-fetch');

function* gen(){
  var url = 'https://api.github.com/users/github';
  var result = yield fetch(url);
  console.log(result.bio);
}
1
2
3
4
5
6
7

In the code above, the Generator function encapsulates an asynchronous operation that first reads a remote API and then parses information from the JSON-format data. As mentioned before, this code looks very much like synchronous operations, except for the addition of the yield command.

Here is how to execute this code.

var g = gen();
var result = g.next();

result.value.then(function(data){
  return data.json();
}).then(function(data){
  g.next(data);
});
1
2
3
4
5
6
7
8

In the code above, the Generator function is first executed to obtain an iterator object, then the next method (second line) is used to execute the first stage of the asynchronous task. Since the Fetch module returns a Promise object, the then method is used to call the next next method.

As can be seen, although Generator functions express asynchronous operations very concisely, flow management is inconvenient (i.e., when to execute the first stage, when to execute the second stage).

# Thunk Functions

Thunk functions are one method of automatically executing Generator functions.

# Parameter Evaluation Strategies

Thunk functions originated as early as the 1960s.

At that time, programming languages were just getting started, and computer scientists were still studying how best to write compilers. A focal point of debate was "evaluation strategies" — when exactly should function parameters be evaluated.

var x = 1;

function f(m) {
  return m * 2;
}

f(x + 5)
1
2
3
4
5
6
7

The code above first defines a function f, then passes the expression x + 5 to it. The question is: when should this expression be evaluated?

One opinion is "call by value", meaning the value of x + 5 (which is 6) is computed before entering the function body, and then this value is passed to function f. The C language uses this strategy.

f(x + 5)
// With call by value, equivalent to
f(6)
1
2
3

Another opinion is "call by name", meaning the expression x + 5 is passed directly into the function body and only evaluated when it is actually used. The Haskell language uses this strategy.

f(x + 5)
// With call by name, equivalent to
(x + 5) * 2
1
2
3

Which is better — call by value or call by name?

The answer is that each has its pros and cons. Call by value is simpler, but when a parameter is evaluated, it may not actually be used yet, potentially causing performance loss.

function f(a, b){
  return b;
}

f(3 * x * x - 2 * x - 1, x);
1
2
3
4
5

In the code above, the first parameter of function f is a complex expression, but the function body never uses it. Evaluating this parameter is actually unnecessary. Therefore, some computer scientists favor "call by name" — only evaluating when executing.

# Meaning of Thunk Functions

The compiler's implementation of "call by name" often involves placing the parameter in a temporary function and then passing that temporary function into the function body. This temporary function is called a Thunk function.

function f(m) {
  return m * 2;
}

f(x + 5);

// Equivalent to

var thunk = function () {
  return x + 5;
};

function f(thunk) {
  return thunk() * 2;
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

In the code above, function f's parameter x + 5 is replaced by a function. Wherever the original parameter is used, the Thunk function is evaluated instead.

This is the definition of a Thunk function — it is an implementation strategy of "call by name", used to replace a certain expression.

# Thunk Functions in JavaScript

JavaScript uses call by value, so the meaning of Thunk functions in JavaScript is somewhat different. In JavaScript, Thunk functions replace not expressions but multi-parameter functions, converting them into single-parameter functions that only accept a callback function as a parameter.

// Normal version of readFile (multi-parameter version)
fs.readFile(fileName, callback);

// Thunk version of readFile (single-parameter version)
var Thunk = function (fileName) {
  return function (callback) {
    return fs.readFile(fileName, callback);
  };
};

var readFileThunk = Thunk(fileName);
readFileThunk(callback);
1
2
3
4
5
6
7
8
9
10
11
12

In the code above, the fs module's readFile method is a multi-parameter function with two parameters: file name and callback function. After processing with a converter, it becomes a single-parameter function that only accepts a callback function as a parameter. This single-parameter version is called the Thunk function.

Any function whose parameters include a callback function can be written in Thunk function form. Below is a simple Thunk function converter.

// ES5 version
var Thunk = function(fn){
  return function (){
    var args = Array.prototype.slice.call(arguments);
    return function (callback){
      args.push(callback);
      return fn.apply(this, args);
    }
  };
};

// ES6 version
const Thunk = function(fn) {
  return function (...args) {
    return function (callback) {
      return fn.call(this, ...args, callback);
    }
  };
};
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19

Using the converter above to generate a Thunk function for fs.readFile.

var readFileThunk = Thunk(fs.readFile);
readFileThunk(fileA)(callback);
1
2

Below is another complete example.

function f(a, cb) {
  cb(a);
}
const ft = Thunk(f);

ft(1)(console.log) // 1
1
2
3
4
5
6

# Thunkify Module

For production environments, it is recommended to use the Thunkify module as the converter.

First, install it.

$ npm install thunkify
1

Usage is as follows.

var thunkify = require('thunkify');
var fs = require('fs');

var read = thunkify(fs.readFile);
read('package.json')(function(err, str){
  // ...
});
1
2
3
4
5
6
7

Thunkify's source code is very similar to the simple converter in the previous section.

function thunkify(fn) {
  return function() {
    var args = new Array(arguments.length);
    var ctx = this;

    for (var i = 0; i < args.length; ++i) {
      args[i] = arguments[i];
    }

    return function (done) {
      var called;

      args.push(function () {
        if (called) return;
        called = true;
        done.apply(null, arguments);
      });

      try {
        fn.apply(ctx, args);
      } catch (err) {
        done(err);
      }
    }
  }
};
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

Its source code mainly adds a checking mechanism: the variable called ensures that the callback function only runs once. This design is related to Generator functions discussed later. See the example below.

function f(a, b, callback){
  var sum = a + b;
  callback(sum);
  callback(sum);
}

var ft = thunkify(f);
var print = console.log.bind(console);
ft(1, 2)(print);
// 3
1
2
3
4
5
6
7
8
9
10

In the code above, since thunkify only allows the callback function to execute once, only one line of results is output.

# Flow Management of Generator Functions

You might ask: what is the use of Thunk functions? The answer is that they were indeed not very useful before, but with ES6's Generator functions, Thunk functions can now be used for automatic flow management of Generator functions.

Generator functions can be automatically executed.

function* gen() {
  // ...
}

var g = gen();
var res = g.next();

while(!res.done){
  console.log(res.value);
  res = g.next();
}
1
2
3
4
5
6
7
8
9
10
11

In the code above, the Generator function gen automatically executes through all steps.

However, this is not suitable for asynchronous operations. If it must be guaranteed that the previous step finishes before the next step executes, the automatic execution above will not work. This is where Thunk functions come in handy. Using file reading as an example, the following Generator function encapsulates two asynchronous operations.

var fs = require('fs');
var thunkify = require('thunkify');
var readFileThunk = thunkify(fs.readFile);

var gen = function* (){
  var r1 = yield readFileThunk('/etc/fstab');
  console.log(r1.toString());
  var r2 = yield readFileThunk('/etc/shells');
  console.log(r2.toString());
};
1
2
3
4
5
6
7
8
9
10

In the code above, the yield command is used to transfer execution control out of the Generator function, so a method is needed to return control to the Generator function.

This method is the Thunk function, because it can return control to the Generator function inside a callback function. To make this easier to understand, let us first see how to manually execute the Generator function above.

var g = gen();

var r1 = g.next();
r1.value(function (err, data) {
  if (err) throw err;
  var r2 = g.next(data);
  r2.value(function (err, data) {
    if (err) throw err;
    g.next(data);
  });
});
1
2
3
4
5
6
7
8
9
10
11

In the code above, variable g is the internal pointer of the Generator function, indicating the current execution point. The next method is responsible for moving the pointer to the next step and returning information about that step (the value property and done property).

Looking closely at the code above, we can see that the execution process of the Generator function is actually passing the same callback function repeatedly into the value property of the next method. This allows us to use recursion to automate this process.

# Automatic Flow Management with Thunk Functions

The true power of Thunk functions lies in their ability to automatically execute Generator functions. Below is a Generator executor based on Thunk functions.

function run(fn) {
  var gen = fn();

  function next(err, data) {
    var result = gen.next(data);
    if (result.done) return;
    result.value(next);
  }

  next();
}

function* g() {
  // ...
}

run(g);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

The run function in the code above is an automatic executor for Generator functions. The internal next function is the Thunk callback function. next first moves the pointer to the Generator function's next step (gen.next method), then checks whether the Generator function has ended (result.done property). If it has not ended, it passes the next function into the Thunk function (result.value property); otherwise, it exits directly.

With this executor, executing Generator functions is much more convenient. No matter how many asynchronous operations there are internally, simply pass the Generator function to the run function. Of course, the prerequisite is that each asynchronous operation must be a Thunk function, meaning what follows the yield command must be a Thunk function.

var g = function* (){
  var f1 = yield readFileThunk('fileA');
  var f2 = yield readFileThunk('fileB');
  // ...
  var fn = yield readFileThunk('fileN');
};

run(g);
1
2
3
4
5
6
7
8

In the code above, function g encapsulates n asynchronous file read operations. Just by executing the run function, all these operations are completed automatically. This way, asynchronous operations can not only be written like synchronous operations, but can also be executed with a single line of code.

Thunk functions are not the only solution for automatic execution of Generator functions. The key to automatic execution is that there must be a mechanism to automatically control the flow of the Generator function, receiving and returning program execution control. Callback functions can accomplish this, and so can Promise objects.

# co Module

# Basic Usage

The co module (opens new window) is a small tool released by famous programmer TJ Holowaychuk in June 2013 for automatic execution of Generator functions.

Below is a Generator function for reading two files in sequence.

var gen = function* () {
  var f1 = yield readFile('/etc/fstab');
  var f2 = yield readFile('/etc/shells');
  console.log(f1.toString());
  console.log(f2.toString());
};
1
2
3
4
5
6

The co module lets you avoid writing a Generator function executor yourself.

var co = require('co');
co(gen);
1
2

In the code above, the Generator function just needs to be passed to the co function and it will automatically execute.

The co function returns a Promise object, so you can use the then method to add callback functions.

co(gen).then(function (){
  console.log('Generator function execution complete');
});
1
2
3

In the code above, when the Generator function finishes executing, a notification message is output.

# Principles of the co Module

Why can co automatically execute Generator functions?

As mentioned earlier, a Generator is a container for asynchronous operations. Its automatic execution requires a mechanism that can automatically return execution control when an asynchronous operation produces a result.

Two methods can achieve this.

(1) Callback functions. Wrap asynchronous operations as Thunk functions and return control inside the callback function.

(2) Promise objects. Wrap asynchronous operations as Promise objects and use the then method to return control.

The co module is essentially both automatic executors (Thunk functions and Promise objects) wrapped into one module. The prerequisite for using co is that after the yield command of a Generator function, there can only be Thunk functions or Promise objects. If all members of an array or object are Promise objects, co can also be used — see the examples below.

The previous section already introduced the Thunk function-based automatic executor. Now let us look at the Promise object-based automatic executor. This is necessary for understanding the co module.

# Promise Object-Based Automatic Execution

Using the same example as above, first wrap the fs module's readFile method into a Promise object.

var fs = require('fs');

var readFile = function (fileName){
  return new Promise(function (resolve, reject){
    fs.readFile(fileName, function(error, data){
      if (error) return reject(error);
      resolve(data);
    });
  });
};

var gen = function* (){
  var f1 = yield readFile('/etc/fstab');
  var f2 = yield readFile('/etc/shells');
  console.log(f1.toString());
  console.log(f2.toString());
};
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

Then, manually execute the Generator function above.

var g = gen();

g.next().value.then(function(data){
  g.next(data).value.then(function(data){
    g.next(data);
  });
});
1
2
3
4
5
6
7

Manual execution is essentially using then methods to add layers of callback functions. Understanding this point, we can write an automatic executor.

function run(gen){
  var g = gen();

  function next(data){
    var result = g.next(data);
    if (result.done) return result.value;
    result.value.then(function(data){
      next(data);
    });
  }

  next();
}

run(gen);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

In the code above, as long as the Generator function has not reached the last step, the next function calls itself, achieving automatic execution.

# Source Code of the co Module

co is an extension of the automatic executor above, and its source code is only a few dozen lines — very simple.

First, the co function accepts a Generator function as a parameter and returns a Promise object.

function co(gen) {
  var ctx = this;

  return new Promise(function(resolve, reject) {
  });
}
1
2
3
4
5
6

Inside the returned Promise object, co first checks whether the parameter gen is a Generator function. If it is, it executes the function to get an internal pointer object; if not, it returns and changes the Promise object's state to resolved.

function co(gen) {
  var ctx = this;

  return new Promise(function(resolve, reject) {
    if (typeof gen === 'function') gen = gen.call(ctx);
    if (!gen || typeof gen.next !== 'function') return resolve(gen);
  });
}
1
2
3
4
5
6
7
8

Next, co wraps the next method of the Generator function's internal pointer object as the onFulfilled function. This is mainly to be able to catch thrown errors.

function co(gen) {
  var ctx = this;

  return new Promise(function(resolve, reject) {
    if (typeof gen === 'function') gen = gen.call(ctx);
    if (!gen || typeof gen.next !== 'function') return resolve(gen);

    onFulfilled();
    function onFulfilled(res) {
      var ret;
      try {
        ret = gen.next(res);
      } catch (e) {
        return reject(e);
      }
      next(ret);
    }
  });
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19

Finally, the key next function, which calls itself repeatedly.

function next(ret) {
  if (ret.done) return resolve(ret.value);
  var value = toPromise.call(ctx, ret.value);
  if (value && isPromise(value)) return value.then(onFulfilled, onRejected);
  return onRejected(
    new TypeError(
      'You may only yield a function, promise, generator, array, or object, '
      + 'but the following object was passed: "'
      + String(ret.value)
      + '"'
    )
  );
}
1
2
3
4
5
6
7
8
9
10
11
12
13

In the code above, the next function's internal code has only four commands.

Line one checks whether the Generator function is at its last step; if so, it returns.

Line two ensures that each step's return value is a Promise object.

Line three uses the then method to add a callback function to the return value, then calls the next function again through onFulfilled.

Line four, if the parameters do not meet the requirements (parameters are neither Thunk functions nor Promise objects), changes the Promise object's state to rejected, thereby terminating execution.

# Handling Concurrent Asynchronous Operations

co supports concurrent asynchronous operations, meaning certain operations can proceed simultaneously, and only after all are complete does the next step begin.

In this case, concurrent operations are placed in an array or object, following a yield statement.

// Array notation
co(function* () {
  var res = yield [
    Promise.resolve(1),
    Promise.resolve(2)
  ];
  console.log(res);
}).catch(onerror);

// Object notation
co(function* () {
  var res = yield {
    1: Promise.resolve(1),
    2: Promise.resolve(2),
  };
  console.log(res);
}).catch(onerror);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

Below is another example.

co(function* () {
  var values = [n1, n2, n3];
  yield values.map(somethingAsync);
});

function* somethingAsync(x) {
  // do something async
  return y
}
1
2
3
4
5
6
7
8
9

The code above allows three somethingAsync asynchronous operations to proceed concurrently. Only after all three complete does the next step begin.

# Example: Handling Streams

Node provides Stream mode for reading and writing data, which processes only a portion of data at a time — the data is processed piece by piece, like a "data stream". This is very beneficial for processing large-scale data. Stream mode uses the EventEmitter API and emits three events.

  • data event: the next chunk of data is ready.
  • end event: the entire "data stream" has been processed.
  • error event: an error occurred.

Using the Promise.race() function, we can determine which of these three events occurs first. Only when the data event occurs first do we proceed to process the next data chunk. This way, we can use a while loop to complete reading all the data.

const co = require('co');
const fs = require('fs');

const stream = fs.createReadStream('./les_miserables.txt');
let valjeanCount = 0;

co(function*() {
  while(true) {
    const res = yield Promise.race([
      new Promise(resolve => stream.once('data', resolve)),
      new Promise(resolve => stream.once('end', resolve)),
      new Promise((resolve, reject) => stream.once('error', reject))
    ]);
    if (!res) {
      break;
    }
    stream.removeAllListeners('data');
    stream.removeAllListeners('end');
    stream.removeAllListeners('error');
    valjeanCount += (res.toString().match(/valjean/ig) || []).length;
  }
  console.log('count:', valjeanCount); // count: 1120
});
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23

The code above uses Stream mode to read the text file of "Les Miserables". For each data chunk, the stream.once method is used to add one-time callback functions on the data, end, and error events. The variable res only has a value when the data event fires, and then accumulates the count of the word valjean appearing in each data chunk.

Edit (opens new window)
#ES6
Last Updated: 2026/03/21, 12:14:36
Generator Function Syntax
async Functions

← Generator Function Syntax async Functions→

Recent Updates
01
How I Discovered Disposable Email — A True Story
06-12
02
Animations in Grid Layout
09-15
03
Renaming a Git Branch
08-11
More Articles >
Theme by VDone | Copyright © 2026-2026 Nikolay Tuzov | MIT License | Telegram
  • Auto
  • Light Mode
  • Dark Mode
  • Reading Mode