One of the best things about working in open source is that you get to work on what you find most interesting. It is easy to stay involved in open source because you can pick and choose what you want to work on. But it's not always easy.
In this volume of Need to Node, you can find the latest news on Node.js’s version 14 Release, Diagnostics in Node.js and The Cost of JavaScript Frameworks
Need to Node is a weekly bulletin designed to keep you up-to-date with the latest news on the Node.js project, events and articles. You are always welcome to collaborate and participate. Please let us know if we missed a piece of content you think should be included!
Node.js version 14 Released — Woo-hoo it’s finally here! Node.js v.14 now becomes the current release line with it becoming a LTS (Long Term Support) release in October. It’s recommended to keep using Node.js version 12 for now. Some of the most exciting features include:
The Cost of JavaScript Frameworks. With JavaScript you end up paying a performance tax no less than four times:
If you find any Node.js or JavaScript related content over the next week (or beyond!), never hesitate to reach out to us on Twitter at @NodeSource to share and get it included in Need to Node - our DMs are open if you don’t want to share publicly!
In this volume of Need to Node, you can find the latest news on the Node.js v13.11.0 (Current) Release, npm is joining GitHub and Three Things You Didn't Know You Could Do with npm Scripts.
Need to Node is a weekly bulletin designed to keep you up to date with the latest news on the Node.js project, events and awesome articles. You are always welcome to collaborate and participate. Please let us know if we missed a piece of content you think should be included!
async_hooks
, cli
and fs
.^
, >=
, or *
required. Exciting! By Valeri Karpov If you find any Node.js or JavaScript related content over the next week (or beyond!), never hesitate to reach out to us on Twitter at @NodeSource to share and get it included in Need to Node - our DMs are open if you don’t want to share publicly!
In my previous blog post, I explored the differences, advantages, and disadvantages of three of the most popular Node.js frameworks: Express, Koa, and Hapi. In this blog post, I’m going to examine the differences between three more very popular frameworks: Next, Nuxt, and Nest. These three frameworks are server-side rendering, and they are closely related to React, Vue, and Angular (the three most widely used front-end frameworks), respectively.
The comparison is based on:
Next is the most popular framework compared to the other two. It has more npm weekly downloads, GitHub stars and number of contributors.
Next.js is a React framework that lets you build server-side rendering and static web applications using React.
Install it:
npm install --save next react react-dom
and add a script to your package.json like this:
{
"scripts": {
"dev": "next",
"build": "next build",
"start": "next start"
}
}
After that, the file-system is the main API. Every .js file becomes a route that gets automatically processed and rendered.
Populate ./pages/index.js
inside your project:
function Home() {
return <div>Hello world!</div>;
}
export default Home;
Then just run npm run dev
and go to http://localhost:3000
. To use another port, you can run npm run dev -- -p <your port here>
.
To measure the performance, I used Apache Bench for benchmarking, which highlights how many requests per second the app is capable of serving. I also used lighthouse to audit performance, accessibility, best practices, and SEO.
This is a basic Hello World app in Next.js. It handles 550.87 requests per second. This value is the result of dividing the number of requests by the total time taken. The average time spent per request is 18.153 ms.
Compared to the other two frameworks, Next.js scored better overall than Nuxt.js but worse than Nest.js
In the report provided by lighthouse, we can see that the performance, accessibility, best practices, and SEO scores are all above 70, which is good, but compared with the other two frameworks, it had the lowest score for Performance and has the highest score in Best Practices.
The Next.js community communicates through chat, slack, issues and pull request on GitHub.
Also, in the repo awesome-nextjs, there is a list of essentials, articles, boilerplates, extensions, apps, books, and videos that are useful for developers using Next.js
Nuxt is a Vue.js Meta Framework to create complex, fast, and universal web applications quickly.
Install it:
$ npm i nuxt
To create a basic app:
$ npx create-nuxt-app <project-name>
You can start directly with the CLI create-nuxt-app for the latest updates. Or you can start by using one of the starter templates: starter: Basic Nuxt.js project template express: Nuxt.js + Express koa: Nuxt.js + Koa adonuxt: Nuxt.js + AdonisJS micro: Nuxt.js + Micro nuxtent: Nuxt.js + Nuxtent module for content heavy sites
This is the most basic example of a “Hello World!” app on Nuxt:
<template>
<div>
<h1>Hello world!</h1>
<NLink to="/about">
About Page
</NLink>
</div>
</template>
<script>
export default {
head: {
title: 'Home page'
}
}
</script>
this.items[key]=value
or adding a new data property. This is a basic Hello World app in Nuxt.js. It handles 190.05 requests per second. The average time spent per request is 52.619 ms. On this metric, Nuxt.js performs the worst compared to the other two frameworks.
Nuxt.js has the highest score in three of the four measures; performance, accesibility and SEO.
There is a GitHub organization where you can find modules and projects from the Nuxt.js community. There is also a curated list of awesome things related to Nuxt.js awesome-nuxt including Modules, tools, mention of Nuxt.js, showcase, tutorials, blogs, books, starter template, official examples, and projects using Nuxt.js.
The community communicates through Gitter Chat Room, Telegram, Russian community, Discord, Twitter and YouTube Channel
A progressive Node.js framework for building efficient, scalable, and enterprise-grade server-side applications on top of TypeScript and JavaScript (ES6, ES7, ES8), Nest is heavily inspired by Angular.
Nest is a framework for building efficient, scalable Node.js server-side applications. It uses modern JavaScript, is built with TypeScript (preserves compatibility with pure JavaScript) and combines elements of OOP (Object Oriented Programming), FP (Functional Programming), and FRP (Functional Reactive Programming).
Under the hood, Nest makes use of Express, but also provides compatibility with a wide range of other libraries, like e.g. Fastify, allowing for easy use of the myriad third-party plugins which are available.
Install it:
$ npm i @nestjs/cli
$ nest new project-name
Alternatively, to install the TypeScript starter project with Git:
$ git clone https://github.com/nestjs/typescript-starter.git project
$ cd project
$ npm install
$ npm run start
After installing Nest.js with the npm cli
command, and creating a new project with nest new project-name
, a src/
directory will be created and populated with several core files, including main.ts
.
The main.ts
includes an async function, which will bootstrap our application:
import { NestFactory } from '@nestjs/core';
import { ApplicationModule } from './app.module';
async function bootstrap() {
const app = await NestFactory.create(ApplicationModule);
await app.listen(3000);
}
bootstrap();
And then to run the app that listens on port 3000, you execute:
$ npm run start
This is a basic Hello World app in Nest.js. It handles 928.18 requests per second. The average time spent per request is 10.774 ms. On this metric, Nest.js performed the best out of the three frameworks we compared.
In the report provided by lighthouse, Nest.js has a very high performance, but scored comparatively lower on other key factors:accessibility, best practices and SEO.
There is a group of developers providing handy packages on NestJS Community organization GitHub. Some of their popular packages are: nestjs-config, a config module for NestJS using dotenv
. nest-access-control, Role and Attribute-based access control for NestJS and nestjs-flub, pretty error stack viewer.
Even if Nest is not the most popular framework, is the one that has the better performance and has many advantages. You should give it a try!
The community has a spectrum chat and Twitter
Streams in Node.js have a reputation for being hard to work with, and even harder to understand.
In the words of Dominic Tarr: “Streams are Node’s best and most misunderstood idea.” Even Dan Abramov, creator of Redux and core team member of React.js is afraid of Node streams.
This article will help you understand streams and how to work with them. So, don’t be afraid. We can figure this out!
Streams are one of the fundamental concepts that power Node.js applications. They are data-handling method and are used to read or write input into output sequentially.
Streams are a way to handle reading/writing files, network communications, or any kind of end-to-end information exchange in an efficient way.
What makes streams unique, is that instead of a program reading a file into memory all at once like in the traditional way, streams read chunks of data piece by piece, processing its content without keeping it all in memory.
This makes streams really powerful when working with large amounts of data, for example, a file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. That’s where streams come to the rescue!
Using streams to process smaller chunks of data, makes it possible to read larger files.
Let’s take a “streaming” services such as YouTube or Netflix for example: these services don’t make you download the video and audio feed all at once. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately.
However, streams are not only about working with media or big data. They also give us the power of ‘composability’ in our code. Designing with composability in mind means several components can be combined in a certain way to produce the same type of result. In Node.js it’s possible to compose powerful pieces of code by piping data to and from other smaller pieces of code, using streams.
Streams basically provide two major advantages compared to other data handling methods:
fs.createWriteStream()
lets us write data to a file using streams.fs.createReadStream()
lets us read the contents of a file.net.Socket
If you have already worked with Node.js, you may have come across streams. For example, in a Node.js based HTTP server, request
is a readable stream and response
is a writable stream. You might have used the fs
module, which lets you work with both readable and writable file streams. Whenever you’re using Express you are using streams to interact with the client, also, streams are being used in every database connection driver that you can work with, because of TCP sockets, TLS stack and other connections are all based on Node.js streams.
We first require the Readable stream, and we initialize it.
const Stream = require('stream')
const readableStream = new Stream.Readable()
Now that the stream is initialized, we can send data to it:
readableStream.push('ping!')
readableStream.push('pong!')
It’s highly recommended to use async iterator when working with streams. According to Dr. Axel Rauschmayer, Asynchronous iteration is a protocol for retrieving the contents of a data container asynchronously (meaning the current “task” may be paused before retrieving an item). Also, it’s important to mention that the stream async iterator implementation use the ‘readable’ event inside.
You can use async iterator when reading from readable streams:
import * as fs from 'fs';
async function logChunks(readable) {
for await (const chunk of readable) {
console.log(chunk);
}
}
const readable = fs.createReadStream(
'tmp/test.txt', {encoding: 'utf8'});
logChunks(readable);
// Output:
// 'This is a test!\n'
It’s also possible to collect the contents of a readable stream in a string:
import {Readable} from 'stream';
async function readableToString2(readable) {
let result = '';
for await (const chunk of readable) {
result += chunk;
}
return result;
}
const readable = Readable.from('Good morning!', {encoding: 'utf8'});
assert.equal(await readableToString2(readable), 'Good morning!');
Note that, in this case, we had to use an async function because we wanted to return a Promise.
It’s important to keep in mind to not mix async functions with EventEmitter
because currently, there is no way to catch a rejection when it is emitted within an event handler, causing hard to track bugs and memory leaks. The best current practice is to always wrap the content of an async function in a try/catch block and handle errors, but this is error prone. This pull request aims to solve this issue once it lands on Node core.
To learn more about Node.js streams via async iteration, check out this great article.
stream.Readable.from(iterable, [options])
it’s a utility method for creating Readable Streams out of iterators, which holds the data contained in iterable. Iterable can be a synchronous iterable or an asynchronous iterable. The parameter options is optional and can, among other things, be used to specify a text encoding.
const { Readable } = require('stream');
async function * generate() {
yield 'hello';
yield 'streams';
}
const readable = Readable.from(generate());
readable.on('data', (chunk) => {
console.log(chunk);
});
According to Streams API, readable streams effectively operate in one of two modes: flowing and paused. A Readable stream can be in object mode or not, regardless of whether it is in flowing mode or paused mode.
stream.read()
method must be called explicitly to read chunks of data from the stream.In a flowing mode, to read data from a stream, it’s possible to listen to data event and attach a callback. When a chunk of data is available, the readable stream emits a data event and your callback executes. Take a look at the following snippet:
var fs = require("fs");
var data = '';
var readerStream = fs.createReadStream('file.txt'); //Create a readable stream
readerStream.setEncoding('UTF8'); // Set the encoding to be utf8.
// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});
readerStream.on('end',function() {
console.log(data);
});
readerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
The function call fs.createReadStream()
gives you a readable stream. Initially, the stream is in a static state. As soon as you listen to data event and attach a callback it starts flowing. After that, chunks of data are read and passed to your callback. The stream implementor decides how often a data event is emitted. For example, an HTTP request may emit a data event once every few KBs of data are read. When you are reading data from a file you may decide you emit a data event once a line is read.
When there is no more data to read (end is reached), the stream emits an end event. In the above snippet, we listen to this event to get notified when the end is reached.
Also, if there is an error, the stream will emit and notify the error.
In paused mode, you just need to call read() on the stream instance repeatedly until every chunk of data has been read, like in the following example:
var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';
var chunk;
readableStream.on('readable', function() {
while ((chunk=readableStream.read()) != null) {
data += chunk;
}
});
readableStream.on('end', function() {
console.log(data)
});
The read() function reads some data from the internal buffer and returns it. When there is nothing to read, it returns null. So, in the while loop, we check for null and terminate the loop. Note that the readable event is emitted when a chunk of data can be read from the stream.
All Readable
streams begin in paused mode but can be switched to flowing mode in one of the following ways:
stream.resume()
method.stream.pipe()
method to send the data to a Writable.The Readable
can switch back to paused mode using one of the following:
stream.pause()
method.stream.unpipe()
method.The important concept to remember is that a Readable
will not generate data until a mechanism for either consuming or ignoring that data is provided. If the consuming mechanism is disabled or taken away, the Readable
will attempt to stop generating the data.
Adding a readable
event handler automatically make the stream to stop flowing, and the data to be consumed via readable.read()
. If the 'readable' event handler is removed, then the stream will start flowing again if there is a 'data' event handler.
To write data to a writable stream you need to call write()
on the stream instance. Like in the following example:
var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');
readableStream.setEncoding('utf8');
readableStream.on('data', function(chunk) {
writableStream.write(chunk);
});
The above code is straightforward. It simply reads chunks of data from an input stream and writes to the destination using write()
. This function returns a boolean value indicating if the operation was successful. If true, then the write was successful and you can keep writing more data. If false is returned, it means something went wrong and you can’t write anything at the moment. The writable stream will let you know when you can start writing more data by emitting a drain event.
Calling the writable.end()
method signals that no more data will be written to the Writable. If provided, the optional callback function is attached as a listener for the 'finish' event.
// Write 'hello, ' and then end with 'world!'.
const fs = require('fs');
const file = fs.createWriteStream('example.txt');
file.write('hello, ');
file.end('world!');
// Writing more now is not allowed!
Using a writable stream you can read data from a readable stream:
const Stream = require('stream')
const readableStream = new Stream.Readable()
const writableStream = new Stream.Writable()
writableStream._write = (chunk, encoding, next) => {
console.log(chunk.toString())
next()
}
readableStream.pipe(writableStream)
readableStream.push('ping!')
readableStream.push('pong!')
writableStream.end()
You can also use async iterators to write to a writable stream, which is recommended
import * as util from 'util';
import * as stream from 'stream';
import * as fs from 'fs';
import {once} from 'events';
const finished = util.promisify(stream.finished); // (A)
async function writeIterableToFile(iterable, filePath) {
const writable = fs.createWriteStream(filePath, {encoding: 'utf8'});
for await (const chunk of iterable) {
if (!writable.write(chunk)) { // (B)
// Handle backpressure
await once(writable, 'drain');
}
}
writable.end(); // (C)
// Wait until done. Throws if there are errors.
await finished(writable);
}
await writeIterableToFile(
['One', ' line of text.\n'], 'tmp/log.txt');
assert.equal(
fs.readFileSync('tmp/log.txt', {encoding: 'utf8'}),
'One line of text.\n');
The default version of stream.finished() is callback-based but can be turned into a Promise-based version via util.promisify() (line A).
In this example, it is used the following two patterns:
Writing to a writable stream while handling backpressure (line B):
if (!writable.write(chunk)) {
await once(writable, 'drain');
}
Closing a writable stream and waiting until writing is done (line C):
writable.end();
await finished(writable);
Piping is a mechanism where we provide the output of one stream as the input to another stream. It is normally used to get data from one stream and to pass the output of that stream to another stream. There is no limit on piping operations. In other words, piping is used to process streamed data in multiple steps.
In Node 10.x was introduced stream.pipeline()
. This is a module method to pipe between streams forwarding errors and properly cleaning up and provide a callback when the pipeline is complete.
Here is an example of using pipeline:
const { pipeline } = require('stream');
const fs = require('fs');
const zlib = require('zlib');
// Use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully done.
// A pipeline to gzip a potentially huge video file efficiently:
pipeline(
fs.createReadStream('The.Matrix.1080p.mkv'),
zlib.createGzip(),
fs.createWriteStream('The.Matrix.1080p.mkv.gz'),
(err) => {
if (err) {
console.error('Pipeline failed', err);
} else {
console.log('Pipeline succeeded');
}
}
);
pipeline
should be used instead of pipe
, as pipe is unsafe.
The Node.js stream module provides the foundation upon which all streaming APIs are build.
The Stream module is a native module that shipped by default in Node.js. The Stream is an instance of the EventEmitter class which handles events asynchronously in Node. Because of this, streams are inherently event-based.
To access the stream module:
const stream = require('stream');
The stream
module is useful for creating new types of stream instances. It is usually not necessary to use the stream
module to consume streams.
Due to their advantages, many Node.js core modules provide native stream handling capabilities, most notably:
net.Socket
is the main node api that is stream are based on, which underlies most of the following APIsprocess.stdin
returns a stream connected to stdinprocess.stdout
returns a stream connected to stdoutprocess.stderr
returns a stream connected to stderrfs.createReadStream()
creates a readable stream to a filefs.createWriteStream()
creates a writable stream to a filenet.connect()
initiates a stream-based connectionhttp.request()
returns an instance of the http.ClientRequest class, which is a writable streamzlib.createGzip()
compress data using gzip (a compression algorithm) into a streamzlib.createGunzip()
decompress a gzip stream.zlib.createDeflate()
compress data using deflate (a compression algorithm) into a streamzlib.createInflate()
decompress a deflate stream
Here are some important events related to writable streams:
error
– Emitted to indicate that an error has occurred while writing/piping.pipeline
– When a readable stream is piped into a writable stream, this event is emitted by the writable stream.unpipe
– Emitted when you call unpipe on the readable stream and stop it from piping into the destination stream.This was all about the basics of streams. Streams, pipes, and chaining are the core and most powerful features in Node.js. Streams can indeed help you write neat and performant code to perform I/O.
Also, there is a Node.js strategic initiative worth looking to, called BOB, aiming to improve Node.js streaming data interfaces, both within Node.js core internally, and hopefully also as future public APIs.
Special thanks to Matteo Colina and Jeremiah Senkpiel for your feedback!
This article was first published in NodeSource blog on January 2017.
As with any programming language, platform, or tool, the first step to using it is getting it installed. Many of them typically come with a speedy way to upgrade when a new version is available.
By default, there's not a way to upgrade the version of Node.js you've got from within Node.js itself. That said, there's a fantastic tool for the community called nvm that allows you to manage the versions of Node.js that you've got installed locally.
One awesome aspect of nvm
is that it manages the versions of Node.js, it doesn't just upgrade them. This means you can have the latest version of Node.js, the latest versions of all the LTS release lines, and any number of other versions you want to use or test as well.
In this quick tutorial, we'll take a look at how to install nvm, and then how to start using it as your version manager for Node.js. Once we've completed the tutorial, you'll be ready to take the next step with Node.js.
This guide covers installing nvm on macOS and Linux - note that all versions of Node.js may not support every version of macOS or Linux.
Here's the abbreviated guide, highlighting the major steps:
Download the nvm install script via cURL:
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
nvm --version
, which should return the version of nvm installed.Install the version of Node.js you want
nvm install node
nvm use node
nvm install --lts
nvm use --lts
In some cases, like when installing Node.js releases from their source or installing versions of Node.js before 0.8.6
(when the project started shipping binaries), you'll need to ensure that your system has the appropriate C++ build tools.
For LTS and modern releases, you will not need this step. That said, it's a nice to have to ensure that the majority of requirements are met in any scenario.
On macOS, you've got two options for a C++ compiler: the full XCode application or the stand-alone Command Line Tools portion of Xcode.
To get these on macOS, you can follow these steps:
Run xcode-select --install
as a command
Install
On Linux, the C++ compiler will vary from distribution to distribution. For example, on Debian and Ubuntu, you'll need to install build-tools
and libssl-dev
, but this may be different on your given Linux distribution.
To get build-tools
and libssl-dev
on Debuan and Ubuntu distributions, you can run these commands:
sudo apt-get install build-essential # Install the build-essential package - let this run to completion
sudo apt-get install libssl-dev # Install the libssl-dev package - also let this one run to completion
Once you've got the right C++ compiler for your system, now it's time to run the nvm install script. Here are the single-step install scripts for both macOS and Linux. You've got the option of cURL or Wget but both achieve the same result.
Note: If your Linux system doesn't have either cURL or Wget, you can run sudo apt-get install curl
and use the cURL method.
To install nvm with the cURL method, run the following command in your terminal:
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
To install nvm with the Wget method, run the following command in your terminal:
wget -qO- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
After running the install script from Step 2, nvm should have successfully installed. To ensure that nvm is up and running on your machine, you can test it with the following command:
nvm --version
This command will return something like (though not necessarily exactly) the following:
nvm --version # The command we ran - it checks the currently installed version of nvm
0.33.0 # The current version of nvm - yours may differ!
nvm
command after running the install script?If you're using macOS, you may be missing a .bash_profile
file - to troubleshoot this, you can run touch ~/.bash_profile
in your command line and re-run the installer script.
If the problem persists after that, you can open the existing .bash_profile
file (using your favorite text editor) and add the following line to it:
source ~/.bashrc
If you're still having issues, you can take a peek at this issue to find a discussion of the problem and a collection of possible resolutions.
Congratulations! You've now got nvm
- a tool to easily allow you to manage and swap out the versions of Node.js you've got installed locally. Now, let's get you started with doing just that.
To install the latest available version of Node.js, you can use the following command:
nvm install node
Next, to use that version of Node.js in any new shell, you can simply run the use
command:
nvm use node
To install the latest available LTS version of Node.js, you can run the following command:
nvm install --lts
And to use that latatestTS version of Node.js in any new shell, you can simply run the use
command:
nvm use --lts
Now you've got a fantastic version manager for Node.js. It's time to start building!
We've got some resources to get you kickstarted! Both the breadth and depth of the Node.js and the JavaScript ecosystems are quite large - in addition to the developer tools like NodeSource N|Solid and Certified Modules, we've got a ton of tutorials, guides, and articles to help you get kick started with Node.js.
If you're interested in keeping your code clean, maintainable, and collaborative, take a peek at our post on using ESLint for linting your JavaScript applications. Are you interested in building web applications with Node.js? One of the most challenging aspects of web apps is security - you can learn security best practices for Express to lock down your web apps, to prevent breaches and attacks. Or, maybe you want to deploy your Node.js apps with Docker? Then you should definitely read our article on dockerizing your Node.js applications.
That said, if you want to keep in touch with the Node.js ecosystem, you should follow @NodeSource on Twitter! We'll keep you updated with important news from the core Node.js project, fresh and useful Node.js tutorials, and more.