Your latest Node.js content, news and updates in one place.

maxresdefault.jpg
Webinar: The end of monoliths? A guide to backend architecture in 2020

You can’t build a stable backend without solid architecture. And without proper backend, you can’t develop a good application. But what does stable backend mean? How to keep up with all the architectural trends in an ever-changing, fast-paced reality of the modern development? 📈

Is monolithic architecture already dead? Should you or your team focus on practicing the use of microservices? Is following the trends a good approach or should you rather lean on some thoroughly tested solutions? 🤔

During our upcoming webinar, Adam Polak – Head of Node.js Team at The Software House, will guide you through the current backend architecture trends and solutions, indicating the most important (but not always obvious) issues that you should pay attention to. Here’s the excerpt from the extensive agenda:

  • analysis of some important elements of backend: architecture, infrastructure, environment, communication;
  • monolithic architecture – still a good idea?
  • modular monolith vs microservices;
  • why the most of the services is still monolithic-based?
  • what were the most significant changes in the architecture in the last few years?
  • REST API, BFF, GraphQL and the communication;
  • what does a stable backend mean in 2020?

Subjects discussed during our event will be 100% based on real-life cases and experiences. Also, Adam will answer your questions during a Q&A session after the webinar. The whole event will last for approximately an hour.

Read More
4.png
Node Is Simple - Part 4

tl;drThis is the fourth article of the Node is Simple article series. In this article series, I will be discussing how to create a simple and secure NodeJS, Express, MongoDB web application.

To follow the past tutorials, Node is Simple Part 1, Node is Simple Part 2 and Node is Simple Part 3.

Hello fellow developers, now you’ve come across this article series, let’s get to it, shall we. In the past articles, I have discussed how to create simple CRUD endpoints, with MongoDB as the database and how to use Postman to test the endpoints. So in this tutorial, I will discuss how to upload files to MongoDB using MongoDB GridFS and view (or stream) them.

So what is this GridFS?

We can upload files to a folder easily with Multer-Middleware. This is a good reference for that.

Uploading Files and Serve Directory Listing Using NodeJS

But I am going to discuss how to use MongoDB as the file storage. However, there is a little hiccup, where you can only store 16MB as a document in MongoDB BSON format. To overcome this issue, there is a feature called GridFS.

Instead of storing a file in a single document, GridFS divides the file into parts, or chunks [1], and stores each chunk as a separate document.

Enough with the theory

Yes, let’s move on to implementation. First, we have to create a GridFS driver. Let’s create gridfs-service.js file inside /database directory.

const mongoose = require("mongoose");
const config = require("../config");
const dbPath = config.MONGO_URI;
const chalk = require("chalk");

const GridFsStorage = require("multer-gridfs-storage");
const Grid = require("gridfs-stream");
Grid.mongo = mongoose.mongo;

const conn = mongoose.createConnection(dbPath, {
  useNewUrlParser: true,
  useUnifiedTopology: true,
  useCreateIndex: true,
  useFindAndModify: false
});

conn.on("error", () => {
  console.log(chalk.red("[-] Error occurred from the database"));
});

let gfs, gridFSBucket;

conn.once("open", () => {
  gridFSBucket = new mongoose.mongo.GridFSBucket(conn.db, {
    bucketName: "file_uploads"
  });
  // Init stream
  gfs = Grid(conn.db);
  gfs.collection("file_uploads");
  console.log(
    chalk.yellow(
      "[!] The database connection opened successfully in GridFS service"
    )
  );
});

const getGridFSFiles = id => {
  return new Promise((resolve, reject) => {
    gfs.files.findOne({ _id: mongoose.Types.ObjectId(id) }, (err, files) => {
      if (err) reject(err);
      // Check if files
      if (!files || files.length === 0) {
        resolve(null);
      } else {
        resolve(files);
      }
    });
  });
};

const createGridFSReadStream = id => {
  return gridFSBucket.openDownloadStream(mongoose.Types.ObjectId(id));
};

const storage = new GridFsStorage({
  url: dbPath,
  cache: true,
  options: { useUnifiedTopology: true },
  file: (req, file) => {
    return new Promise(resolve => {
      const fileInfo = {
        filename: file.originalname,
        bucketName: "file_uploads"
      };
      resolve(fileInfo);
    });
  }
});

storage.on("connection", () => {
  console.log(chalk.yellow("[!] Successfully accessed the GridFS database"));
});

storage.on("connectionFailed", err => {
  console.log(chalk.red(err.message));
});

module.exports = mongoose;
module.exports.storage = storage;
module.exports.getGridFSFiles = getGridFSFiles;
module.exports.createGridFSReadStream = createGridFSReadStream;
/database/gridfs-service.js file (Contains the GridFS driver configurations)

If you can see line 63, the “bucketName” is set to “file_uploads”. This means the collection that we are using for storing the files is named as file_uploads. After running the application if you go to the MongoDB Compass, you can see there are two new collections are created.

file_uploads.chunks //contains the file chunks (one file is divided in to chunks of 255 kiloBytes.

file_uploads.files //contains the metadata of the file (such as lenght, chunkSize, uploadDate, filename, md5 hash, and the contentType)

Now we need to install several packages for this. Let’s install them.

$ npm install multer multer-gridfs-storage gridfs-stream

Multer is the middleware that handles multipart/form-data. This is a good reference if you are not familiar with them.

Understanding HTML Form Encoding: URL Encoded and Multipart Forms

The other two packages are for the file upload handling with MongoDB GridFS.

After creating this file, let’s create the GridFS middleware. It is really easy to integrate with Express since Express is handy with middleware. Let’s create /middleware/gridfs-middleware.js file.

const multer = require("multer");
const { storage } = require("../database/gridfs-service");

const upload = multer({
  storage
});

module.exports = function GridFSMiddleware() {
  return upload.single("image");
};
/middleware/gridfs-middleware.js file (Contains GridFS middleware configuration)

There is something to mention here. If you can see line 9, I set “image” as the name of the form field name. You can set it to anything you like, but you have to remember it for later use. FYI, There are several methods of uploading files.

upload.single("field_name"); //for uploading a single file

upload.array("field_name"); //for uploading an array of files

upload.fields([{name: 'avatar'}, {name: 'gallery'}]); //for uploading an array of files with multiple field names

upload.none(); //not uploading any files but contains text fields as multipart form data

Here I used, upload.single(“field_name”) because I want to upload only one image (Probably I’ll change this to uploading the student’s profile picture). However, the things I’ve described in the above code-block are not something I brew up myself 🤣. They are all described beautifully in the multer documentation.

Now we can set up the controller to upload files to MongoDB GridFS. Let’s update the /controllers/index.js file.

const GridFSMiddleware = require("../middleware/gridfs-middleware");
/** @route  POST /image
 *  @desc   Upload profile image
 *  @access Public
 */
router.post(
  "/image",
  [GridFSMiddleware()],
  asyncWrapper(async (req, res) => {
    const { originalname, mimetype, id, size } = req.file;
    res.send({ originalname, mimetype, id, size });
  })
);

Now let’s try to upload an image, shall we? Let’s start the web application as usual.

$ pm2 start

1

Figure 1: POST request containing the image file

As in figure 1, you have to select the request body as form-data and then you have to set the key as image. Now select the key type as “File”. (As seen in Figure 2)

2

Figure 2: Selecting the key type of the form data

3

Figure 3: Response from uploading the image to the DB

Now if you see the response as figure 3, it means everything worked out. Now let’s see how the MongoDB Compass shows the uploaded file.

4

Figure 4: file_uploads.files collection (Contains the metadata of the uploaded file)

5

Figure 5: file_uploads.chunks collection (Contains the file chunks of the uploaded file)

As you can see in figure 4, the file metadata is shown. In figure 5, you can see only one chunk. That is because the image we uploaded is less than 255kB in size.

Now that we have uploaded the file, how do we view it? We cannot see the image from the DB, now can we? So let’s implement a way to view the uploaded image.

Now let’s update the /controllers/index.js file.

const { getGridFSFiles } = require("../database/gridfs-service");
const { createGridFSReadStream } = require("../database/gridfs-service");
/** @route   GET /image/:id
 *  @desc    View profile picture
 *  @access  Public
 */
router.get(
  "/image/:id",
  asyncWrapper(async (req, res) => {
    const image = await getGridFSFiles(req.params.id);
    if (!image) {
      res.status(404).send({ message: "Image not found" });
    }
    res.setHeader("content-type", image.contentType);
    const readStream = createGridFSReadStream(req.params.id);
    readStream.pipe(res);
  })
);

What we are doing here is, we get the image id from the request and check for the image in the DB and if the image exists, we stream it. If the image is not found we return a 404 and a not found message. It is as simple as that.

Now let’s try that out.

6

Figure 6: GET request to view the image.

Remember the image id is what gets returned as the id in figure 3.

7

Figure 7: Image is streamed (Image is from [https://nodejs.org/en/about/resources/](https://nodejs.org/en/about/resources/))

If we send the GET request from a web browser, we’ll see the image in the browser. (If you get a “Your connection is not private” message, ignore it.) It will be the same as in figure 7.

So that was a lot of work, isn’t it? But totally worth it. Now you can save your precious images inside a MongoDB database and view them from a browser. Go show off that talent to your friends and impress them 😁.

This is the full /controllers/index.js file for your reference.

/controllers/index.js file (Updated to reflect the view and upload image endpoints)

So, this is the end for this tutorial and in the next tutorial, I will tell you how to create and use custom middleware with Express. (Remember I told you Express is great with middleware). As usual, you can see the whole code behind this tutorial in my GitHub repository. (Check for the commit message “Tutorial 4 checkpoint”.)

Node is Simple

So until we meet again, happy coding…

Read More
3.png
Node Is Simple — Part 3

tl;drThis is the third article of the Node is Simple article series. In this article series, I will be discussing how to create a simple and secure NodeJS, Express, MongoDB web application.

To follow the past tutorials, Node is Simple Part 1, and Node is Simple Part 2.

Hello, fellow developers, I am once again, asking you to learn some NodeJS with me. In this tutorial, I will be discussing creating basic CRUD operations in NodeJS. So without further ado, let’s start, shall we?

CREATE endpoint

If you can remember in my previous article, I have created a simple endpoint to POST a name and city of a student and created a record (document) of the student. What I am going to do is enhance what I’ve already done. Let’s update the model, and the service to reflect our needs.

We are going to add some new fields to the student document.

const mongoose = require("../database");
const Schema = mongoose.Schema;

const studentSchema = new Schema(
  {
    _id: {
      type: mongoose.SchemaTypes.String,
      unique: true,
      required: true,
      index: true
    },
    name: { type: mongoose.SchemaTypes.String, required: true },
    city: { type: mongoose.SchemaTypes.String, required: true },
    telephone: { type: mongoose.SchemaTypes.Number, required: true },
    birthday: { type: mongoose.SchemaTypes.Date, required: true }
  },
  { strict: true, timestamps: true, _id: false }
);

const collectionName = "student";

const Student = mongoose.model(collectionName, studentSchema, collectionName);

module.exports = {
  Student
};
Updated /models/index.js file

What I’ve done here is added \id, telephone,_ and birthday as the new fields. And I have disabled the Mongoose default \id_ and specified my own.

Now let’s update the service file.

const { Student } = require("../models");

module.exports = class StudentService {
  async registerStudent(data) {
    const { _id, name, city, telephone, birthday } = data;

    const new_student = new Student({
      _id,
      name,
      city,
      telephone,
      birthday
    });

    const response = await new_student.save();
    const res = response.toJSON();
    delete res.__v;
    return res;
  }
};
Updated /services/index.js file

Before we are going to test what we have done, I am going to let you in on a super-secret. If you have experience in developing NodeJS applications, I hope you have come across the Nodemon tool. It restarts the server once you changed the files. But today I am going to tell you about this amazing tool called PM2.

What is PM2?

PM2 is a production-grade process management tool. It has various capabilities such as load balancing (which I will discuss in a later tutorial), enhanced logging features, adding environment variables, and many more. Their documentation is super nifty and worth checking out.

So let’s install PM2 and start using it in our web app.

$ npm install pm2@latest -g

Let’s create ecosystem.config.js file in the project root, which will contain all of our environment variables, keys, secrets, and PM2 configurations.

const fs = require("fs");

const SERVER_CERT = fs.readFileSync(__dirname + "/config/server.cert", "utf8");
const SERVER_KEY = fs.readFileSync(__dirname + "/config/server.key", "utf8");

module.exports = {
  apps: [
    {
      name: "node-is-simple",
      script: "./index.js",
      watch: true,
      env: {
        NODE_ENV: "development",
        SERVER_CERT,
        SERVER_KEY,
        HTTP_PORT: 8080,
        HTTPS_PORT: 8081,
        MONGO_URI: "mongodb://localhost/students"
      }
    }
  ]
};
ecosystem.config.js file (The configuration file for PM2)

Since we have moved all of our keys as the environment variables, now we have to change /config/index.js file to reflect these changes.

module.exports = {
  SERVER_CERT: process.env.SERVER_CERT,
  SERVER_KEY: process.env.SERVER_KEY,
  HTTP_PORT: process.env.HTTP_PORT,
  HTTPS_PORT: process.env.HTTPS_PORT,
  MONGO_URI: process.env.MONGO_URI
};
Updated /config/index.js file (here we have moved all the secrets to the ecosystem.config.js file as environment variables)

Now, remember, /config/index.js is safe to commit to a public repository. But not the ecosystem.config.js file. Also never commit your private keys to a public repository (But I’ve done it for the demonstration purposes). Protect your secrets like your cash 😂.

Since we have done our initial setup let’s run our application. And one thing to keep in mind. If you start the application, as usual,

$ node index.js

it won’t work. Now we have to use PM2 to start our application because it contains all the environment variables needed for the Node web application. Now go to the project root folder and run the following command.

$ pm2 start

1

Figure 1: PM2 startup

If you see this (figure 1) in your console it is working as it should. Now to see the logs run the following command.

$ pm2 logs

2

Figure 2: PM2 logs

If you see this (figure 2) in your console then the node app is working as it should.

Enough with the PM2

Yes, let’s move to test our enhanced CREATE endpoint.

3

Figure 3: POST request in Postman

Now create a request like this (figure 3) and hit Send.

4

Figure 4: POST response in Postman

If you see this response (figure 4), it is safe to say everything works as it should. Yay!

Now since we have a CREATE endpoint, let’s add a READ endpoint.

READ endpoint

Now let’s read what’s inside our student collection. We can view all the students who are registered, or we can see details from only one student.

GET /students

Now let’s get all the students’ details. We only get the \id, name,_ and city of the student here. Add these lines to the /controllers/index.js file.

/** @route  GET /students
 *  @desc   Get all students
 *  @access Public
 */
router.get(
  "/students",
  asyncWrapper(async (req, res) => {
    const response = await studentService.getAllStudents();
    res.send(response);
  })
);

And add these lines (inside the StudentService class) to the /servcies/index.js file.

_async_ getAllStudents() {  
  _return_ Student.find({}, "\_id name city");  
}

Let me give a summary of the above lines. Student.find() is the method to apply queries to the MongoDB. Since we need all the students we pass the empty object as the first argument. As the second argument (which is called a projection) we provide the fields we want to return and the fields we do not want to return. Here I want \id, name,_ and city. We can use “-field\name”_ to provide the field we do not want.

GET /students/:id

Now let’s get all the details from a single student. Now here we are getting all the details of a single student.

Now add these lines to the /controllers/index.js file.

/** @route  GET /students/:id
 *  @desc   Get a single student
 *  @access Public
 */
router.get(
  "/students/:id",
  asyncWrapper(async (req, res) => {
    const response = await studentService.getStudent(req.params.id);
    res.send(response);
  })
);

In the Express router, we can specify a path parameter via /:param syntax. Now we can access this path parameter via req.params.param. This is basic Express and to get more knowledge on this please refer to the documentation, and it is a great source of good knowledge.

Now add these lines to the /servcies/index.js file.

_async_ getStudent(\_id) {  
  _return_ Student.findById(\_id, "-\_\_v -createdAt -updatedAt");  
}

Here we provide the \id_ of the student to get the details. As the projection, we do not want \_v, createdAt,_ and updatedAt fields.

Now let’s check these endpoints.

5

Figure 5: Request /students

6

Figure 6: Response /students

I have added two more students’ details, so I got three records.

Now let’s check the single student endpoint.

7

Figure 7: Request /students/stud\_1

8

Figure 8: Response /students/stud\_1

If you get similar results in figure 6 and figure 8 let’s say it was a success.

Update endpoint

Since we created student records, viewed these student records, now it is time to update these student records.

To PUT, or to PATCH?

So this is the biggest question, to update a resource, should we use PUT or PATCH? The answer is somewhat simple. If you want to update the whole resource every time, use PUT. If you want to update the resource partially, use PATCH. It is that simple. This article clarified this dilemma.

Differences between PUT and PATCH

Since I am going to partially update the student record, I will be using PATCH.

PATCH /students/:id

Now let’s update the /controllers/index.js file.

/** @route  PATCH /students/:id
 *  @desc   Update a single student
 *  @access Public
 */
router.patch(
  "/students/:id",
  asyncWrapper(async (req, res) => {
    const response = await studentService.updateStudent(
      req.params.id,
      req.body
    );
    res.send(response);
  })
);

After adding this let’s update the /services/index.js file.

_async_ updateStudent(\_id, { name, city, telephone, birthday }) {  
  _return_ Student.findOneAndUpdate(  
    { \_id },  
    {  
      name,  
      city,  
      telephone,  
      birthday  
    },  
    {  
      _new_: _true_,  
      omitUndefined: _true_,  
      fields: "-\_\_v -createdAt -updatedAt"  
    }  
  );  
}

Let me give a brief description of what’s going on here. We update the student by the given id, and we provide the name, city, telephone, and birthday data to be updated. As the third argument we provide new: true to return the updated document to us, omitUndefined: true to partially update the resource and fields: “-\_v -createdAt -updatedAt”_ to remove these fields from the returning document.

Now let’s check this out.

9

Figure 9: Updating the name of the student (stud\_1)

10

Figure 10: The student’s name has changed from June to Jane

So if you get similar results as figure 10 then let’s say yay! Now let’s move on to the final part of this tutorial, which is DELETE.

DELETE endpoint

Since we create, read, and update, now it is time to delete some students 😁.

DELETE /students/:id

Since we should not delete all the students at once (It is a best practice IMO), let’s delete student by the provided student_id.

Now let’s update the /controllers/index.js file.

/** @route  DELETE /students/:id
 *  @desc   Delete a single student
 *  @access Public
 */
router.delete(
  "/students/:id",
  asyncWrapper(async (req, res) => {
    const response = await studentService.deleteStudent(req.params.id);
    res.send(response);
  })
);

Now let’s update the /services/index.js file.

_async_ deleteStudent(\_id) {  
  _await_ Student.deleteOne({ \_id });  
  _return_ { message: \`Student \[${\_id}\] deleted successfully\` };  
}

Now the time to see this in action.

11

Figure 11: Request to delete student with stdent\_id \[stud\_2\]

12

Figure 12: Response of deleting student with stdent\_id \[stud\_2\]

If the results are similar to figure 12, then it is safe to assume, the application works as it should.

So here are the current /controllers/index.js file and /services/index.js file for your reference.

const router = require("express").Router();
const asyncWrapper = require("../utilities/async-wrapper");
const StudentService = require("../services");
const studentService = new StudentService();

/** @route  GET /
 *  @desc   Root endpoint
 *  @access Public
 */
router.get(
  "/",
  asyncWrapper(async (req, res) => {
    res.send({
      message: "Hello World!",
      status: 200
    });
  })
);

/** @route  POST /register
 *  @desc   Register a student
 *  @access Public
 */
router.post(
  "/register",
  asyncWrapper(async (req, res) => {
    const response = await studentService.registerStudent(req.body);
    res.send(response);
  })
);

/** @route  GET /students
 *  @desc   Get all students
 *  @access Public
 */
router.get(
  "/students",
  asyncWrapper(async (req, res) => {
    const response = await studentService.getAllStudents();
    res.send(response);
  })
);

/** @route  GET /students/:id
 *  @desc   Get a single student
 *  @access Public
 */
router.get(
  "/students/:id",
  asyncWrapper(async (req, res) => {
    const response = await studentService.getStudent(req.params.id);
    res.send(response);
  })
);

/** @route  PATCH /students/:id
 *  @desc   Update a single student
 *  @access Public
 */
router.patch(
  "/students/:id",
  asyncWrapper(async (req, res) => {
    const response = await studentService.updateStudent(
      req.params.id,
      req.body
    );
    res.send(response);
  })
);

/** @route  DELETE /students/:id
 *  @desc   Delete a single student
 *  @access Public
 */
router.delete(
  "/students/:id",
  asyncWrapper(async (req, res) => {
    const response = await studentService.deleteStudent(req.params.id);
    res.send(response);
  })
);

module.exports = router;
Updated /controllers/index.js file (contains CRUD endpoints)
const { Student } = require("../models");

module.exports = class StudentService {
  async registerStudent(data) {
    const { _id, name, city, telephone, birthday } = data;

    const new_student = new Student({
      _id,
      name,
      city,
      telephone,
      birthday
    });

    const response = await new_student.save();
    const res = response.toJSON();
    delete res.__v;
    return res;
  }

  async getAllStudents() {
    return Student.find({}, "_id name city");
  }

  async getStudent(_id) {
    return Student.findById(_id, "-__v -createdAt -updatedAt");
  }

  async updateStudent(_id, { name, city, telephone, birthday }) {
    return Student.findOneAndUpdate(
      { _id },
      {
        name,
        city,
        telephone,
        birthday
      },
      {
        new: true,
        omitUndefined: true,
        fields: "-__v -createdAt -updatedAt"
      }
    );
  }

  async deleteStudent(_id) {
    await Student.deleteOne({ _id });
    return { message: `Student [${_id}] deleted successfully` };
  }
};
Updated /services/index.js file (Contains all CRUD endpoints)

So this is it for this tutorial, and we will meet again in a future tutorial about uploading files to MongoDB using GridFS. As usual, you can find the code here (Check for the commit message “Tutorial 3 checkpoint”).

node-is-simple

So until we meet again, happy coding…

Read More
node_is_simple.png
Node Is Simple — Part 2

tl;drThis is the second article of the Node is Simple article series. In this article series, I will be discussing how to create a simple and secure NodeJS, Express, MongoDB web application.

Hello, my friends, this is part two of Node is Simple article series, where I will be discussing how to add MongoDB to your web application. If you haven’t read my first article you can read it from here: Node is Simple - Part 1


So what is MongoDB?

If you haven’t heard of MongoDB that is news to me. It is the most popular database for modern web applications. (Yeah, yeah, Firebase, Firestore are there too.) MongoDB is a NoSQL database where it has documents as the atomic data structure and a collection of documents is a Collection. With these documents and collections, we can store data as we want.

{     
 "\_id": "5cf0029caff5056591b0ce7d",     
 "firstname": "Jane",     
 "lastname": "Wu",     
 "address": {       
     "street": "1 Circle Rd",       
     "city": "Los Angeles",       
     "state": "CA",       
     "zip": "90404"     
 },     
 "hobbies": \["surfing", "coding"\]   
}

This is how a simple document is constructed in MongoDB. For our tutorial, we will use MongoDB Cloud Atlas which is a MongoDB server as a service platform. I am not going to explain how to create an Atlas account and set up a database since it is really easy and the following is a really good reference.

Getting Started with MongoDB Atlas: Overview and Tutorial

After creating the MongoDB account and set up the database obtain the MongoDB URI which looks like this.

**_mongodb+srv://your\_user\_name:your\_password@cluster0-v6q0g.mongodb.net/database\_name_**

You can specify the following with the URI.

your\user_name:_ The username of the MongoDB database

your\password:_ The password for the MongoDB database

database\name:_ The MongoDB database name

Now that you have the MongoDB URI, you can use MongoDB Compass to view the database and create new collections.

Now let’s move on to the coding, shall we?

First of all, let me tell you something awesome about linting. To catch errors in your code, before running any tests, it is vital to do linting. In linting, tools like ESLint, look at your code and displays warnings and errors about your code. So let’s set up ESLint in your development environment.

  1. Setting up ESLint in VSCode

This is a good reference to setting up ESLint in VSCode: Linting and Formatting with ESLint in VS Code

  1. Setting up ESLint in WebStorm

First, install ESLint in your project path.

$ npm install eslint --save-dev

There are several ESLint configurations to use. Let’s create .eslintrc.json file in the project root folder and specify the configurations.

{
  "env": {
    "browser": true,
    "commonjs": true,
    "es6": true,
    "node": true
  },
  "extends": ["eslint:recommended"],
  "globals": {
    "Atomics": "readonly",
    "SharedArrayBuffer": "readonly"
  },
  "parserOptions": {
    "ecmaVersion": 2018
  },
  "rules": {}
}
_.eslintrc.json file (The configuration file for ESLint)_

After that go to settings in WebStorm and then select Manual ESLint configuration.

1

ESLint configuration settings on WebStorm

Then click OK and done and dusted.

Show me the Code!!!

Alright, alright, let’s move on to the real deal. Now we are going to create a Mongoose reference for the MongoDB database and create some models. Mongoose is the Object Document Model for the MongoDB in NodeJS environment. It is really easy to use and the documentation is really good. Let me tell you a super-secret. Read the documentation, always read the documentation. Documentation is your best friend. 🤣

So first let's install the mongoose package inside the project folder.

$ npm install mongoose --save

Now let’s create the database connection file index.js in /database folder.

const mongoose = require("mongoose");
const config = require("../config");
const dbPath = config.MONGO_URI;
const chalk = require("chalk");

mongoose
  .connect(dbPath, {
    useNewUrlParser: true,
    useUnifiedTopology: true,
    useCreateIndex: true,
    useFindAndModify: false
  })
  .then(() => {
    console.log(chalk.yellow("[!] Successfully connected to the database"));
  })
  .catch(err => {
    console.log(chalk.red(err.message));
  });

const db = mongoose.connection;

db.on("error", () => {
  console.log(chalk.red("[-] Error occurred from the database"));
});

db.once("open", () => {
  console.log(
    chalk.yellow("[!] Successfully opened connection to the database")
  );
});

module.exports = mongoose;
/database/index.js file (Contains MongoDB configurations)

Now let’s update the /config/index.js file.

const fs = require("fs");

const SERVER_CERT = fs.readFileSync(__dirname + "/server.cert", "utf8");
const SERVER_KEY = fs.readFileSync(__dirname + "/server.key", "utf8");

module.exports = {
  SERVER_CERT,
  SERVER_KEY,
  HTTP_PORT: 8080,
  HTTPS_PORT: 8081,
  MONGO_URI:
    "mongodb+srv://your_user_name:your_password@cluster0-v6q0g.mongodb.net/students"
};
/config/index.js file (Contains configurations of the project)

Remember to change the MONGO_URI according to the one you obtained from MongoDB Cloud Atlas instance.

Now let’s create a simple mongoose model. Create index.js inside /models folder.

const mongoose = require("../database");
const Schema = mongoose.Schema;

const studentSchema = new Schema(
  {
    name: { type: mongoose.SchemaTypes.String },
    city: { type: mongoose.SchemaTypes.String }
  },
  { strict: true, timestamps: true }
);

const collectionName = "student";

const Student = mongoose.model(collectionName, studentSchema, collectionName);

module.exports = {
  Student
};
/models/index.js file (Contains Mongoose model schemas)

As simple as that. Now let’s create a simple service to create a student using the endpoint. Create index.js file inside /services folder.

const { Student } = require("../models");

module.exports = class StudentService {
  async registerStudent(data) {
    const { name, city } = data;

    const new_student = new Student({
      name,
      city
    });

    const response = await new_student.save();
    const res = response.toJSON();
    delete res.__v;
    return res;
  }
};
/services/index.js file

Simple right? Now let’s use this service inside a controller. Remember our controller, we created in the first article. Let’s update it.

const router = require("express").Router();
const asyncWrapper = require("../utilities/async-wrapper");
const StudentService = require("../services");
const studentService = new StudentService();

/** @route  GET /
 *  @desc   Root endpoint
 *  @access Public
 */
router.get(
  "/",
  asyncWrapper(async (req, res) => {
    res.send({
      message: "Hello World!",
      status: 200
    });
  })
);

/** @route  POST /register
 *  @desc   Register a student
 *  @access Public
 */
router.post(
  "/register",
  asyncWrapper(async (req, res) => {
    const response = await studentService.registerStudent(req.body);
    res.send(response);
  })
);

module.exports = router;
/controllers/index.js file (Contains Express Router to handle requests)

It’s not over yet. If you run this application as of now and send a POST request to the /register endpoint, it would just return a big error message. It is because our application still doesn’t know how to parse a JSON payload. It would just complain that req.body is undefined. So let’s teach how to parse a JSON payload to our web application. It is not much but it’s honest work. We have to use a simple Express middleware called Body-Parser for this situation. Now let’s set up this.

First, install the following packages inside the project folder.

$ npm install body-parser helmet --save

Create the file common.js inside /middleware folder.

const bodyParser = require("body-parser");
const helmet = require("helmet");

module.exports = app => {
  app.use(bodyParser.json());
  app.use(helmet());
};
/middleware/common.js file (Contains all the common middleware for the Express app)

Body-Parser is the middleware which parses the body payload. And Helmet is there to secure your web application by setting various security HTTP headers.

After that let’s export our middleware as a combined middleware. Now if we want to add new middleware all we have to do is update the common.js file. Create index.js file inside /middleware folder.

const CommonMiddleware = require("./common");

const Middleware = app => {
  CommonMiddleware(app);
};

module.exports = Middleware;
/middleware/index.js file (Exports all the common middleware)

We are not done yet. Now we have to include this main middleware file inside the index.js root file. Remember we created the index.js file inside the project root folder. Let’s update it.

const express = require("express");
const chalk = require("chalk");
const http = require("http");
const https = require("https");
const config = require("./config");

const HTTP_PORT = config.HTTP_PORT;
const HTTPS_PORT = config.HTTPS_PORT;
const SERVER_CERT = config.SERVER_CERT;
const SERVER_KEY = config.SERVER_KEY;

const app = express();
const Middleware = require("./middleware");
const MainController = require("./controllers");

Middleware(app);
app.use("", MainController);
app.set("port", HTTPS_PORT);

/**
 * Create HTTPS Server
 */

const server = https.createServer(
  {
    key: SERVER_KEY,
    cert: SERVER_CERT
  },
  app
);

const onError = error => {
  if (error.syscall !== "listen") {
    throw error;
  }

  const bind =
    typeof HTTPS_PORT === "string"
      ? "Pipe " + HTTPS_PORT
      : "Port " + HTTPS_PORT;

  switch (error.code) {
    case "EACCES":
      console.error(chalk.red(`[-] ${bind} requires elevated privileges`));
      process.exit(1);
      break;
    case "EADDRINUSE":
      console.error(chalk.red(`[-] ${bind} is already in use`));
      process.exit(1);
      break;
    default:
      throw error;
  }
};

const onListening = () => {
  const addr = server.address();
  const bind = typeof addr === "string" ? `pipe ${addr}` : `port ${addr.port}`;
  console.log(chalk.yellow(`[!] Listening on HTTPS ${bind}`));
};

server.listen(HTTPS_PORT);
server.on("error", onError);
server.on("listening", onListening);

/**
 * Create HTTP Server (HTTP requests will be 301 redirected to HTTPS)
 */
http
  .createServer((req, res) => {
    res.writeHead(301, {
      Location:
        "https://" +
        req.headers["host"].replace(
          HTTP_PORT.toString(),
          HTTPS_PORT.toString()
        ) +
        req.url
    });
    res.end();
  })
  .listen(HTTP_PORT)
  .on("error", onError)
  .on("listening", () =>
    console.log(chalk.yellow(`[!] Listening on HTTP port ${HTTP_PORT}`))
  );

module.exports = app;
/index.js file (Contains the Express app configuration)

Well done people, now we have completed setting up MongoDB connection and the model. So how do we test this? It is so simple, we just have to use a super easy tool called Postman.

Testing API endpoints with Postman

I hope you all know what Postman does. (Not the one that delivers letters ofc.) Install Postman and fire it up.

22

Figure 1: Postman Settings

As in figure 1, change the settings for SSL certificate validation. And set it to off. Since we only have a self-signed SSL certificate. (Remember tutorial one.) After that, we are ready to go.

3

Figure 2: Request creation in Postman

As in figure 2, create your JSON body request. Add your name and city. Then click Send.

4

Figure 3: Response to the request

If everything goes as it should, you’ll see this response. If you see this response, voila, everything works correctly. Now let’s take a look at MongoDB Compass.

5

Figure 4: MongoDB Compass view

As in figure 3, you'll see how it shows in the database. Here, the MongoDB database is “students” and the collection is “student” and the document is the one that is showed in the view.

Now that was easy right? So that’s it for this tutorial. In the coming tutorial, I’ll add some more CRUD operations to give you more details on working with MongoDB. All the code is saved in the GitHub repo and matched with the commit message. Look for the commit message “Tutorial 2 checkpoint”.

Niweera/node-is-simple

Until we meet again, happy coding…

Read More
1_q4C3LGm0jGTaDtoSco5d1w.jpg
Node is Simple — Part 1

tl;drThis is the first article of the Node is Simple article series. In this article series, I will be discussing how to create a simple and secure NodeJS, Express, MongoDB web application.

First of all let me give a big shout out to JavaScript, who’s going to turn 25 years old this year. W/O JavaScript, the world would be a much darker place indeed. 😁 In this article series what I am going to do is create an API with NodeJS, ExpressJS, and MongoDB. I know there is a vast ocean of tutorials out there, describing how to build an API with these technologies, but the thing is that I have never found a very comprehensive all in one tutorial where you get the knowledge of the following things.

  1. Setting up a basic NodeJS, Express web app with SSL/TLS.
  2. Setting up ESLint in your favorite editor or IDE.
  3. Adding MongoDB as the database.
  4. Creating basic CRUD endpoints and testing with Postman.
  5. Uploading files and view them using MongoDB GridFS.
  6. Creating custom middleware with Express.
  7. Add logging for the web application.
  8. Securing endpoints with JWT authentication.
  9. Validating the input using @Hapi/Joi.
  10. Adding an OpenAPI Swagger Documentation.
  11. Caching the responses with Redis.
  12. Load balancing with PM2.
  13. Testing the API using Chai, Mocha.
  14. Create a CI/CD pipeline.
  15. Deploying to your favorite platform.

Well if you have done all of these things with your web application, it would be awesome. Since I learned all of these the hard way, I want to share them with all of you. Because sharing is caring 😇.

Since this tutorial is a series I won’t be making this a long; boring to read one. In this first article, I will describe how to set up a simple NodeJS and Express app with SSL/TLS. Just that, nothing else.

maxresdefault

On your feet soldier, we are starting.

First of all, let’s create the folder structure of our web application.

1 jnP vv30BKCx84yOSHbTRQ

Figure 1: node-is-simple folder structure

As shown in figure 1, you need to create the folders and the index.js file. If you use Linux or, Git Bash on Windows, let’s make a simple script to do this. So you won’t have to do this again when you need to create another application. (We are so lazy aren’t we? 😂)

#!/usr/bin/env bash

############################################################
# Remember to create a folder for your project first       #
# And run `npm init` to initialize a node project          #
# Inside that project folder run this bootstrap.sh script  #
############################################################

# Create the folders
mkdir config controllers errors middleware models services swagger utilities database

# Create the index.js file
touch index.js

############################################################
# Remember to check if you have node and npm installed     #
# I assume you have installed node and npm                 #
############################################################

# Install required packages
npm install express chalk --save

#That's it folks!

Let’s go through it again.

First, you need to create a folder for your project and run npm init and initialize your application. With this, you can specify a great name to your project, your license of preference, and many more. Ah, I just forgot. You need to check if you have the current version of Node and NPM installed in your machine. I hope you know how to install NodeJS on your computer. If not please refer to the following links.

How to Download & Install Node.js - NPM on Windows

How To Install Node.js on Ubuntu 18.04

After doing that just run this bootstrap.sh bash script inside your project folder and it will create a simple Node, Express application. So simple right?

After creating the necessary folder structure let’s set up the Express application. Let’s go to the index.js file on the root folder and add these lines.

const express = require("express");
const chalk = require("chalk");
const http = require("http");
const https = require("https");
const config = require("./config");

const HTTP_PORT = config.HTTP_PORT;
const HTTPS_PORT = config.HTTPS_PORT;
const SERVER_CERT = config.SERVER_CERT;
const SERVER_KEY = config.SERVER_KEY;

const app = express();
const MainController = require("./controllers");

app.use("", MainController);
app.set("port", HTTPS_PORT);

/**
 * Create HTTPS Server
 */

const server = https.createServer(
  {
    key: SERVER_KEY,
    cert: SERVER_CERT
  },
  app
);

const onError = error => {
  if (error.syscall !== "listen") {
    throw error;
  }

  const bind =
    typeof HTTPS_PORT === "string"
      ? "Pipe " + HTTPS_PORT
      : "Port " + HTTPS_PORT;

  switch (error.code) {
    case "EACCES":
      console.error(chalk.red(`[-] ${bind} requires elevated privileges`));
      process.exit(1);
      break;
    case "EADDRINUSE":
      console.error(chalk.red(`[-] ${bind} is already in use`));
      process.exit(1);
      break;
    default:
      throw error;
  }
};

const onListening = () => {
  const addr = server.address();
  const bind = typeof addr === "string" ? `pipe ${addr}` : `port ${addr.port}`;
  console.log(chalk.yellow(`[!] Listening on HTTPS ${bind}`));
};

server.listen(HTTPS_PORT);
server.on("error", onError);
server.on("listening", onListening);

/**
 * Create HTTP Server (HTTP requests will be 301 redirected to HTTPS)
 */
http
  .createServer((req, res) => {
    res.writeHead(301, {
      Location:
        "https://" +
        req.headers["host"].replace(
          HTTP_PORT.toString(),
          HTTPS_PORT.toString()
        ) +
        req.url
    });
    res.end();
  })
  .listen(HTTP_PORT)
  .on("error", onError)
  .on("listening", () =>
    console.log(chalk.yellow(`[!] Listening on HTTP port ${HTTP_PORT}`))
  );

module.exports = app;

Don’t run this yet, since we haven’t done anything yet. I know this is a bit too much just bear with me, I’ll explain everything. What we are doing here is, first we create an HTTP server and then we create an HTTPS server. Then we redirect all the HTTP traffic to the HTTPS server. First of all, we need to create certificates to use inside the HTTPS server. It is a little bit of pain, but it’ll worth it. This is a good resource about creating SSL certificates for your Node application.

How to Use SSL/TLS with Node.js

I am a little bit of a lazy person so I’ll just generate server.cert and server.key files using this link.

Self-Signed Certificate Generator

Inside the server name input, you just have to provide your domain name. For this purpose, I’ll use the localhost as the domain name.

After generating the *.cert and *.key files copy them to the /config folder. And rename them to server.cert and server.key.

Now let’s import the certificates and make them useful. Inside the /config folder create the file index.js and add these lines.

const fs = require("fs");

const SERVER_CERT = fs.readFileSync(__dirname + "/server.cert", "utf8");
const SERVER_KEY = fs.readFileSync(__dirname + "/server.key", "utf8");

module.exports = {
  SERVER_CERT,
  SERVER_KEY,
  HTTP_PORT: 8080,
  HTTPS_PORT: 8081
};

Not so simple after all right? Well, let’s see.

Now that we have set up the certificates, let’s create a simple controller (endpoint) to our application and check that out. First, let’s got to the /controllers folder and create an index.js file. Inside that add the following lines.

const router = require("express").Router();
const asyncWrapper = require("../utilities/async-wrapper");

/** @route  GET /
 *  @desc   Root endpoint
 *  @access Public
 */
router.get(
  "/",
  asyncWrapper(async (req, res) => {
    res.send({
      message: "Hello World!",
      status: 200
    });
  })
);

module.exports = router;

asyncWrapper what is that???

Let me tell you about that. Async Wrapper is a wrapper function that will catch all the errors happen inside your code and returns them to the error handling middleware. I’ll explain this a little more when I am discussing Express middleware. Now let’s create this infamous asyncWrapper. Go to the /utilities folder and create the two following files.

**async-wrapper.js**

module.exports = requestHandler => (req, res, next) =>
  requestHandler(req, res).catch(next);

**async.wrapper.d.ts** (Async wrapper type definition file)

We have done it, folks, we have done it. Now let’s check this beautiful Express app at work. Let’s run the web application first. Let’s go to the project folder and in the terminal (or Git Bash on Windows) run the following line.

node index.js

After that let’s fire up your favorite browser (mine is Chrome 😁) and hit the endpoint at,

https://localhost:8081/

Don’t worry if your browser says it is not secure to go to this URL. You trust yourself, don’t you? So let’s accept the risk and go ahead.

1 -j7gNIpPkuqIQj OQf9Y0A

This is the warning I told you about. (FYI, this screenshot is from Firefox)

Now if you can see something like this,

1 UNQTECXa-hXe2YM4ANG0DQ

Response from [https://localhost:8081](https://localhost:8081)

You have successfully created a NodeJS, Express application. Also since we have set up an HTTP server, you can try this URL too. http://localhost:8080

If you go to this URL, you’ll see that you’ll be redirected to https://localhost:8081

Now that’s the magic with our Express application. Awesome right? So this is it for this tutorial. In the next tutorial, I’ll tell you all about adding MongoDB as the database.

https://github.com/Niweera/node-is-simple

This is the GitHub repo for this tutorial and I’ll update this repo as the tutorial grows. Check for the commit messages for the snapshot of the repo for a particular tutorial. If you have any queries, don’t forget to hit me up on any social media as shown on my website.

https://niweera.gq/

So until we meet again with part two of this tutorial series, happy coding…

Read More
5-min.jpg
Running Your Node.js App With Systemd - Part 1

You've written the next great application, in Node, and you are ready to unleash it upon the world. Which means you can no longer run it on your laptop, you're going to actually have to put it up on some server somewhere and connect it to the real Internet. Eek.

There are a lot of different ways to run an app in production. This post is going to cover the specific case of running something on a "standard" Linux server that uses systemd, which means that we are not going to be talking about using Docker, AWS Lambda, Heroku, or any other sort of managed environment. It's just going to be you, your code, and terminal with a ssh session my friend.

Before we get started though, let's talk for just a brief minute about what systemd actually is and why you should care.

What is systemd Anyway?

The full answer to this question is big, as in, "ginormous" sized big. So we're not going to try and answer it fully since we want to get on the the part where we can launch our app. What you need to know is that systemd is a thing that runs on "new-ish" Linux servers that is responsible for starting / stopping / restarting programs for you. If you install mysql, for example, and whenever you reboot the server you find that mysql is already running for you, that happens because systemd knows to turn mysql on when the machine boots up.

This systemd machinery has replaced older systems such as init and upstart on "new-ish" Linux systems. There is a lot of arguably justified angst in the world about exactly how systemd works and how intrusive it is to your system. We're not here to discuss that though. If your system is "new-ish", it's using systemd, and that's what we're all going to be working with for the forseeable future.

What does "new-ish" mean specifically? If you are using any of the following, you are using systemd:

  • CentOS 7 / RHEL 7
  • Fedora 15 or newer
  • Debian Jessie or newer
  • Ubuntu Xenial or newer

Running our App Manually

I'm going to assume you have a fresh installation of Ubuntu Xenial to work with, and that you have set up a default user named ubuntu that has sudo privileges. This is what the default will be if you spin up a Xenial instance in Amazon EC2. I'm using Xenial because it is currently the newest LTS (Long Term Support) version available from Canonical. Ubuntu Yakkety is available now, and is even newer, but Xenial is quite up-to-date at the time of this writing and will be getting security updates for many years to come because of its LTS status.

Use ssh with the ubuntu user to get into your server, and let's install Node.

$ sudo apt-get -y install curl
$ curl -sL https://deb.nodesource.com/setup_6.x | sudo bash -
$ sudo apt-get -y install nodejs

Next let's create an app and run it manually. Here's a trivial app I've written that simply echoes out the user's environment variables.

const http = require('http');

const hostname = '0.0.0.0';
const port = process.env.NODE_PORT || 3000;
const env = process.env;

const server = http.createServer((req, res) => {
  res.statusCode = 200;
  res.setHeader('Content-Type', 'text/plain');
  for (var k in env) {
    res.write(k + ": " + env[k] + "\n");
  }
  res.end();
});

server.listen(port, hostname, () => {
  console.log("Server running at http://" + hostname + ":" + port + "/");
});

Using your text editor of choice (which should obviously be Emacs but I suppose it's a free country if you want to use something inferior), create a file called hello_env.js in the user's home directory /home/ubuntu with the contents above. Next run it with

$ /usr/bin/node /home/ubuntu/hello_env.js

You should be able to go to

http://11.22.33.44:3000

in a web browser now, substituting 11.22.33.44 with whatever the actual IP address of your server is, and see a printout of the environment variables for the ubuntu user. If that is in fact what you see, great! We know the app runs, and we know the command needed to start it up. Go ahead and press Ctrl-c to close down the application. Now we'll move on to the systemd parts.

Creating a systemd Service File

The "magic" that's needed to make systemd start working for us is a text file called a service file. I say "magic" because for whatever reason, this seems to be the part that people block on when they are going through this process. Fortunately, it's much less difficult and scary than you might think.

We will be creating a file in a "system area" where everything is owned by the root user, so we'll be executing a bunch of commands using sudo. Again, don't be nervous, it's really very straightforward.

The service files for the things that systemd controls all live under the directory path

/lib/systemd/system

so we'll create a new file there. If you're using Nano as your editor, open up a new file there with:

sudo nano /lib/systemd/system/hello_env.service

and put the following contents in it:

[Unit]
Description=hello_env.js - making your environment variables rad
Documentation=https://example.com
After=network.target

[Service]
Environment=NODE_PORT=3001
Type=simple
User=ubuntu
ExecStart=/usr/bin/node /home/ubuntu/hello_env.js
Restart=on-failure

[Install]
WantedBy=multi-user.target

Let's go ahead and talk about what's in that file. In the [Unit] section, the Description and Documentation variables are obvious. What's less obvious is the part that says

After=network.target

That tells systemd that if it's supposed to start our app when the machine boots up, it should wait until after the main networking functionality of the server is online to do so. This is what we want, since our app can't bind to NODE_PORT until the network is up and running.

Moving on to the [Service] section we find the meat of today's project. We can specify environment variables here, so I've gone ahead and put in:

Environment=NODE_PORT=3001

so our app, when it starts, will be listening on port 3001. This is different than the default 3000 that we saw when we launched the app by hand. You can specify the Environment directive multiple times if you need multiple environment variables. Next is

Type=simple

which tells systemd how our app launches itself. Specifically, it lets systemd know that the app won't try and fork itself to drop user privileges or anything like that. It's just going to start up and run. After that we see

User=ubuntu

which tells systemd that our app should be run as the unprivileged ubuntu user. You definitely want to run your apps as unprivileged users to that attackers can't aim at something running as the root user.

The last two parts here are maybe the most interesting to us

ExecStart=/usr/bin/node /home/ubuntu/hello_env.js
Restart=on-failure

First, ExecStart tells systemd what command it should run to launch our app. Then, Restart tells systemd under what conditions it should restart the app if it sees that it has died. The on-failure value is likely what you will want. Using this, the app will NOT restart if it goes away "cleanly". Going away "cleanly" means that it either exits by itself with an exit value of 0, or it gets killed with a "clean" signal, such as the default signal sent by the kill command. Basically, if our app goes away because we want it to, then systemd will leave it turned off. However, if it goes away for any other reason (an unhandled exception crashes the app, for example), then systemd will immediately restart it for us. If you want it to restart no matter what, change the value from on-failure to always.

Last is the [Install] stanza. We're going to gloss over this part as it's not very interesting. It tells systemd how to handle things if we want to start our app on boot, and you will probably want to use the values shown for most things until you are a more advanced systemd user.

Using systemctl To Control Our App

The hard part is done! We will now learn how to use the system provided tools to control our app. To being with, enter the command

$ sudo systemctl daemon-reload

You have to do this whenever any of the service files change at all so that systemd picks up the new info.

Next, let's launch our app with

$ sudo systemctl start hello_env

After you do this, you should be able to go to

http://11.22.33.44:3001

in your web browser and see the output. If it's there, congratulations, you've launched your app using systemd! If the output looks very different than it did when you launched the app manually don't worry, that's normal. When systemd kicks off an application, it does so from a much more minimal environment than the one you have when you ssh into a machine. In particular, the $HOME environment variable may not be set by default, so be sure to pay attention to this if your app makes use of any environment variables. You may need to set them yourself when using systemd.

You may be interested in what state systemd thinks the app is in, and if so, you can find out with

$ sudo systemctl status hello_env

Now, if you want to stop your app, the command is simply

$ sudo systemctl stop hello_env

and unsurprisingly, the following will restart things for us

$ sudo systemctl restart hello_env

If you want to make the application start up when the machine boots, you accomplish that by enabling it

$ sudo systemtl enable hello_env

and finally, if you previously enabled the app, but you change your mind and want to stop it from coming up when the machine starts, you correspondingly disable it

$ sudo systemctl disable hello_env

Wrapping Up

That concludes today's exercise. There is much, much more to learn and know about systemd, but this should help get you started with some basics. In a follow up blog post, we will learn how to launch multiple instances of our app, and load balance those behind Nginx to illustrate a more production ready example.


This article was first published NodeSource blog post in November 2016

Read More
6-min.jpg
Configuring Your .npmrc for an Optimal Node.js Environment

This blog post was first published on March 2017. Find out more here


For Node.js developers, npm is an everyday tool. It's literally something we interact with multiple times on a daily basis, and it's one of the pieces of the ecosystem that's led to the success of Node.js.

One of the most useful, important, and enabling aspects of the npm CLI is that its highly configurable. It provides an enormous amount of configurability that enables everyone from huge enterprises to individual developers to use it effectively.

One part of this high-configurability is the .npmrc file. For a long time I'd seen discussion about it - the most memorable being the time I thought you could change the name of the node_modules directory with it. For a long time, I didn't truly understand just how useful the .npmrc file could be, or how to even use it.

So, today I've collected a few of the optimizations that .npmrc allows that have been awesome for speeding up my personal workflow when scaffolding out Node.js modules and working on applications long-term.

Automating npm init Just a Bit More

When you're creating a new module from scratch, you'll typically start out with the npm init command. One thing that some developers don't know is that you can actually automate this process fairly heftily with a few choice npm config set ... commands that set default values for the npm init prompts.

You can easily set your name, email, URL, license, and initial module version with a few commands:

npm config set init.author.name "Hiro Protagonist"
npm config set init.author.email "hiro@showcrash.io"
npm config set init.author.url "http://hiro.snowcrash.io"
npm config set init.license "MIT"
npm config set init.version "0.0.1"

In the above example, I've set up some defaults for Hiro. This personal information won't change too frequently, so setting up some defaults is helpful and allows you to skip over entering the same information in manually every time.

Additionally, the above commands set up two defaults that are related to your module.

The first default is the initial license that will be automatically suggested by the npm init command. I personally like to default to MIT, and much of the rest of the Node.js ecosystem does the same. That said, you can set this to whatever you'd like - it's a nice optimization to just be able to nearly automatically select your license of choice.

The second default is the initial version. This is actually one that made me happy, as whenever I tried building out a module I never wanted it to start out at version 1.0.0, which is what npm init defaults to. I personally set it to 0.0.1 and then increment the version as I go with the npm version [ major | minor | patch ] command.

Change Your npm Registry

As time moves forward, we're seeing more options for registries arise. For example, you may want to set your registry to a cache of the modules you know you need for your apps. Or, you may be using Certified Modules as a custom npm registry. There's even a separate registry for Yarn, a topic that is both awesome and totally out of scope for this post.

So, if you'd like to set a custom registry, you can run a pretty simple one-line command:

npm config set registry "https://my-custom-registry.registry.nodesource.io/"

In this example, I've set the registry URL to an example of a Certified Modules registry - that said, the exact URL in the command can be replaced with any registry that's compatible. To reset your registry back to the default npm registry, you can simply run the same command pointing to the standard registry:

npm config set registry "https://registry.npmjs.com/"

Changing the console output of npm install with loglevel

When you npm install a bunch of information gets piped to you. By default, the npm command line tool limits how much of this information is actually output into the console when installing. There are varying degrees of output that you can assign at install, or by default, if you change it with npm config in your .npmrc file. The options, from least to most output, are: silent, error, warn, http, info, verbose, and silly.

Here's an example of the silent loglevel: npm install express loglevel silent

And here's an example of the silly loglevel: npm install express loglevel silly

If you'd like to get a bit more information (or a bit less, depending on your preferences) when you npm install, you can change the default loglevel.

npm config set loglevel="http"

If you tinker around with this config a bit and would like to reset to what the npm CLI currently defaults to, you can run the above command with warn as the loglevel:

npm config set loglevel="warn"
Looking for more info on npm? Check out our complete guide: Read now: The Ultimate Guide to npm

Change Where npm Installs Global Modules

This is a really awesome change - it has a few steps, but is really worth it. With a few commands, you can change where the npm CLI installs global modules by default. Normally, it installs them to a privileged system folder - this requires administrative access, meaning that a global install requires sudo access on UNIX-based systems.

If you change the default global prefix for npm to an unprivileged directory, for example, ~/.global-modules, you'll not need to authenticate when you install a global module. That's one benefit - another is that globally installed modules won't be in a system directory, reducing the likelihood of a malicious module (intentionally or not) doing something you didn't want it to on your system.

To get started, we're going to create a new folder called global-modules and set the npm prefix to it:

mkdir ~/.global-modules
npm config set prefix "~/.global-modules"

Next, if you don't already have a file called ~/.profile, create one in your root user directory. Now, add the following line to the ~/.profile file:

export PATH=~/.global-modules/bin:$PATH

Adding that line to the ~/.profile file will add the global-modules directory to your PATH, and enable you to use it for npm global modules.

Now, flip back over to your terminal and run the following command to update the PATH with the newly updated file:

source ~/.profile

Just one more thing...

If you'd like to keep reading about Node.js, npm, configuration options, and development with the Node.js stack, I've got some fantastic articles for you.

Our most recent guide is a deep-dive into the core concepts of the package.json file. You'll find a ton of info about package.json in there, including a ton of super helpful configuration information. We also published an absolute beginner's guide to npm that you may be interested in reading - even though it's a beginner's guide, I'd bet you'll find something useful in it.

With this article, the intent was to help you set up a great configuration for Node.js development. If you'd like to take the leap and ensure that you're always on a rock-solid platform when developing and deploying you Node.js apps, check out NodeSource Certified Modules - it's a new tool we launched last week that will help enable you to spend more time building apps and less time worrying about modules.

Learn more and get started with NCM Create your free NodeSource account

Read More
  • 1 / 2