What are some commonly used timing features of Node.js?
- set Timeout/clear Timeout – This is used to implement delays in code execution.
- set Interval/clear Interval – This is used to run a code block multiple times.
- set Immediate/clear Immediate – This is used to set the execution of the code at the end of the event loop cycle.
- process.nextTick – This is used to set the execution of code at the beginning of the next event loop cycle.
What are the advantages of using promises instead of callbacks?
The main advantage of using promise is you get an object to decide the action that needs to be taken after the async task completes. This gives more manageable code and avoids callback hell.
What is fork in node JS?
A fork in general is used to spawn child processes. In node it is used to create a new instance of v8 engine to run multiple workers to execute the code.
Node JS Interview Question
How many types of API functions are there in Node.js?
There are two types of API functions:
- Asynchronous, non-blocking functions – mostly I/O operations which can be fork out of the main loop.
- Synchronous, blocking functions – mostly operations that influence the process running in the main loop.
List down the two arguments that async.queue takes as input?
- Task Function
- Concurrency Value
What tools can be used to assure consistent code style?
ESLint can be used with any IDE to ensure a consistent coding style which further helps in maintaining the codebase.
Advance Node JS Interview Question
What do you understand by callback hell?
async_A(function(){
async_B(function(){
async_C(function(){
async_D(function(){
....
});
});
});
});
For the above example, we are passing callback functions and it makes the code unreadable and not maintainable, thus we should change the async logic to avoid this.
If Node.js is single threaded then how does it handle concurrency?
The main loop is single-threaded and all async calls are managed by libuv library.
For example:
const crypto = require("crypto");
const start = Date.now();
function logHashTime() {
crypto.pbkdf2("a", "b", 100000, 512, "sha512", () => {
console.log("Hash: ", Date.now() - start);
});
}
logHashTime();
logHashTime();
logHashTime();
logHashTime();
This gives the output:
Hash: 1213
Hash: 1225
Hash: 1212
Hash: 1222
This is because libuv sets up a thread pool to handle such concurrency. How many threads will be there in the thread pool depends upon the number of cores but you can override this.
How does Node.js overcome the problem of blocking of I/O operations?
Since the node has an event loop that can be used to handle all the I/O operations in an asynchronous manner without blocking the main function.
So for example, if some network call needs to happen it will be scheduled in the event loop instead of the main thread(single thread). And if there are multiple such I/O calls each one will be queued accordingly to be executed separately(other than the main thread).
Thus even though we have single-threaded JS, I/O ops are handled in a nonblocking way.
Node JS Interview Question
How can we use async await in node.js?
Here is an example of using async-await pattern:
// this code is to retry with exponential backoff
function wait (timeout) {
return new Promise((resolve) => {
setTimeout(() => {
resolve()
}, timeout);
});
}
async function requestWithRetry (url) {
const MAX_RETRIES = 10;
for (let i = 0; i <= MAX_RETRIES; i++) {
try {
return await request(url);
} catch (err) {
const timeout = Math.pow(2, i);
console.log('Waiting', timeout, 'ms');
await wait(timeout);
console.log('Retrying', err.message, i);
}
}
}
What is node.js streams?
Streams are instances of Event Emitter which can be used to work with streaming data in Node.js. They can be used for handling and manipulating streaming large files(videos, mp3, etc) over the network. They use buffers as their temporary storage.
There are mainly four types of the stream:
- Writable: streams to which data can be written (for example, fs.createWriteStream()).
- Readable: streams from which data can be read (for example, fs.createReadStream()).
- Duplex: streams that are both Readable and Writable (for example, net.Socket).
- Transform: Duplex streams that can modify or transform the data as it is written and read (for example, zlib.createDeflate()).
What are node.js buffers?
In general, buffers is a temporary memory that is mainly used by stream to hold on to some data until consumed. Buffers are introduced with additional use cases than JavaScript’s Unit8Array and are mainly used to represent a fixed-length sequence of bytes. This also supports legacy encodings like ASCII, utf-8, etc. It is a fixed(non-resizable) allocated memory outside the v8.
Advance Node JS Interview Question
What is middleware?
Middleware comes in between your request and business logic. It is mainly used to capture logs and enable rate limit, routing, authentication, basically whatever that is not a part of business logic. There are third-party middleware also such as body-parser and you can write your own middleware for a specific use case.
Explain what a Reactor Pattern in Node.js?
Reactor pattern again a pattern for nonblocking I/O operations. But in general, this is used in any event-driven architecture.
There are two components in this: 1. Reactor 2. Handler.
Reactor: Its job is to dispatch the I/O event to appropriate handlers
Handler: Its job is to actually work on those events.
Why should you separate Express app and server?
The server is responsible for initializing the routes, middleware, and other application logic whereas the app has all the business logic which will be served by the routes initiated by the server. This ensures that the business logic is encapsulated and decoupled from the application logic which makes the project more readable and maintainable.
Node JS Interview Question
Describe the exit codes of Node.js?
Exit codes give us an idea of how a process got terminated/the reason behind termination.
A few of them are:
- Uncaught fatal exception – (code – 1) – There has been an exception that is not handled
- Unused – (code – 2) – This is reserved by bash
- Fatal Error – (code – 5) – There has been an error in V8 with stderr output of the description
- Internal Exception handler Run-time failure – (code – 7) – There has been an exception when bootstrapping function was called
- Internal JavaScript Evaluation Failure – (code – 4) – There has been an exception when the bootstrapping process failed to return function value when evaluated.
Explain the concept of stub in Node.js?
Stubs are used in writing tests which are an important part of development. It replaces the whole function which is getting tested.
This helps in scenarios where we need to test:
- External calls which make tests slow and difficult to write (e.g HTTP calls/ DB calls)
- Triggering different outcomes for a piece of code (e.g. what happens if an error is thrown/ if it passes)
For example, this is the function:
const request = require('request');
const getPhotosByAlbumId = (id) => {
const requestUrl = `https://jsonplaceholder.typicode.com/albums/${id}/photos?_limit=3`;
return new Promise((resolve, reject) => {
request.get(requestUrl, (err, res, body) => {
if (err) {
return reject(err);
}
resolve(JSON.parse(body));
});
});
};
module.exports = getPhotosByAlbumId;
To test this function this is the stub
const expect = require('chai').expect;
const request = require('request');
const sinon = require('sinon');
const getPhotosByAlbumId = require('./index');
describe('with Stub: getPhotosByAlbumId', () => {
before(() => {
sinon.stub(request, 'get')
.yields(null, null, JSON.stringify([
{
"albumId": 1,
"id": 1,
"title": "A real photo 1",
"url": "https://via.placeholder.com/600/92c952",
"thumbnailUrl": "https://via.placeholder.com/150/92c952"
},
{
"albumId": 1,
"id": 2,
"title": "A real photo 2",
"url": "https://via.placeholder.com/600/771796",
"thumbnailUrl": "https://via.placeholder.com/150/771796"
},
{
"albumId": 1,
"id": 3,
"title": "A real photo 3",
"url": "https://via.placeholder.com/600/24f355",
"thumbnailUrl": "https://via.placeholder.com/150/24f355"
}
]));
});
after(() => {
request.get.restore();
});
it('should getPhotosByAlbumId', (done) => {
getPhotosByAlbumId(1).then((photos) => {
expect(photos.length).to.equal(3);
photos.forEach(photo => {
expect(photo).to.have.property('id');
expect(photo).to.have.property('title');
expect(photo).to.have.property('url');
});
done();
});
});
});
How are worker threads different from clusters?
Cluster:
- There is one process on each CPU with an IPC to communicate.
- In case we want to have multiple servers accepting HTTP requests via a single port, clusters can be helpful.
- The processes are spawned in each CPU thus will have separate memory and node instance which further will lead to memory issues.
Worker threads:
- There is only one process in total with multiple threads.
- Each thread has one Node instance (one event loop, one JS engine) with most of the APIs accessible.
- Shares memory with other threads (e.g. SharedArrayBuffer)
- This can be used for CPU-intensive tasks like processing data or accessing the file system since NodeJS is single-threaded, synchronous tasks can be made more efficient leveraging the worker’s threads.
Advance Node JS Interview Question
How to measure the duration of async operations?
Performance API provides us with tools to figure out the necessary performance metrics. A simple example would be using async_hooks and perf_hooks
'use strict';
const async_hooks = require('async_hooks');
const {
performance,
PerformanceObserver
} = require('perf_hooks');
const set = new Set();
const hook = async_hooks.createHook({
init(id, type) {
if (type === 'Timeout') {
performance.mark(`Timeout-${id}-Init`);
set.add(id);
}
},
destroy(id) {
if (set.has(id)) {
set.delete(id);
performance.mark(`Timeout-${id}-Destroy`);
performance.measure(`Timeout-${id}`,
`Timeout-${id}-Init`,
`Timeout-${id}-Destroy`);
}
}
});
hook.enable();
const obs = new PerformanceObserver((list, observer) => {
console.log(list.getEntries()[0]);
performance.clearMarks();
observer.disconnect();
});
obs.observe({ entryTypes: ['measure'], buffered: true });
setTimeout(() => {}, 1000);
This would give us the exact time it took to execute the callback.
How to measure the performance of async operations?
Performance API provides us with tools to figure out the necessary performance metrics.
A simple example would be:
const { PerformanceObserver, performance } = require('perf_hooks');
const obs = new PerformanceObserver((items) => {
console.log(items.getEntries()[0].duration);
performance.clearMarks();
});
obs.observe({ entryTypes: ['measure'] });
performance.measure('Start to Now');
performance.mark('A');
doSomeLongRunningProcess(() => {
performance.measure('A to Now', 'A');
performance.mark('B');
performance.measure('A to B', 'A', 'B');
});
What is Node.js Process Model?
Node.js runs in a single process and the application code runs in a single thread and thereby needs less resources than other platforms. All the user requests to your web application will be handled by a single thread and all the I/O work or long running job is performed asynchronously for a particular request. So, this single thread doesn’t have to wait for the request to complete and is free to handle the next request. When asynchronous I/O work completes then it processes the request further and sends the response.
Node JS Interview Question
What are the data types in Node.js?
Primitive Types
- String
- Number
- Boolean
- Undefined
- Null
- RegExp
Buffer
: Node.js includes an additional data type called Buffer (not available in browser’s JavaScript). Buffer is mainly used to store binary data, while reading from a file or receiving packets over the network.
How to create a simple server in Node.js that returns Hello World?
Step 01: Create a project directory
mkdir myapp
cd myapp
Step 02: Initialize project and link it to npm
npm init
This creates a package.json
file in your my app folder. The file contains references for all npm packages you have downloaded to your project. The command will prompt you to enter a number of things. You can enter your way through all of them EXCEPT this one:
entry point: (index.js)
Rename this to:
app.js
Step 03: Install Express in the my app directory
npm install express --save
Step 04: app.js
var express = require('express');
var app = express();
app.get('/', function (req, res) {
res.send('Hello World!');
});
app.listen(3000, function () {
console.log('Example app listening on port 3000!');
});
Step 05: Run the app
node app.js
How to make an HTTP POST request using Node.js?
const https = require('https')
const obj = {
"userId":1,
"id":1,
"title":"whatever",
"completed":false
}
const data = JSON.stringify(obj)
const options = {
hostname: 'jsonplaceholder.typicode.com',
port: 443,
path: '/todos',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': data.length
}
}
const req = https.request(options, res => {
console.log(`statusCode: ${res.statusCode}`)
res.on('data', d => {
process.stdout.write(d)
})
})
req.on('error', error => {
console.error(error)
})
req.write(data)
req.end()
Advance Node JS Interview Question
What does the runtime environment mean in Node.js?
The Node.js runtime is the software stack responsible for installing your web service’s code and its dependencies and running your service.
The Node.js runtime for App Engine in the standard environment is declared in the app.yaml
file:
runtime: nodejs10
The runtime environment is literally just the environment your application is running in. This can be used to describe both the hardware and the software that is running your application. How much RAM, what version of node, what operating system, how much CPU cores, can all be referenced when talking about a runtime environment.
How do Node.js works?

Node is completely event-driven. Basically the server consists of one thread processing one event after another.
A new request coming in is one kind of event. The server starts processing it and when there is a blocking IO operation, it does not wait until it completes and instead registers a callback function. The server then immediately starts to process another event (maybe another request). When the IO operation is finished, that is another kind of event, and the server will process it (i.e. continue working on the request) by executing the callback as soon as it has time.
So the server never needs to create additional threads or switch between threads, which means it has very little overhead. If you want to make full use of multiple hardware cores, you just start multiple instances of node.js
Node JS Platform does not follow Request/Response Multi-Threaded Stateless Model. It follows Single Threaded with Event Loop Model. Node JS Processing model mainly based on JavaScript Event based model with JavaScript callback mechanism.
Single Threaded Event Loop Model Processing Steps:
- Clients Send request to Web Server.
- Node JS Web Server internally maintains a Limited Thread pool to provide services to the Client Requests.
- Node JS Web Server receives those requests and places them into a Queue. It is known as “Event Queue”.
- Node JS Web Server internally has a Component, known as “Event Loop”. Why it got this name is that it uses indefinite loop to receive requests and process them.
- Event Loop uses Single Thread only. It is main heart of Node JS Platform Processing Model.
- Even Loop checks any Client Request is placed in Event Queue. If no, then wait for incoming requests for indefinitely.
- If yes, then pick up one Client Request from Event Queue
- Starts process that Client Request
- If that Client Request Does Not requires any Blocking IO Operations, then process everything, prepare response and send it back to client.
- If that Client Request requires some Blocking IO Operations like interacting with Database, File System, External Services then it will follow different approach
- Checks Threads availability from Internal Thread Pool
- Picks up one Thread and assign this Client Request to that thread.
- That Thread is responsible for taking that request, process it, perform Blocking IO operations, prepare response and send it back to the Event Loop
- Event Loop in turn, sends that Response to the respective Client.
What is an error-first callback?
The pattern used across all the asynchronous methods in Node.js is called Error-first Callback. Here is an example:
fs.readFile( "file.json", function ( err, data ) {
if ( err ) {
console.error( err );
}
console.log( data );
});
Any asynchronous method expects one of the arguments to be a callback. The full callback argument list depends on the caller method, but the first argument is always an error object or null. When we go for the asynchronous method, an exception thrown during function execution cannot be detected in a try/catch statement. The event happens after the JavaScript engine leaves the try block.
In the preceding example, if any exception is thrown during the reading of the file, it lands on the callback function as the first and mandatory parameter.
Node JS Interview Question
What is callback hell in Node.js?
Callback hell
is a phenomenon that afflicts a JavaScript developer when he tries to execute multiple asynchronous operations one after the other.
An asynchronous function is one where some external activity must complete before a result can be processed; it is “asynchronous” in the sense that there is an unpredictable amount of time before a result becomes available. Such functions require a callback function to handle errors and process the result.
getData(function(a){
getMoreData(a, function(b){
getMoreData(b, function(c){
getMoreData(c, function(d){
getMoreData(d, function(e){
...
});
});
});
});
});
Techniques for avoiding callback hell
- Using Async.js
- Using Promises
- Using Async-Await
- Managing callbacks using Async.js
Async
is a really powerful npm module for managing asynchronous nature of JavaScript. Along with Node.js, it also works for JavaScript written for browsers.
Async provides lots of powerful utilities to work with asynchronous processes under different scenarios.
npm install --save async
ASYNC WATERFALL
var async = require('async');
async.waterfall([
function(callback) {
//doSomething
callback(null, paramx); //paramx will be availaible as the first parameter to the next function
/**
The 1st parameter passed in callback.
@null or @undefined or @false control moves to the next function
in the array
if @true or @string the control is immedeatly moved
to the final callback fucntion
rest of the functions in the array
would not be executed
*/
},
function(arg1, callback) {
//doSomething else
// arg1 now equals paramx
callback(null, result);
},
function(arg1, callback) {
//do More
// arg1 now equals 'result'
callback(null, 'done');
},
function(arg1, callback) {
//even more
// arg1 now equals 'done'
callback(null, 'done');
}
], function (err, result) {
//final callback function
//finally do something when all function are done.
// result now equals 'done'
});
ASYNC SERIES
var async = require('async');
async.series([
function(callback){
// do some stuff ...
callback(null, 'one');
/**
The 1st parameter passed in callback.
@null or @undefined or @false control moves to the next function
in the array
if @true or @string the control is immedeatly moved
to the final callback fucntion with the value of err same as
passed over here and
rest of the functions in the array
would not be executed
*/
},
function(callback){
// do some more stuff ...
callback(null, 'two');
}
],
// optional callback
function(err, results){
// results is now equal to ['one', 'two']
});
Managing callbacks hell using promises
Promises are alternative to callbacks while dealing with asynchronous code. Promises return the value of the result or an error exception. The core of the promises is the .then()
function, which waits for the promise object to be returned. The .then()
function takes two optional functions as arguments and depending on the state of the promise only one will ever be called. The first function is called when the promise if fulfilled (A successful result). The second function is called when the promise is rejected.
var outputPromise = getInputPromise().then(function (input) {
//handle success
}, function (error) {
//handle error
});
Using Async Await
Async await makes asynchronous code look like it\’s synchronous. This has only been possible because of the reintroduction of promises into node.js. Async-Await only works with functions that return a promise.
const getrandomnumber = function(){
return new Promise((resolve, reject)=>{
setTimeout(() => {
resolve(Math.floor(Math.random() * 20));
}, 1000);
});
}
const addRandomNumber = async function(){
const sum = await getrandomnumber() + await getrandomnumber();
console.log(sum);
}
addRandomNumber();
What are Promises in Node.js?
It allows to associate handlers to an asynchronous action’s eventual success value or failure reason. This lets asynchronous methods return values like synchronous methods: instead of the final value, the asynchronous method returns a promise for the value at some point in the future.
Promises in node.js promised to do some work and then had separate callbacks that would be executed for success and failure as well as handling timeouts. Another way to think of promises in node.js was that they were emitters that could emit only two events: success and error. The cool thing about promises is you can combine them into dependency chains (do Promise C only when Promise A and Promise B complete).
The core idea behind promises is that a promise represents the result of an asynchronous operation. A promise is in one of three different states:
- pending – The initial state of a promise.
- fulfilled – The state of a promise representing a successful operation.
- rejected – The state of a promise representing a failed operation. Once a promise is fulfilled or rejected, it is immutable (i.e. it can never change again).
Creating a Promise
var myPromise = new Promise(function(resolve, reject){
....
})
What tools can be used to assure consistent style?
- ESLint
- Standard
Advance Node JS Interview Question
When should you npm and when yarn?
- npm
It is the default method for managing packages in the Node.js runtime environment. It relies upon a command line client and a database made up of public and premium packages known as the the npm registry. Users can access the registry via the client and browse the many packages available through the npm website. Both npm and its registry are managed by npm, Inc.
node -v
npm -v
- Yarn
Yarn was developed by Facebook in attempt to resolve some of npm’s shortcomings. Yarn isn’t technically a replacement for npm since it relies on modules from the npm registry. Think of Yarn as a new installer that still relies upon the same npm structure. The registry itself hasn’t changed, but the installation method is different. Since Yarn gives you access to the same packages as npm, moving from npm to Yarn doesn’t require you to make any changes to your workflow.
npm install yarn --global
Comparing Yarn vs npm
- Fast: Yarn caches every package it downloads so it never needs to again. It also parallelizes operations to maximize resource utilization so install times are faster than ever.
- Reliable: Using a detailed, but concise, lock file format, and a deterministic algorithm for installs, Yarn is able to guarantee that an install that worked on one system will work exactly the same way on any other system.
- Secure: Yarn uses checksums to verify the integrity of every installed package before its code is executed.
- Offline Mode: If you’ve installed a package before, you can install it again without any internet connection.
- Deterministic: The same dependencies will be installed the same exact way across every machine regardless of install order.
- Network Performance: Yarn efficiently queues up requests and avoids request waterfalls in order to maximize network utilization.
- Multiple Registries: Install any package from either npm or Bower and keep your package workflow the same.
- Network Resilience: A single request failing won’t cause an install to fail. Requests are retried upon failure.
- Flat Mode: Resolve mismatching versions of dependencies to a single version to avoid creating duplicates.
What is a stub?
Stubbing and verification for node.js tests. Enables you to validate and override behavior of nested pieces of code such as methods, require() and npm modules or even instances of classes. This library is inspired on node-gently, Mock JS and mock-require.
Features of Stub:
- Produces simple, lightweight Objects capable of extending down their tree
- Compatible with Nodejs
- Easily extendable directly or through an ExtensionManager
- Comes with predefined, usable extensions
Stubs are functions/programs that simulate the behaviors of components/modules. Stubs provide canned answers to function calls made during test cases. Also, you can assert on with what these stubs were called.
A use-case can be a file read, when you do not want to read an actual file:
var fs = require('fs');
var readFileStub = sinon.stub(fs, 'readFile', function (path, cb) {
return cb(null, 'filecontent');
});
expect(readFileStub).to.be.called;
readFileStub.restore();
What is a test pyramid? How can you implement it when talking about HTTP APIs?
The “Test Pyramid” is a metaphor that tells us to group software tests into buckets of different granularity. It also gives an idea of how many tests we should have in each of these groups. It shows which kinds of tests you should be looking for in the different levels of the pyramid and gives practical examples on how these can be implemented.

Mike Cohn’s original test pyramid consists of three layers that your test suite should consist of (bottom to top):
- Unit Tests
- Service Tests
- User Interface Tests
Node JS Interview Question
How can you secure your HTTP cookies against XSS attacks?
1. When the web server sets cookies, it can provide some additional attributes to make sure the cookies won’t be accessible by using malicious JavaScript. One such attribute is HttpOnly.
Set-Cookie: [name]=[value]; HttpOnly
HttpOnly makes sure the cookies will be submitted only to the domain they originated from.
2. The “Secure” attribute can make sure the cookies are sent over secured channel only.
Set-Cookie: [name]=[value]; Secure
3. The web server can use X-XSS-Protection response header to make sure pages do not load when they detect reflected cross-site scripting (XSS) attacks.
X-XSS-Protection: 1; mode=block
4. The web server can use HTTP Content-Security-Policy response header to control what resources a user agent is allowed to load for a certain page. It can help to prevent various types of attacks like Cross Site Scripting (XSS) and data injection attacks.
Content-Security-Policy: default-src 'self' *.http://sometrustedwebsite.com
How can you make sure your dependencies are safe?
The only option is to automate the update / security audit of your dependencies. For that there are free and paid options:
- npm outdated
- Trace by RisingStack
- NSP
- GreenKeeper
- Snyk
- npm audit
- npm audit fix
What is Event loop in Node.js? How does it work?
The event loop is what allows Node.js to perform non-blocking I/O operations — despite the fact that JavaScript is single-threaded — by offloading operations to the system kernel whenever possible.
Node.js is a single-threaded application, but it can support concurrency via the concept of event
and callbacks
. Every API of Node.js is asynchronous and being single-threaded, they use async function calls
to maintain concurrency. Node uses observer pattern. Node thread keeps an event loop and whenever a task gets completed, it fires the corresponding event which signals the event-listener function to execute.
Event-Driven Programming
In an event-driven application, there is generally a main loop that listens for events, and then triggers a callback function when one of those events is detected.
Although events look quite similar to callbacks, the difference lies in the fact that callback functions are called when an asynchronous function returns its result, whereas event handling works on the observer pattern. The functions that listen to events act as Observers. Whenever an event gets fired, its listener function starts executing. Node.js has multiple in-built events available through events module and EventEmitter class which are used to bind events and event-listeners as follows
// Import events module
var events = require('events');
// Create an eventEmitter object
var eventEmitter = new events.EventEmitter();
Example:
// Import events module
var events = require('events');
// Create an eventEmitter object
var eventEmitter = new events.EventEmitter();
// Create an event handler as follows
var connectHandler = function connected() {
console.log('connection succesful.');
// Fire the data_received event
eventEmitter.emit('data_received');
}
// Bind the connection event with the handler
eventEmitter.on('connection', connectHandler);
// Bind the data_received event with the anonymous function
eventEmitter.on('data_received', function() {
console.log('data received succesfully.');
});
// Fire the connection event
eventEmitter.emit('connection');
console.log("Program Ended.");
Advance Node JS Interview Question
What is REPL? What purpose it is used for?
REPL (READ, EVAL, PRINT, LOOP) is a computer environment similar to Shell (Unix/Linux) and command prompt. Node comes with the REPL environment when it is installed. System interacts with the user through outputs of commands/expressions used. It is useful in writing and debugging the codes. The work of REPL can be understood from its full form:
- Read: It reads the inputs from users and parses it into JavaScript data structure. It is then stored to memory.
- Eval: The parsed JavaScript data structure is evaluated for the results.
- Print: The result is printed after the evaluation.
- Loop: Loops the input command. To come out of NODE REPL, press ctrl+c twice
Simple Expression
$ node
> 10 + 20
30
> 10 + ( 20 * 30 ) - 40
570
>
What is the difference between Asynchronous and Non-blocking?
1. Asynchronous
The architecture of asynchronous explains that the message sent will not give the reply on immediate basis just like we send the mail but do not get the reply on an immediate basis. It does not have any dependency or order. Hence improving the system efficiency and performance. The server stores the information and when the action is done it will be notified.
2. Non-Blocking
Nonblocking immediately responses with whatever data available. Moreover, it does not block any execution and keeps on running as per the requests. If an answer could not be retrieved than in those cases API returns immediately with an error. Nonblocking is mostly used with I/O(input/output). Node.js is itself based on nonblocking I/O model. There are few ways of communication that a nonblocking I/O has completed. The callback function is to be called when the operation is completed. Nonblocking call uses the help of JavaScript which provides a callback function.
- Asynchronous VS Non-Blocking
- Asynchronous does not respond immediately, While Nonblocking responds immediately if the data is available and if not that simply returns an error.
- Asynchronous improves the efficiency by doing the task fast as the response might come later, meanwhile, can do complete other tasks. Nonblocking does not block any execution and if the data is available it retrieves the information quickly.
- Asynchronous is the opposite of synchronous while nonblocking I/O is the opposite of blocking. They both are fairly similar but they are also different as asynchronous is used with a broader range of operations while nonblocking is mostly used with I/O.
How to debug an application in Node.js?
node-inspector
npm install -g node-inspector
Run
node-debug app.js
- Debugging
- Debugger
- Node Inspector
- Visual Studio Code
- Cloud9
- Brackets
- Profiling
1. node --prof ./app.js
2. node --prof-process ./the-generated-log-file
- Heapdumps
- node-heapdump with Chrome Developer Tools
- Tracing
- Interactive Stack Traces with TraceGL
- Logging
Libraries that output debugging information- Caterpillar
- Tracer
- scribbles
Libraries that enhance stack trace information
- Longjohn
Node JS Interview Question
What are some of the most popular packages of Node.js?
- Async: Async is a utility module which provides straight-forward, powerful functions for working with asynchronous JavaScript.
- Browserify: Browserify will recursively analyze all the require() calls in your app in order to build a bundle you can serve up to the browser in a single
<script>
tag. - Bower: Bower is a package manager for the web. It works by fetching and installing packages from all over, taking care of hunting, finding, downloading, and saving the stuff you’re looking for.
- Csv: csv module has four sub modules which provides CSV generation, parsing, transformation and serialization for Node.js.
- Debug: Debug is a tiny node.js debugging utility modelled after node core’s debugging technique.
- Express: Express is a fast, un-opinionated, minimalist web framework. It provides small, robust tooling for HTTP servers, making it a great solution for single page applications, web sites, hybrids, or public HTTP APIs.
- Grunt: is a JavaScript Task Runner that facilitates creating new projects and makes performing repetitive but necessary tasks such as linting, unit testing, concatenating and minifying files (among other things) trivial.
- Gulp: is a streaming build system that helps you automate painful or time-consuming tasks in your development workflow.
- Hapi: is a streaming build system that helps you automate painful or time-consuming tasks in your development workflow.
- Http-server: is a simple, zero-configuration command-line http server. It is powerful enough for production usage, but it’s simple and hackable enough to be used for testing, local development, and learning.
- Inquirer: A collection of common interactive command line user interfaces.
- jQuery: jQuery is a fast, small, and feature-rich JavaScript library.
- Jshint: Static analysis tool to detect errors and potential problems in JavaScript code and to enforce your team’s coding conventions.
- Koa: Koa is web app framework. It is an expressive HTTP middleware for node.js to make web applications and APIs more enjoyable to write.
- Lodash: The lodash library exported as a node module. Lodash is a modern JavaScript utility library delivering modularity, performance, & extras.
- Less: The less library exported as a node module.
- Moment: A lightweight JavaScript date library for parsing, validating, manipulating, and formatting dates.
- Mongoose: It is a MongoDB object modeling tool designed to work in an asynchronous environment.
- MongoDB: The official MongoDB driver for Node.js. It provides a high-level API on top of mongodb-core that is meant for end users.
- Npm: is package manager for JavaScript.
- Nodemon: It is a simple monitor script for use during development of a node.js app, It will watch the files in the directory in which nodemon was started, and if any files change, nodemon will automatically restart your node application.
- Nodemailer: This module enables e-mail sending from a Node.js applications.
- Optimist: is a node.js library for option parsing with an argv hash.
- Phantomjs: An NPM installer for Phantom JS, headless webkit with JS API. It has fast and native support for various web standards: DOM handling, CSS selector, JSON, Canvas, and SVG.
- Passport: A simple, unobtrusive authentication middleware for Node.js. Passport uses the strategies to authenticate requests. Strategies can range from verifying username and password credentials or authentication using OAuth or OpenID.
- Q: Q is a library for promises. A promise is an object that represents the return value or the thrown exception that the function may eventually provide.
- Request: Request is Simplified HTTP request client make it possible to make http calls. It supports HTTPS and follows redirects by default.
- Socket.io: Its a node.js Realtime framework server.
- Sails: Sails : API-driven framework for building Realtime apps, using MVC conventions (based on Express and Socket.io)
- Through: It enables simplified stream construction. It is easy way to create a stream that is both readable and writable.
- Underscore: Underscore.js is a utility-belt library for JavaScript that provides support for the usual functional suspects (each, map, reduce, filter…) without extending any core JavaScript objects.
- Validator: A NodeJS module for a library of string validators and sanitizers.
- Winston: A multi-transport async logging library for Node.js
- Ws: A simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js
- Xml2js: A Simple XML to JavaScript object converter.
- Yo: A CLI tool for running Yeoman generators
- Zmq: Bindings for node.js and io.js to ZeroMQ .It is a high-performance asynchronous messaging library, aimed at use in distributed or concurrent applications.
What is Event Emitter in Node.js?
All objects that emit events are members of Event Emitter class. These objects expose an eventEmitter.on()
function that allows one or more functions to be attached to named events emitted by the object.
When the Event Emitter object emits an event, all of the functions attached to that specific event are called synchronously. All values returned by the called listeners are ignored and will be discarded.
Example:
var events = require('events');
var eventEmitter = new events.EventEmitter();
// listener #1
var listner1 = function listner1() {
console.log('listner1 executed.');
}
// listener #2
var listner2 = function listner2() {
console.log('listner2 executed.');
}
// Bind the connection event with the listner1 function
eventEmitter.addListener('connection', listner1);
// Bind the connection event with the listner2 function
eventEmitter.on('connection', listner2);
var eventListeners = require('events').EventEmitter.listenerCount
(eventEmitter,'connection');
console.log(eventListeners + " Listner(s) listening to connection event");
// Fire the connection event
eventEmitter.emit('connection');
// Remove the binding of listner1 function
eventEmitter.removeListener('connection', listner1);
console.log("Listner1 will not listen now.");
// Fire the connection event
eventEmitter.emit('connection');
eventListeners = require('events').EventEmitter.listenerCount(eventEmitter,'connection');
console.log(eventListeners + " Listner(s) listening to connection event");
console.log("Program Ended.");
Now run the main.js
$ node main.js
Output
2 Listner(s) listening to connection event
listner1 executed.
listner2 executed.
Listner1 will not listen now.
listner2 executed.
1 Listner(s) listening to connection event
Program Ended.
How many types of streams are present in node.js?
Streams are objects that let you read data from a source or write data to a destination in continuous fashion. There are four types of streams
- Readable − Stream which is used for read operation.
- Writable − Stream which is used for write operation.
- Duplex − Stream which can be used for both read and write operation.
- Transform − A type of duplex stream where the output is computed based on input.
Each type of Stream is an EventEmitter instance and throws several events at different instance of times.
Example:
- data − This event is fired when there is data is available to read.
- end − This event is fired when there is no more data to read.
- error − This event is fired when there is any error receiving or writing data.
- finish − This event is fired when all the data has been flushed to underlying system.
Reading from a Stream
var fs = require("fs");
var data = '';
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Set the encoding to be utf8.
readerStream.setEncoding('UTF8');
// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});
readerStream.on('end',function() {
console.log(data);
});
readerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
Writing to a Stream
var fs = require("fs");
var data = 'Simply Easy Learning';
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Write the data to stream with encoding to be utf8
writerStream.write(data,'UTF8');
// Mark the end of file
writerStream.end();
// Handle stream events --> finish, and error
writerStream.on('finish', function() {
console.log("Write completed.");
});
writerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
Piping the Streams
Piping is a mechanism where we provide the output of one stream as the input to another stream. It is normally used to get data from one stream and to pass the output of that stream to another stream. There is no limit on piping operations.
var fs = require("fs");
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);
console.log("Program Ended");
Chaining the Streams
Chaining is a mechanism to connect the output of one stream to another stream and create a chain of multiple stream operations. It is normally used with piping operations.
var fs = require("fs");
var zlib = require('zlib');
// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));
console.log("File Compressed.");
Advance Node JS Interview Question
What is crypto in Node.js? How do you cipher the secure information in Node.js?
The Node.js Crypto module supports cryptography. It provides cryptographic functionality that includes a set of wrappers for open SSL’s hash HMAC, cipher, decipher, sign and verify functions.
- Hash: A hash is a fixed-length string of bits i.e. procedurally and deterministically generated from some arbitrary block of source data.
- HMAC: HMAC stands for Hash-based Message Authentication Code. It is a process for applying a hash algorithm to both data and a secret key that results in a single final hash.
- Encryption Example using Hash and HMAC
const crypto = require('crypto');
const secret = 'abcdefg';
const hash = crypto.createHmac('sha256', secret)
.update('Welcome to JavaTpoint')
.digest('hex');
console.log(hash);
Encryption example using Cipher
const crypto = require('crypto');
const cipher = crypto.createCipher('aes192', 'a password');
var encrypted = cipher.update('Hello JavaTpoint', 'utf8', 'hex');
encrypted += cipher.final('hex');
console.log(encrypted);
Decryption example using Decipher
const crypto = require('crypto');
const decipher = crypto.createDecipher('aes192', 'a password');
var encrypted = '4ce3b761d58398aed30d5af898a0656a3174d9c7d7502e781e83cf6b9fb836d5';
var decrypted = decipher.update(encrypted, 'hex', 'utf8');
decrypted += decipher.final('utf8');
console.log(decrypted);
What is the use of DNS module in Node.js?
DNS is a node module used to do name resolution facility which is provided by the operating system as well as used to do an actual DNS lookup. No need for memorizing IP addresses – DNS servers provide a nifty solution of converting domain or subdomain names to IP addresses. This module provides an asynchronous network wrapper and can be imported using the following syntax.
const dns = require('dns');
Example: dns.lookup() function
const dns = require('dns');
dns.lookup('www.google.com', (err, addresses, family) => {
console.log('addresses:', addresses);
console.log('family:',family);
});
Example: resolve4()
and reverse()
functions
const dns = require('dns');
dns.resolve4('www.google.com', (err, addresses) => {
if (err) throw err;
console.log(`addresses: ${JSON.stringify(addresses)}`);
addresses.forEach((a) => {
dns.reverse(a, (err, hostnames) => {
if (err) {
throw err;
}
console.log(`reverse for ${a}: ${JSON.stringify(hostnames)}`);
});
});
});
Example: print the localhost name using lookupService()
function
const dns = require('dns');
dns.lookupService('127.0.0.1', 22, (err, hostname, service) => {
console.log(hostname, service);
// Prints: localhost
});
What are the security mechanisms available in Node.js?
Using the Helmet module
Helmet helps to secure your Express applications by setting various HTTP headers, like:
- X-Frame-Options to mitigates clickjacking attacks,
- Strict-Transport-Security to keep your users on HTTPS,
- X-XSS-Protection to prevent reflected XSS attacks,
- X-DNS-Prefetch-Control to disable browsers DNS prefetching.
const express = require('express')
const helmet = require('helmet')
const app = express()
app.use(helmet())
Validating user input
Validating user input is one of the most important things to do when it comes to the security of your application. Failing to do it correctly can open up your application and users to a wide range of attacks, including command injection, SQL injection or stored cross-site scripting.
To validate user input, one of the best libraries you can pick is joi. Joi is an object schema description language and validator for JavaScript objects.
const Joi = require('joi');
const schema = Joi.object().keys({
username: Joi.string().alphanum().min(3).max(30).required(),
password: Joi.string().regex(/^[a-zA-Z0-9]{3,30}$/),
access_token: [Joi.string(), Joi.number()],
birthyear: Joi.number().integer().min(1900).max(2013),
email: Joi.string().email()
}).with('username', 'birthyear').without('password', 'access_token')
// Return result
const result = Joi.validate({
username: 'abc',
birthyear: 1994
}, schema)
// result.error === null -> valid
Securing your Regular Expressions
Regular Expressions are a great way to manipulate texts and get the parts that you need from them. However, there is an attack vector called Regular Expression Denial of Service attack, which exposes the fact that most Regular Expression implementations may reach extreme situations for specially crafted input, that cause them to work extremely slowly.
The Regular Expressions that can do such a thing are commonly referred as Evil Regexes. These expressions contain: *grouping with repetition, *inside the repeated group: *repetition, or *alternation with overlapping
Examples of Evil Regular Expressions patterns:
(a+)+
([a-zA-Z]+)*
(a|aa)+
Security.txt
Security.txt defines a standard to help organizations define the process for security researchers to securely disclose security vulnerabilities.
const express = require('express')
const securityTxt = require('express-security.txt')
const app = express()
app.get('/security.txt', securityTxt({
// your security address
contact: 'email@example.com',
// your pgp key
encryption: 'encryption',
// if you have a hall of fame for securty resourcers, include the link here
acknowledgements: 'http://acknowledgements.example.com'
}))
Node JS Interview Question
Name the types of API functions in Node.js?
There are two types of API functions in Node.js:
- Asynchronous, Non-blocking functions
- Synchronous, Blocking functions
1. Blocking functions
In a blocking operation, all other code is blocked from executing until an I/O event that is being waited on occurs. Blocking functions execute synchronously.
Example:
const fs = require('fs');
const data = fs.readFileSync('/file.md'); // blocks here until file is read
console.log(data);
// moreWork(); will run after console.log
The second line of code blocks the execution of additional JavaScript until the entire file is read. moreWork () will only be called after Console.log
2. Non-blocking functions
In a non-blocking operation, multiple I/O calls can be performed without the execution of the program being halted. Non-blocking functions execute asynchronously.
Example:
const fs = require('fs');
fs.readFile('/file.md', (err, data) => {
if (err) throw err;
console.log(data);
});
// moreWork(); will run before console.log
Since fs.readFile()
is non-blocking, moreWork() does not have to wait for the file read to complete before being called. This allows for higher throughput.
How does Node.js handle child threads?
Node.js is a single threaded language which in background uses multiple threads to execute asynchronous code. Node.js is non-blocking which means that all functions ( callbacks ) are delegated to the event loop and they are ( or can be ) executed by different threads. That is handled by Node.js run-time.
- Nodejs Primary application runs in an event loop, which is in a single thread.
- Background I/O is running in a thread pool that is only accessible to C/C++ or other compiled/native modules and mostly transparent to the JS.
- Node v11/12 now has experimental worker threads, which is another option.
- Node.js does support forking multiple processes ( which are executed on different cores ).
- It is important to know that state is not shared between master and forked process.
- We can pass messages to forked process ( which is different script ) and to master process from forked process with function send.
What is the preferred method of resolving unhandled exceptions in Node.js?
Unhandled exceptions in Node.js can be caught at the Process level by attaching a handler for uncaughtException event.
process.on('uncaughtException', function(err) {
console.log('Caught exception: ' + err);
});
Process is a global object that provides information about the current Node.js process. Process is a listener function that is always listening to events.
Few events are :
- Exit
- disconnect
- unhandled Exception
- rejection Handled
Advance Node JS Interview Question
What are the important command line options in Node.js?
Some of the important command line options in Node.js are as follows:
- -i or —interactive: This option opens REPL.
- –no-warnings: We use it suppress printing of warnings.
- –v8-options: It is used to print v8 engine command line options.
- –preserve-symlinks: This option is used to instructs the module loader to preserve symbolic links while resolving and caching modules.
How can we avoid Callback Hell in Node.js?
Node.js heavily depends on Callback functions. But at times developers create programs that create heavily nested Callbacks. This leads to unreadable spaghetti code that is difficult to comprehend.
We can use divide and conquer approach to avoid Callback Hell. We have to create loosely coupled modules in our Node.js program that handle specialized functions.
With each module, we define independent functions in which we can pass parameters.
How do you resolve unhandled exceptions in a Node.js program?
We can create a handler for uncaughtexception event in Node.js. This handler can handle the unhandled exceptions.
But this is method is not a professional method to handle unhandled exceptions. In general, we should modularize our program and create domains. Any unhandledexception coming from a domain has to be handled at that level. This helps in handling the exception before letting it reach at Process level.
Node JS Interview Question
What is the use of Query String in Node.js?
We use Query String module to handle and process URL query strings. This module provides utilities to parse URLs and format URL query strings.
Some of the useful methods are querystring.escape(), querystring.parse(), querystring.stringify(), querystring.escape() etc.
How will you get the amount of free memory on the server in which Node.js application is running?
We can use OS module utilities to get the amount of free memory on the server.
The function to use is os.freemem(). This function gives us the amount of free system memory in bytes as an integer.
What is a Global object in Node.js?
There are Global objects in Node.js that are accessible to all the parts and modules of Node application. Some of the Global objects are as follows:
- Buffer: This object is used to handle binary data in Node.js.
- Console: This object is used to print to stderr and stdout.
- Process: This is the Global Process object in Node.js. It provides information and control on the current Node process.
Advance Node JS Interview Question
What is the use of Zlib in Node.js?
Zlib is a module in Node.js that provides compression and decompression utilities based on Gzip and Deflate/Inflate. Since there is a large amount of I/O in a Node.js application, it makes sense to use compression and decompression to save bandwidth and computing time.
How will you convert a Buffer content into readable String in Node.js?
We can use string_decoder module APIs to decode buffer objects. This module provides utilities to decode a buffer in a way that preserves encoded multi-byte UTF-8 and UTF-16 characters.
In this way we can convert the Buffer contents into readable String.
How do you write unit test cases in Node.js?
We use Assert module to implement simple unit tests in a Node.js application. Assert module has functions like assert.deepequal() , assert.deepstrictequal() etc functions to write different unit testcases.
Node JS Interview Question
What are some steps to handle maintenance problems in Node?
We can start with a code review to handle maintenance issues. Using microservices and working to improve code quality can also help. We can also aim for better documentation and update the whole stack.
What are the different custom directive types in AngularJS?
AngularJS supports a no. of different directives which also depend on the level we want to restrict them.
So in all, there are four different kinds of custom directives.
- Element Directives (E)
- Attribute Directives (A)
- CSS Class Directives (C)
- Comment Directives (M)
How to create a custom directive in AngularJS?
To create a custom directive, we have to first register it with the application object by calling the <directive> function. While invoking the <register> method of <directive>, we need to give the name of the function implementing the logic for that directive.
For example, in the below code, we have created a copyright directive which returns a copyright text.
app.directive('myCopyRight', function ()
{
return
{
template: '@CopyRight MyDomain.com '
};
});
Note – A custom directive should follow the camel case format as shown above.
Advance Node JS Interview Question
Does Node.js support multi-core platforms? And is it capable of utilizing all the cores?
Yes, Node.js would run on a multi-core system without any issue. But it is by default a single-threaded application, so it can’t completely utilize the multi-core system.
However, Node.js can facilitate deployment on multi-core systems where it does use the additional hardware. It packages with a Cluster module which is capable of starting multiple Node.js worker processes that will share the same port.
What is the local installation of dependencies?
By default, NPM installs any dependency in the local mode. It means that the package gets installed in “node_modules” directory which is present in the same folder, where Node application is placed. Locally deployed packages are accessible via require(). Following is the syntax to install a Node project locally.
What is the global installation of dependencies?
Globally installed packages/dependencies are stored in <user-directory>/npm directory. Such dependencies can be used in CLI (Command Line Interface) function of any node.js, but cannot be imported using require() in the Node application directly.
To install a Node project globally use -g flag as.
C:\Nodejs_WorkSpace>npm install express -g
Node JS Interview Question