0% found this document useful (0 votes)
5 views127 pages

Unit-2 Node JS

The document provides an overview of working with Node.js, focusing on file system operations such as reading, writing, deleting, and renaming files using the fs module. It explains both synchronous and asynchronous methods for these operations, highlighting the importance of callbacks in asynchronous calls. Additionally, it covers other tasks like verifying path existence and retrieving file information.

Uploaded by

vasanthi.kota17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views127 pages

Unit-2 Node JS

The document provides an overview of working with Node.js, focusing on file system operations such as reading, writing, deleting, and renaming files using the fs module. It explains both synchronous and asynchronous methods for these operations, highlighting the importance of callbacks in asynchronous calls. Additionally, it covers other tasks like verifying path existence and retrieving file information.

Uploaded by

vasanthi.kota17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 127

UNIT-2 (Node.

js)

Working with JSON, Using the Buffer Module to Buffer Data, Using the Stream
Module to Stream Data, Accessing the File System from Node.js- Opening,
Closing, Writing, Reading Files and other File System Tasks. Implementing
HTTP Services in Node.js- Processing URLs, Processing Query Strings and
Form Parameters, Understanding Request, Response, and Server Objects,
Implementing HTTP Clients and Servers in Node.js, Implementing HTTPS
Servers and Clients. Using Additional Node.js Modules-Using the os Module,
Using the util Module, Using the dns Module, Using the crypto Module.
Accessing the File System from Node.js
• Interacting with the file system in Node.js is important especially if you need to manage
dynamic files to support a web application or service.
• Node.js provides a good interface for interacting with the file system in the fs module. This
module provides the standard file access APIs that are available in most languages to open,
read, write, and interact with files
• To include the File System module, use the require() method:
• var fs = require('fs’);
• Common use for the File System module:
• Read files
• Create files
• Update files
• Delete files
• Rename files
read.js
// The fs.readFile() method is used to read files on your
computer

var http = require('http');


var fs = require('fs’);
fs.readFile('file1.txt', function(err, data) {
Console.log(‘file1 content:’+ data);
});
Read Files
//READ FILE SYNCHRONOUS CALL

var http = require('http');

var fs = require('fs');

var data=fs.readFileSync('file1.txt');

console.log('file1 content:'+ data);


Read Files
The fs.readFile() method is used to read files on your
computer.

file1.html
<html>
<body>
<h1>My Header</h1>
<p>My paragraph.</p>
</body>
</html>
read.js
// The fs.readFile() method is used to read files on your computer
var http = require('http');
var fs = require('fs');
http.createServer(function (req, res) {
fs.readFile('file1.html', function(err, data) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.write(data);
return res.end();
});
}).listen(8080);
Write Files
//The fs.writeFile() method replaces the specified file and
content:
//Example:Replace the content of the file "file3.txt":

var fs = require('fs');
fs.writeFile('file3.txt', 'This is my text', function (err) {
if (err) throw err;
console.log('Replaced!');
});
APPEND FILES
//The fs.appendFile() method appends specified content to
a file.
//If the file does not exist, the file will be created:

var fs = require('fs');
fs.appendFile('file1.txt', ‘FULL STACK DEVELOPMENT',
function (err) {
if (err) throw err;
console.log('Saved!');
});
Delete Files
//To delete a file with the File System module, use the
fs.unlink() method.
//Delete "file2.txt":

var fs = require('fs');
fs.unlink('file2.txt', function (err) {
if (err) throw err;
console.log('File deleted!');
});
Rename files
//To rename a file with the File System module, use the
fs.rename() //method. Rename "file1.txt" to
"renamedfile.txt":

var fs = require('fs');
fs.rename('file1.txt', 'renamedfile.txt', function (err) {
if (err) throw err;
console.log('File Renamed!');
});
Synchronous Versus Asynchronous
Synchronous Versus Asynchronous File System Calls:
• The fs module provided in Node.js makes almost all functionality available in
two forms: asynchronous and synchronous.
• Synchronous file system calls block until the call completes and then control is
released back to the thread. This has advantages but can also cause severe
performance issues in Node.js if synchronous calls block the main event thread
or too many of the background thread pool threads. Therefore, synchronous file
system calls should be limited in use when possible.
• Asynchronous calls are placed on the event queue to be run later. This allows
the calls to fit into the Node.js event model; however, this can be tricky when
executing your code because the calling thread continues to run before the
asynchronous call gets picked up by the event loop
Synchronous Versus Asynchronous
Important differences between synchronous and asynchronous file system calls in
Node.js:
■ Asynchronous calls require a callback function as an extra parameter. The callback
function is executed when the file system request completes, and typically contains an
error as its first parameter.
■ Exceptions are automatically handled by asynchronous calls, and an error object is
passed as the first parameter if an exception occurs. Exceptions in synchronous calls
must be handled by your own try/catch blocks of code.
■ Synchronous calls are run immediately, and execution does not return to the current
thread until they are complete. Asynchronous calls are placed on the event queue, and
execution returns to the running thread code, but the actual call will not execute until
picked up by the event loop.
Opening and Closing Files

Node provides synchronous and asynchronous methods for opening files.


Once a file is opened, you can read data from it or write data to it depending on the
flags used to open the file.
To open files in a Node.js app, use one of the following statements for asynchronous or
synchronous:
fs.open(path, flags, [mode], callback)
fs.openSync(path, flags, [mode])
The path parameter specifies a standard path string for your file system.
The flags parameter specifies what mode to open the file in—read, write, append, and
so on.
The optional mode parameter sets the file access mode and defaults to 0666, which is
readable and writable.
Flags that define how files are opened
Mode Description
r Open file for reading. An exception occurs if the file does not exist.

r+ Open file for reading and writing. An exception occurs if the file does not exist.

rs Open file for reading in synchronous mode.

rs+ Same as rs except the file is open file for reading and writing

w Open file for writing. The file is created if it does not exist or truncated if it does
exist.
wx Same as w but fails if the path exists.

w+ Open file for reading and writing. The file is created if it does not exist or truncated if
it exists.
wx+ Same as w+ but fails if path exists

a Open file for appending. The file is created if it does not exist.

ax Same as a but fails if the path exists

a+ Open file for reading and appending. The file is created if it does not exist.

ax+ Same as a+ but fails if the path exists.


Opening and Closing Files
• In the case of the asynchronous close() call, you also need to specify a callback function:
• fs.close(fd, callback)
• fs.closeSync(fd)
Asynchronous mode: Notice that a callback function is specified that receives an err and an fd
parameter. The fd parameter is the file descriptor that you can use to read or write to the file:
fs.open("myFile", 'w', function(err, fd){
if (!err){
fs.close(fd);
}
});
synchronous mode: Notice that a there is no callback function and that the file descriptor used to read
and write to the file is returned directly from fs.openSync():
var fd = fs.openSync("myFile", 'w');
fs.closeSync(fd);
Writing Files
• The fs module provides four different ways to write data to files.
• You can write data to a file in a single call, write chunks using synchronous writes, write
chunks using asynchronous writes, or stream writes through a Writable stream.
• Each of these methods accepts either a String or a Buffer object as input.
1.Simple File Write
• The simplest method for writing data to a file is to use one of the writeFile() methods.
• These methods write the full contents of a String or Buffer to a file.
• fs.writeFile(path, data, [options], callback)
• fs.writeFileSync(path, data, [options])
• The path parameter specifies the path to the file. The data parameter specifies the String or
Buffer object to be written to the file. The optional options parameter is an object that can
contain encoding, mode, and flag properties that define the string encoding as well as the
mode and flags used when opening the file. The asynchronous method also requires a callback
that is called when the file write has been completed.
Writing Files
• 2. Synchronous File Writing:
• The synchronous method of file writing writes the data to the file before returning execution to the
running thread.
• To write to a file synchronously, first open it using openSync() to get a file descriptor and then use
fs.writeSync() to write data to the file.
• The syntax for fs.writeSync():
• fs.writeSync(fd, data, offset, length, position)
• The fd parameter is the file descriptor returned by openSync().
• The data parameter specifies the String or Buffer object to be written to the file.
• The offset parameter specifies the index in the input data to begin reading from; if you want to begin
at the current index in the String or Buffer, this value should be null.
• The length specifies the number of bytes to write; specifying null writes until the end of the data
buffer.
• The position argument specifies the position in the file to begin writing at; specifying null for this
value uses the current file position.
Writing Files
• 3.Asynchronous File Writing
• The asynchronous method of file writing puts the write request on the event queue and then
returns control back to the calling code. The actual write does not take place until the event
loop picks up the write request and executes it.
• To write to a file asynchronously, first open it using open() and then after the callback from the
open request has executed, use fs.write() to write data to the file.
• The syntax for fs.write():
• fs.write(fd, data, offset, length, position, callback)
• fd , data , offset, length, position are same as above.
• The callback argument must be a function that can accept two parameters, error and bytes,
where error is an error that occurred during the write and bytes specifies the number of bytes
written
Writing Files
• 4.Streaming File Writing
• One of the best methods to use when writing large amounts of data to a file is the
streaming method.
• This method opens the file as a Writable stream.
• Writable streams can easily be implemented and linked to Readable streams using the
pipe() method, which makes it easy to write data from a Readable stream source such as
an HTTP request.
• To stream data to a file asynchronously, you first need to create a Writable stream object
using the following syntax:
• fs.createWriteStream(path, [options])
• The path parameter specifies the path to the file and can be relative or absolute. The
optional options parameter is an object that can contain encoding, mode, and flag
properties that define the string encoding as well as the mode and flags used when
opening the file.
Reading Files
• The fs module also provides four different ways to read data from files. You can read data in
one large chunk, read chunks of data using synchronous writes, read chunks of data using
asynchronous writes, or stream reads through a Readable stream.
1.Simple File Read:
The simplest method for reading data to a file is to use one of the readFile() methods. These
methods read the full contents of a file into a data buffer. The following shows the syntax for
the readFile() methods:
fs.readFile(path, [options], callback)
fs.readFileSync(path, [options])
2. Synchronous File Reading
The synchronous method of file reading reads the data from the file before returning
execution to the running thread. To read to a file synchronously, first open it using openSync()
to get a file descriptor and then use readSync() to read data from the file. The following shows
the syntax for readSync():
fs.readSync(fd, buffer, offset, length, position)
Reading Files
• 3.Asynchronous File Reading
• The asynchronous method of file reading puts the read request on the event queue and then returns
control back to the calling code. The actual read does not take place until the event loop picks up the
read request and executes it.
• To read from a file asynchronously, first open it using open() and then after the callback from the
open request has executed, use read() to read data from the file. The following shows the syntax for
read():
• fs.read(fd, buffer, offset, length, position, callback).
• The callback argument must be a function that can accept three parameters: error, bytes, and buffer.
The error parameter is an error that occurred during the read, bytes specifies the number of bytes
read, and buffer is the buffer with data populated from the read request.
• 4. Streaming File Reading
• To stream data from a file asynchronously, you first need to create a Readable stream object using the
following syntax:
• fs.createReadStream(path, [options])
Deleting Files
• To delete a file from Node.js, use one of the following commands:
• fs.unlink(path, callback)
• fs.unlinkSync(path)
• The unlinkSync(path) returns true or false based on whether the delete is successful.
• The asynchronous unlink() call passes back an error value to the callback function if
an error is encountered when deleting the file.
• The following code snippet illustrates the process of deleting a file named new.txt
using the unlink() asynchronous fs call:
• fs.unlink("new.txt", function(err){
• console.log(err ? "File Delete Failed" : "File Deleted");
• });
Renaming Files and Directories
• You might also need to rename files and folders in your Node.js application to make room for
new data, archive old data, or apply changes made by a user. Renaming files and folders uses
the fs calls shown here:
• fs.rename(oldPath, newPath, callback)
• fs.renameSync(oldPath, newPath)
• The oldPath specifies the existing file or directory path, and the newPath specifies the new
name.
• The renameSync(path) returns true or false based on whether the file or directory is
successfully renamed.
• The asynchronous rename() call passes an error value to the callback function if an error is
encountered when renaming the file or directory
Truncating Files
• Truncating a file means reducing the size of the file by setting the end to a smaller value than
the current size.
• You might want to truncate a file that grows continuously but does not contain critical data,
such as a temporary log.
• To truncate a file, use one the following fs calls and pass in the number of bytes you want the
file to contain when the truncation completes:
• fs.truncate(path, len, callback)
• fs.truncateSync(path, len)
• The truncateSync(path) returns true or false based on whether the file is successfully
truncated. The asynchronous truncate() call passes an error value to the callback function if an
error is encountered when truncating the file.
Other File System Tasks
• Verifying Path Existence
• Before doing any kind of read/write operation on a file or directory, you might want to verify
• whether the path exists. This can easily be done using one of the following methods:
• fs.exists(path, callback)
• fs.existsSync(path)
• The fs.existsSync(path) returns true or false based on the path existence.
• If you use fs.exists(), you need to implement a callback that is executed when the call
completes. The callback is passed a Boolean value of true or false depending on whether the
path exists.
• For example, the following code verifies the existence of a file named filesystem.js in the
current path and displays the results:
fs.exists('filesystem.js', function (exists) {
console.log(exists ? "Path Exists" : "Path Does Not Exist");
});
Getting File Info
• To get basic information about file system objects such as file size, the mode, modify time, whether
the entry is a file or folder, and so on. This information can be obtained using one of the following
calls: fs.stats(path, callback)
• fs.statsSync(path)
• The fs.statsSync() method returns a Stats object, whereas the fs.stats() method is executed and the
Stats object is passed to the callback function as the second parameter.
• Attributes and methods of Stats objects for file system entries
Attribute/Method Description
isFile() Returns true if the entry is a file
isDirectory() Returns true if the entry is a directory
dev Specifies the device ID on which the file is
located
mode Specifies the access mode of the file
size Specifies the number of bytes in the file
atime Specifies the time the file was last accessed
mtime Specifies the time the file was last modified
Making and Removing Directories
• The fs module provides the functionality to add and remove directories as necessary.
• To add a directory from Node.js, use one of the following fs calls.
• The path can be absolute or relative. The optional mode parameter allows you to specify the
access mode for the new directory.
• fs.mkdir(path, [mode], callback)
• fs.mkdirSync(path, [mode])
To delete a directory from Node.js, use one of the following fs calls. The path can be absolute
or relative.
fs.rmdir(path, callback)
fs.rmdirSync(path)
Node.js Modules

•In Node.js, Modules are the blocks of encapsulated code that communicate with an external
application on the basis of their related functionality.
•Modules can be a single file or a collection of multiple files/folders.
•The reason programmers are heavily reliant on modules is because of their reusability as well
as the ability to break down a complex piece of code into manageable chunks.
• Different types of Node.js Modules:
 Core Modules/ Built-in
 Local Modules
 Third-party modules
Core Modules(BUILT-IN)
•Node.js has many built-in modules that are part of the platform and come with Node.js installation. These modules
can be loaded into the program by using the required function.
•Syntax: const module = require('module_name');
•The require() function will return a JavaScript type depending on what the particular module returns.
const http = require('http’);
http.createServer(function (req, res) {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.write('Welcome to this page!');
res.end();
}).listen(3000);

• In the above example, the require() function returns an object because the Http module returns its functionality as
an object. The function http.createServer() method will be executed when someone tries to access the computer
on port 3000. The res.writeHead() method is the status code where 200 means it is OK, while the second
argument is an object containing the response headers.
Core Modules

Core Modules Description

http creates an HTTP server in Node.js.

assert set of assertion functions useful for testing.

fs used to handle file system.

path includes methods to deal with file paths.

process provides information and control about the current Node.js process.

os provides information about the operating system.

querystring utility used for parsing and formatting URL query strings.

url module provides utilities for URL resolution and parsing.


Local Modules(User-defined)
•Unlike built-in and external modules, local modules are created locally in your Node.js application.
Let’s create a simple calculating module that calculates various operations.
•// Filename: calc.js
exports.add = function (x, y) {
return x + y;
};
exports.sub = function (x, y) {
return x - y;
};
exports.mult = function (x, y) {
return x * y;
};
exports.div = function (x, y) {
return x / y;
};
Since this file provides attributes to the outer world via exports, another file can use its exported
functionality using the require() function.
Local Modules
// Filename: module.js
const calculator = require('./calc.js');
let x = 50, y = 10;
console.log("Addition of 50 and 10 is" + calculator.add(x, y));
console.log("Subtraction of 50 and 10 is " + calculator.sub(x, y));
console.log("Multiplication of 50 and 10 is "+ calculator.mult(x, y));
console.log("Division of 50 and 10 is " + calculator.div(x, y));
____________________________________________________
Output:
Addition of 50 and 10 is 60
Subtraction of 50 and 10 is 40
Multiplication of 50 and 10 is 500
Division of 50 and 10 is 5
Third-party modules (NPM)
•Third-party modules are modules that are available online using the Node Package
Manager(NPM).
• These modules can be installed in the project folder or globally.
• Some of the popular third-party modules are Mongoose, express, angular, and React.
•Example:
 npm install express
 npm install mongoose
 npm install angular
 npm install react
Using Additional Node.js Modules
• OS
• UTIL
• DNS
• CRYPTO
The os module
The os module provides a useful set of functions that allow you to get information from the operating
system (OS).
Event Description

tmpdir() - Returns a string path to the default temp directory for the OS. Useful if you
need to store files temporarily and then remove them later.
endianness()-Returns BE or LE for big endian or little endian, depending on the architecture
of the machine.
hostname()- Returns the hostname defined for the machine. This is useful when
implementing network services that require a hostname.
type() - Returns the OS type as a string
platform() Returns the platform as a string; for example, win32, linux, or freeBSD.
arch() Returns the platform architecture; for example, x86 or x64.
release() Returns the OS version release.
uptime() Returns a timestamp in seconds of how long the OS has been running.
The os module

loadavg() On UNIX-based systems, returns an array of values containing the system


load value for [1, 5, 15] minutes.
totalmem() Returns an integer specifying the system memory in bytes.
freemem() Returns an integer specifying the free system memory in bytes.

cpus() Returns an array of objects that describes the model, speed, and times.
This array contains the amount of time the CPU has spent in user, nice,
sys, idle, and irq.
networkInterfaces() Returns an array of objects describing the address and family of
addresses bound on each network interface in your system.
EOL Contains the appropriate End Of Line characters for the operating system; for
example, \n or \r\n. This can be useful to make your application cross-platform
compatible when processing string data.
The os module
//Demonstrate OS MODULE functions
var os = require('os');
console.log("tmpdir :\t" + os.tmpdir());
console.log("endianness :\t" + os.endianness());
console.log("hostname :\t" + os.hostname());
console.log("type :\t\t" + os.type());
console.log("platform :\t" + os.platform());
console.log("arch :\t\t" + os.arch());
console.log("release :\t" + os.release());
console.log("uptime :\t" + os.uptime());
console.log("loadavg :\t" + os.loadavg());
console.log("totalmem :\t" + os.totalmem());
console.log("freemem :\t" + os.freemem());
console.log("EOL :\t" + os.EOL);
console.log("cpus :\t\t" + JSON.stringify(os.cpus()));
console.log("networkInterfaces : " + JSON.stringify(os.networkInterfaces()));
• tmpdir : C:\Users\91966\AppData\Local\Temp
• endianness : LE
• hostname : DESKTOP-N5NOQHC
• type : Windows_NT
• platform : win32
• arch : x64
• release : 10.0.22621
• uptime : 207517.671
• loadavg : 0,0,0
• totalmem : 8430252032
• freemem : 851951616
• EOL :
• cpus : [{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1352843,"nice":0,"sys":560953,"idle":20264609,"irq":41109}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1688953,"nice":0,"sys":356250,"idle":20132984,"irq":10125}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1836437,"nice":0,"sys":521140,"idle":19820671,"irq":11046}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":2336656,"nice":0,"sys":376593,"idle":19464968,"irq":7390}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1557625,"nice":0,"sys":493812,"idle":20126796,"irq":11890}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1244921,"nice":0,"sys":335031,"idle":20598281,"irq":7468}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1273156,"nice":0,"sys":454500,"idle":20450578,"irq":10781}},{"model":"Intel(R) Core(TM) i5-10210U CPU @ 1.60GHz","speed":2112,"times":
{"user":1351343,"nice":0,"sys":397312,"idle":20429578,"irq":7640}}]
• networkInterfaces : {"VMware Network Adapter VMnet1":
[{"address":"fe80::16ab:dd25:1c82:58d4","netmask":"ffff:ffff:ffff:ffff::","family":"IPv6","mac":"00:50:56:c0:00:01","internal":false,"cidr":"fe80::16ab:dd2
5:1c82:58d4/64","scopeid":8},
{"address":"192.168.111.1","netmask":"255.255.255.0","family":"IPv4","mac":"00:50:56:c0:00:01","internal":false,"cidr":"192.168.111.1/24"}],"VMwar
e Network Adapter VMnet8":
[{"address":"fe80::6a49:dc4:e71d:73a9","netmask":"ffff:ffff:ffff:ffff::","family":"IPv6","mac":"00:50:56:c0:00:08","internal":false,"cidr":"fe80::6a49:dc4:e
71d:73a9/64","scopeid":14},
{"address":"192.168.204.1","netmask":"255.255.255.0","family":"IPv4","mac":"00:50:56:c0:00:08","internal":false,"cidr":"192.168.204.1/24"}],"Wi-Fi":
[{"address":"fe80::8c10:d740:d138:2f86","netmask":"ffff:ffff:ffff:ffff::","family":"IPv6","mac":"6c:6a:77:8c:65:04","internal":false,"cidr":"fe80::8c10:d740
:d138:2f86/64","scopeid":21},
{"address":"192.168.0.102","netmask":"255.255.255.0","family":"IPv4","mac":"6c:6a:77:8c:65:04","internal":false,"cidr":"192.168.0.102/24"}],"Loopba
Using the util Module
The util module in Node.js provides utility functions that help with debugging,
formatting, and working with objects and types.
The util module is a catch-all module that provides functions for
• formatting strings,
• converting objects to strings,
• checking object types,
• performing synchronous writes to output streams,
• and some object inheritance enhancements.
Syntax:
const util = require('util');
Formatting Strings
The util.format() function accepts a formatter string as the first argument and returns a
formatted string.
util.format(format[...args])
The format argument is a string that can contain zero or more placeholders.
Each placeholder begins with a % character and is replaced with the converted string value
from its corresponding argument.
The first formatter placeholder represents the second argument and so on. The following is a list
of supported placeholders:
■ %s: Specifies a string
■ %d: Specifies a number (can be integer or float)
■ %i: Specifies an integer
■ %f: Specifies a floating point value
■ %j: Specifies a JSON stringifyable object
■ %: If left empty afterward, does not act as a placeholder
Formatting Strings
The following is a list of things to keep in mind when using format():
■ When there are not as many arguments as placeholders, the placeholder is not
replaced. For example:
util.format('%s = %s', 'Item1’); // 'Item1:%s'

■ When there are more arguments than placeholders, the extra arguments are
converted to strings and concatenated with a space delimiter.
util.format('%s = %s', 'Item1', 'Item2', 'Item3'); // 'Item1 = Item2 Item3'

■ If the first argument is not a format string, then util.format() converts each
argument to a string, concatenates them together using a space delimiter, and then
returns the concatenated string. For example:
util.format(1, 2, 3); // '1 2 3'
Checking Object Types

• It is often useful to determine whether an object you have received back from a
command is of a certain type.
• To do this, you can use the instanceof operator, which compares the object types and
returns true or false.
• For example:
• ([1,2,3] instanceof Array) //true
Formatting Strings & checking Object types
var util=require('util')
console.log(util.format('%s = %s', 'Item1'));
console.log(util.format('%s = %s', 'Item1', 'Item2',
'Item3'));
console.log(util.format(1,2,3));
console.log(([1,2,3] instanceof Array));
Output:
Item1 = %s
Item1 = Item2 Item3
1 2 3
true
Converting JavaScript Objects to Strings
• Especially when debugging, you need to convert a JavaScript object to a
string representation.
• The util.inspect() method allows you to inspect an object and then return a
string representation of the object.
• The following shows the syntax for the inspect() method:
• util.inspect(object, [options])
• The object parameter is the JavaScript object you want to convert to a string.
• The options method allows you to control certain aspects of the formatting
process.
Converting JavaScript Objects to Strings
options can contain the following properties:
■ showHidden: When set to true, the non-enumerable properties of the
object are also converted to the string. Defaults to false.
■ depth: Limits the number of levels deep the inspect process traverses
while formatting properties that are also objects. This can prevent infinite loops .
Defaults to 2; if it is null, it can recurse forever.
■ colors: When set to true, the output is styled with ANSI color codes.
Defaults to false.
■ customInspect: When set to false, any custom inspect() functions defined on
the objects being inspected are not called. Defaults to true.
Converting JavaScript Objects to Strings
var obj = { first:'Caleb', last:'Dayley' };
obj.inspect = function(depth) {
return '{ name: "' + this.first + " " + this.last + '" }';
};
console.log(util.inspect(obj));

//Outputs:
{ first: 'Caleb', last: 'Dayley', inspect: [Function (anonymous)] }
{ name: "Caleb Dayley" }
Inheriting Functionality from Other Objects
The util module provides the util.inherits() method to allow you to
create objects that inherit the prototype methods from another.
When you create the new object, the prototype methods are
automatically used.
The following shows the format of the util.inherits() method:
util.inherits(constructor, superConstructor)
The prototype of constructor is set to the prototype of
superConstructor and executed when a new object is created.
You can access the superConstructor from your custom object
constructor using the constructor.super_ property
Using the dns Module
Using the dns Module
• The dns module in Node.js provides functionalities for domain name
resolution. It allows you to perform DNS lookups, or do reverse lookups,
resolve hostnames, and interact with DNS records
• A DNS lookup contacts the domain name server and requests records
about a specific domain name.
• A reverse lookup contacts the domain name server and requests the DNS
name associated with an IP address.
• The dns module provides functionality for most of the lookups that you
may need to perform.
• Syntax:
• const dns = require('dns');
Methods that can be called on the dns Module
Event Description

lookup(domain, [family], callback) Resolves the domain. The family attribute can be 4, 6, or null,
where 4 resolves into the first found A (IPv4) record, 6 resolves
into the first round AAAA (IPv6) record, and null resolves both.
The default is null.

resolve(domain, [rrtype], callback) Resolves the domain into an array of record types specified by
rrtype.
rrtype can be ■ A: IPV4 addresses, the default ■ AAAA: IPV6
addresses ■ MX: Mail eXchange records ■ TXT: Text records ■
SRV: SRV records ■ PTR: Reverse IP lookups ■ NS: Name
Server records ■ CNAME: Canonical Name records

reverse(ip, callback) Does a reverse lookup on the ip address. The callback function
receives an error object if one occurs and an array of domains if
the lookup is successful.
Methods that can be called on the dns
//Performing lookups and then reverse lookups on domains and IP addresses
Module
var dns = require('dns');
console.log("Resolving www.google.com . . .");
dns.resolve4('www.google.com', function (err, addresses) {
console.log('IPv4 addresses: ' + JSON.stringify(addresses, false, ' '));
addresses.forEach(function (addr) {
dns.reverse(addr, function (err, domains) {
console.log('Reverse for ' + addr + ': ' + JSON.stringify(domains));
});
});
});
Output:
Resolving www.google.com . . .
IPv4 addresses: [
"172.217.163.196"
]
Reverse for 172.217.163.196: ["maa05s06-in-f4.1e100.net"]
Using the crypto Module
The crypto module in Node.js provides cryptographic functionality, including encryption,
decryption, hashing, and key generation.
It supports various cryptographic algorithms like AES, RSA, SHA, and HMAC.
It creates cryptographic information, or in other words, creates secure communication using
secret code. The easiest way to do ensure crypto is loaded is to use a simple try catch (err);
for example:
let crypto;
try {
crypto = require('crypto');
} catch (err) {
console.log('crypto support is disabled!');
}
The crypto module includes several classes that provide functionality to encrypt and decrypt
data and streams.
Classes that can be used in the crypto module
Class description

certificate Used for working with SPKAC (a certificate signing request mechanism) and primarily
used to handle output of HTML5.
cipher Used to encrypt data in either a stream that is both readable and writable, or using the
cipher.update and cipher.final methods
decipher The opposite of cipher. Used to decrypt data using either a readable and writable
stream or the decipher.update and deciper.final methods
diffieHellman Used to create key exchanges for Diffie-Hellman (a specific method for exchanging
cryptographic keys)
hash Used to create hash digests of data using a readable and writable stream or hash.update
and hash.digest
sign Used to generate signatures.

verify Used in tandem with sign to verify the signatures


Classes that can be used in the crypto module
The most common use for the crypto module is to use the Cipher and Decipher
classes to create encrypted data that can be stored and decrypted later;
for example, passwords.
Initially, passwords are entered as text, but it would be foolish to actually store
them as text.
Instead, passwords are encrypted using an encryption algorithm such as the
('aes192') method.
This allows you to store data encrypted so if it is accessed without decrypting,
your password is protected from prying minds.
Classes that can be used in the crypto module
//encrypt_password.js: Using cipher and decipher to encrypt and then decrypt data
var crypto = require('crypto');
var crypMethod = 'aes192';
var secret = 'MySecret';
function encryptPassword(pwd){
var cipher = crypto.createCipher(crypMethod, secret);
var cryptedPwd = cipher.update(pwd,'utf8','hex');
cryptedPwd += cipher.final('hex');
return cryptedPwd;
}
function decryptPassword(pwd){
var decipher = crypto.createDecipher(crypMethod, secret);
var decryptedPwd = decipher.update(pwd,'hex','utf8');
decryptedPwd += decipher.final('utf8');
return decryptedPwd;
Classes that can be used in the crypto module
var encryptedPwd = encryptPassword("BadWolf");
console.log("Encrypted Password");
console.log(encryptedPwd);
console.log("\nDecrypted Password");
console.log(decryptPassword(encryptedPwd))

Output:
Encrypted Password
0ebc7d846519b955332681c75c834d50

Decrypted Password
BadWolf
Implementing HTTP Services in Node.js
• One of the most important aspects of Node.js is the ability to quickly implement
HTTP and HTTPS servers and services.
• Node.js provides the http and https modules, and they provide the basic
framework to do most everything you need from an HTTP and HTTPS
standpoint.
• In fact, it is not difficult to implement a full webserver using just the http
module.
• We will likely use a different module, such as express, to implement a full-on
webserver.
• This is because the http module is pretty low level. It doesn’t provide calls to
handle routing, cookies, caching, and so on.
• You can create basic HTTP servers that provide an interface for communications
behind your firewall and then basic HTTP clients that interact with those
services.
Processing URLs
• The Uniform Resource Locator (URL) acts as an address label for the HTTP
server to handle requests from the client.
• It provides all the information needed to get the request to the connect server on
a specific port and access the proper data.
• The URL can be broken down into several different components, each providing
a basic piece of information for the webserver on how to route and handle the
HTTP request from the client.
• Figure illustrates the basic structure of a URL and the components that may be
included.
Properties of the URL object

Property Description

href This is the full URL string that was originally parsed.
protocol The request protocol lowercased
host The full host portion of the URL including port information lowercased.
auth The authentication information portion of a URL.
hostname The hostname portion of the host lowercased.
port The port number portion of the host
pathname The path portion of the URL including the initial slash if present

search The query string portion of the URL including the leading question mark
path The full path including the pathname and search.
query This is either the parameter portion of the query string or a parsed object containing the query string
parameters and values if the parseQueryString is set to true
hash The hash portion of the URL including the pound sign (#).
Understanding the URL Object
• To use the URL information more effectively, Node.js provides the url module that
provides functionality to convert the URL string into a URL object.
• To create a URL object from the URL string, pass the URL string as the first
parameter to the following method:
• url.parse(urlStr, [parseQueryString], [slashesDenoteHost])
• The url.parse() method takes the URL string as the first parameter.
• The parseQueryString parameter is a Boolean that when true also parses the query
string portion of the URL into an object literal. The default is false.
• The slashesDenoteHost is also a Boolean that when true parses a URL with the format
of //host/path to {host: 'host', pathname: '/path'} instead of {pathname:
'//host/path'}. The default is false.

• url.format(urlObj): which takes an object or string and return a formatted


string derived from that object or string.
Understanding the URL Object
//An example of parsing a URL string into an object and then converting it
//back into a string:
var url = require('url');
var urlStr = 'http://user:pass@host.com:80/resource/path?query=string#hash';
var urlObj = url.parse(urlStr);
urlString = url.format(urlObj)
console.log('urlObj='+urlObj)
console.log('urlString='+urlString)

Output:
urlObj=[object Object]
urlString=http://user:pass@host.com:80/resource/path?query=string#hash
Resolving the URL Components
• Another useful feature of the url module is the ability to resolve URL components in
the same manner as a browser would.
• This allows you to manipulate the URL strings on the server side to make
adjustments in the URL.
• For example, you might want to change the URL location before processing the
request because a resource has moved or changed parameters.
• To resolve a URL to a new location use the following syntax:
• url.resolve(from, to)
• The from parameter specifies the original base URL string.
• The to parameter specifies the new location where you want the URL to resolve.
Resolving the URL Components
//Resolving the URL Components
var url = require('url');
var originalUrl = 'http://user:pass@host.com:80/resource/path?query=string#hash';
var newResource = '/another/path?querynew';
console.log(url.resolve(originalUrl, newResource));

Output:

http://user:pass@host.com:80/another/path?querynew
Processing Query Strings and Form Parameters
• HTTP requests often include query strings in the URL or parameter data in the
body for form submissions.
• The query string can be obtained from the URL object defined in the previous
section.
• The parameter data sent by a form request can be read out of the body of the
client request.
• The query string and form parameters are just basic key-value pairs.
• To actually consume these values in your Node.js webserver you need to
convert the string into a JavaScript object using the parse() method from the
querystring module:
• querystring.parse(str, [sep], [eq], [options])
Processing Query Strings and Form Parameters
• querystring.parse(str, [sep], [eq], [options])
• The str parameter is the query or parameter string.
• The sep parameter allows you to specify the separator character used. The
default separator character is &.
• The eq parameter allows you to specify the assignment character to use when
parsing. The default is =.
• The options parameter is an object with the property maxKeys that allows you
to limit the number of keys the resulting object can contain. The default is
1000. If you specify 0, there is no limit.
• You can also go back the other direction and convert an object to a query
string using the stringify() function shown here:
• querystring.stringify(obj, [sep], [eq]
Processing Query Strings and Form Parameters
//an example of using parse() and stringify() to parse a
query //string:
var qstring = require('querystring');
var params = qstring.parse("name=Brad&color=red&color=blue");
console.log(params)
var str = qstring.stringify(params);
console.log(str)
Output:

{ name: 'Brad', color: [ 'red', 'blue' ] }

name=Brad&color=red&color=blue
Read the Query String
The function passed into the http.createServer() has a req argument that represents the
request from the client, as an object (http.IncomingMessage object).
This object has a property called "url" which holds the part of the url that comes after
the domain name:
//demo_url.js
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.write(req.url);
res.end();
}).listen(8080);
Read the Query String
Initiate demo_url.js:

C:\Users\Your Name>node demo_url.js

http://localhost:8080/summer?year=2009&month=AUG
Will produce this result:
/summer?year=2009&month=AUG
http://localhost:8080/winter?year=2009&month=AUG
Will produce this result:
/winter?year=2009&month=AUG
Split the Query String
//write a Node JS program to read form data from query string and generate response
//using NodeJS.
var http = require('http');
var url = require('url');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
var q = url.parse(req.url, true).query;
var txt = q.year + " " + q.month;
res.end(txt);
}).listen(8080);
Output:
http://localhost:8080/?year=2017&month=July
Will produce this result:
2017 July
Understanding Request, Response, and Server Objects
• To use the http module in Node.js applications, you first need to understand
the request and response objects.
• They provide the information and much of the functionality that comes into
and out of the HTTP clients and servers.
• Once you see the makeup of these objects—including properties, events, and
methods they provide—it will be simple to implement your own HTTP servers
and clients.
• ClientRequest,
• ServerResponse,
• IncomingMessage,
• and Server objects
The http.ClientRequest Object
• The ClientRequest object is created internally when you call http.request() when
building the HTTP client.
• This object represents the request while it is in progress to the server. You use the
ClientRequest object to initiate, monitor, and handle the response from the server.
• The ClientRequest implements a Writable stream, so it provides all the functionality of a
Writable stream object.
• To implement a ClientRequest object, owing syntax:
http.request(options, callback)
• The options parameter is an object whose properties define how to open and send the
client HTTP request to the server.
• The callback parameter is a callback function that is called after the request is sent to the
server and handles the response back from the server.
• The only parameter to the callback is an IncomingMessage object that will be the
response from the server.
The http.ClientRequest Object
• The ClientRequest object provides several events that enable you to handle the various states
the request may experience.
The http.ClientRequest Object
• In addition to events, the ClientRequest object also provides several methods that can be
used to write data to the request, abort the request, or end the request.
The http.ServerResponse Object
• The ServerResponse object is created by the HTTP server internally when a
request event is received. It is passed to the request event handler as the second
argument.
• You use the ServerRequest object to formulate and send a response to the
client.
• The ServerResponse implements a Writable stream, so it provides all the
functionality of a Writable stream object.
• For example, you can use the write() method to write to it as well as pipe a
Readable stream into it to write data back to the client.
• When handling the client request, you use the properties, events, and methods
of the ServerResponse object to build and send headers, write data, and send
the response.
The http.ServerResponse Object
• Events available on ServerResponse objects
The http.ServerResponse Object (methods)
HTTP server
• To start the HTTP server, you need to first create a Server object using the
createServer() method shown below. This method returns the Server object.
• The optional requestListener parameter is a callback that is executed when the
request event is triggered.
• The callback should accept two parameters.
• The first is an IncomingMessage object representing the client request, and the
second is a ServerResponse object you use to formulate and send the response:
http.createServer([requestListener])
• Once you have created the Server object, you can begin listening on it by calling the
listen() method on the Server object:
• listen(port, [hostname], [backlog], [callback])
The http.ServerResponse Object(METHODS)-EXAMPLE

// code shows an example of starting an HTTP server and listening on port 8080.
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Hello World!');
res.end();
}).listen(8080);
Implementing HTTP Clients and Servers in Node.js
• 1. Serving Static Files
• The most basic type of HTTP server is one that serves static files.
• To serve static files from Node.js, you need to first start the HTTP server and listen on a
port.
• Then in the request handler, you open the file locally using the fs module and write the file
contents to the response.
Steps:
1. Create a file ‘httpstaticserver.js’ in current directory
2. Make a directory with name ‘html’ in the current directory
3. Create a new file ‘1.txt’ and ‘hello.html’ in ‘html ’ directory
4. Compile the file ‘httpstaticserver.js’
5. open the web browser and request ‘ localhost:8080/1.txt’ or ‘ localhost:8080/hello.html’
content of 1.txt or hello.html will be on the web page…
Implementing HTTP Clients and Servers in Node.js
// Write a Node JS program to demonstrate
//accessing static files using http webserver
and //client. fs.readFile(ROOT_DIR +
//httpstaticfileserver.js urlObj.pathname, function (err,data) {
if (err) {
var fs = require('fs');
res.writeHead(404);
var http = require('http'); res.end(JSON.stringify(err));
var url = require('url'); return;
var ROOT_DIR = "html/"; }
res.writeHead(200);
http.createServer(function (req, res) {
res.end(data);
var urlObj = url.parse(req.url, true, false); });
}).listen(8080)
Implementing HTTP Clients and Servers in Node.js
// Basic web client retrieving static files
// httpstaticfileclient.js response.on('end', function () {
var http = require('http'); console.log(serverData);
var options = { });
}
hostname: 'localhost',
http.request(options, function(response){
port: '8080', handleResponse(response);
path: '/hello.html' }).end()
};
function handleResponse(response) {
var serverData = '';
response.on('data', function (chunk) {
serverData += chunk;
});
Implementing HTTP Clients and Servers in Node.js
hello.html
<html>
<head>
<title>Static Example</title>
</head>
<body>
<h1>Hello from a Static File</h1>
</body>
</html>

Output:
Request: localhost:8080/hello.html
Output:
Hello from a Static File
Implementing HTTP Clients and Servers in Node.js
In modules/html/ 1.txt

1.txt
welcome to static file server......

Request: localhost:8080/1.txt
Output:
welcome to static file server......
In another terminal, run the file ‘httpstaticfileclient.js’
It will display the content of the file hello.html
Implementing HTTP Clients and Servers in Node.js
2.Implementing Dynamic GET Servers

• More often than not you will use Node.js webservers to serve dynamic content
rather than static content.
• This content may be dynamic HTML files or snippets, JSON data, or a number
of other data types.
• To serve a GET request dynamically, you need to implement code in the
request handler that dynamically populates the data you want to send back to
the client, writes it out to the response, and then calls end() to finalize the
response and flush the Writable stream.
Implementing HTTP Clients and Servers in Node.js
//Write a Node JS program to demonstrate accessing dynamic content through GET method using http webserver and
//client.(httpserverget.js)
var http = require('http');
var messages = [
'Hello World’,
'From a basic Node.js server’,
'Take Luck'];
http.createServer(function (req, res) {
res.setHeader("Content-Type", "text/html");
res.writeHead(200);
res.write('<html><head><title>Simple HTTP Server</title></head>');
res.write('<body>');
for (var idx in messages){
res.write('\n<h1>' + messages[idx] + '</h1>');
}
res.end('\n</body></html>');
}).listen(8080)
Implementing HTTP Clients and Servers in Node.js
http_client_get.js: Basic web client that makes a
GET request to the server
response.on('end', function () {
// Basic web client retrieving console.log("Response Status:",
var http = require('http'); response.statusCode);
console.log("Response Headers:",
var options = { response.headers);
hostname: 'localhost', console.log(serverData);
port: '8080', });
}; }
function
handleResponse(response) { http.request(options, function(response){
handleResponse(response);
var serverData = ''; }).end()
response.on('data', function
(chunk) {
serverData += chunk;
});
Implementing HTTP Clients and Servers in Node.js
Output:
1. Run the ‘httpserverget.js’ file
2. In the browser: localhost:8080 gives the response in web page.

3.In another terminal, run the ‘httpclientget.js’ file,


It gives the response status, Headers and file content.
Handling Data I/O in node.js
• Most active web applications and services have a lot of data flowing through them.
• That data comes in the form of text, JSON strings, binary buffers, and data
streams.
• For that reason, Node.js has many mechanisms built into support handling the data
I/O from system to system.
• It is important to understand the mechanisms that Node.js provides to implement
effective and efficient web applications and services.
we need to understand:
• manipulating JSON data,
• managing binary data buffers,
• Implementing readable and writable streams, and
• compressing and decompressing data.
Working with JSON
• One of the most common data types that you work with when implementing Node.js web
applications and services is JSON (JavaScript Object Notation).
• JSON is a lightweight method to convert JavaScript objects into a string form and then
back again.
• This provides an easy method when you need to serialize data objects when passing them
from client to server, process to process, stream to stream, or when storing them in a
database.
• There are several reasons to use JSON to serialize your JavaScript objects over XML
including the following:
 JSON is much more efficient and takes up fewer characters.
 Serializing/deserializing JSON is faster than XML because it’s simpler syntax.
 JSON is easier to read from a developer’s perspective because it is similar to JavaScript
syntax.
Converting JSON to JavaScript Objects
• A JSON string represents the JavaScript object in string form. The string
syntax is similar to code, making it easy to understand.
• You can use the JSON.parse(string) method to convert a string that is
properly formatted with JSON into a JavaScript object.
• For example, the following code snippet defines accountStr as a formatted
JSON string and converts it to a JavaScript object using JSON.parse().
var accountStr = '{"name":"Jedi", "members":["Yoda","Obi Wan"], \
"number":34512, "location": "A galaxy far, far away"}’;
var accountObj = JSON.parse(accountStr);
• Then member properties can be accessed via dot notation.
Converting JSON to JavaScript Objects
//formatted JSON string and converts it to a JavaScript object using JSON.parse()
var accountStr = '{"name":"Jedi", "members":["Yoda","Obi Wan"], \
"number":34512, "location": "A galaxy far, far away"}';
var accountObj = JSON.parse(accountStr);
console.log(accountObj)
console.log(accountObj.name);
console.log(accountObj.members);
console.log(accountObj.number);
console.log(accountObj.location);
Output:
{
name: 'Jedi',
members: [ 'Yoda', 'Obi Wan' ],
number: 34512,
location: 'A galaxy far, far away'
}
Jedi
[ 'Yoda', 'Obi Wan' ]
34512
A galaxy far, far away
Converting JavaScript Objects to JSON String
Node also allows you to convert a JavaScript object into a properly
formatted JSON string.
Thus the string form can be stored in a file or database, sent across an HTTP
connection, or written to a stream/buffer.
Use the JSON.stringify(text) method to parse JSON text and generate a
JavaScript object:
For example, the following code defines a JavaScript object that includes string,
numeric, and array properties.
Using JSON.stringify(), it is all converted to a JSON string.
Converting JavaScript Objects to JSON
//converting JavaScript object to JSON string JSON.stringify()
var accountObj = {
name: "Baggins",
number: 10645,
members: ["Frodo, Bilbo"],
location: "Shire"
};
var accountStr = JSON.stringify(accountObj);
console.log(accountStr);
Output:
{"name":"Baggins","number":10645,"members":["Frodo, Bilbo"],"location":"Shire"}
Reading JSON file
//READING JSON FILE
var http = require('http');
var fs = require('fs');
http.createServer(function (req, res) {
fs.readFile('CUSTOMER.json', 'utf-8', function(err, data) {
if(err)
throw err;
res.writeHead(200, {'Content-Type': 'application/json'});
res.write(data);
//console.log(data);
return res.end();
});
}).listen(8080);
Reading JSON file
//converting JSON FILE to OBJECT AND STRING
var http = require('http');
var fs = require('fs');
fs.readFile('CUSTOMER.json', 'utf-8', function(err,data) {
if(err)
throw err;

var custObj = JSON.parse(data);


console.log(custObj);
var string1=JSON.stringify(custObj);
console.log(string1)
});
Reading JSON file(CUSTOMER.json from mockaroo)
[{"id":1,"first_name":"Tybi","last_name":"O'Kane","email":"tokane0@goo
gle.ru","gender":"Female","ip_address":"102.121.156.208"},
{"id":2,"first_name":"Martha","last_name":"MacHoste","email":"mmachost
e1@chron.com","gender":"Female","ip_address":"220.237.49.213"},
{"id":3,"first_name":"Kent","last_name":"Carsey","email":"kcarsey2@net
scape.com","gender":"Male","ip_address":"236.185.136.75"},
{"id":4,"first_name":"Shell","last_name":"Le
Marquand","email":"slemarquand3@diigo.com","gender":"Male","ip_address
":"134.149.174.248"},
{"id":5,"first_name":"Susie","last_name":"Furmedge","email":"sfurmedge
4@stumbleupon.com","gender":"Female","ip_address":"52.156.20.21"}]
Using the Buffer Module to Buffer
Data
While JavaScript is Unicode friendly, it is not good at managing
binary data.
However, binary data is useful when implementing some web
applications and services.
For example:
■ Transferring compressed files
■ Generating dynamic images
■ Sending serialized binary data
Understanding Buffered Data
• Buffered data is made up of a series of octets in big endian or little endian
format. That means they take up considerably less space than textual data.
• Therefore, Node.js provides the Buffer module that gives you the
functionality to create, read, write, and manipulate binary data in a
buffer structure.
• The Buffer module is global, so you do not need to use the require()
statement to access it.
• Buffered data is stored in a structure similar to that of an array but is
stored outside the normal V8 heap in raw memory allocations. Therefore a
Buffer cannot be resized.
• When converting buffers to and from strings, you need to specify the
explicit encoding method to be used.
Understanding Buffered Data
• Big Endian and Little Endian:
• Binary data in buffers is stored as a series of octets or a sequence of eight
0s and 1s that can be a hexadecimal value of 0x00 to 0xFF (from 00000000
to 11111111.).
• It can be read as a single byte or as a word containing multiple bytes.
• Endian defines the ordering of significant bits when defining the word.
• Big endian stores the least significant word first, and little endian stores
the least significant word last.
• For example, the words 0x0A 0x0B 0x0C 0x0D would be stored in the
buffer as [0x0A, 0x0B, 0x0C, 0x0D] in big endian but as [0x0D, 0x0C,
0x0B, 0x0A] in little endian.
Understanding Buffered Data
Table 5.1 Methods of encoding between strings and binary
buffers
Method Description
utf8 Multi-byte encoded Unicode characters used as the
standard in most documents and
webpages.
utf16le Little endian encoded Unicode characters of 2 or 4
bytes.
ucs2 Same as utf16le.
base64 Base64 string encoding.
Hex Encode each byte as two hexadecimal characters.
Creating Buffers
Buffer objects are actually raw memory allocations; therefore, their
size must be determined when they are created.
The three methods for creating Buffer objects using the new keyword
are
new Buffer(sizeInBytes)
new Buffer(octetArray)
new Buffer(string, [encoding])
For example, the following lines of code define buffers using a byte
size, octet buffer, and a UTF8 string:
var buf256 = new Buffer(256);
var bufOctets = new Buffer([0x6f, 0x63, 0x74, 0x65, 0x74, 0x73]);
var bufUTF8 = new Buffer("Some UTF8 Text \u00b6 \u30c6 \u20ac",
'utf8');
Writing to Buffers
You cannot extend the size of a Buffer object after it has been
created, but you can write data to any location in the buffer.
buffer_write.js: Various ways to write to a Buffer
object
buf256 = new Buffer(256);
buf256.fill(0);
buf256.write("add some text");
console.log(buf256.toString());
buf256.write("more text", 9, 9);
console.log(buf256.toString());
buf256[18] = 43;
console.log(buf256.toString());
Output:
C:\node buffer_write.js
add some text
add some more text
add some more text+
Reading from Buffers
There are several methods for reading from buffers. The simplest is to use the toString()
method to convert all or part of a buffer to a string. However, you can also access specific
indexes in the buffer directly or by using read().
buffer_read.js: Various ways to read from a Buffer object
bufUTF8 = new Buffer("Some UTF8 Text \u00b6 \u30c6 \u20ac",
'utf8');
console.log(bufUTF8.toString());
console.log(bufUTF8.toString('utf8', 5, 9));
var StringDecoder = require('string_decoder').StringDecoder;
var decoder = new StringDecoder('utf8');
console.log(decoder.write(bufUTF8));

Output buffer_read.js: Reading data from a Buffer object


Some UTF8 Text ¶ テ €
UTF8
Some UTF8 Text ¶ テ €
Determining Buffer Length
A common task when dealing with buffers is determining the
length, especially when you create a buffer dynamically from a
string.
The length of a buffer can be determined by calling .length on
the Buffer object.
"UTF8 text \u00b6".length;
//evaluates to 11
Buffer.byteLength("UTF8 text \u00b6", 'utf8');
//evaluates to 12
Buffer("UTF8 text \u00b6").length;
//evaluates to 12
Copying Buffers
An important part of working with buffers is the ability to copy
data from one buffer into another buffer.
Node.js provides the method on Buffer objects.
copy(targetBuffer, [targetStart], [sourceStart],
[sourceIndex])
The targetBuffer parameter is another Buffer object,
and targetStart, sourceStart, and sourceEnd are indexes inside
the source and target buffers.
You can also copy data from one buffer to the other by indexing
them directly, for example:
sourceBuffer[index] = destinationBuffer[index]
Slicing Buffers

Another important aspect of working with buffers is the ability


to divide them into slices.
A slice is a section of a buffer between a starting index and an
ending index.
Slicing a buffer allows you to manipulate a specific chunk.
Slices are created using the slice([start], [end]) method,
which returns a Buffer object that points to start index of the
original buffer and has a length of end – start.
Keep in mind that a slice is different from a copy. If you edit a
copy, the original does not change.
However, if you edit a slice, the original does change.
Concatenating Buffer objects
• You can also concatenate two or more Buffer objects
together to form a new buffer.
• The concat(list, [totalLength]) method accepts an array of
Buffer objects as the first parameter, and totalLength defines
the maximum bytes in the buffer as an optional second
argument.
• The Buffer objects are concatenated in the order they appear
in the list, and a new Buffer object is returned containing the
contents of the original buffers up to totalLength bytes.
Using the Stream Module to Stream
Data
• An important module in Node.js is the stream module.
• Data streams are memory structures that are readable, writable,
or both.
• Streams are used all over in Node.js, for example, when
accessing files or reading data from HTTP requests and in several
other areas.
• Use stream module to create streams as well as read and write
data from them.
• The purpose of streams is to provide a common mechanism to
transfer data from one location to another.
• They also expose events, such as when data is available to be
read, when an error occurs, and so on.
• You can then register listeners to handle the data when it
becomes available in a stream or is ready to be written to.
Advantages
• Streams basically provide two major advantages compared
to other data handling methods:

• Memory efficiency: you don’t need to load large amounts of


data in memory before you are able to process it.
• Time efficiency: it takes significantly less time to start
processing data as soon as you have it, rather than having
to wait with processing until the entire payload has been
transmitted
What is a stream?
• Streams are data collected from a source and brought to another location in a
sequence.
• Streaming a video online is an example: while the video content is passed to
you in a sequence, the full content is not available yet.
• Streams are divided into four categories:
• Writable,
• Readable,
• Duplex,
• and Transform.
What is a stream?
• Readable streams read data from a file or source and pass it to the main
application. A buffer then stores the data in case there is a delay passing the data
to the application.
• When Writable streams, the functionality is opposite. The data is read from the
application to the file. There is also a buffer if the data transfer slows, and it
then stores it there.
• Duplex streams, on the other hand, are a mixture of both the readable and
writable streams where both streams are independent of each other.
• Transform streams are also like Duplex, but both the readable and writable
streams are connected. The connection enables the application to write data to
the application, but there the data must be manipulated before passing to the
readable stream.
Using the Stream Module to Stream Data
• There are four fundamental stream types in Node.js: Readable, Writable,
Duplex, and Transform streams.
• A readable stream is an abstraction for a source from which data can be
consumed. An example of that is the fs.createReadStream method.
• A writable stream is an abstraction for a destination to which data can be
written. An example of that is the fs.createWriteStream method.
• A duplex streams is both Readable and Writable. An example of that is a TCP
socket.
• A transform stream is basically a duplex stream that can be used to modify or
transform the data as it is written and read. An example of that is the
zlib.createGzip stream to compress the data using gzip.
• All streams are instances of EventEmitter. They emit events that can be used to
read and write data. However, we can consume streams data in a simpler way
using the pipe method.
There are 4 types of streams in Node.js( Examples):
• For example, in a Node.js based HTTP server, request is a readable stream
and response is a writable stream.
• You might have used the fs module, which lets you work with both readable
and writable file streams.
• Whenever you’re using Express you are using streams to interact with the
client,
• also, streams are being used in every database connection driver that you
can work with, because of TCP sockets, TLS stack and other connections are
all based on Node.js streams.
Readable Streams
• Readable streams provide a mechanism to easily read data
coming into your application from another source.
Some common examples of readable streams are:
■ HTTP responses on the client
■ HTTP requests on the server
■ fs read streams
■ zlib streams
■ crypto streams
■ TCP sockets
■ Child processes stdout and stderr
■ process.stdin
Readable Streams
Readable streams provide the read([size]) method to read data
where size specifies the number of bytes to read from the stream.
read() can return a String, Buffer or null.
Readable streams also expose the following events:
■ readable: Emitted when a chunk of data can be read from the
stream.
■ data: Similar to readable except that when data event handlers are
attached, the
stream is turned into flowing mode, and the data handler is
called continuously until all data has been drained.
■ end: Emitted by the stream when data will no longer be provided.
■ close: Emitted when the underlying resource, such as a file, has
been closed.
■ error: Emitted when an error occurs receiving data.
Readable Streams
Readable streams methods:
stream_read.js: Implementing a Readable stream object
// read stream
var fs = require("fs");
var data = '';
var readerStream = fs.createReadStream('file1.txt'); //Create a readable stream
readerStream.setEncoding('UTF8'); // Set the encoding to be utf8.
// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});
readerStream.on('end',function() {
console.log(data);
});
readerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
stream_read.js: Implementing a Readable stream object
File.txt
hello world how are you
thank you
bye

Output:

Program Ended
hello world how are you
thank you
bye
Writable Streams
Writable streams are designed to provide a mechanism to write data
into a form that can easily be consumed in another area of code. Some
common examples of Writable streams are:
■ HTTP requests on the client
■ HTTP responses on the server
■ fs write streams
■ zlib streams
■ crypto streams
■ TCP sockets
■ Child process stdin
■ process.stdout, process.stderr
Writable Streams
Writable streams provide the write(chunk, [encoding], [callback])
method to write data into the stream, where chunk contains the data to
write, encoding specifies the string encoding if necessary, and callback
specifies a callback function to execute when the data has been fully
flushed.
The write() function returns true if the data was written successfully.
Writable streams also expose the following events:
■ drain: After a write() call returns false, the drain event is emitted to
notify listeners when it is okay to begin writing more data.
■ finish: Emitted when end() is called on the Writable object; all data is
flushed and no more data will be accepted.
■ pipe: Emitted when the pipe() method is called on a Readable stream to
add this Writable as a destination
 unpipe: Emitted when the unpipe() method is called on a Readable
stream to remove this Writable as a destination.
Writable Streams
Writable stream methods:
Writable Streams
//write stream
var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');
readableStream.setEncoding('utf8');
readableStream.on('data', function(chunk) {
writableStream.write(chunk);
});
console.log("data copied....from file1 to file2")
Writable Streams
Output:
File1.txt
hello world
how are you

File2.txt
hello world
how are you
Duplex Streams
• Duplex Streams implement both the Readable and Writable
interfaces.
• A prime example of Duplex Streams is a Socket.
• Sockets provide two passages for sending and receiving
information.
• Other examples of Duplex Streams are TCP sockets, zlib
streams, and crypto streams.
• When creating a Duplex Stream,
a) start off by implementing the required methods (File System), we’ll
be utilizing the PassThrough Stream which is a standard version of
Duplex and acts as a connecting tunnel between Readable Stream
and Writable Stream.
b) The next step is reading a file and including it into a Writeable
Stream with WriteStream.
Transform Streams
Another type of stream is the Transform stream.
• A Transform stream extends the Duplex stream but
modifies the data between the Writable stream and the
Readable stream.
• This can be useful when you need to modify data from
one system to another.
• Some examples of Transform streams are
■ zlib streams
■ crypto streams

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy