Introduction
The term CSV is an abbreviation that stands for comma-separated values.
A CSV file is a plain text file that contains data formatted according to the CSV standard. It has distinct lines which represent records and each field in the record is separated from another by a comma.
It's very convenient to store tabular data in CSV:
Name,Surname,Age,Gender
John,Snow,26,M
Clair,White,33,F
Fancy,Brown,78,F
Here, the first row represents the titles of the columns/fields of our CSV records and then there are 3 records that represent certain people. As you can see, the values are delimited by commas and each record starts on a new row.
Hey, but what if we want to include commas or line breaks to some of the fields that are stored in the CSV format?
There are several approaches to solving this issue, for example, we could wrap up such values in double quotes. Some of the CVS implementations don't support this feature by design, though.
CSV Standardization
One of the most commonly used CSV standards is described in the RFC4180.
According to it, the CSV format is described by these 7 rules:
- Each record is located on a separate line, delimited by a line break (CRLF).
- The last record in the file may or may not have an ending line break.
- There may be an optional header line appearing as the first line of the file with the same format as normal record lines. This header will contain names corresponding to the fields in the file and should contain the same number of fields as the records in the rest of the file (the presence or absence of the header line should be indicated via the optional "header" parameter of this MIME type).
- Within the header and each record, there may be one or more fields, separated by commas. Each line should contain the same number of fields throughout the file. Spaces are considered part of a field and should not be ignored. The last field in the record must not be followed by a comma.
- Each field may or may not be enclosed in double quotes (however some programs, such as Microsoft Excel, do not use double quotes at all). If fields are not enclosed with double quotes, then double quotes may not appear inside the fields.
- Fields containing line breaks (CRLF), double quotes, and commas should be enclosed in double-quotes.
- If double-quotes are used to enclose fields, then a double-quote appearing inside a field must be escaped by preceding it with another double quote.
If you're interested in reading more with multiple examples, you can study the original RFC4180 document, linked above.
Reading CSV Files in Node.js
Many different npm modules let you read from a CSV file.
Most of them are based on streams, like csv-parser
or node-csv
.
Those are great to deal with CSV in a production system.
I like to keep things simple when I don’t have performance in mind. For example, for a one-time parsing of CSV that I had to do to consolidate my backend systems.
To do so, I used neat-csv
, a package that exposes the csv-parser
functionality to a simple async/await interface.
Install it using npm install neat-csv
and require it in your app:
const neatCsv = require('neat-csv');
then load the CSV from the filesystem and invoke neatCsv passing the content of the file:
const fs = require('fs')
fs.readFile('./file.csv', async (err, data) => {
if (err) {
console.error(err)
return
}
console.log(await neatCsv(data))
})
Now you can start doing whatever you need to do with the data, which is formatted as a JavaScript array of objects.
Writing CSV Files in Node.js
A great library you can use to quickly write an array of objects to a CSV file using Node.js is objects-to-csv
.
Many other libraries exist, of course. I found this useful for a project of mine where I had to generate a one-time CSV file, so I wrote this little tutorial.
Using a stream-based library like fast-csv
might suits your needs in more performance-oriented applications.
Install it using:
npm install objects-to-csv
then require it in your Node.js code:
const ObjectsToCsv = require('objects-to-csv')
When you have an array of objects ready to write to CSV, initialize a new ObjectsToCsv object instance:
const csv = new ObjectsToCsv(list)
then call csv.toDisk()
, passing the file you want to write to (relative to your app base path):
await csv.toDisk('./list.csv')
This is a promise-based API and I used await
, so you need to call this inside an async function.
The column names in the CSV are automatically inferred from the object properties names.
Note that this command overwrites the existing content of the file. To append to that file, pass a second object with the append
property set to true:
await csv.toDisk('./list.csv', { append: true })
Conclusion
Reading and writing CSV files with Node.js is a common development task as a CSV format is commonly used to store structured tabular data. Many npm
modules provide this functionality, so you should choose the one that suits best to your need and has ongoing support.
Comments
Post a Comment