GitXplorerGitXplorer
t

reading-large-files-with-node-streams

public
0 stars
0 forks
0 issues

Commits

List of commits on branch main.
Unverified
44947d8003883556f1603791eae32a8c0156e66a

feat: show total of inserts

tthenriquedb committed a month ago
Unverified
076d2df2b0b1408d24a85c294bc11ddb12d677f9

fix: use mjs extension instead js

tthenriquedb committed a month ago
Unverified
8ce3bd14821f5ad1d4a21dbf92bfb47122be8e47

docs: README.md

tthenriquedb committed a month ago
Unverified
10dee625e802a493ba38420ba212af2c5a59757a

refactor: move database to separate file

tthenriquedb committed a month ago
Unverified
6d5c1d7397b414a13c2847cafb9040faf0c70cd0

feat: first commit

tthenriquedb committed a month ago

README

The README file for this repository.

CSV Read/Write with Node.js Streams and SQLite

This is a proof of concept (POC) demonstrating how to read and write large CSV files efficiently using Node.js streams. The data read from the CSV is also inserted into an SQLite database concurrently, making it suitable for handling large datasets without consuming excessive memory.

Features

  • Efficiently read large CSV files using Node.js streams.
  • Write data to new CSV files with minimal memory usage.
  • Insert CSV data into an SQLite database as it is read, line by line.
  • Suitable for processing large datasets without loading the entire file into memory.

Install

  1. Install dependecies
yarn
  1. Generate CSV file.
# If the number of records is not passeddd, the default value will be 50
yarn seed <number of records>
  1. Run aplication
yarn migrate