Skip to content

spencermountain/sunday-driver

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

sunday-driver

process a large file, at a steady cruise

🕶️

- slow is smooth, smooth is fast -

sunday-driver works through a large file at a responsible pace - it pauses to let you consider the data, at given points, and waits to resume working once that's all done.

this allows processing a large file, by sizable chunks, without any race-conditions or memory leaking.

(heavily) inspired by line-by-line, by Markus Ostertag🙏

npm i sunday-driver
const sundayDriver = require('sunday-driver')

let options= {
	file: './my/large/file.tsv',
	splitter: '\n',
	start: '80%', //as percentages, or in bytes
	end: '100%',
	//do your thing, for each segment
	each: (chunk, resume) => {
		console.log(chunk)//do your thing..
		resume()
	}
	//log progress-based events
	atPercent: {
		50: (status) => {
			console.log('50%!')
		},
		75: () => {
			console.log('75%!')
		},
	},
	//log time-based events
	atInterval: {
		'1min': (status) => {
		console.log('1 minute')
		},
		'2mins': () => {
		console.log('2 minutes')
		},
	}
}

sundayDriver(options).then((status)=>{
	console.log('done!')
})

any events/intervals will provide you with all the details of the current reader's status:

/*{
  chunksDone:: 10,     // how many times we've called the 'each' function
	bytesDone:: 20480,   // how many bytes we've processed so far
	filesize:: 61440,    // size of the whole file
	position: 34.42,     // where, in percentage, we are in the file. (if we didn't start at the top!)
	progress: 68.84      // how far, in percentage, we are to being complete
}*/

it was built to support unleashing multiple workers on the same file, and letting them run safely and responsibly, without blowing any fuses.

MIT

Releases

No releases published

Packages

No packages published