-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New sort-csv
command
#1295
Comments
@hitenkoku In order to sort without using a lot of memory, it might be good to import the CSV data into a temporary sqlite database, sort the sqlite database and then export the results out back to CSV. What do you think? |
So that an investigator can collect many CSV files from endpoints with velociraptor we should also support directory input. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Since
--low-memory-mode
can not sort or remove duplicate entries it would be nice to have a command that can do this in post-processing.sort-csv
:sort and remove duplicate detections
@hitenkoku Since you did the
-X, --remove-duplicate-detections
, could I ask you to do this one?We need to keep the CSV header at the top and sort by timestamp, then remove duplicate entries but ignore the EvtxFile column if there is one because sometimes different
.evtx
files will have the same records. (backup files, etc...) I think this is the same logic as-X, --remove-duplicate-detections
, is that correct?Options:
-f, --file <FILE> Input file
-d, --directory <DIRECTORY> Input directory
-o, --output <FILE> Output file
-C, --clobber Overwrite files when saving
The text was updated successfully, but these errors were encountered: