text editor to delete duplicates from pool of large files
#3
(07-13-2023, 02:56 PM)Snoopy Wrote:
(07-13-2023, 02:47 PM)ataman4uk Wrote: Hello
Could you please give me an advice, which text editor or maybe other software could I use to clean duplicates in the folder filled with  a lot of .txt files, which have about 200gb total size? I mean , not the duplicates in every file separately, but to clean duplicates all over those all files in the same folder.

I am using notepad to open those 10+ GB txt files, but i need to clean them up. 
So i am thinking about using 2 ways:
merge all txt files into one large ( its size will be > RAM)
Find tools\software to clean folder with several .txt files.

just for searching fileduplicates

the onyl thing you can do progging yourself a little script or using linux basic programms like cat -> sort -> uniq

thank you
i am using windows as OS, and i ve used Git Bash, for example, for splitting those files , so each one is 4gb and it not takes 1 hhour to open 200gb text file. And now I am cofused how to clean them. 
It seems like i should merge them back into one large
Reply


Messages In This Thread
RE: text editor to delete duplicates from pool of large files - by blaster666 - 07-13-2023, 03:25 PM