Report duplicates files and document
Pull reports on duplicated files or documents. Clean them after.

This is not in our short or medium plan. We understand the need but don’t feel like it’s our priority right now as we focus on the adoption of the modern workplace.
-
Anonymous commented
Being able to run Deduplication reports for Pre-Migration cleanup, or post migration management.
-
lon.ramsey commented
It would be nice if this not only checked for duplicate file names but content. I have a site with several folders where the users have used duplicate file names, but the content is actually different.
-
Karyn commented
This would be so awesome, specially as we migrate from fileshare into o365
-
Anonymous commented
Report duplicates files and document is very critical functionality
-
Anonymous commented
This is killing our project right now
-
Anonymous commented
Hello, do we have any news from product group regarding this functionality?
-
Neil Howell commented
Have just started a file share migration to a Sharepoint Library, I am utilising Column Metadata instead of folder names that are by "Year & Month". I am seeing lots of duplicate file names that are simply overriding the previous filename that is not the same file. Maybe an option to rename a file if detecting a duplicate name, like a tick box in the options field.
-
Anonymous commented
Would be great to have an inventory of duplicate documents stored across site collections based on document name / title.
-
Anonymous commented
Hello,
Please consider adding a "Duplicate Files" report to the Reporting section. The report should help identify files with the same name across libraries / lists with the same file name.
Thank you for your consideration.
-
Can you scan my environment to find all duplicate files and documents, then ask the user which one is the right one in order to clean the unused versions in the end.
-
Chris Eaheart commented
Going cross-tenant is key here - looking at a single site collection won't cut it. Must also look across site collections created as part of an O365 Group.
-
Bob S commented
A true duplicate would check the file contents using a file hash. But the entities you mention would be great!
-
Fontys Hogescholen commented
We are also very curious about the developments.
-
Theo Verdel commented
He Nathalie,
Please can you tell me: are there developments concerning duplicated files. For us it will be very helpfull to use such a feature. -
Anonymous commented
Is there any way to find duplicate files? This would be extremely helpful for us when migrating from shared drives to SharePoint, and even for doc libraries in SharePoint. For instance, one file share has a million files, and over 20,000 duplicates that another file analysis software we use found. It would be great if this is something ShareGate could help track. I didn't even see the option when creating a custom report. Thank you!
-
Anonymous commented
It would be handy if Inventory had a report on duplicate files. I have not made progress eliminating folders and continue to find the same document, or a different version , in multiple folders.
-
Anonymous commented
It would be handy if Inventory had a report on duplicate files. I have not made progress eliminating folders and continue to find the same document, or a different version , in multiple folders.
-
Dave Kuehling commented
Hi Sharegate!
For me it would be based on more than just the metadata, but an actual bit level check. I have found, especially when looking to pull in content from file shares, the file content will be the same, but the name, date stamps, and other things could be different, and will likely be in a different folder. I think the data points you mention are good to use as well, but having that bit level check will truly identify dups.
Thanks!!
-
Anonymous commented
I'm not sure if I'm just missing it, but it would be great to have an option to get duplicate file information and maybe even who has all the duplicates so we can contact them. Just a thought.
-
Would be nice to see the duplicate items in fileshares so we don't migrate them twice