Run a Source Analysis (formally Inventory) from Powershell
Please add the option of running an inventory from Powershell.

This is not in our plans for now as we are in the process of rethinking the inventory feature.
-
Spikemans commented
Not having this option available in powershell makes all our migrations very labour intensive.
Please Please make this available in PowerShell -
Nandy1234 commented
Any update about this feature. Is there any possibility to perform perform source analysis or get inventory report of file shares using Sharegate PowerShell commands
-
Vaibhav Chopra commented
This would be a very good feature to have. We have been using Sharegate to automate a lot of our migrations and this feature will help us resolve a lot of dependencies and manual labor.
-
Michael Baker commented
Similar in functionality to running Copy-Site with "-WhatIf", adding the ability to invoke the same source analysis process via the console would give a massive boost to the efficiency of migration projects for large on-prem farms.
Even though on the surface it would seem trivial, the amount of typing, clicking, navigating (per source) is almost entirely repetitive and, for very large farms, adds up to massive amount of dedicated (but nonproductive) time for a migration SME. Starting a source analysis is very simple but, doing it exclusively in the GUI requires more manual work than it would appear at the start.
First, a URL can only be selected via the search results dropdown. So, if a farm holds, say 3000 sites, trying to initialize the process against a subset of 100 of them means manually typing short search strings, over and over, until all the URLs are selected. That alone is 400 mouse clicks -- and not a single bit of data has been analyzed yet. What's more, if you want to re-analyze the same sites later for comparison, the whole "site selection" process has to be started over from scratch.
Also, after the selected sites complete their analysis process, the task list does not offer display filtering beyond narrowing it down to one day. When hundreds of analysis reports are being run, by multiple people, throughout the day, it becomes nearly impossible for one to find the reports they ran as they are intermixed among all the others.
Based on my experience, over the course of a large SharePoint migration project, it can easily add up to 25-30 hours, per FTE, just wading through these steps. I am fully in favor of potentially destructive processes (e.g. site migration) being slow and deliberative by design. But the source analysis is strictly a read-only mechanism and makes the wasted hours even more painful to accept.
Being able to script this process would take a fraction of the time and, more importantly, become more valuable as a project progresses; the script, once written, could be used again and again. The resource running the script can be tracked and reports could be extracted based upon that information.
-
SjoerdV commented
-
SjoerdV commented
Rediculous you are stuck using a bulky UI which persists that you login to the target location first, while all you want is to make a local inventory of the source...
-
Kevin Bryan commented
I think this is a critical element that should be added. In a large organization where files shares are in many locations and may have reasons for retaining old/disabled user directories - I can run an active user report in powershell to dump into excel and then use that active list to then run inventory and pre-check.
-
Fredrik commented
Yes
This would really make it easier to plan migration in large organisations where you need to group users in reasonable batches. -
Wenting Huang commented
Good idea, that's one more step closer to automation.
-
mats.warnolf commented
File Share Inventory powershell module would be lovely. That way I can produce separate user reports with a nice foreach loop.