Run a Source Analysis (formally Inventory) from Powershell
Please add the option of running an inventory from Powershell.
This is not in our plans for now as we are in the process of rethinking the inventory feature.
Michael Baker commented
Similar in functionality to running Copy-Site with "-WhatIf", adding the ability to invoke the same source analysis process via the console would give a massive boost to the efficiency of migration projects for large on-prem farms.
Even though on the surface it would seem trivial, the amount of typing, clicking, navigating (per source) is almost entirely repetitive and, for very large farms, adds up to massive amount of dedicated (but nonproductive) time for a migration SME. Starting a source analysis is very simple but, doing it exclusively in the GUI requires more manual work than it would appear at the start.
First, a URL can only be selected via the search results dropdown. So, if a farm holds, say 3000 sites, trying to initialize the process against a subset of 100 of them means manually typing short search strings, over and over, until all the URLs are selected. That alone is 400 mouse clicks -- and not a single bit of data has been analyzed yet. What's more, if you want to re-analyze the same sites later for comparison, the whole "site selection" process has to be started over from scratch.
Also, after the selected sites complete their analysis process, the task list does not offer display filtering beyond narrowing it down to one day. When hundreds of analysis reports are being run, by multiple people, throughout the day, it becomes nearly impossible for one to find the reports they ran as they are intermixed among all the others.
Based on my experience, over the course of a large SharePoint migration project, it can easily add up to 25-30 hours, per FTE, just wading through these steps. I am fully in favor of potentially destructive processes (e.g. site migration) being slow and deliberative by design. But the source analysis is strictly a read-only mechanism and makes the wasted hours even more painful to accept.
Being able to script this process would take a fraction of the time and, more importantly, become more valuable as a project progresses; the script, once written, could be used again and again. The resource running the script can be tracked and reports could be extracted based upon that information.
Rediculous you are stuck using a bulky UI which persists that you login to the target location first, while all you want is to make a local inventory of the source...
Kevin Bryan commented
I think this is a critical element that should be added. In a large organization where files shares are in many locations and may have reasons for retaining old/disabled user directories - I can run an active user report in powershell to dump into excel and then use that active list to then run inventory and pre-check.
This would really make it easier to plan migration in large organisations where you need to group users in reasonable batches.
Wenting Huang commented
Good idea, that's one more step closer to automation.
File Share Inventory powershell module would be lovely. That way I can produce separate user reports with a nice foreach loop.