14 min read

In this article by Prashant G Bhoyar and Martin Machado, authors of the book, PowerShell for Office 365, you will learn about OneDrive which is another key component for the digital workplace. Individuals can safely keep/sync content on the cloud, making it available anywhere while businesses can manage and retain content securely. In this article, we’ll go over common provisioning and management scenarios.

(For more resources related to this topic, see here.)

In the early days, SharePoint was positioned as a great replacement for file shares. SharePoint addressed some important pain points of file shares: versioning, recycle bin, check in/check out, history/auditing, web interface, and custom metadata features, to name a few. Fast forward to the present, SharePoint and other content management system products have effectively replaced file shares in the collaboration space. Yet, file shares still remain very relevant to personal storage. Albeit you would hardly qualify OneDrive for business as a file share (at least not the one from 10 years ago). Officially defined as file-hosting products, OneDrive and OneDrive for business still offer the convenience of operating system integration of file shares while adopting important features from the CMS world.

Recently, Microsoft has also rolled out OneDrive for Office Groups, making the case for small group collaboration through OneDrive.

Why start with SharePoint in the article on OneDrive, you ask? I am glad you did. At the time of writing this, there are a few differences between OneDrive and SharePoint. All the OneDrive administration commands are included within the SPO API. OneDrive’s web interface is a trimmed down SharePoint site, and you can use the same SharePoint CSOM/REST APIs to work with OneDrive. From an administrator’s perspective, OneDrive can be thought of as a synchronization client (in charge of keeping data consistent between local copies and online storage) and a web interface (branded and customized SharePoint site).

Will this be the case in the long run? At the moment, we are going through a transition period. From the writer’s point of view, SharePoint will continue to provide infrastructure for OneDrive and other services. However, Microsoft is in an ongoing effort to provide one API for all its online services. Also, as the platform matures, the lines between OneDrive, SharePoint, Exchange and other services seem to blur more and more. In the long run, it is quite possible that these products will merge or change in ways we have not thought of. With the maturity of the Microsoft Graph API (the promised API to access all your services), the internal implementation of the services will be less important for developers and administrators.

In the Graph API, both OneDrive and SharePoint document libraries are referred to as ‘drives’ and files or list items within them as ‘driveitems’. This is an indication that even though change is certain, both feature sets will remain similar.

In this article, we will cover OneDrive administration, which can be divided into three different areas:

  • Feature configuration
  • Personal site management
  • Data migration

Feature configuration

The following are properties of the Set-SPOTenant command that can be used to configure the OneDrive user experience:

  • OneDriveStorageQuota: By default, OneDrive’s storage quota is set to 1 TB. The policy value can be changed through the Set-SPOTenant command, and the existing sites quotas can be changed through the Set-SPOSite command. This value is set in megabytes (1048576 for 1 TB) and will be capped by the user’s assigned license.

In the following example, we change the quota policy to 6 TB, but the value is effectively set at 5 TB as it is the highest value allowed for standard licenses:

$quota = 6TB / 1024 / 1024
Set-SPOTenant -OneDriveStorageQuota $quota
Get-SPOTenant | Select OneDriveStorageQuota
OneDriveStorageQuota
--------------------
5242880

Individual site quotas can be reviewed and updated using the Get-SPOSite and Set-SPOSite commands.In the following sample, note that after updating the quotas for the individual sites, we have to use Get-SPOSite to see the updated values (changes to sites will not be updated in local variables):

$mySites = Get-SPOSite -IncludePersonalSite $true -Filter { Url -like
'/personal/'}
$mySites | Select StorageQuota, StorageUsageCurrent
StorageQuota StorageUsageCurrent
------------ -------------------
1048576 6
1048576 1
5242880 15
$quota = 3TB / 1024 / 1024
foreach ($site in $mySites) {
Set-SPOSite -Identity $site -StorageQuota $quota
}
$mySites = Get-SPOSite -IncludePersonalSite $true -Filter { Url -like
'/personal/'}
$mySites | Select StorageQuota
StorageQuota
------------
31457286
31457286
31457286
  • NotifyOwnersWhenInvitationsAccepted: When set to true, the OneDrive owner will be notified when an external user accepts an invitation to access a file or folder.
  • NotifyOwnersWhenItemsReshared : When set to true, the OneDrive owner will be notified when a file or folder is shared by another user.
  • OrphanedPersonalSitesRetentionPeriod: When a user is deleted, the OneDrive will be retained by a default of 30 days after the threshold the site will be deleted (value in days from 30 to 3650).
  • ProvisionSharedWithEveryoneFolder: If set to true, a public folder will be set up when a OneDrive site is set up. The ‘Shared with Everyone’ folder is not accessible through the OneDrive client, but it can be used through the browser and is accessible by all users.
  • SpecialCharactersStateInFileFolderNames: Allows the use of special characters in files and folders (applies to both SharePoint and OneDrive). Currently, the only special characters that can be allowed are # and %. Microsoft has announced that support for additional special characters will be rolled out soon.
  • Client Synchronization: Synchronization Set-SPOTenantSyncClientRestriction

Personal Site management

Historically, personal sites (or My Sites) have been a management problem. When planning a deployment, you have to consider your user base, the turnover in your organization, the internal policy for content storage, and many other factors. In Office 365, some of these factors have been addressed, but largely, the MySites deployment (as well as any other large-scale site deployment) remains a usage problem.

With the introduction of quotas, you can cap both storage and resources allocated for a site. By default, MySites get 1 TB of space; unfortunately, the quotas cannot be set in the Request-SPOPersonalSite command, which is used for the provisioning of personal sites.

Another issue with personal sites is that it takes a few minutes to set them up. It is very common that an administrator will pre-provision personal sites for the organization. At the time of writing this, One Drive is implemented as Personal Sites, which means that the scripts we will review also apply to provisioning OneDrive. This is a very common task for migrations to the cloud:

Request-SPOPersonalSite -UserEmails <String[]> [-NoWait <SwitchParameter>]

The Request-SPOPersonalSite command has only two parameters, yet its usage is worth documenting due to some common issues.

If you are deploying for a small list of users, an inline array of strings will schedule the creation of the sites. It is worth noting that the command will not return errors if the users are not found or if the user count exceeds 200 items. In general, you will have to validate that the process has completed:

Request-SPOPersonalSite -UserEmails '[email protected]',
'[email protected]' -NoWait $true

It is very common that the list of users will be read from a file or a csv input. In the following example, we parse a comma-separated list of e-mails using Split. Even though the documentation specifies an array of strings, this call will not work unless we transform the string array into an object array through the use of the Where command:

Request-SPOPersonalSite -UserEmails
('[email protected],[email protected]'.Split('
,')|
Where-Object {$true})

Another common scenario is to deploy personal sites for a list of users already in SharePoint Online. The following script will retrieve all users with a valid login (login in the form of an e-mail). Note the use of the ExpandProperty parameter to return just the LoginName property of the users:

$users = Get-SPOUser -Site https://mytest321.sharepoint.com |
Where-Object { $_.IsGroup -ne $true -and $_.LoginName -like '*@*.*'} |
Select-Object -ExpandProperty LoginName;

If the list is small, we can iterate over the list of users or schedule the provisioning in one call. It is safe to schedule the personal site for a user that already has one (will be silently skipped), but there will be no warning when submitting over 200 requests:

#indivudal request
$users | ForEach-Object {
Request-SPOPersonalSite -UserEmails $_
}
#bulk
Request-SPOPersonalSite -UserEmails $users

If you are dealing with many users, you can create groups of 200 items instead and submit them in bulk:

# Group by requests of 200 emails -----------------------------------------
--------------------
$groups = $users | Group-Object {[int]($users.IndexOf($_)/200)}
# send requests in 200 batches, do no wait for a response
$groups | ForEach-Object {
$logins = $_.Group;
Write-Host 'Creating sites for: '$logins
Request-SPOPersonalSite -NoWait -UserEmails $logins
}

It is up to the administrator to verify the completion of the request or if any of the users were not found.

To complete the scenario, the following script will select and delete all personal sites:

$mySites = Get-SPOSite -IncludePersonalSite $true -Filter { Url -like
'/personal/'}
$mySites | Remove-SPOSite -Confirm:$false

To be able to access and manage OneDrives, administrators need to be site collection administrators of the OneDrive (remember that it is a SharePoint site). The SharePoint Tenant administration site has an option to add a secondary administrator when sites are provisioned, but this setting will not apply to sites that are already created. In the following script, we add an additional site collection administrator to all existing OneDrives:

$mySites = Get-SPOSite -IncludePersonalSite $true -Filter { Url -like
'/personal/'}
foreach ($site in $mySites) {
Set-SPOUser -Site $site -LoginName [email protected] -
IsSiteCollectionAdmin $true
}

Data migration

The last topic concerning site collections is document migrations. All the content covered in this article also applies to SharePoint sites. There are three alternative methods to upload data in Office 365:

  • The CSOM API
  • The SPO API
  • Office 365 Import Service

Let’s look at each one in detail.

CSOM API

Initially, the CSOM API was the only method available to upload documents to SharePoint Online. CSOM is a comprehensive API that is used for application development and administration. It is a great tool for a myriad scenarios, but it is not specialized for content migrations. When used for this purpose, we can go over the API throttling limits (Microsoft has purposely not put a specific number to this as it depends on multiple factors). Your scripts might get temporarily blocked (requests will get a 429 ‘Too Many Requests’ HTTP error), and if the misuse continues for an extended period of time, your tenant might get blocked altogether (503 ‘Service Unavailable’). The tenant administrator would have to take action in this case.

API throttling is put in place to guarantee platform health. The patterns and practices Throttling project shows how to work around this limitation for legitimate scenarios at https://github.com/SharePoint/PnP/tree/dev/Samples/Core.Throttling.

Moreover, the bandwidth allocated for the CSOM API will allow you to upload approximately 1 Gb/hour only (depending on multiple factors such as the file size, the number of files, networking, and concurrent API usage), which makes it impractical for large content migrations.

In the next sections, you will see faster and easier approaches to bulk migrations, yet the CSOM API remains relevant in this scenario. This is because at the time of writing this, it is the only method that allows metadata modification. It is also worth mentioning that CSOM changes are reflected immediately, whereas updates through the other methods will take some time to be effective due to the architecture of the process.

In our experience doing content migrations, the bulk of the tasks are done with the SPO API, yet CSOM is better suited for last minute changes or ad-hoc requests.

The following sample shows how to upload a file and set its metadata. This method will be used for small migrations or to set the file metadata:

$siteUrl = "https://mytest321.sharepoint.com/personal/admin1";
$clientContext = New-Object
Microsoft.SharePoint.Client.ClientContext($siteUrl)
$credentials = New-Object
Microsoft.SharePoint.Client.SharePointOnlineCredentials($spoCreds.UserName,
$spoCreds.Password)
$clientContext.Credentials = $credentials
$stream = [System.IO.File]::OpenRead('c:tempfileToMigrate.xml')
$overwrite = $true
$fileUrl = '/personal/admin1/Documents/file.xml'
[Microsoft.SharePoint.Client.File]::SaveBinaryDirect($clientContext,
$fileUrl, $stream, $overwrite)
$listItem =
$clientContext.Web.GetFileByServerRelativeUrl($fileUrl).ListItemAllFields
$listItem["Title"] = 'Updated via script'
$listItem.Update()
$clientContext.ExecuteQuery()

SPO Migration API

The SPO API has a handful of commands to support the migration of content to SharePoint or OneDrive sites. The main advantage in this case is that the migration package is first uploaded to the Azure Blob storage. The contents are encrypted while in the temporary storage and can be processed in parallel. Being able to take advantage of the enhanced bandwidth and parallel processing makes this approach necessary when dealing with hundreds of gigabytes or many different destinations (typically the case when migrating OneDrive content). The costs of transfer and storage of your data are minimal when considering that the upload speed increases ten-fold in comparison to the CSOM approach. With this approach, you can submit multiple packages and execute them in parallel. When first released, the platform allowed up to 16 concurrent migrations; however, this number has increased lately. As an administrator, you will have to monitor the state and results of each migration package. Let’s look at a few commands that will help us in achieving this:

  • New-SPOMigrationPackage:
    New-SPOMigrationPackage -OutputPackagePath <String>
    -SourceFilesPath <String> [-IgnoreHidden <SwitchParameter>]
    [-IncludeFileSharePermissions <SwitchParameter>]
    [-NoAzureADLookup <SwitchParameter>]
    [-NoLogFile <SwitchParameter>]
    [-ReplaceInvalidCharacters <SwitchParameter>]
    [-TargetDocumentLibraryPath <String>]
    [-TargetDocumentLibrarySubFolderPath <String>]
    [-TargetWebUrl <String>]

    We begin by creating a migration package using New-SPOMigrationPackage. The command will create a package with the contents of a folder and include options to match accounts by name, include file permissions, and upload to a specific subfolder of a library: 

    $sourceFolder = 'C:mydocs'
    $packageFolder = 'C:temppackage1'
    $targetWeb =
    'https://mytest321-my.sharepoint.com/personal/admin1'
    $targetLib = 'Documents'
    New-SPOMigrationPackage -SourceFilesPath $sourceFolder
    - OutputPackagePath $packageFolder ` -NoAzureADLookup `
  • ConvertTo-SPOMigrationTargetedPackage: The ConvertTo-SPOMigrationTargetPackage command allows you to set the target website URL, library, and folder for the migration. In the following sample, we use the ParallelImport and PartitionSizeInBytes parameters to break up the migration into multiple packages. Breaking up the upload into multiple packages can significantly reduce the overall migration time:
    $packages = ConvertTo-SPOMigrationTargetedPackage
    -ParallelImport -SourceFilesPath `
    $sourceFolder -SourcePackagePath $packageFolder
    -OutputPackagePath $finalPackage `
    -TargetWebUrl $targetWeb -TargetDocumentLibraryPath
    $targetLib `
    -TargetDocumentLibrarySubFolderPath 'migration3' `
    -Credentials $spoCreds -PartitionSizeInBytes 500MB
    $packages
    PackageDirectory FilesDirectory
    ---------------- --------------
    1 C:mydocs
    2 C:mydocs
  • Invoke-SPOMigrationEncryptUploadSubmit: The next step is to upload the packages. Invoke-SPOMigrationEncryptUploadSubmit will upload the contents of the package into Azure blob storage and create a migration job:
    $jobs = $packages | % {
    Invoke-SPOMigrationEncryptUploadSubmit `
    -SourceFilesPath $_.FilesDirectory.FullName
    -SourcePackagePath $_.PackageDirectory.FullName `
    -Credentials $spoCreds -TargetWebUrl $targetWeb
    }
    Creating package for folder:
    C:mydocs
    Converting package for office 365:
    c:tempfinalPackage
    $jobs
    JobId ReportingQueueUri
    ----- -----------------
    f2b3e45c-e96d-4a9d-8148-dd563d4c9e1d
    https://sposn1ch1m016pr.queue.core.windows.net/...
    78c40a16-c2de-4c29-b320-b81a38788c90
    https://sposn1ch1m001pr.queue.core.windows.net/...
  • Get-SPOMigrationJobStatus:Get-SPOMigrationJobStatus will return the status of the active jobs. This command can be used to monitor the status and wait until all the submitted jobs are completed:
    # retrieve job status individually
    foreach( $job in $jobs){
    Get-SPOMigrationJobStatus -TargetWebUrl $targetWeb
    -Credentials $spoCreds -JobId $job.JobId
    }
    None
    Processing

    In a real-world scenario, you can use the command without the JobId parameter to get an array of the job status and wait until all are complete. Running jobs will have the ‘Processing’ state, and completed jobs have the ‘None’ status. Completed jobs are removed automatically so that the job status array is not guaranteed to have the same length on each call and will eventually be zero.

    In the following example, we wait until the active number of jobs is 15 or less before continuing with the script:

    $jobs = Get-SPOMigrationJobStatus -TargetWebUrl $targetWeb
    while ($jobs.Count -ge 15)
    {
    $active = $jobs | Where { $_.JobState -eq 'Processing'}
    Write-Host 'Too many jobs: ' $jobs.Count ' active: '
    $active.Length ', pausing...';
    Start-Sleep 60
    $jobs = Get-SPOMigrationJobStatus -TargetWebUrl $targetWeb
    }
  • Get-SPOMigrationJobProgress: The Get-SPOMigrationJobProgress command will return the result of each job; by default, a log file is placed in the folder specified in SourcePackagePath. By default, the command will wait for the job to complete unless the DontWaitForEndJob parameter is used:
    foreach( $job in $jobs){
    Get-SPOMigrationJobProgress -AzureQueueUri
    $job.ReportingQueueUri.AbsoluteUri `
    -Credentials $spoCreds -TargetWebUrl $targetWeb
    -JobIds $job.JobId -EncryptionParameters `
    $job.Encryption -DontWaitForEndJob
    }
    Total Job(s) Completed = 1, with Errors = 0, with Warnings = 1
    Total Job(s) Completed = 1, with Errors = 0, with Warnings = 0
  • Remove-SPOMigrationJob: If needed, you can manually remove jobs with the Remove-SPOMigrationJob command:
    $jobStatus = Get-SPOMigrationJobStatus -TargetWebUrl
    $targetWeb -Credentials $spoCreds -JobId $job.JobId
    if ($jobStatus -eq 'None'){
    Write-Host 'Job completed:' $job.JobId
    Remove-SPOMigrationJob -JobId $job.JobId
    -TargetWebUrl $targetWeb -Credentials $spoCreds
    }

Summary

OneDrive offers a compelling service to store files on multiple devices and operating systems. OneDrive continues to evolve to target individuals and small collaboration groups. As an administrator, you can help your organization quickly migrate to this service and manage its use through the different scripting methodologies we reviewed.

Resources for Article: 


Further resources on this subject:


LEAVE A REPLY

Please enter your comment!
Please enter your name here