Migrating enterprise content to the cloud is no small undertaking, but Microsoft is looking to make that process...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
quicker and easier with its new Office 365 Import Service.
Currently available in preview, the service supports data migration from on-premises deployments of SharePoint and Exchange to the Office 365 suite of cloud services. It also supports migration of content from OneDrive for Business and regular file shares.
Given Microsoft's cloud-first approach in recent years, data migration has become a priority. Previous methods -- such as transferring data using a dedicated Internet line or relying on users to manually drag and drop files, and wait for their subsequent uploads -- could take years, given the volume of data at stake. And this was a common reservation for enterprises considering cloud migration. But now, Microsoft is acknowledging that issue and billing this new offering as an attempt to speed that process, as well as make it more user-friendly.
In this piece, I'll give you a quick head start on using the service to populate a SharePoint tenant.
To use the service, you need to bust out the PowerShell scripting skills. Download the SharePoint Online Management Shell, which adds support for a lot of SharePoint Online and Office 365-related cmdlets.
Once you download and install the shell, start by uploading data over the network. For an upload, you do the migration prep work, and then send your data over the wire to an Azure storage account. The automated service picks up the work from there. This is how it's done:
1. Open PowerShell as an administrator and run the following cmdlet: Connect-SPOService -- URL https://yourtenant.sharepoint.com
2. Log in to the Office 365 Administration Console at and click on the "import" link.
3. Click the "+" and then choose "Upload data over network."
4. Click "Show key," and then "Show URL." Be sure to record both of these values somewhere safe -- you will need them later on.
5. Next, create the migration package. The migration package is the container that holds the data you are migrating to the cloud. You create this one of two ways:
- If you are bundling up files on a standard file server, you can use this cmdlet, including the last tag that preserves the permissions on the access control list, so that you can later restore them in the cloud:
New-SPOMigrationPackage -SourceFilesPath D:\ data -OutputPackagePath D:\package -IncludeFileSharePermissions.
- If you are migrating a lot of SharePoint content from an on-premises site, use the following cmdlet instead:
Export-SPWeb -Identity http://site -Path "d:\data" -ItemUrl "/shared" –NoFileCompression.
- For the Identity tag, use the GUID or the URL of the SharePoint Server site being exported -- the top-level one. For the Path tag, name a directory that will hold the package when the cmdlet is done. For the ItemUrl tag, use the subsite path that you are exporting, such as "/shared/teamsite" or similar. You must include the bit about no file compression, as compression is not currently supported by the Office 365 Import Service.
6. Back in the SharePoint PowerShell window, use the following cmdlet to process the package:
ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath D:\data -SourcePackagePath D:\package -OutputPackagePath D:\targetpkg -TargetWebUrl https://yourtenant.sharepoint.com/ -TargetDocumentLibraryPath "Shared Documents"
7. Then, commence the upload, using the account name and key you recorded from the portal in an earlier step:
Set-SPOMigrationPackageAzureSource -SourceFilesPath D:\data -SourcePackagePath D:\targetpkg -AccountName youraccountnamehere -AccountKey yourkeyhere -FileContainerName files -PackageContainerName package
8. Next, you need to create a mapping file that tells the import service the destination of the files, specifically which tenant and site your import package will reside in. The mapping file is a CSV file, which you can easily create in Microsoft Excel. For SharePoint packages, you just need to create mapping that links the files and the package to the right Site URL. Here is a sample CSV that works in the context of our example here; feel free to grab and modify as you need. Just keep the headings at the top, as those are fixed and defined by Microsoft.
9. Once your mapping file is done, go back to the Office 365 portal and click "I'm done uploading my files," and "I have access to the mapping file." Upload that CSV to the service and click "Finish;" the automated minions will then pick up your job and finish the import.
Shipping drives to Microsoft
If you have a ton of data, it may be faster to put it on hard drives and send them to Microsoft via UPS or FedEx for them to plug in and import directly in the data center. This service is generally useful only if you have to move more than five or 10 TB of data, since the transfer over the network would take a long time at typical bandwidth and transfer rates. You do much of the same prep process as above for shipping drives.
1. Obtain the right hard drive. You'll need 3.5 in SATA II or SATA III hard drives that are less than 6 TB.
2. Use the Microsoft Azure Import/Export Tool to prepare the drives. Download it from here and run it; it's easy and requires just a few clicks.
3. Follow steps two to seven above, only instead of choosing the network option in step three, choose "Ship data on physical hard drives."
4. Mount your physical drives, and then copy the manifest and the data. You'll need your Azure account key from earlier.
WAImportExport.exe PrepImport /j:C:\journalFiles\SPDiskShipping1.jrn /id: spopackageingestiontest /sk: <AzAccountKey> /t:F /srcdir: E:\Ingestion\Ingestion\SPDiskShipping\SPOPackageIngestionTest /dstdir: spopackageingestiontest /encrypt /Disposition:overwrite /logdir:c:\azureImportLogs
WAImportExport.exe PrepImport /j:C:\jounalFiles\SPDiskShipping1.jrn /id: spodataSourceIngestionTest /srcdir:E:\Ingestion\Ingestion\SPDiskShipping\ SPDataSourceIngestionTest /dstdir: spdatasourceIngestiontest /Disposition:overwrite
5. Duplicate step nine from the previous section in order to create the mapping file.
6. Back on the portal window, confirm that you have prepared the drives and have the journal files, as well as that you have access to the mapping file. Then, upload the journal files from the Azure Import/Export Tool -- these contain the Bitlocker keys used to encrypt the drive for transit -- and then upload the mapping CSV. Copy the shipping address and then ship the drives.
7. Enter the tracking number on the Ingestion page in the Office 365 portal, and then enter your UPS or FedEx account number as well, so your drives can be returned to you after the data is copied from them.
The preview version of the Office 365 Import Service is free, and it's slated for general release sometime in the fourth quarter of 2015. Microsoft hasn't announced a pricing plan for general release version, so if you're anxious to use this, I'd get started today. Whether this is a valuable addition to the migration options that exist today remains to be seen; the technical aspect of the service seems sound and imports generally complete successfully, so the pricing will likely make or break this offering.
Office 365 has become the testing ground for new SharePoint features
Experiences vary when migrating to SharePoint Online
Office 365 features may sell hybrid SharePoint scenarios