Exchange Online PST Import Part 2: Upload, Mapping, Import

You need to have the pre-reqs and initial steps complete before proceeding, as detailed in Exchange Online PST Import Part 1.

Upload PST files to Office 365

Azure AzCopy is going to be used to upload the PST files into Azure Blob storage.

The PST files need to be accessible via a network share (i.e. UNC path), for example:

PSTNetworkLocation

You cannot specify a physical path.

From a command prompt, go to the AzCopy program folder.  If you went with the defaults, that should be C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.

AzCopy command syntax is:

AzCopy.exe /Source:[UNC Path] /Dest:[SAS URL] /V:[Log File] /Y

AzCopy example:

AzCopy.exe /Source:”\\X500FS01\AzurePSTUpload” /Dest:”https://3c3e5952a2764023ad14984.blob.core.windows.net/ingestiondata?sv=2012-02-12&se=9999-12-31T23%3A59%3A59Z&sr=c&si=IngestionSasForAzCopy201601121920498117&sig=Vt5S4hVzlzMcBkuH8bH711atBffdrOS72TlV1mNdORg%3D” /V:”c:\Users\SteveB.X500\Documents\AzCopy.log” /Y

Note all parameters are required unless stated otherwise, and the values must be wrapped in “quotation marks”.

/Source is the UNC path of where the PST files are located.  Here that’s a share on my fileserver (“\\X500FS01\AzurePSTUpload”).

/Dest is the SAS URL that we obtained earlier (see here).

When the PST files are uploaded to Azure Blob storage, they’ll go into the root of a folder called ingestiondata.  This happens because the folder is specified in the SAS URL.  If you want to upload files into a different folder (e.g. PSTUploads), add the folder name to the SAS URL, after ingestiondata.  For example:

/Dest:”https://3c3e5952a2764023ad14984.blob.core.windows.net/ingestiondata/PSTUploads?sv=2012-02-12&se=9999-12-31T23%3A59%3A59Z&sr=c&si=IngestionSasForAzCopy201601121920498117&sig=Vt5S4hVzlzMcBkuH8bH711atBffdrOS72TlV1mNdORg%3D”

/V (Optional) specifies a log file location if you don’t want to go with the default (AzCopyVerbose.log in %LocalAppData%\Microsoft\Azure\AzCopy).

/S (Optional) specifies recursive mode, i.e. the AzCopy tool will upload files that are in subfolders of the specified /Source folder.  If you are using this switch, be sure to check the path of the uploaded files in the Azure Blob storage using Azure Storage Explorer.

/Y allows the use of write-only SAS tokens when uploading the PST files to Azure Blog storage.

When the command is executed, you’ll get status updates of the PST file upload.  When it’s finished, you’ll get a final status message.

Cmd2

Create a PST Import Mapping File

Now that the PST files have been uploaded to Azure Blob storage, a CSV file needs to be created, used to map mailboxes to PST files, together with some options.

This is the header row of the CSV file:

Workload,FilePath,Name,Mailbox,IsArchive,TargetRootFolder,ContentCodePage,SPFileContainer,SPManifestContainer,SPSiteUrl

Example data:

Workload,FilePath,Name,Mailbox,IsArchive,TargetRootFolder,ContentCodePage,SPFileContainer,SPManifestContainer,SPSiteUrl
Exchange,,SteveTest1.pst,steve.test1@x500.co.uk,FALSE,/ImportedArchive,,,,
Exchange,,SteveTest2.pst,steve.test2@x500.co.uk,FALSE,/ImportedArchive,,,,

In this example, I am uploading PST files for two users.  The import is going into their mailbox (not their archive mailbox), into a root folder called ‘/ImportedArchive’.

Mapping options…

Workload: this is always Exchange

FilePath: if you’ve uploaded the PST files into the default ingestiondata (i.e. you didn’t specify a subfolder in the /Dest parameter), this value can be left as blank.  If you did specify a subfolder, specify it.

Name: the name of the PST file, again confirm it using Azure Storage Explorer.

Mailbox: the email address of the Exchange Online mailbox where the PST file is going to be imported into.  If the mailbox is inactive, you need to specify the GUID of the mailbox.

IsArchive: FALSE if the import is into a mailbox, TRUE if it’s going into the archive mailbox.  The default if you leave it blank is the mailbox.

TargetRootFolder: which folder the PST file should be imported into.  In my example, I’m specifying a folder called ‘/ImportedArchive’.  The folder will be created under the root of the mailbox if it doesn’t already exist.

If I wanted the PST to go straight into the inbox folder, it’s “/”.  If I don’t enter anything, it will be imported into a folder called ‘Imported’ at the root of the mailbox.

ContentCodePage: if the PST files are from Chinese, Japanese, and Korean (CJK) organisations, as a double byte character set (DBCS) for character encoding is typically used.  Leave it blank, or 932 (which is the code page identifier for ANSI/OEM Japanese).

The remaining three columns aren’t applicable as they relate to SharePoint Online only:  SPFileContainerSPManifestContainer, and SPSiteUrl.

Create a PST Job

I’ve already got a PST job ready to go (created as part of the initial setup).  If you haven’t got one, create a new Import Job.

Check I’m done uploading my files, and I have access to the mapping file.  Click Next.

ImportStart1

Click + Select mapping file.

ImportStart2

Browse to the CSV file.  Highlight the CSV file, click Open.

ImportStart3

Click Validate.

ImportStart4

The CSV file has to pass validation to continue.  The CSV file name has changed from red to green text, so it’s successful.  If it failed, you’d get a View log link.  Click Save.

ImportStart5

Click Close.

ImportStart6

You’ll now get a pop-out status window.  The initial status is Analysis in progress.  Click Close.

ImportStart7

Click Refresh to update the status. 

ImportStart8

Refresh again, 1 of 2 files are complete.

ImportStart9

When the status changes to Ready to import to Office 365, click on the link.

ImportStart10

Here you see the status of the PST file imports.  Both are complete.  Click Import to Office 365.

ImportStart11

Filter & Start the PST Import Job

When the PST files are analysed, Office 365 looks at the age of the messages in the PST files.  You have the option to import all of the data in the PST files, or specify filters to control what gets imported and reduce the imported data volume.

Here I want to import everything.  Next time I do this, if I need to apply filters, I’ll write it up.

Check No, I want to import everything.  Click Next.

ImportStart12

Now I get confirmation that 12.02GB data will be imported.  Click Import data.

ImportStart13

Click Close.

ImportStart14

Click Refresh to update the status information.

ImportStart16

If you click on the job, you’ll get a pop-out window showing detailed information for each file.

ImportStart15

The import job is complete.  Click Close.

ImportStart18

Click View log for detailed information.

ImportStart19

 

 

 

Advertisements

One comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s