High Volume Mailbox Moves

letters

One challenges of planning your migration to Office 365 is how fast can you go?  Office 365 migration performance and best practices covers some great information, but I’ll add to it here based on my experience with real world projects.

Spoilers Ahead

One of my most recent engagements is wrapping up and I have done some anaylsis on the summary statistics. Note this was a move from a legacy dedicated version of Office 365 – so the throughput can be a bit higher than coming from an on-premises Exchange deployment.  On average (throwing out the high and low values) we moved about 3,000 mailboxes per week. One of the most impressive things from this migration was actually that it included a deployment of Office Pro Plus.  There was only a couple of months for planning – and deploying to over 30,000 workstations with very little impact to the helpdesk was a great surprise.

Another project I’m working on we have just started pilot migrations coming from on-premises Exchange 2010 servers.  Initially, we saw pretty limited performance when routing traffic through typical network infrastructure (e.g. hardware load balancer).  When we changed the configuration, we more than doubled our throughput and continued to tune it resulting in our last test was over 50GB/hr (our initial test was closer to 4 GB/hr).  Not too bad!

Migration Architecture

How did we get this speed boost?  A typical architecture for accessing mail (in this case the /EWS endpoint) is done over https (443) from the client to the hardware load balancer (HLB).  You may have a reverse proxy in front of the HLB, and you may have an additional interior firewall.  Some customers do not allow external access to the EWS virtual directory, but as part of establishing hybrid connectivity with Office 365 this is required.

mrs1

You may just reuse the same endpoint for the MRS traffic.  In this case your mailbox migrations will follow the same data path as the rest of your traffic.  A few additional constraints may need to be met: a publicly signed certificate, and you cannot bridge the SSL traffic (break encryption and re-encrypt).  If you meet this bar, then this design will meet the minimal requirements for MRS – however it may not perform very well as there are so many layers of infrastructure its traversing, plus it may impact the total available bandwidth of those devices.  Creating 1:1 MRS endpoints is a way to bypass all of this infrastructure, and ramp up throughput.

mrs2

In this example, three new DNS names are created, each resolving to a specific server. The firewall must allow traffic only from Exchange Online to the on-premises servers (see Office 365 URLs and IP address ranges).  The certificate with the additional MRS names will have to be redeployed to all the infrastructure (e.g. HLB) and Exchange servers (unless using a wildcard certificate – e.g. *.rosenlabs.com).  Now when you create migration requests you can choose across the endpoints.  For most customers, the ACL on the firewall is enough security to allow this configuration – at least for the duration of the mailbox migrations.

Other Considerations

There is always a bottleneck in the system, the question is do you hit it before you achieve the velocity speeds you would like to hit.  I work with customers to walk through every point in the data flow and see what the bottleneck will be.  In the original architecture above, the first bottleneck is nearly always the HLB – either because of its network connection, or the load it’s already under.  After that, the source Exchange servers tend to be unable to keep up and cause migration stalls. Also be aware of things like running backups that could severely impact resources. Finally, other items like the day after helpdesk support capacity or network downloads (OAB, or maybe changing your offline cache value) may prove to also limit your velocity speeds.  MRS migrations usually have a very low failure rate, but other ancillary things like mobile clients, etc. that are coupled with the migration need to be considered.

 

 

 

 

 

 

 

 

 

Office 365 Consumption Analytics

data

One of the great things about Office 365 is that you can get great telemetry data to understand the actual consumption of your tenant.  The Office 365 admin portal has built-in usage reports that give some quick hight level stats (e.g. emails send/received, OneDrive for Business storage consumed, the number of Office activation, etc.).  But what if you want to slice the data differently, like by department or other properties?  The Office 365 Adoption pack, currently in preview, is a Power BI solution that is customizable to your organization’s specific needs.

Office 365 Adoption Pack Installation

The overall steps are detailed in this post.  I’ll walk through a summary of the steps here.  First, you must enable the usage reports to work with Power BI. To do this, open the Admin portal (portal.office.com) and open the Admin center.  Under Reports, Usage, you will see a section for enabling the content pack.  This is shown below on the bottom right pane.

usage

Once you click the ‘Get started’ button – it will take a while (between 2 and 48 hours) before you can move on.  Eventually you will see that your data is ready and your tenant ID (needed for a later step).

usage 3usage 4

At this point the reports will not have anonymized data.  If this is required, in the Admin Center, on the left nav menu open Settings > Services & add-ins.  Find Reports and you will be able to toggle on anonymous identifies instead of names on all reports.  This applies to all usage reports, not just the Adoption pack.

Now that you have the infrastructure configured, you need to set up the Power BI dashboard.

Configuring the Office 365 Adoption Content Packconfigure

There are several ways to install the content pack, but I’m going to highlight one deployment option here.  In this scenario, I will create an app workspace and configure the content pack there.  The benefit with this model is it makes the deployment independent of a specific user account.  I could have  deployed the content pack into my workspace, and share it as needed.  But, what would happen if I leave the company or change job roles – this would break sharing for everyone.  Note, for each of these deployment scenarios you will need to check and ensure everyone is properly licensed. At this time of writing this, all internal users need a Power BI pro or greater license.

Open up Power BI in the app selector or from the Usage report page.  Under workspaces, there is a button to create a new workspace. Behind the scenes this creates an Office 365 group. There are several options for configuration, such as allowing members to have edit rights.  Open the workspace and under Microsoft AppSource, click ‘get’ under services.

usage 5

Search for the Office 365 Adoption Preview – click ‘get it now’

usage 6

There can only be one deployment of the content pack for the organization. You will then need to input the tenant ID (from the earlier step).  It will then have you authenticate and start to import the data.  Once loaded you can interact with the usage data and drill down to see the nitty-gritty details.  Other members can view and interact with the data as well.

That should get you running with the out of box dashboards and report.  In another post I may show some neat things you can do to extend the capabilities.  In the mean time, for more information on how to customize the solution, check out this web page.

(Dis)Organization

Ever since I can remember, I’ve had challenges with organzation..and I’ve tried a lot of ‘gimmicks’ to figure out something that would help.  I tried paper calendars, sticky notes, virtual sticky notes, OneNote, Evernote, WunderList, Planner..nothing really worked for me.  I needed something that was easy and accessible.  Wunderlist came very close since it has a mobile client and a fairly simple model for tracking tasks. But, I mainly work in Outlook and it just felt very disconnected.

I decided to write a modern plug-in for it only to find out that Wunderlist was getting an official add-in.  This was better, but it still didn’t feel very integrated and it was slow.

Enter Microsoft To-Do

todoMicrosoft To-Do is the eventual replacement for Wunderlist (brought to you by the same team).  What makes it special is that it fully integrates with Office 365 Exchange Online – tasks. This means I can manage my tasks in Outlook and they will surface in To-Do, and vice-versa.  I can use the ‘Quick Step’ feature in Outlook to create a task from an email, or simply drag an email to the task icon.  If I’m mobile, then the Android To-Do app makes it easy to quickly add a new task as well. My routine still needs some work, but I set aside the first hour of my day to use To-Do to plan my day.  “My Day” lets you prioritize tasks with intelligent suggestions based on a smart algorithm.  The best part is when you complete a task you get a rewarding ‘ding’ sound.

Note that this is still in preview and must be enabled on your Office 365 tenant.  The instructions can be found here. There is some fine print to be aware of before enabling the feature. Once enabled, you can control who can use it through licenses at the user object level.

I encourage you to check it out – let me know what you think in the comments.

 

MyAnalytics

data

One of the greatest benefits (in my humble opinion) of Office 365 is having all your data in the cloud unlocks new capabilities. Easier sharing and data insights are a couple of examples.  One of the early features that took advantage of this centralized information storage is called ‘Delve’.  ‘Delve’ – the application let you know what others were working on (documents) and feeds.  Delve also provided some analytics, Delve Analytics, and the feature has evolved over time and is now ‘MyAnalytics’.

unlocked

On a weekly basis I get an email with highlights of my week (real example shown below) and it also is surfaced in my Outlook client.  At first, my reaction was like a lot of my customers – turn this off.  However, as the feature has matured and I spent some time reviewing it and understanding its value, I completely changed my mind.

my-data.png

“If it can be measured, it can be fixed”

-Lots of people

While there is a lot of data here are the top two things that jumped out at me. First, is rethinking how I do things. One challenge is understanding and drawing conclusions from the raw data.

data2

Working during meetings probably means I’m not really paying attention to the meeting. I’m guilty on more than one occasion not paying attention, someone calls on me, and I have to ask them repeat their question. Maybe I’m not really required to be in the meeting and use this as an oppourtunity rethink which meetings I actually accept.

A second example, which probably hits home for many of you, is email overload. I do a lot of email.  This graphic is one snip from my data.  There is a great new feature that breaks this down by people, but to protect the innocent I won’t show it here.

data3

This may look like a familiar pattern to some of you.  Once the family goes to bed, work can begin.  The system also give you some strategies for changing your behavior:

 

 

 

Protecting your data in Office 365

Having a hard time understand data protection technologies in Office 365? This post addresses service keys, BYOK, and HYOK.

secureThe old adage “If you are confused about something, then someone else probably is too” applies to this post.  I am working with several customers who want to understand their options when it comes to securing their data in Office 365.

There are multiple technologies available and its important to understand the scenarios and risks you are trying to protect to apply the right controls.  To better understand how Office 365 protect customers data in a multi-tenant (shared) platform take a look at Scott Schnoll’s post that makes it easy to view material in the risk assurance library.  This site has links to an interactive PowerPoint that if you select Encryption, links to a whitepaper.  There is a section titles ‘Office 365 Service Encryption’  that explains how data at rest and in transit is protected for the various workloads (e.g. SharePoint Online, Skype for Business Online, Exchange Online).  The data at rest protection acts as a security boundary and is analogous to ‘drive encryption’ (like BitLocker).  Alternatively, Azure RMS addresses data at rest and in transit. In summary, the protection features I’m discussing are:

The FAQ ‘Customer Key for Office 365 FAQ‘ summarizes the difference between the two very well:

Both options enable you to provide and control your own encryption keys; however, service encryption with Customer Key encrypts your data at rest, residing in Office 365 servers at-rest, while BYOK with Azure Information Protection for Exchange Online encrypts your data in-transit and provides persistent online and offline protection for email messages and attachments for Office 365. Customer Key and BYOK with Azure Information Protection for Exchange Online are complementary, and whether you choose to use Microsoft’s service-managed keys or your own keys, encrypting your data at-rest and in-transit can provide added protection from malicious attacks.

Each of these features have the options around key management.  Either Microsoft manages your keys (the default) or the customer can manage the keys.  It is best to not share a key across service encryption and Azure RMS, but to use distinct separate keys.

99% of the time, I highly recommend customers leave the default and have Microsoft do the key management.  Managing your own keys is not trivial, and if done incorrectly risks losing access to all your tenant data – and Microsoft will not be able to fix this.

HYOK vs BYOK

Now that we are clear on the difference between service keys (customer keys) and RMS – there are multiple options for RMS.  (I call these *YOK – star your own key).

BYOK is simply using the Azure RMS (AIP) service with customer managed keys.  HYOK is an architecture that integrates your on premises AD RMS with Azure RMS.  The benefit of HYOK is that for some data, you keep it out of the cloud and use your OWN AD, your OWN RMS server, and your OWN HSM.  Sounds great, but there are some important considerations.  These are well documented here Azure Information Protection with HYOK (Hold Your Own Key).

To summarize, 99% of the time customers should use the default – Microsoft managed RMS keys.  If you need your own keys, 99% of the time you should go with the BYOK architecture.

Why?

When I go down this path with a customer who wants to do their own keys, I ask what is it they are trying to achieve.  The main reason for managing your own keys if should you ever leave the service – you can ensure that no one has any access to your data (by revoking the key).  While in the service, Microsoft still needs and has access to the data – otherwise features like search and indexing would not work.  Given the added complexity and potential catastrophic loss of data, this is a decision that needs careful consideration and understanding on why a customer should go down this path.

 

note:edited to remove link to outdated whitepaper