Introduction To Data Loss Prevention in SharePoint 2016

Everywhere your data exists, moves or is shared, you need to protect it. With a Data Loss Prevention (DLP) policy in SharePoint Server 2016, you can identify, monitor, and automatically protect sensitive information across your site collections.  Learn the basics of DLP and how you can start better protecting your data.

What is Data Loss Prevention?

  • Data loss prevention (DLP) is a strategy for making sure that end users do not send sensitive or critical information outside the corporate network.
  • DLP Software products help a network administrator control what data end users can transfer so that users cannot accidentally or maliciously share data that could put the organization at risk.

I recently created a presentation for a Lunch & Learn at my company, AvePoint, and decided to share the slides:

The information comes primarily from Microsoft’s documentation with a couple of slides based off of information from Vlad Catrinescu’s presentation at Ignite 2016.

I am working on adding information on the differences between SharePoint 2016 and Office 365 and will update with the new slides when I complete it.

Hope it you find it informative.

Stopping Mass Downloads in Office 365

Recently Microsoft has been focusing a lot of effort on security and compliance when it comes to information management.  In an effort to make a one-stop portal, they have create the Security & Compliance portal for Office 365.


It can be accessed multiple ways.  It can be accessed through the icon from the Waffle menu in the upper left hand corner as pictured above.  It can all so be accessed through the Admin Center:


Or you can go straight to the URL: 

From there you can take care of your security needs such as: set alerts, manage permissions, set you data loss prevention policies and many other things.  You can visit for complete details and I will be following up with for posts and detailing more of its features.  The focus of this post is focusing on one particularly new feature I discovered that is now available, the ability to alert when someone is doing mass downloads and suspend the user.

While a lot of governance and compliance focuses on securing information and limiting access to data, a major problem is when seemingly good or trusted people turn into bad actors and download things that the company wouldn’t want them too, think Snowden and Wikileaks.

To enact this feature, you have to have an Office 365 E5 license because it is an Advanced Security Management feature.  If you have this license, then you can go to the Security & Compliance Center –> Manage advanced alerts and click on the Go to Advanced Security Management button:


This will take you to the Policies screen:

Policies.pngClick on Create Policy –> Active policy:

CreatePolicyFrom the Policy template choose Mass download by a single user and fill out the form.  An example is if the person downloads 30 items in 5 minutes:

PolicyTemplate_top.pngYou can have the Policy send an alert email to you administrator and Suspend  user until the admin has time to evaluate the situation and decide if the user has a legitimate business reason for downloading so many files.


After you hit create, you will see you Policy appear on the Policies screen:


This is a great new feature to help stop the loss of data from you organization and just one of many useful security options that Microsoft has released.  It should be exciting to see future features and enhancements in the area of Security & Compliance.

DevOps in the ‘Burb

Those that know me, know I am always looking to learn new things and meet new people who share my passion for learning and sharing information in the tech world.  In the past I have frequented several user groups, organized a SharePoint group, organized an Azure group.  I co-chair and am organizing my 4th SharePoint Saturday Chicago Suburbs and I co-chair and in the process of organizing my 3rd Cloud Saturday.

One of the biggest trends in the IT world has been the evolution of DevOps.  There are a lot of thoughts around what DevOps is, but I kind of like Amazon’s definition:

DevOps is the combination of cultural philosophies, practices, and tools that increases an organization’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organizations using traditional software development and infrastructure management processes.

In an attempt to broaden my knowledge on DevOps and help to develop a DevOps community in the Chicago area, I have co-founded the “DevOps in the Burbs” user goup.  We meet the first Thursday of the month and so far the first 2 meetings have been really enjoyable with a lot of good conversations.  The group is founded on that thougth that:

Delivering high-quality, modern applications requires DevOps tools and processes. This users group is for people who want to learn and discuss the advances being made to the application life-cycle management offerings that enable development teams to be more productive. We will be talking about AWS, Azure, Docker, Chef, and all the other great platforms, tools, and approaches driving the DevOps movement.

Our next meeting is February 2nd, 2017, if you are in the Naperville area, stop by and see us.



70-532 Developing Microsoft Azure Solutions

In order to help my own personal development and further my understanding of Azure, I decided to take the Azure Developer certification.  I know some people don’t place much value in certs, but I think they are a great way to motivate one’s self to learn areas of a topic you might not learn otherwise.  Besides I had gotten a free voucher for the exam, so what could it hurt.

My only doubt about taking it was that Azure had changed a lot since the exam was rolled out.  Would it still be relevant?  Much to my surprise, Microsoft had recently revised, revamped and republished the exam last month (Nov. 2016).  I am encouraged that they are keeping it up to date and look forward to taking it.

For those that don’t know it, below is the Exam Outline.  I will be updating with notes and links as I progress through the learning process.

Create and manage Azure Resource Manager Virtual Machines (30‒35%)

  • Deploy workloads on Azure Resource Manager (ARM) Virtual Machines (VMs)
    • Identify workloads that can and cannot be deployed; run workloads, including Microsoft and Linux; create VMs
  • Perform configuration management
    • Automate configuration management by using PowerShell Desired State Configuration and VM Agent (custom script extensions); configure VMs using a configuration management tool, such as Puppet or Chef; enable remote debugging
  • Configure ARM VM networking
    • Configure static IP addresses, Network Security Groups (NSG), DNS, User Defined Routes (UDRs), external and internal load balancing with HTTP and TCP health probes, public IPs, firewall rules, and direct server return; design and implement Application Gateway
  • Scale ARM VMs
    • Scale up and scale down VM sizes, deploy ARM VM Scale Sets (VMSS), configure ARM VMSS auto-scale
  • Design and implement ARM VM storage
    • Configure disk caching, plan for storage capacity, configure shared storage using Azure File service, configure geo-replication, implement ARM VMs with Standard and Premium Storage
  • Monitor ARM VMs
    • Configure ARM VM monitoring, configure alerts, configure diagnostic and monitoring storage location
  • Manage ARM VM availability
    • Configure multiple ARM VMs in an availability set for redundancy, configure each application tier into separate availability sets, combine the Load Balancer with availability sets

Design and implement a storage and data strategy (25‒30%)

  • Implement Azure Storage blobs and Azure files
    • Read data, change data, set metadata on a container, store data using block and page blobs, stream data using blobs, access blobs securely, implement async blob copy, configure Content Delivery Network (CDN), design blob hierarchies, configure custom domains, scale blob storage
  • Implement Azure storage tables and queues
    • Implement CRUD with and without transactions, design and manage partitions, query using OData, scale tables and partitions, add and process messages, retrieve a batch of messages, scale queues
  • Manage access and monitor storage
    • Generate shared access signatures, including client renewal and data validation; create stored access policies; regenerate storage account keys; configure and use Cross-Origin Resource Sharing (CORS); set retention policies and logging levels; analyze logs
  • Implement Azure SQL Databases
    • Choose the appropriate database tier and performance level, configure and perform point-in-time recovery, enable geo-replication, import and export data and schema, scale Azure SQL databases
  • Implement Azure DocumentDB
    • Create databases and collections, query documents, run DocumentDB queries
  • Implement Redis caching
    • Choose a cache tier, implement data persistence, implement security and network isolation, tune cluster performance
  • Implement Azure Search
    • Create a service index, add data, search an index, handle search results

Manage identity, application, and network services (15‒20%)

  • Integrate an app with Azure Active Directory (Azure AD)
    • Develop apps that use WS-federation, OAuth, and SAML-P endpoints; query the directory using Graph API
  • Design and implement a communication strategy
    • Implement hybrid connections to access data sources on-premises, leverage site-to-site (S2S) VPN and ExpressRoute to connect to an on-premises infrastructure
  • Design and implement a messaging strategy
    • Develop and scale messaging solutions using service bus queues, topics, relays, and notification hubs; monitor service bus queues, topics, relays, event hubs, and notification hubs
  • Develop apps that use Azure AD B2C and Azure AD B2B
    • Design and implement .NET MVC, Web API, and Windows desktop apps that leverage social identity provider authentication, including Microsoft account, Facebook, Google+, Amazon, and LinkedIn; leverage Azure AD B2B to design and implement applications that support partner-managed identities

Design and implement Azure PaaS compute and web and mobile services (25–30%)

  • Design Azure App Service Web Apps
    • Define and manage App Service plans; configure Web Apps settings, certificates, and custom domains; manage Web Apps by using the API, Azure PowerShell, and Xplat-CLI; implement diagnostics, monitoring, and analytics; implement web jobs; design and configure Web Apps for scale and resilience
  • Implement Azure Functions
    • Create Azure Functions, implement a webhook function, create an event processing function, implement an Azure-connected function
  • Implement API management
    • Create managed APIs, configure API management policies, protect APIs with rate limits, add caching to improve performance, monitor APIs, customize the Developer Portal
  • Design Azure App Service API Apps
    • Create and deploy API Apps, automate API discovery by using Swashbuckle, use Swagger API metadata to generate client code for an API app, monitor API Apps
  • Develop Azure App Service Logic Apps
    • Create a Logic App connecting SaaS services, create a Logic App with B2B capabilities, create a Logic App with XML capabilities, trigger a Logic App from another app, create custom and long-running actions, monitor Logic Apps
  • Develop Azure App Service Mobile Apps
    • Create a Mobile App, add offline sync to a Mobile App, add authentication to a Mobile App, add push notifications to a Mobile App
  • Design and implement Azure Service Fabric apps
    • Create a Service Fabric application; build an Actors-based service; add a web front end to a Service Fabric application; monitor and diagnose services; migrate apps from cloud services; create, secure, upgrade, and scale Service Fabric Cluster in Azure; scale a Service Fabric app

From <>

5 Challenges with a FastTrack Migration

FastTrack is a collection of resources, tools, and a team of hundreds of engineers around the globe, committed to ensuring successful Office 365, Enterprise Mobility + Security (EMS), and/or Azure experiences for IT professionals and partners.  Microsoft created the FastTrack program to ensure that it’s customers could get up and running with their new services as quickly as possible.

This sounds great, Microsoft sells helps you get up and running as soon as possible, and moves your data for you!  This may be a great fit for you and your organization, but there are some scenarios where this might not be the best fit for you and your data:

  1. You have special requirements as to where you data can go!  Many organizations can’t have data leave the country it resides in.  Currently, data migrated through the FastTrack services may be transferred to, stored, and processed anywhere that Microsoft maintains facilities (except as otherwise provided for your particular FastTrack engagement). The FastTrack services aren’t designed or intended for data subject to special legal or regulatory requirements.  Data migrated through the FastTrack services may be transferred to, stored, and processed in the United States or anywhere that Microsoft or its third-party suppliers maintain facilities. 
  2. No full fidelity migration capabilities – While FastTrack does a good job of migrating your base data information, it doesn’t move all of the metadata and other setting associated with those files.What won’t migrate:
    • Files larger than 2 GB
    • Sharing
    • Ownership history
    • Previous versions
    • Windows file and folder attributes (for example, Read Only, Hidden)
    • Windows New Technology File System (NTFS) special permissions and advanced settings
    • Corrupted and inaccessible documents
    • Conversion of embedded URLs/links in source documents
    • Files of certain types or files that exceed Office 365 service limits
    • Windows NTFS Auditing Configuration
    • Windows Advanced NTFS permissions
    • Additional file metadata provided by File Classification Infrastructure (FCI)

    The only way to move documents with full fidelity is to use a third party tool.

  3. No content reorganization.  Doing a FastTrack migration, is a “Lift Shift”  The data is “lifted” from the source environment and “shifted” to Office 365, SharePoint On-Line or One Drive For Business.  The problem is that there is no way to reorganize the data.  Often during such an upgrade or transition to a new environment, organizations what to reorganize their data and look to put governance rules and policies in place to better manage it.  These are not things that FastTrack helps with, it is only concerned with getting your data moved so that you can begin using the new environment the same way as the old.
  4. No mechanism to check new sensitive data created or imported.  FastTrack does not have the ability to search the source destination for sensitive information prior to the migration.
  5. FastTrack only migrates data. Many companies have spent valuable time and resources creating business processes, custom workflows, branding,  etc. in their old environments; these are not items that will be migrated.

SharePoint 2016 Newsfeed Not Working

A few weeks ago, I was assigned a SharePoint 2016 implementation.  Yea me!  Wanting to get a jump on the project, I quickly spun up a few Azure virtual machines, downloaded SP2016, and created a small SharePoint farm.  While I was doing all the project prep work, analysis, requirement gathering, etc. I didn’t have much time to play with it.

Fast forward to today.  I went out and created a new site collection.  A relatively easy task, but I noticed the Newsfeed feature was displaying an error:

Something went wrong

SharePoint returned the following error: The operation failed because an internal error occurred. Internal type name: Microsoft.Office.Server.Microfeed.MicrofeedException. Internal error code: 54. Contact your system administrator for help in resolving this problem.
I quickly began researching the issue on-line and found a few articles pointing to the service accounts not having the correct permission, some suggested to try resetting the Distributed Cache service, and a couple advising to check the User Profile Services. Nothing worked, so I decided to check the logs.  I know I should have done that first, but I quickly found the issue.
Unexpected SPMicrofeedContext.SetMySiteHostForContext failed Microsoft.SharePoint.SPException: The trial period for this product has expired.
A quick trip out to MSDN and I was able to get a SP2016 license key from my subscription benefits. I then went Central Admin –> Upgrade and Migration–> Convert farm license type and I was able to add the license.
I then did an IIS Reset, refreshed my site and my Newsfeed was restored!
Thanks for reading and I hope this helps you out.