Radi Atanassov

SharePoint MCM, MVP, MCT and owner of OneBit Software

Testing and Simulating HTTP 429 Too Many Requests with Fiddler Extensions

This post is applicable to any scenario where you want to simulate the return of HTTP 429 Too Many Requests messages. In my case, it is applicable to SharePoint Online as I want to test code that handles 429 errors.

See the end for a link to a code sample in GitHub.

Problem and Solution

My requirement is simple - test code that can recover itself from SharePoint Online throttling mechanisms.

If you are unfamiliar with request throttling in SharePoint Online, please read these:

What we don't want to do is actually stress test SharePoint Online and play with its throttling mechanism. That's just weak and chances are that we can't generate enough requests to get throttled from a single developer machine. The only way to load a cloud is with another cloud, and that is expensive.

My solution isn't complicated either - use a Fiddler extension to tamper with the response packets from SharePoint Online. What we want is to process some requests, then transform the response HTTP code and status message of others to 429, then in code we can capture true 429's which appear to come from SharePoint Online.

My use case is to test that ExecureQueryRetry works in OfficeDev.PnP Sites Core: https://github.com/SharePoint/PnP-Sites-Core/blob/master/Core/OfficeDevPnP.Core/Extensions/ClientContextExtensions.cs#L73 So we can further develop the ExecuteQueryRetryAsync method as well.

Fiddler Extension Build and Deploy

  1. In order for Fiddler to pick up your extension, it needs to go in one of either folders.
  • %userprofile%\documents\Fiddler2\Scripts to make the extension available to the current user. This is the "My Documents" folder
  • Use %programfilesx86%\Fiddler2\Scripts to make the extension available to all users on the machine. This is the 64-bit Program Files folder. Change this to where your Fiddler application is installed.
  • Your paths might differ if you have installed things differently or use a newer version.

   2. You need to build a class library that has a public class which implements IFiddlerExtension. In your class library, add a reference to Fiddler.exe, straight from the folder where it is located. Your DLL will go in the \Scripts folder relative to the Fiddler.exe you are referencing.

3. You need to set a Fiddler.RequiredVersion assembly attribute somewhere in your code. I do it in my class that implements IFiddlerExtensions. I couldn't get the extension to work without this, so I am assuming it is required.

If all of the above are true, Fiddler's extension tab in Fiddler Options will show that your assembly is loaded.

The following post-build command on the project automates my DLL to go in the right folder:

  • copy "$(TargetPath)" "%userprofile%\My Documents\Fiddler2\Scripts\$(TargetFilename)" /Y

Fiddler Extension Design - Throttling Simulation

You can implement the "public void AutoTamperResponseAfter(Session oSession)" method from the IAutoTamper2 interface. The passed Session object gives you access to the HTTP packet, amongst other things.

In my case, I would like to allow 10 packets to pass within 5 seconds, then tamper any more packets within that moving 5 second window. I implement a ConcurrentQueue collection, because you typically have multiple in-flight requests. ConcurrentQueue deals with the cross-message coordination of counters and does a perfect job here.

In the end, I modify the HTTPResponseCode and HTTPResponseStatus messages. Note that the return Content-Type must be "text/html" and the body must not be empty.

   

I also do simple exclusions, as I only want to throttle *.sharepoint.com requests in my use case. Change this to suit your situation. 

Testing and Validation

Here it is in action. You can see that some requests get throttled as long as they are more than 10 and within a 5 second window. Then, reissued requests pass through as expected. This lets me test PnP's ExecureQueryRetry method.   

Sample Code

All of the above is uploaded on GitHub here: https://github.com/OneBitSoftware/Http429.SimulateThrottling

Feel free to fix issues or add functionality, or just say hello.

Happy throttling!

Asset Bundles in Unity - 2017/2018 Edition

This post is a detailed write-up on what Asset Bundles are in Unity, what they should be used for and how they work internally.

Asset Bundles in Unity are an editor and platform feature that let you configure and deliver dynamic content to your games or apps. You can do "in-game" downloading of external content.

The key use cases for Asset Bundles are:

  • Download in-game content during the first run of the game with the goal of minimizing the initial installation size
  • Update game content by replacing the asset bundles located on a server - a very poor man's game objects patching system
  • You can delivery variants of content for different devices and runtimes - for example differentiate between high-resolution and low-resolution assets, or Android vs Windows assets, potentially allowing you to optimize for devices
  • The Asset Bundles API lets you manage what is stored in memory for a potential manual memory management use-case
  • Separate the asset bundles from the standard build - the key reason for this could be to reduce build times

What are Asset Bundles?

  • They are containers for objects and assets. You can think of them as packages with lots of assets inside.
  • Asset Bundles store serialized versions of Unity game objects, such as scenes and everything that a scene could contain.
  • An Asset Bundle can contain an entire scene, but if it does - it can't contain anything else (in practical terms).
  • Asset Bundles are platform-specific
  • Asset Bundles use LZMA or LZ4 compression (depending on the Editor version), which can significantly reduce your build size.
  • Asset Bundles use memory. Just in case it is not obvious, the HTTP client needs to download a the bundle (and dump it from memory to disk), release memory, then go about decompressing, which again consumes memory. In big projects we really have to manage this process carefully.
  • Asset Bundles are not part of the build output. You execute the bundle separately and deal with the output on your own. Your typical next step is to upload the bundle to a web server (which you also manage on your own) and get your game/app to download it and store it on disk.

The entire Asset Bundles functionality is encapsulated in the following places:

  • An Asset Bundle Manager and a (quite crappy) DEV/TEST server, which are delivered through the Unity Asset Store. That's right - you have to "import" it into your game/app. It feels *very* unnatural and more like a hack, however it is understandable why it is implemented that way. It puts the asset build pipeline in total control of the developer.
  • The UnityEditor.BuildPipeline class within the UnityEditor.dll assembly. This is where Unity contains coordination logic regarding asset bundles, but the actual bundle packing mechanism is buried in the native part of Unity.

The Asset Bundle Manager

This tool is something you install/import in your game as raw .cs files from the Asset Store. Some scripts are targeted for the Editor experience, while others are examples of how you use asset bundles at runtime in your game/app.

The AssetBundle Manager introduces Editor screens to help you build Asset Bundles. This capability will go over the project hierarchy to detect AssetBundles, then build them in the projects AssetBundles folder.

Since AssetBundles internally use WWW.LoadFromCacheOrDownload, the tooling provides you a local web server, which is a bit too simple, and a Simulation mode so you don't download from a web server.

   

Apart from the fact that the AssetBundle sample doesn't build with newer versions of Unity (straight from the Asset Store and created and maintained by the Unity team!!), the problem with AssetBundles is that it is written terribly, has TODO's in the code and consumes quite a lot of resources. See below:

The main class itself is called AssetBundleManager. Although every function in it is static, it inherits from MonoBehaviour to make use of the public void Update method called on each frame. The fact that it is not static, yet every function call is static means that you

The Initialize method must be called to load something called the AssetBundleManiifest

All calls to load an asset bundle are classes that inherit the abstract class AssetBundleLoadOperation and get stored into a load queue:

static Dictionary<string, WWW> m_DownloadingWWWs = new Dictionary<string, WWW> ();

The AssetBundleManager class Update method (MonoBehaviour, fires on every frame) checks m_DownloadingWWWs and executes all pending AssetBundleLoadOperations. When done, they are removed from the queue and disposed. 

Out of all of the above, probably the only exciting thing is the m_DownloadingErrors Dictionary:

static Dictionary<string, string> m_DownloadingErrors = new Dictionary<string, string> ();

You can use that to check if your download tasks have failed.

Where did the Web Player Asset Bundles go?

As of version 5.3, Unity deprecated the Web Player build target - an in-browser plugin to run Unity games. During the time it still existed, around the 4.0 to 5.3 era, many build pipelines created .unity3d packages. These are Web Player packages in most cases. Some games (such as Hearthstone) use the file extension for their asset bundle build pipeline, but in essence and as far as asset bundling goes, they are asset bundles :)

You can verify this even in the documentation for WWW.LoadFromCacheOrDownload (https://docs.unity3d.com/ScriptReference/WWW.LoadFromCacheOrDownload.html):

   

   

   

The Asset Bundle Manager in 5.4 and later packs files without file extensions. This is fine for most use-cases, however some web file servers require extensions on files. This is why it is a good idea to give them an extension.

You can use tools such as Disunity and Unity Asset Bundle Manager to extract .unity3d files.

Asset Bundle Browser

This is another tool you install from the Asset Store:

   

Its primary purpose is to help you manage asset bundle dependencies, as well as help you build them with more control:

   

Apart from that, it is a great set of (badly?) written code samples to help you understand the Asset Bundles API better, so go check them out:

   

   

AssetBundle Graph Tool

The Unity team has also released the AssetBundle Graph Tool, a set of editor tools to improve and visualise the AssetBundle workflow and build experience.   

   

There is plenty of reading material on it, so no reason for me to repeat it:

The future of Asset Bundles

This article is written on the 30th of December, 2017 and rechecked in January. At the time of writing, AssetBundle Addressables are not yet released for Unity and the current public version is 2017.3. I'm not part of the Unity team (although I'd love to be) so everything in this heading is taken from the demos performed by the Unity team at Unite events.

  • Addressables - The long-awaited replacement of the Resources folder. The idea here is to remove/reduce the overhead of management of AssetBundles through addressable assets and a basic online hosting setup.
  • Resource Manager API - a new set of API's designed to work with Addressables to automate dependency loading in a much better way than the current AssetBundle API.   

We're expecting releases and fixes to the releases soon. I don't have an ETA for you.

Resources

The following are articles you should check out on Asset Bundles:

   

Good luck with your bundling.

   

A sample ASP.NET Core starter solution with Identity, Entity Framework 6 and a Repository Pattern Implementation

This sample solution makes use of ASP.NET Core with ASP.NET Identity over Entity Framework Core running side-by-side with Entity Framework 6. Look at this if you don't want Entity Framework Core for your data layer, but still want to use ASP.NET Identity in ASP.NET Core and benefit from its out-of-the-box implementation (which is awesome!).

Goals and inspiration

I was inspired by Mosh Hamedani's work on "Repository Pattern with C# and Entity Framework, Done Right". I think the theory that Mosh has put forward on the difference between a repository pattern and EF's DbContext is well structured and well explained. The sample code is also worth checking out.

I wanted to take it one step forward and put the pattern into an ASP.NET Core starter solution. The key point I want to demonstrate is using critical components of an ASP.NET Core MVC web application with an implementation of the repository that works side-by-side with the ASP.NET Identity 3 implementation. I wanted to make the repository asynchronous as well, because there's not many good examples out there.

The catch: ASP.NET Identity 3 uses ASP.NET Core and Entity Framework Core. That's all great, but many are not ready for EFCore and want the battle-tested Entity Framework 6. EF6 is feature packed, while EFCore is still very feature-limited.

In summary, this sample answers these questions:

  • How do you inject an EF6 context with a connection string in the ASP.NET Core service collection?
  • How do you run ASP.NET Identity on EFCore and the rest of the application on EF6?
  • How do you properly implement the Repository pattern with EF6?
  • How do you make your Repository async?
  • How do you maintain one database for both your custom entities and ASP.NET Identity?
  • How and where do you implement a Services layer?
  • How do you make your Controller methods Async?
  • How do you use Services with Controllers?
  • How do you keep the connection string in one place.

GitHub Repository: https://github.com/OneBitSoftware/AspNetCore.Ef6.Identity3.StarterSolution

The Data class library

I've taken all data access classes in relation to EF6 and the Repository pattern in a separate class library. This library doesn't do anything else apart from the responsibility of data access. Technically you can split it into more assemblies if your case makes sense.

There's no need for me to describe the Repository implementation itself, because Mosh has done it very well. Follow his video or code sample for background knowlede. I am focusing on using it in an ASP.NET Core MVC application.

This is what my *.Data project looks like:

Asynchronous Repository

Many operations against Entity Framework 6 make sense to be asynchronous. You get the true benefit if the async chain goes all the way up to the controller. That is why my repository interface returns Task<T>.

public interface IRepository<TEntity> where TEntity: class

{

Task<TEntity> GetAsync(int id);

Task<IEnumerable<TEntity>> GetAllAsync();

Task<IEnumerable<TEntity>> FindAsync(Expression<Func<TEntity, bool>> match);

Task<TEntity> AddAsync(TEntity entity);

Task RemoveAsync(TEntity entity);

Task RemoveRangeAsync(IEnumerable<TEntity> entity);

}

The Web project

All of my other bits and pieces I have stuck in the MVC project. I have strictly kept to the empty ASP.NET Core MVP project structure - my Services and View models are here. I have not touched the ApplicationDbContext ASP.NET Identity DbContext class to allow for a quick start.

If you choose, you can break this project into more class libraries. In this sample I keep my MVC(S) Services where my ViewModels are, so Services return ViewModels, which also means that they encapsulate the Factory calls to map Models to ViewModels. If you wish, you can move that responsibility to the Controllers and have your Services return Models (and essentially move Services to their own library).

The Visual Studio tooling and specifically the MVC scaffolding is unfortunately designed (as of now) to have everything in one project, that is when you get good scaffolding possibilities.

Here is what my *.Web project looks like:

Common database for entities and ASP.NET Identity tables

Nobody wants to deal with more than one database unless it is really necessary. In our case we want a web application with objects and user accounts - not a hugely complex thing. A single database would be appropriate for many scenarios.

The sample does just that - both contexts (ApplicationDbContext and my custom entity context) access the same database. They use the same connection string during registration in the ConfigureServices startup method.

services.AddDbContext<ApplicationDbContext>(options =>

options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));

services.AddScoped<CarApplicationDbContext>(_ =>

new CarApplicationDbContext(Configuration.GetConnectionString("DefaultConnection")));

Here is what the database looks like. EF5 uses the migration history tables to deal with migrations as it should, the ContextKey column here is key.

Target Framework

Entity Framework 6 is dependent on the full .NET Framework. This is why the solution targets .NET Framework (4.5.2) but uses the ASP.NET Core framework and libraries to run the application.

Code on GitHub

Feel free to use as you please. I'd be glad to get your feedback and comments.

GitHub Repository: https://github.com/OneBitSoftware/AspNetCore.Ef6.Identity3.StarterSolution

A Rational Guide to Dealing with Azure's "MySQL in App" App Service Capability and PHP Web Applications - Part 2

This is a multi-part series with tips and solutions to roadblocks that I faced in dealing with the "MySQL in App" service.

Part 2 - Migrating a really large MySQL database to Azure's "MySQL in App"

You can use the phpMyAdmin interface to do most operations on your database, but it has never been good with importing/exporting databases over 8MB, and these days 8MB is not that big. 

I had to operate on a 3GB database and using phpMyAdmin to restore it was no option at all.

Export your database from your source with the "mysqldump.exe" command:

mysqldump -u [user_name]-pdbpassword [database_name] --routines > 20170416_dbbackup_full.sql

(notice the -pPASSWORD syntax, a bit weird but there is no space there).

NOTE: if your .sql file is "huge" see spliting a .sql file on Windows below. "Huge" means the import times out given the size of the App Service resources that you have provided.

Soon we will use the Kudu console, but first access the FTP service that is enabled on your Azure App Service. Get the FTP hostname and credentials from the Overview screen, and upload your large database.

FTP your large database to the D:\home folder:

Once your file is there, you can manipulate it with mysql.exe.

The key thing to gather is the path and port of your MySQL instance (phpMyAdmin could tell you both. The username and passwords you need to know are the "root" and "azure" accounts, both with passwords "password". You also need to specify the --bind-address parameter to localhost.

Pretty much your command looks like this:

D:\home\sql_restore>"D:\Program Files (x86)\mysql\5.7.9.0\bin\mysql.exe" --user=root --password=password --port=56172 --bind-address=127.0.0.1 mydbname-test-01 < 20170401_dbbackup_mini.sql

This is enough for you to restore a large database, but there is one big problem with this approach. The Kudu console will timeout (I think after 30 or 60 minutes) and kill the process that is hosting it. BAD. Basically it looks like this:

Eventually you might get a timeout of the console before the operation completes, terminating it half-way.

See my next post for a solution to the above problem: Part 3 - Splitting a really large MySQL database dump/export file into smaller pieces

A Rational Guide to Dealing with Azure's "MySQL in App" App Service Capability and PHP Web Applications - Part 1

If you haven't heard, Azure's App Service PaaS offering comes with a build in MySQL instance. See these blog posts for welcoming info:

This is great because you can move any PHP solution (like Wordpress) that uses MySQL to an Azure App Service instance. You could even use the free tier and host your blog or anything else at your likings.

Working with PHP and MySQL in Azure has its intricacies and definitely doesn't have a lot of information.

This is a multi-part series with tips and solutions to roadblocks that I faced in dealing with the "MySQL in App" service.

I was tasked with moving a pretty big PHP solution with a big database to Azure.

Here are some key bits of information that I were either not documented or I had a hard time figuring out.

Part 1 - Overview and different interfaces to manage PHP

Azure's portal doesn't have much, but it has a few crucial buttons worth knowing about. I will first explain the Kudu console, because it is a predecessor to tweaking PHP settings.

Kudu Console, aka Advanced Tools

You might be familiar with the Kudu project, a cool set of tools to manage your Azure App Instance. This is key to working with PHP and MySQL.

Access it through the App Service interface in portal.azure.com.

The Kudu UI is simple enough and has tons of useful information:

To access the console, go to the Debug console menu and click CMD or PowerShell, depending on what you need.

This gives you a neat console into the VM behind the App Service (yes, they are VM's!).

Everything under C:\ is blocked, but you can do all you wish under D:\

PHP Settings

An Azure App Service has PHP installed by default, and it has various versions available too. These are changeable in the Application Settings screen on the App Service:

The actual php.ini file is located in "D:\local\Config\PHP-5.5.38\php.ini" (or as per the version you are using). You can edit that, however it is a much better recommendation to do it the "right way" and create a ".user.ini" file. To do this, add a custom php.ini override file with the configurations you need, place it in your application root and deploy your application. Restart it so you force PHP to re-read its configuration.

Alternatively, you can place a ".ini" file on the D: drive, such as d:\home\site\ini, then add the 'PHP_INI_SCAN_DIR' configuration settings key in "App settings":

Both will work. This is documented much better here: Configure PHP in Azure App Service Web Apps

Managing the MySQL service

You can manage the MySQL service through the Azure portal UI (very limited) or through the well-known phpMyAdmin interface. The Azure App Service comes with phpMyAdmin installed. Reach it through the difficult to see "MySQL in App" option and the "Manage" portal button/link, or through the following URL: https://[appserviceaddress].scm.azurewebsites.net/phpMyAdmin/ mine was https://blog-prod-01.scm.azurewebsites.net/phpMyAdmin/

You will be automatically logged in with an "azure" account.

And the not-so-visible Manage button:

You will reach the phpMyAdmin interface:

This is enough to get you going with basic PHP and MySQL management. My next post describes how to import large MySQL databases: Part 2 - Migrating a really large MySQL database to Azure's "MySQL in App"

Jerky mouse movement with a laser Logitech mouse

This is one of those off-topic blog posts that I eventually write, but I spent significant time trying to figure out why my Logitech G700s mouse doesn't move in a straight line and I think it would be worth it.

I couldn't find solutions for this on the web. My problem is simple - I move the mouse and it doesn't go smooth in a straight line. It jumps sporadically across the movement vector at about 0.5-1cm off its course, making it impossible to use for games or anything for that matter. I can't really demonstrate it easily, but it was bad.

I got initially put off in the wrong direction - drivers. I did all kinds of installs and uninstalls, both native windows and Logitec. I played with the pointer speed settings both in Windows devices and in the Logitech mouse drive software. None solved my problem.

I even tried different mouse pads.

The issue at hand is that it is just dirty - the laser heads have very tiny dust particles that are very difficult to see. I tried blowing them off, but I'm not a good blower.

My fix is easy - I stuck an almost dried out wet wipe tissue into the laser hole and rotated it a few times. The laser heads we're visibly cleaner.

Now it works like brand new. This is one of the most smoothest mice I have owned and it is fantastic.

Happy cleaning!

The Mystery of Microsoft.IdentityModel.Extensions

When doing development with SharePoint Server and SharePoint Online, we have to do API calls that are authenticated and authorized. Authentication and authorization over HTTP in regards to SharePoint API's, for the better part of it, nowadays is based on principles and techniques predominantly stepping on the OAuth protocol, which deals with authorization by its definition. We have to do handshaking with token providers and resource owners, and include HTTP headers with our calls. From a development perspective, as long as you have an HTTP request/response interceptor and code that can generate/manipulate/transform HTTP packets and tokens – you pretty much have whatever you need to call an OAuth secured endpoint, such as SharePoint API's.

Every modern web development platform has these capabilities. You can build SharePoint provider-hosted add-ins and console applications on whatever platform you want and use whatever HTTP mangling language you want (See the Python example here for proof https://github.com/SharePoint/PnP/tree/master/Samples/Python.Office365.AppAuthentication). The only thing you really need to do is deal with the authentication and authorization side of things, then just call the API's.

When it comes to the Microsoft promoted development set of tools, we use Visual Studio, ASP.NET and Office Developer Tools to build provider-hosted apps for SharePoint. Our development model steps on ASP.NET, which steps on Windows Identity Foundation and ASP.NET (System.Web and supporting libraries). Visual Studio knows nothing about SharePoint add-ins unless you install Office Developer Tools.

If you've been around for a while, you will recall Windows Identity Foundation and how it got integrated into the .NET Framework (4.5) (see this Namespace Mapping between WIF 3.5 and WIF 4.5 and Guidelines for Migrating an Application Built Using WIF 3.5 to WIF 4.5). "Beginning with .NET 4.5, Windows Identity Foundation (WIF) has been fully integrated into the .NET Framework". *Almost* everything under the namespaces Microsoft.IdentityModel have been moved to System.Security.Claims, System.ServiceModel.Security, and the System.IdentityModel namespace.

One odd fella, Microsoft.IdentityModel.Extensions, has been left aside. This is the namespace in the Microsoft.IdentityModel.Extensions.dll file and is where the code for SharePoint provider-hosted apps mangling of OAuth and S2S tokens is located. During that era SharePoint development had a decline in general, mostly due to the growth of Office 365 and Azure AD, so the libraries got left aside (btw, they are all owned by the almighty and unhuman idol of mine Vittorio Bertocci). Microsoft.IdentityModel.Extensions is not maintained by anyone, yet SharePoint add-ins depend on it. The future of SharePoint server-side development depends on it. Development of SharePoint add-ins on the Microsoft development platform depends on it. SharePoint PnP depends on it, so we need to show that library some deep love.

Enough history, let's look at where we use it and where it comes from.

 

Understanding the dependency chain

I'm going to focus on SharePoint provider-hosted add-ins and applications that use App-only calls, because that's where this stuff gets used for the most of it.

You create a SharePoint Add-in project in Visual Studio and you get a bunch of stuff in the project. Some are .cs/.vb files, others are assembly refences and a few nuget packages.

If you look at the AppForSharePointOnlineWebToolkit nuget package, it contains Framework assembly references. Notice the Microsoft.IdentityModel.Extensions assembly and the fact that it doesn't exist in the nuget package itself. AppForSharePointOnlineWebToolkit brings in SharePointContext and TokenHelper classes to your project (together with the ASP.NET MVC SharePointContextFilter attribute). Before the AppForSharePointOnlineWebToolkit package was available, we used to copy those classes to get a console application to authenticate with SharePoint. Here is a view of the package in NuGet Package Explorer. You might recognize SharePointContext.cs/.vb and TokenHelper.cs/.vb.

So, where does Microsoft.IdentityModel.Extensions.dll come from? This is the mysterious question that we have to answer, so we can reliably build PnP tooling that helps developers.

Let's look at the Microsoft.SharePoint.CSOM NuGet package:

No dependencies. No Microsoft.IdentityModel.Extensions.dll to be found.

This is quite a bad design decision – both Microsoft.SharePointOnline.CSOM and AppForSharePointOnlineWebToolkit require Microsoft.IdentityModel.Extensions.dll to be present, TokenHelper has direct managed code calls to classes in it. Inherently, our OfficeDevPnP.Core library depends on Microsoft.IdentityModel.Extensions.dll, and this is where it affects PnP.

The answer to the question: the actual assembly gets delivered with Office Developer Tools for Visual Studio 2015: https://www.visualstudio.com/vs/office-tools/ through one of its packaged MSI's, which steps on Web Platform Installer. This means that the entire development toolchain depends on the abovementioned dependencies.

The problem doesn't exist only here either. There's quite a few nuget packages that depend on Framework assemblies. That is OK. Our problem is that it is delivered with a Visual Studio extension.

NOTE: At one point of our careers me and a colleague of mine did a silly thing and packaged the DLL as an NuGet package and published it on nuget.org. We delisted it, but can never remove it from the feed (that's how nuget.org rolls): https://www.nuget.org/packages/Microsoft.IdentityModel.Extensions/1.0.0

 

You can actually download the MSI

Funny enough, you can directly download the MSI packages that copy over the DLL to your machine:

If you open the MSI with 7-ZIP (yes, you can do that), or any other MSI explorer, you can see that it only contains that DLL.

This is the only way I know, that you can get the assembly without VS Office Developer Tools… until…

SharePointPnP.IdentityModel

As part of PnP, we want to help developers build stuff. We pay a lot of attention towards what you need to get going and how easy it is to get it. We currently have a new agenda, too – getting PnP to run on .NET Core. This is possible, see my previous blog posts on Developing the ASP.NET Core Authentication/Authorization Middleware for SharePoint Provider-Hosted Apps (Add-ins), and the OfficeDevPnP.Core.Framework.Authentication library. Keep an eye for developments here, because we're investing time into it.

The OfficeDevPnP.Core.Framework.Authentication library achieves one awesome goal – it lets you build an ASP.NET Core web application that authenticate with SharePoint, just like every other SharePoint provider-hosted add-in, but you can make use of the ASP.NET Core stack, which is tons better than System.Web.Mvc etc. You can use CSOM with it to manipulate SharePoint. Awesome, but it doesn't solve one next challenge – to use OfficeDevPnP.Core to help you manipulate SharePoint better.

OfficeDevPnP.Core naturally depends on Microsoft.IdentityModel.Extensions.dll, which depends on System.IdentityModel.dll, and that causes hell when you want to run in .NET Core.

Our solution is simple – port Microsoft.IdentityModel.Extensions.dll to a .NET Core project in Visual Studio. The hard work of this is done here: https://github.com/SharePoint/PnP/tree/dev/Solutions/AspNetCore.Authentication/src/SharePointPnP.IdentityModel

For the time being, you can build a custom OfficeDevPnP.Core assembly with reference to SharePointPnP.IdentityModel instead of Microsoft.IdentityModel.Extensions and happily continue referncing it in .NET Core.

We are working with the PnP team to do this once for good in January (2017). Stay tuned…

Azure App Service CORS versus Web API CORS

I'm blogging this as it wasted about 3 hours of my time. I ended up debugging the Microsoft.AspNetCore.Cors library, and that takes time.

I was having issues with cross-domain AJAX requests (with AngularJS). Firefox was pretty shit at telling me the problem. The developer tools just show the failed request. IE was pretty verbal:

SEC7122: Credentials flag was set to true, but Access-Control-Allow-Credentials was not present or was not set to "true".

Right on the spot. So using Fiddler, I spent some time reviewing HTTP packets (of course it is difficult with HTTPS and debugging) and IE was pretty much right. The response did not include Access-Control-Allow-Credentials: true

I tested it locally and it works fine. Debugging runtime in Azure is difficult (I would've lost another 3 hours probably) so I ended up guessing. It turns out that configuring CORS on the Azure Web Site overrides stuff in the underlying layers. Even though that ASP.NET Core is doing the right thing, somewhere (most likely on packet exit) Azure CORS takes control. So, if you want to use CORS in ASP.NET, avoid this:

Solution: remove all of them, including a "*" if you have one.

Unfortunately (for me), this is also documented (I went to document it myself, but it was already there!). All I had to do was find it :)

https://azure.microsoft.com/en-gb/documentation/articles/app-service-api-cors-consume-javascript/#app-service-cors-versus-web-api-cors clearly states:

"Don't try to use both Web API CORS and App Service CORS in one API app. App Service CORS will take precedence and Web API CORS will have no effect. For example, if you enable one origin domain in App Service, and enable all origin domains in your Web API code, your Azure API app will only accept calls from the domain you specified in Azure."

And even further, there is something on StackOverflow:

http://stackoverflow.com/questions/36860423/enable-access-control-allow-credentials-header-in-azure-website-azure-app-servi

Ouch, painful, lesson learnt. Hope this helps you and saves you time.

For the record, Microsoft.AspNetCore.Cors ignores everything if the Origin header is not present:

Microsoft.AspNetCore.Authentication.ActiveDirectory updated to ASP.NET Core RTM

Since the ASP.NET Core RTM bits came out it's time to update all RC1/RC2 solutions and NuGet packages to run under RTM.

Find it here:

The steps to update are not that difficult, but here they are in general:

  1. Global.json should be updated, and the version here is important. Notice the bits are in preview (we still don't have VS2015 tooling that is RTM at the time of writing this):
{
"projects": [ "src" ],
"sdk": { "version": "1.0.0-preview2-003121" }
}
  1. Change all dependencies from Microsoft.AspNet.Something to Microsoft.AspNetCore.Something. Some libraries have been totally rearranged.
  2. Update the project.json file. There are quite a few things to be done and intellisense is good at pointing them out. The most important:
"frameworks": {
"netcoreapp1.0": {},
"net451": { }
}

This essentially means the library can run in both frameworks, cool.

The code has also significantly changed/improved – lots of work around redirects and working with the ASP.NET Cookie Middleware.

Feel free to ping me if you have issues.

Getting Started with ASP.NET Core Add-ins for SharePoint Online

Overview

With the introduction and growth of ASP.NET Core, we SharePoint/Office 365 developers need a story that allows us to build Add-ins on top of provider-hosted ASP.NET Core deployments. What we have now works with old school ASP.NET and MVC 5 (yes, that is now old J).

As I explained in my previous post, Developing the ASP.NET Core Authentication/Authorization Middleware for SharePoint Provider-Hosted Apps (Add-ins), I've explained why this might be appealing and the technical challenges we are faced with. I have put together a library that allows us to develop SharePoint Add-ins that run on ASP.NET Core and in this blog post I explain how you can get started with including it in your own ASP.NET Core projects.

We have now updated the library to .NET Core and ASP.NET Core RTM. Previously it was running on RC1 only.

If you just want to have a look, just get the sample project and run it. Please share your experiences and feedback, they are an important reminder to where we need to put effort in with the PnP team.

Before you start

  • You don't have to complete these steps manually; the PnP repository has a sample project located here: https://github.com/OfficeDev/PnP/tree/master/Solutions/AspNetCore.Authentication
  • Make sure you are using Visual Studio 2015 Update 3. This is not required, but some new features in the tooling make things easier.
  • You need to register your app and get a ClientId and ClientSecret. This is not detailed here as it is nothing new. By default, ASP.NET Core web applications run on https://localhost:5000

Things you need to know before you start (all are explained why here):

  • At the time of writing (09.09.2016) there is no support for High-Trust Add-ins, yet. This is on our roadmap.
  • Our PnP library still targets .NET Framework 4.5.1.

Adding SharePoint Authentication to ASP.NET Core

Step 1: Create a new ASP.NET Core project based on the Web Application template

Step 2: Change the target framework of the web application project

This might change in future. Currently, our PnP library only runs on .NET 4.5.1. See why in this post: Developing the ASP.NET Core Authentication/Authorization Middleware for SharePoint Provider-Hosted Apps (Add-ins)

Change this:

"frameworks": {
"netcoreapp1.0": {
"imports": [
"dotnet5.6",
"portable-net45+win8"
]
}
},

To this (you can, of course, keep the imports that you need):

"frameworks": {
"net451": {}
},

Then, remove the dependency to Microsoft.NETCore.App (it is only for netcoreapp1.0 and above):

"dependencies": {
"Microsoft.NETCore.App": {
"version": "1.0.0",
"type": "platform"
},

If you need the NETStandard library , just add it under dependencies:

"dependencies": {
"NETStandard.Library": {
"version": "1.6.0",
"type": "platform"
},

Step 3: Add the Microsoft.SharePointOnline.CSOM NuGet package

You need this to write SP CSOM code that works with SharePoint J Get it here: https://www.nuget.org/packages/Microsoft.SharePointOnline.CSOM/

PS Command: Install-Package Microsoft.SharePointOnline.CSOM

 

Step 4: Add the OfficeDevPnP.Core.Framework.Authentication NuGet package

With the current release of VS Tooling and .NET Core projects (Preview 2 at the time of writing) you can't add a reference to a DLL, you need to add it through a NuGet package.

As pointed out, also at the time of writing the PnP Core Authentication code is not part of the Core library (see roadmap later in this post, we are working on that). This means that you need to bundle the OfficeDevPnP.Core.Framework.Authentication DLL into a NuGet package.

You can do this following the steps outlined on the NuGet site (https://docs.nuget.org/create/creating-and-publishing-a-package ) or just using Nuget Package Explorer (click here).

 UPDATE (18.09.2016):

I have added a ready NuGet package to the project. You can find it here:

 

Step 5: Add other necessary NuGet packages to your project.json

You will need these:

"Microsoft.AspNetCore.Authentication": "1.0.0",
"Microsoft.AspNetCore.Session": "1.0.0",
"Microsoft.AspNetCore.Server.Kestrel.Https": "1.0.0",

 

Step 6: Configure your Service Collection and pipeline in Startup.cs

Now it is time to add code. I hope you are impressed with the minimal footprint.

Add using OfficeDevPnP.Core.Framework.Authentication; and using OfficeDevPnP.Core.Framework.Authentication.Events;.

Include AddSession() and AddAuthentication() to your service collection.

public void ConfigureServices(IServiceCollection services)
{
// Add framework services.
services.AddMvc();
 
//Add Session to the service collection
services.AddSession();
 
//add the authentication middleware and point SP as the default authentication sign-in scheme
services.AddAuthentication(sharedOptions =>
sharedOptions.SignInScheme = SharePointAuthenticationDefaults.AuthenticationScheme);
}

Then, go ahead and configure your pipeline to include the middleware:

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
//required to store SP Cache Key session data
//must also call AddSession in the IServiceCollection
app.UseSession();
 
//Add SharePoint authentication capabilities
app.UseSharePointAuthentication(
new SharePointAuthenticationOptions()
{
ClientId = Configuration["SharePointAuthentication:ClientId"],
ClientSecret = Configuration["SharePointAuthentication:ClientSecret"],
 
AutomaticAuthenticate = true, //set to false if you prefer to manually call Authenticate on the handler.
//OPTIONAL: CookieAuthenticationScheme = "AspNet.ApplicationCookie",
 
//Handle events thrown by the auth handler
SharePointAuthenticationEvents = new SharePointAuthenticationEvents()
{
OnAuthenticationSucceeded = succeededContext => {
return Task.FromResult<object>(null);
},
OnAuthenticationFailed = failedContext => {
return Task.FromResult<object>(null);
}
}
}
);

NOTE: Some web applications/add-ins will require multiple authentication mechanisms. The library is built to allow this and co-exist with other Authentication middleware.

NOTE 2: Keep in mind that this is a pipeline, so order matters. The right place will depend on your app.

 

Step 7: Add your ClientId and ClientSecret

Configurations in ASP.NET Core happen in the appsettings.json file. Add the following under the root object:

"SharePointAuthentication": {
"ClientId": "Add id here",
"ClientSecret": "Add secret here"
}

Step 8: Modify your Kestrel Web Server to run on HTTPS

Since we're doing Add-in authentication based on the OAuth protocol, we need to run over SSL. ASP.NET Core can run both on IIS and on Kestrel, I prefer Kestrel. This is what you need to do:

In Program.cs, change this:

public static void Main(string[] args)
{
var host = new WebHostBuilder()
.UseKestrel()
.UseContentRoot(Directory.GetCurrentDirectory())
.UseIISIntegration()
.UseStartup<Startup>()
.Build();
 
host.Run();
}

To this:

public static void Main(string[] args)
{
var host = new WebHostBuilder()
.UseKestrel(options =>
    {
    options.UseHttps(@"..\..\certificates\localhost_ssl.pfx", "pass@word1");
options.NoDelay = true;
    }
)
.UseUrls("https://localhost:5000")
.UseContentRoot(Directory.GetCurrentDirectory())
.UseStartup<Startup>()
.Build();
 
host.Run();
}

You can stick it in a try/catch if you wish. I have also added a IgnoreSslErrorsConnectionFilter class in the PnP sample to get rid of SSL handshaking errors with untrusted certificates. This is optional.

The SSL certificate is committed to the certificates folder. Note the https://localhost:5000 URL. Make sure everything runs at this point.

Make sure your project is firing up HTTPS://localhost:5000 and your F5 is not kicking off IIS Express, but rather the actual web application (which runs it as a console application now). This will start Kestrel.

Step 9: Add some CSOM code

You are all set. Write some CSOM Code to test:

var spContext = SharePointContextProvider.Current.GetSharePointContext(HttpContext);
var spLists = new List<SharePointListViewModel>();
 
if (spContext == null) return View();
 
//build a client context to work with data
using (var clientContext = spContext.CreateUserClientContextForSPHost())
{
if (clientContext != null)
{
var lists = clientContext.Web.Lists;
clientContext.Load(lists);
clientContext.ExecuteQuery();
 
foreach (var list in lists)
{
spLists.Add(new SharePointListViewModel() { ListTitle = list.Title });
}
}
}

Step 10: Test it!

Now that everything is set, go to the SharePoint Online site where your app is configured, then click it's link. It will go through AppRedirect.aspx and then lead your browser to https://localhost:5000 if everything is configured right. The ASP.NET Code PnP middleware will intercept (based on the SPHostUrl query string) and get the context token, work through it and eventually allow you to instantiate the SPContext object needed to work with CSOM data. Pretty cool, hey?

Summary

Overall, it takes 10 steps to heaven J We are working hard to improve everything we can in regards to the developer experience, that is the fun parth in building tooling and API's.

Please take the time to give feedback, share your issues or even high five. Microsoft and the PnP team need to see activity and the need for this library, so we have justification to increase its priority.

 

Roadmap for the ASP.NET Core Authentication library

As mentioned, things are still evolving. We have tons of issues to solve in terms of dependencies and figure out how to make it easier to plug-and-play all of this. It might seem hard at first, but so where Apps when they first came out. The PnP team is working hard to make it easier for developers and this is all done in our free time. Please help us out with constructive feedback and contributions https://github.com/OfficeDev/PnP .

Here is a list of targets that we have:

  • Enhance the sample to demonstrate the usage of App-Only access tokens.
  • Enhance the sample to demonstrate on-the-fly authentication through the Authentication Code flow.
  • Remove dependencies from Microsoft.IdentityModel.Extensions and framework assemblies
  • Implement High-Trust authentication capabilities
  • Add to the current OfficeDev PnP Core library
  • Build a true .NET Core App compatible library