Quantcast
Channel: Jerrie Pelser's Blog RSS Feed
Viewing all 317 articles
Browse latest View live

5 Weeks of ASP.NET Weekly Tools and Libraries - Part 4

$
0
0

I publish a weekly newsletter for ASP.NET Developers called ASP.NET Weekly which contains a curated list of the best ASP.NET related content I came across during the previous week.

Each week I feature a tool or library which I think ASP.NET developers will find useful. This is part 4 in a 5 part series of blog posts during which I recap all the tools and libraries I featured during the past year.

I re-publish these with the exact descriptions I used in the newsletter, so see my commentary in that context…

31. Material Design Lite

If you are a lover of all things Material Design, you will be happy to know that you can now apply some of that Material Design love to your website. Google has released the Material Design Lite CSS framework which allows you to apply Material Design styles and principles to your website.

32. ExpressMapper

AutoMapper has long been one of my go-to libraries which I use in most projects. Well it seems that there is now a new kid on the block which wants to upset the status quo. The ExpressMapper API seems fairly similar to AutoMapper so it should be familiar for developers which currently use AutoMapper.

It does however claim massive performance gains over AutoMapper and other similar libraries. If you’re an existing AutoMapper user then it may be worth your while to check it out.

33. Visual Studio Web Development Extensions

This week’s tool is not a single tool, but I am rather pointing you to this web page which lists a whole bunch of Visual Studio extensions that assists with web development.

There are 15+ extensions listed, so go through them. You are bound to find at least one which you will find handy.

34. WebApiThrottle

WebApiThrottle is an ASP.​NET Web API rate limiter for IIS and OWIN hosting. It is designed to control the rate of requests that clients can make to a Web API based on IP address, client API key and request route.

35. Designing for Performance

This week’s tool is actually a book. Lara Callender Hogan has graciously made available for free her book entitled “Designing for Performance”. The book helps you approach your web development projects with page speed in mind, showing you how to test and benchmark which design choices are most critical.

You can read it free on the web, or show your gratitude to her by buying the print or ebook versions.

36. Bootbox.js

Bootbox.js is a small JavaScript library which allows you to create programmatic dialog boxes using Bootstrap modals, without having to worry about creating, managing or removing any of the required DOM elements or JS event handlers.

37. Bootstrap Starter Kit

I prefer not showcasing commercial products in this section, but I feel I need to make an exception with this one. You can think of Bootstrap Starter Kit as an HTML theme on steroids. It allows you to very rapidly build website by dragging and dropping predefined building blocks on a design surface.

Check out the live demo to get an idea of how you can rapidly build a website.

38. Tag-it

Tag-it is a A jQuery UI plugin to handle multi-tag fields as well as tag suggestions/autocomplete. If you need to give your users the ability to enter multiple tags, then this is a good option for you to look at.

39. Bootstrap Tour

Bootstrap Tour is an open source library that allows you to easily add a product tour to your application by using Bootstrap popovers.

40. Shouldly

Shouldly is a .NET library which allows you to write much better assertions for your unit test. It goes beyond just the ease of use of assertions as it give great failure messages that allow developers to track down why a test failed mush more easily. For more examples on how to use if you can also check out the documentation.


Top blog posts of 2015

$
0
0

2015 in review and plans for 2016

$
0
0

Update 26 January 2016

I have barely written this blog post or some changes happened. Read more about it.

Looking back at 2015

This time of the year you tend to see a lot of posts from people looking back at the previous year. In enjoy reading these, and also to do one myself as it highlights the progress one has made over the past year.

For me 2015 started off with the Killing of a Rockstar and me working for a company in Chiang Mai.

The work ended in July for a whole range of reasons most important of which was that I just did not fit in. The cultural differences and language barrier was just to difficult to overcome.

After that I floated around for a few months until I decided to see whether all the traffic coming to my blog can gain me some work. It worked out great for me and I am currently working for 2 clients - which is as much as I can handle at one time.

I have had so many enquiries for people approaching me (around 3 per week), that I decided to remove the sections at the bottom of each post stating that I am available for work, as I was turning down everyone in any case.

BTW, if you want to start doing some freelancing check out my post Finding work as a Freelance Software Developer

During 2016 I blogged fairly steadily at about one post a week, and the blog’s traffic grew nicely.

I completed the OAuth for ASP.NET website as a reference on how to configure all the different OAuth providers for ASP.NET.

I started ASP.NET Weekly and subscriber numbers grew nicely to around 3,000 and could have been much more if I had not removed the subscriber popups on my blog - more on this in a bit.

I also started with AspnetCasts and produced a number of videos, but that too came to a screeching halt.

Overall I am happy with 2015. Some things worked and some did not. Most importantly for myself was that I figured out more about myself and what I like and don’t like to do.

So on to 2016, and here is what is in store…

Themes for 2016

Travel

In early 2012 I left South Africa on a grand mission to travel around the world. I traveled through Southeast Asia for about 5 months and got to Chiang Mai, Thailand and decided it was time for a break from traveling.

The plan was to settle down for maximum 6 months and focus on a Windows 8.1 app I was writing. I got so comfortable, and liked the place so much that I got stuck for 2 and a half years!

Well in 2016 all that is changing. I left Thailand in December and right now I am back home in the Lowveld region of South Africa visiting my mom.

Later in January I will head up to Pretoria for a few days and then I am off to Orlando, FL for about a week until the end of January. That is as much as is booked at this moment in time. The rough plan is to then head south to Mexico for a few months and hopefully I can return to the USA end of March for Build - if I can get a ticket.

Otherwise I may stay longer in Mexico and then head further south to South America. I don’t plan these things too far ahead and make it up as I go along. The general plan however is to spend the next few years in South America. On my list is Chile, Argentina, Colombia, Ecuador, Peru and Uruguay.

Meeting up with blog readers

I have already made some friends online (through my blog, Twitter and the work I do) in some of the countries I wish to visit, so I hope to meet up with some of them.

I will still add a section to this blog stating my general itinerary and allowing people if those countries to reach out to me. I really hope I can use the blog as a vehicle to meet up with some of the blog readers in the coming year. I love meeting people from different countries and learning about their countries and cultures.

I will also make use of co-working spaces in those countries which I know from past experience will give a lot of opportunity to meet up with locals and other like-minded people - i.e. computer nerds… ;)

Learning Spanish

Having already learned my lesson about what a great problem the language barrier can be, I started a few months ago with learning Spanish on Duolingo. I am still at a very beginner level but hopefully this will improve over the coming year and I am exposed to the language in Latin America.

Creating content and training materials

This year I plan to produce a lot of varied content - blog posts, the newsletter, more short videos and full-blown video training courses. My plan is to keep this content free as much as possible. I get paid enough to not have to worry about money and this is my contribution back to the community.

Some of the full-blown courses I will charge money for, but this is something I will decide on later as the circumstances dictate.

I mentioned previously about removing the newsletter signup popup from the blog. This is something I am committing to this year, and that is to put the content front and center.

No more annoying popups. No affiliate links. No advertisements. I will not cheapen this blog with a thousand popups and disrespect my readers in the process. That is my commitment to you. If I waver from it, please call me to task.

I will place some links on my website to point you to some more of my own content. If I need work again I will make it prominent on the blog that I am available for hire. That is as much of “advertising” as you will see.

Previously I have tried to put out blog posts on a set schedule every Tuesday and it worked great. With the greater focus on the videos and also the full video training courses I may not be able to keep to this. I will turn it around a bit this year. Instead of publishing on a set schedule, I will commit to producing content on a set schedule.

This may mean more or less blog posts. I will have to see how this works out. Producing the videos takes time, and this is one reason I am looking at charging for the full courses, as it will bring in money to allow me to hire someone to actually edit the videos, which in turn will free up time to allow me to produce more of the free content.

My mind is not made up 100% on this. As with most other things in my life right now, I will figure it out as I go along.

Generating Income

My freelance work is what brings in the money to keep this party going. Right now I have 2 clients that I do work for. I like the people I work for and I like the work they give me. I hope to carry on with these 2 clients for the foreseeable future.

If either of those contracts end, I am confident enough that I will be able to pick up some more work through the blog again.

For me it is important to keep doing the freelance work (besides the fact that it generates income). For one it means I stay up to date with my skills and learn new skills and technologies, but the other thing is that it also generates ideas for blog posts and videos.

As mentioned before I also hope to generate some income from full-blown courses. I do not foresee that this will generate a sustainable income for me - at least not this year. Maybe somewhere in the future it can.

Specific goals for 2016

OK, time to get specific with goals for 2016. Here is my list for 2016:

Producing Content

  • Produce at least 30 blog posts
  • Produce at least 30 short videos (< 10 minutes)
  • Produce 2 full video courses
  • Rework the OAuth for ASP.NET website to cater for ASP.NET 5
  • Publish ASP.NET Weekly every week

Personal and Career Improvement

  • Learn Spanish. By the end of the year, I must be able to produce one of my short videos speaking only in Spanish. This one scares me, but then again English isn’t my native language either ;)
  • Attend one major developer conference in the US.

Travel

  • I have no specific goals, other than to keep moving at a fairly steady pace, and not get stuck in one place for more than 3 months.

Spiritual and Health

  • I have started with the Read Scripture series from the people at The Bible Project, so my goal is to read the entire Bible this year. BTW, if you are a Christian or in any other way interested in the Bible, you should go check out their YouTube channel. They are doing some really great animated videos explaining the different books and concepts in the Bible.
  • Reach a target weight of 82 kg, as I put on a bit of weight the past 3 years in Asia. I need to drop about 10kg or perhaps more as I have not been on a scale for a while.

Keep myself accountable

  • I will start doing the monthly reviews again, to keep accountable and focused on my goals. End of each month I will review my progress in the past month, and set some short term goals for the following month.Change of plans

5 Weeks of ASP.NET Weekly Tools and Libraries - Part 5

$
0
0

I publish a weekly newsletter for ASP.NET Developers called ASP.NET Weekly which contains a curated list of the best ASP.NET related content I came across during the previous week.

Each week I feature a tool or library which I think ASP.NET developers will find useful. This is part 5 in a 5 part series of blog posts during which I recap all the tools and libraries I featured during the past year.

I re-publish these with the exact descriptions I used in the newsletter, so see my commentary in that context…

41. Light Bootstrap Dashboard

Light Bootstrap Dashboard is an Admin Dashboard template designed to be beautiful and simple. It is built on top of Bootstrap 3 and it is fully responsive. It comes with a big collections of elements that will offer you multiple possibilities to create an application that best fits your needs.

42. At.js

At.js is a great little library which allows you to easily add auto-complete for mentions and emojis like you see in GitHub’s editor. Also be sure to check out the demo

43. Nancy

Nancy is a lightweight, low-ceremony, framework for building HTTP based services on .Net and Mono. You can it as an alternative to ASP.​NET MVC for building web applications.

44. Epoch

Epoch is a library which helps you to create dates and spans with a fluent API. It makes it very easy to create relative dates, adjusting existing dates and even create time spans.

45. smartcrop.js

Smartcrop is a JavaScript library which implements intelligent cropping of images by being aware of the content of the image and cropping to whatever the primary focus of the image should be.

46. Front-end Developer Handbook

The Front-end Developer Handbook is a giant index to all sort of resources for front-end developers. When you have some downtime, go through the list of resources, as I am sure that you are bound to come across some interesting stuff which will help you enhance your skills.

47. EnjoyHint

EnjoyHint is a free web-tool that is created to guide users through a site or app by adding simple hints that prompt users to navigate a website or an app intuitively and easily.

48. Bootstrap 4

Most web developers are familiar with the Bootstrap framework, as it has become the standard for use in web applications and most web templates are also build on top of Bootstrap. Version 4 is in the works and you can start playing around with the Alpha version for familiarise yourself with what is new and changing.

49. Node School

Yeah, this is an ASP.​NET newsletter, but I also strongly believe that programmers should be proficient in multiple programming languages. Having good JavaScript skills is especially useful for any sort of web developer.

A few months ago I attended a Node School which was organised in the city I lived in an really enjoyed the experience. Have a look at the Node School website. They have plenty of details on upcoming workshops all over the world, and well as details on how to host your own workshop.

Even if you decide not to attend (or host) a Node School, you can still work through the lessons on your own time at home.

50. Cerberus

Just about every single application has to send out some form of emails. Making emails work across the various email clients can however be quite a daunting task. This is where Cerberus comes in. Cerberus is a set of responsive email patterns which you can use in your applications and help you ease some of the pain of creating email templates.

51. Specs.For<>

SpecsFor is a library to help you test your web applications. It is flexible enough to support most approaches to testing - from “old-school” plain-jane test methods to full-blown BDD-style specifications.

52. MailKit

MailKit is a cross-platform mail client library for .NET applications. It includes an SMTP client, POP3 client and IMAP4 client and works across most modern .NET platforms.

53. You Don’t Need jQuery

As a developer I tend to reach for jQuery when I want to do DOM manipulation, but is it really necessary? This project summarises most of the jQuery method alternatives in native implementation, with IE 10+ support.

Assign execute permissions with Git

$
0
0

Recently I had to configure a build on Jenkins for the work I am doing at Auth0 and ran into an issue with a shell script that did not want to execute and failed with a “Permission Denied” error.

Being new to the Linux world I reached out to a colleague and it turned out the solution was an easy one. It is new for me, so I am sharing it so it can maybe help you in the future.

The solution is to use the Git update-index command to assign the execute permissions.

Let’s say the bash script in question is named foo.sh, then go to your shell (or Git shell if you’re on Windows like me) and execute the following command:

git update-index --chmod=+x foo.sh

This will assign execute permissions to the bash file. After that you can commit the changes to the repo.

Hope this helps someone out there :)

New job at Auth0

$
0
0

The new role

I have been doing freelancing work for the past 4 months for a couple of clients, one of which is a company called Auth0. They offer an identity platform which allows developers to easily add authentication (and other identity management) tasks to their applications.

The initial freelance work I did for them involved writing the C# SDK for their Authentication and Management APIs. Early in January they offered me a permanent position which I gladly accepted.

The new position involves writing SDKs, sample applications, quick starts and other documentation. This is the sort of work I enjoy doing as it allows me to cover a wide range of technologies.

What does this mean for this blog and my other plans for 2016?

So what does this mean for this blog (and my other plans for 2016)?

Well as far as the blog is concerned it will probably moved away from being focused on ASP.NET and I will start to cover a range of other technologies.

My initial focus at Auth0 is to work on the .NET related stuff, but over time I will move into other areas. This is good for me as it means I will start learning a whole lot of stuff in languages and frameworks that I did not previously do work in.

I also still plan to do more videos this year though I suspect these will be about some other cool things I come across, and not necessarily ASP.NET. I actually anticipated this change in focus last year already, and started rebranding my Youtube channel from AspnetCasts to just using my personal name.

As far as my travel plans are concerned, these are still in place. The position at Auth0 is a remote position, which means I can work from anywhere in the world. Right now I am in Orlando, and heading up to Seattle next week, for a week. After that I am off to Mexico for perhaps 3 months and then hopefully off to South America.

The only thing that’s really going to change is that I am reneging on my goal to do monthly reviews. These were planned because of the long-term goal of creating a business around my content and earning an income from it. With the new job this is not my focus anymore, and though I still plan on producing a lot of content I am not worried too much if things slip a little, and therefore I am not too worried about keeping myself accountable with this.

Hey, looking for a job?

BTW, Auth0 is hiring across the board in many roles. Have a look at the job openings on their website.

Using JsonExtensionData with JSON.NET

$
0
0

Background

One of the ussues we faced when developing the .NET SDK for Auth0 was that user profiles can have different properties based on the origin of the user profile. There are a set of standard properties for a user profile, but over and above that many of the providers can “enhance” the user profile with extra properties.

Here is an example of a user profile for a database user:

And here is one for a user profile from LinkedIn:

The LinkedIn image is cropped and does not display nearly all the available fields in the screenshot. There are just too many of them. But as you can see the LinkedIn user has a bunch of extra properties which were populated that are unique to LinkedIn and which will most probably not ever be present in profiles from other providers.

This also surfaces when retrieving the user profile through the Management API.

Here is the JSON returned for the database user:

{"email":"jerrie@jerriepelser.com","email_verified":true,"user_id":"auth0|568a366de8e57fe426d23100","picture":"https://s.gravatar.com/avatar/6222081fd7dcea7dfb193788d138c457?s=480&r=pg&d=https%3A%2F%2Fcdn.auth0.com%2Favatars%2Fje.png","nickname":"jerrie","identities":[{"user_id":"...","provider":"auth0","connection":"Username-Password-Authentication","isSocial":false}],"updated_at":"2016-01-22T07:36:32.419Z","created_at":"2016-01-04T09:07:57.302Z","name":"jerrie@jerriepelser.com","last_password_reset":"2016-01-17T16:50:27.407Z","last_ip":"154.119.56.252","last_login":"2016-01-22T07:36:32.418Z","logins_count":16,"blocked_for":[]}

And once again for the LinkedIn user:

{"email":"jerrie@jerriepelser.com","given_name":"Jerrie","family_name":"Pelser","picture":"https://media.licdn.com/mpr/mprx/0_zVjRUCq9dLg1h2I8vUxXU3rUdCZr82R8UYdkU31k9Xuhxe2hMpDczTbEF74fiIVuqJ0LNQDBKd3H","name":"Jerrie Pelser","apiStandardProfileRequest":{"headers":{"_total":1,"values":[{"name":"x-li-auth-token","value":"..."}]},"url":"https://api.linkedin.com/v1/people/ORklSDbtFm"},"currentShare":{...removedforbrevity...},"distance":0,"headline":"Software Developer, Blogger, Screencaster, Curator of ASP.NET Weekly.","industry":"Information Technology and Services","location":{"country":{"code":"th"},"name":"Thailand"},"numConnections":331,"numConnectionsCapped":false,"positions":{"_total":4,"values":[{...},{...},{...},{...}]},"publicProfileUrl":"https://www.linkedin.com/in/jerriepelser","relationToViewer":{"distance":0},"siteStandardProfileRequest":{"url":"......","email_verified":true,"updated_at":"2016-01-14T14:33:25.720Z","user_id":"linkedin|ORklSDbtFm","nickname":"jerrie","identities":[{"provider":"linkedin","user_id":"...","connection":"linkedin","isSocial":true}],"created_at":"2016-01-14T14:33:25.720Z","last_ip":"197.229.128.5","last_login":"2016-01-14T14:33:25.716Z","logins_count":1,"blocked_for":[]}

Once again in the case of the LinkedIn document, I have removed a lot of the information returned from the API for brevity. But you get the idea that the API returns vastly different information depending on the original source of the user profile.

For the .NET SDK we wanted to return a strongly typed User object but obviously this left us with a decision to make:

  1. The first option was to simply return the properties from the normalized user profile and then somehow make the other properties available dynamically
  2. The second option was to return a specialized User class for each user profile, depending on the original source. So for a user from LinkedIn we could for example return a LinkedInUser object which had all the extra properties for LinkedIn profiles.
  3. The last option was to simply add every possible property to the User class.

The last option was quickly discounted, and after some deliberation we decided that the second option was also potentially too much work, as it meant that every time the core API team added an extra provider or decided to retrieved extra attributes from a specific provider, that we had to update the SDK as well.

So finally we settled on the first option, but we were still not sure how to do this.

After some investigation I stumbled across the JsonExtensionData attribute in JSON.NET. What this allows you to do is to serialize elements of a JSON document which does not have matching properties on the destination object to the dictionary which is decorated with the [JsonExtensionData] attribute. (Also see the JSON.NET documentation on the various Serialization Attributes)

Example

As a practical example, let us assume the following JSON array is being returned from an API call:

[
  {
    "first_name": "Jerrie",
    "last_name": "Pelser",
    "initials": "JJ",
    "profile_image": "http://www.gravatar.com/some_image"
  },
  {
    "first_name": "Peter",
    "last_name": "Parker",
    "initials": "P",
    "profile_image": "http://www.gravatar.com/another_image",
    "address": {
      "city": "New York",
      "suburb": "Forest Hills"
    },
    "family":
    [
      {
        "first_name": "May",
        "last_name": "Parker"
      }
    ]
  }
]

We are trying to deserialize this into the following User class:

public class User
{
    [JsonProperty("first_name")]
    public string FirstName { get; set; }

    [JsonProperty("last_name")]
    public string LastName { get; set; }

    [JsonProperty("initials")]
    public string Initials { get; set; }

    [JsonProperty("profile_image")]
    public string ProfileImage { get; set; }
}

The standard properties for the User class is FirstName, LastName, Initials and ProfileImage. These are common across all our user objects.

As you can see however, the second object in the JSON array has extra attributes which have no backing attributes on our User class, so what we can do is to update the User class by adding an AdditionalData property of type IDictionary<string, JToken> which is decorated with the [JsonExtensionData] attribute:

public class User
{
    [JsonProperty("first_name")]
    public string FirstName { get; set; }

    [JsonProperty("last_name")]
    public string LastName { get; set; }

    [JsonProperty("initials")]
    public string Initials { get; set; }

    [JsonProperty("profile_image")]
    public string ProfileImage { get; set; }

    [JsonExtensionData]
    public IDictionary<string, JToken> AdditionalData { get; set; }
}

Now when we deserialize the JSON array, all the extra attributes for the JSON documents which are not mapped to properties in the class will be added to the AdditionalData dictionary.

Below you can see a screenshot of the Visual Studio debugger at runtime, and you can see the extra attributes being added to the dictionary:

Conclusion

JSON.NET is very easy to use to handle the most common cases, but what makes it so powerful is that it also has a few really advanced features which allows you to control how data gets serialized and deserialized, so you can almost always find a way to work around those tricky edge cases.

Using custom converters in JSON.NET: Array or Object?

$
0
0

The problem

During the development of the Auth0 .NET SDK, I ran into an issue with one of our Management API calls where it could return a different JSON structure based on parameters passed in by the user.

The offending API call in question was the Users endpoint where you could pass in a parameter called include_totals which will return the list of users, along with the total number of records and some other paging information. If you did not pass in this parameter (or specifeied a value of false), it would simply return an array of users.

To make this a bit more visual, here is an approximate example when you as the API call to include the totals:

{"start":0,"limit":50,"length":36,"total":36,"users":[{"email":"john.doe@gmail.com","email_verified":false,"username":"johndoe","phone_number":"+199999999999999","phone_verified":false,"user_id":"usr_5457edea1b8f33391a000004","created_at":"","updated_at":"","identities":[[{"connection":"Initial-Connection","user_id":"5457edea1b8f22891a000004","provider":"auth0","isSocial":false}]],"app_metadata":{},"user_metadata":{},"picture":"","name":"","nickname":"","multifactor":[""],"last_ip":"","last_login":"","logins_count":0,"blocked":false},{},{}]}

So you can see that the API returns a JSON object with the paging information, and then a users property containing an array with the list of users.

In the case where you ask to not include the totals, the API would simply return an array of users, e.g.:

[{"email":"john.doe@gmail.com","email_verified":false,"username":"johndoe","phone_number":"+199999999999999","phone_verified":false,"user_id":"usr_5457edea1b8f33391a000004","created_at":"","updated_at":"","identities":[[{"connection":"Initial-Connection","user_id":"5457edea1b8f22891a000004","provider":"auth0","isSocial":false}]],"app_metadata":{},"user_metadata":{},"picture":"","name":"","nickname":"","multifactor":[""],"last_ip":"","last_login":"","logins_count":0,"blocked":false},{},{}]

In my case I wanted to always return the user an instance of the PagedList<User> class, which is defined as follows:

publicclassPagedList<T>:List<T>{publicPagedList(){}publicPagedList(IEnumerable<T>collection):base(collection){}publicPagedList(IEnumerable<T>collection,PagingInformationpaging):base(collection){Paging=paging;}publicPagedList(intcapacity):base(capacity){}publicPagingInformationPaging{get;set;}}publicclassPagingInformation{[JsonProperty("length")]publicintLength{get;set;}[JsonProperty("limit")]publicintLimit{get;set;}[JsonProperty("start")]publicintStart{get;set;}[JsonProperty("total")]publicintTotal{get;set;}publicPagingInformation(intstart,intlimit,intlength,inttotal){Start=start;Limit=limit;Length=length;Total=total;}}

Depending on whether they request paging information, the Paging property will contain the paging information.

JsonConverter to the rescue

As you can expect this creates problem when trying to deserialize the resulting JSON to a specific .NET type, because we are working with two very different potential JSON structures being deserialized.

Thankfully JSON.NET offers a solution by allowing you to create a custom converter which specifies how an object is serialized or deserialized. All you need to do is inherit from JsonConverter and then provide implementations for the CanConvert, WriteJson and ReadJson methods.

The logic then is fairly simple; I simply check whether the JSON being serialized is an object or an array.

If it is an object, I know that the user requested the totals, so I extract the various paging information properties, and deserialize the “users” property of the JSON object to a list of User. I then return a PagedList<User> with all the list of users as well as the paging information.

In the case where the JSON is an array, I know that paging information was not requested, so I simply convert the JSON to a list of User, and return a PagedList<User> with the list of users and no paging information:

internalclassUserPagedListConverter:JsonConverter{publicoverridevoidWriteJson(JsonWriterwriter,objectvalue,JsonSerializerserializer){thrownewSystem.NotImplementedException();}publicoverrideboolCanConvert(TypeobjectType){returntypeof(PagedList<User>).GetTypeInfo().IsAssignableFrom(objectType.GetTypeInfo());}publicoverrideobjectReadJson(JsonReaderreader,TypeobjectType,objectexistingValue,JsonSerializerserializer){if(reader.TokenType==JsonToken.StartObject){JObjectitem=JObject.Load(reader);if(item["users"]!=null){varusers=item["users"].ToObject<IList<User>>(serializer);intlength=item["length"].Value<int>();intlimit=item["limit"].Value<int>();intstart=item["start"].Value<int>();inttotal=item["total"].Value<int>();returnnewPagedList<User>(users,newPagingInformation(start,limit,length,total));}}else{JArrayarray=JArray.Load(reader);varusers=array.ToObject<IList<User>>();returnnewPagedList<User>(users);}// This should not happen. Perhaps better to throw exception at this point?
returnnull;}}

Simple as that. Then when you want to deserialize an JSON string, you can simply pass along the UserPagedListConverter to the Convert method, e.g.

JsonConvert.DeserializeObject<PagedList<User>>(content,newUserPagedListConverter());

ASP.NET Core: No more worries about checking in secrets

$
0
0

A number of articles have been written about the new Configuration model in ASP.NET Core, but one of the things which does not seem to be highlighted quite often, is how it can protect you from accidently checking secrets (such as connection string passwords or OAuth keys) into source control.

The have been various cases in the media over the past number of years where people have ended up on the wrong side of an Amazon Web Services bill after an unscrupulous operator have managed to get a hold of their AWS keys and used it to create EC2 instances.

In ASP.NET Core this is dead-simple.

Let us first look at a sample piece of code from the Startup class generated by one of the default ASP.NET Core application templates:

publicclassStartup{publicStartup(IHostingEnvironmentenv){varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true).AddEnvironmentVariables();Configuration=builder.Build();}// Rest of class omitted for brevity...
}

As you can see in the configuration of the ConfigurationBuilder, by default the configuration will be read from 3 different sources.

  1. The appsettings.json file
  2. The appsettings file which correlates with the current environment, e.g. appsettings.Development.json
  3. The environment variables

Configuration settings will be read from these 3 sources in order.

Using the enviroment-specific appsettings file

One of the first ways you can avoid checking in secrets is by using the environment-specific appsettings file and excluding that file from source control.

So in the configuration specified above, if you run on your local machine and have configured the Development environment (read more about Working with Multiple Environments), then the ASP.NET Core runtime is going to try and load settings from an optional appsettings.Development.json file.

As you can see in the code snippet, this file is specified as optional, so what you can do is to specify your secret values inside this file, e.g.

{"twitter":{"consumerKey":"your consumer key goes here","consumerSecret":"your consumer secret goes here"}}

This will make the configuration settings with the keys twitter:consumerKey and twitter:consumerSecret will be available inside your application.

All you need to do is exlude the file from source conrol, so if you use Git then simply add the file to your .gitignore file, so it does not get checked in.

You can even make it more explicit that the file contains secrets, by naming it secrets.json and excluding the secrets.json file from source control:

publicclassStartup{publicStartup(IHostingEnvironmentenv){varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true).AddJsonFile($"secrets.json",optional:true).AddEnvironmentVariables();Configuration=builder.Build();}// Rest of class omitted for brevity...
}

Store them as environment variables

You may have noticed the call to AddEnvironmentVariables in the code samples above. What this does is that it will load configuration values from environment variables.

So using the example of the Twitter Consumer Key and Secret above, I can simple specify environment variables twitter:consumerKey and twitter:consumerSecret with the relevant values:

And because of the call to AddEnvironmentVariables, the configuration settings with the keys twitter:consumerKey and twitter:consumerSecret will once again be available inside my application

Use the Secret Manager tool

One more (and probably the best) way in which you can do this is to actually use the Secret Manager Tool which is available as a .NET Core tool and was built specifically for this purpose.

You can read the article above for more detail on exactly how to use this, but what it boils down to is that there is a .NET Core tool available which you can add to your application, called the Secret Manager Tool.

You can use this tool you can specify the values for any secrets you use inside your application, and it will be stored securely on your local machine without any chance of them being checked into source control.

At runtime you can use the AddUserSecrets method to load the values of the configuration variables from the secret storage:

publicStartup(IHostingEnvironmentenv){varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true);if(env.IsDevelopment()){// For more details on using the user secret store see http://go.microsoft.com/fwlink/?LinkID=532709
builder.AddUserSecrets();}builder.AddEnvironmentVariables();Configuration=builder.Build();}

A handy side-effect for our Auth0 samples

Let’s quickly look again at the code for specifying the configuration sources inside our application:

varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true).AddEnvironmentVariables();Configuration=builder.Build();

I mentioned before that the environment variables get loaded in the order in which the configuration sources are specified, but what I did not make clear is that all configuration sources can declare the configuration settings with the same key. What will happen in this case is that the values from a subsequent configuration source will override the values from a previous configuration source.

This is useful for me when developing our Auth0 samples, because we have a clever little trick where we replace configuration values with the actual values from your Auth0 instance.

Here is the contents from the configuration file of one of our samples:

{"AppSettings":{"SiteTitle":"Auth0 - ASP.NET 5 Web App Sample"},"Auth0":{"ClientId":"{CLIENT_ID}","ClientSecret":"{CLIENT_SECRET}","Domain":"{DOMAIN}","RedirectUri":"http://localhost:5001/"}}

Do you see those values {CLIENT_ID}, {CLIENT_SECRET} and {DOMAIN}? When you download this sample application through our documentation website, and you are signed in to your Auth0 account, we will automatically replace those with the correct values from your Auth0 instance, so you do not have to do any configuration of the application after you have downloaded it - you can just run it immediately and it is pre-configured to work with your specific Auth0 instance.

Now previously when I worked on these samples to code and test them, I had to set the values for those configuration settings to the actual values. So instead of {CLIENT_ID}, I would have to specify the actual Client ID.

I then also had to remember that everytime I checked a sample application in to GitHub that I once again replaced the actual Client ID I used while testing the sample application with the string {CLIENT_ID}, so our sample downloader worked correctly.

From time to time I forgot to do this…

With the new multiple configuration sources in ASP.NET Core, this is a thing of the past. I never have to touch the values of those configuration settings in appsettings.json again. All I do is to specify environment variables with the correct values which will then override the values in the appsettings.json file becuase of the call to AddEnvironmentVariables.

So when I use them on my computer, the environment variables I use get specified, but when a user downloads the sample they will have the correct values specified in appsettings.json and I do not have to worry about messing things up by accident.

Accessing the Request object inside a Tag Helper in ASP.NET Core

$
0
0

Last week I was doing a little experiment for our Auth0 support for ASP.NET Core which involved writing a Tag Helper. For this Tag Helper I had to access the actual URL for the request, so I therefore had to somehow get a hold of the HttpRequest inside of the Tag Helper.

Injecting IHttpContextAccessor

The Request is not available as a property of the TagHelper base class so I figured that I needed to inject IHttpContextAccessor into my Tag Helper’s constructor, for example:

publicclassLockTagHelper:TagHelper{privatereadonlyIHttpContextAccessor_contextAccessor;publicLockTagHelper(IHttpContextAccessorcontextAccessor){_contextAccessor=contextAccessor;}}

The Request can then later be accessed as follows:

varrequest=_contextAccessor.HttpContext.Request;

On my first try I got the following exception:

InvalidOperationException: Unable to resolve service for type ‘Microsoft.AspNetCore.Http.IHttpContextAccessor’ while attempting to activate ‘Auth0.AspNetCore.Mvc.TagHelpers.LockTagHelper’.

I know this worked before when I used ASP.NET Core (then still called ASP.NET 5) last year, and after a bit of research it seemed that the default behaviour has changed and you now had to configure IHttpContextAccessor manually with the DI framework.

So inside the ConfigureServices method of your Startup class, simple add the following line.

services.AddSingleton<IHttpContextAccessor,HttpContextAccessor>();

This worked great but it posed a problem for me. This particular Tag Helper would be available as a NuGet package and I did not want to expect users to have to configure IHttpContextAccessor with the DI in order for my Tag Helper to work correctly.

Using ViewContextAttribute

I needed a way which was less error prone, and after posing the question on GitHub, Pranav supplied a much better solution..

Simply declare a property of type ViewContext and decorate it with the [ViewContext] attribute.

You can then access the HttpRequest through the ViewContext.HttpContext.Request property.

publicclassLockTagHelper:TagHelper{protectedHttpRequestRequest=>ViewContext.HttpContext.Request;protectedHttpResponseResponse=>ViewContext.HttpContext.Response;[ViewContext]publicViewContextViewContext{get;set;}// Code omitted for brevity
}

Adding parameters to the OpenID Connect Authorization URL

$
0
0

I am busy working on some more samples for ASP.NET Core to demonstrate various techniques people can use Auth0 to authenticate their users. In most of our samples we use the standard OpenID Connect middleware, and one of the things I wanted to do was to pass extra parameters when the request is made to the Authorization endpoint.

At Auth0 we allow users to authenticate with multiple social and Enterprise providers. Usually when the Authorization endpoint is called, we will display Lock which will promt the user for their username and password, and also allow them to sign in with any of the connected social or enterprise providers.

We can however also directly invoke any of the social connections, bypassing Lock completely and directing the user directly to the Authorization page for the relevant service. So as an example we can send the user directly to the Google login by passing along the query string parameter connection=google-oauth2.

So how do you do this when using the OpenID Connect middleware?

All you need to do is handle the OnRedirectToIdentityProvider event when configuring the OpenIdConnectOptions, and add the exta query string parameters by calling the ProtocolMessage.SetParameter method on the supplied RedirectContext

app.UseOpenIdConnectAuthentication(newOpenIdConnectOptions("Auth0"){// Set the authority to your Auth0 domain
Authority="https://YOUR_AUTH0_DOMAIN",// Configure the Auth0 Client ID and Client Secret
ClientId="CLIENT ID",ClientSecret="CLIENT SECRET",// Do not automatically authenticate and challenge
AutomaticAuthenticate=false,AutomaticChallenge=false,// Set response type to code
ResponseType="code",// Set the callback path 
CallbackPath=newPathString("/signin-auth0"),// Configure the Claims Issuer to be Auth0
ClaimsIssuer="Auth0",Events=newOpenIdConnectEvents{OnRedirectToIdentityProvider=context=>{context.ProtocolMessage.SetParameter("connection","google-oauth2");returnTask.FromResult(0);}}});

Now the user will be sent directly to the Google login page whenever the OIDC middleware is invoked.

This however means that the user will always be directed to sign in with their Google account. What if we want to make this configurable somehow?

At the moment the Login action in the AccountController which issues the challenge to the OIDC middleware looks as follows:

publicIActionResultLogin(){returnnewChallengeResult("Auth0",newAuthenticationProperties(){RedirectUri="/"});}

What we need to do is add a connection parameter to the Login action and then if the user passed in a value for that parameter we can add it to the Items dictionary of the AuthenticationProperties instance which is passed along with the challenge:

publicIActionResultLogin(stringconnection){varproperties=newAuthenticationProperties(){RedirectUri="/"};if(!string.IsNullOrEmpty(connection))properties.Items.Add("connection",connection);returnnewChallengeResult("Auth0",properties);}

And then also change the OnRedirectToIdentityProvider delegate to check if the connection property was passed along, and if it was, append the value to the ProtocolMessage parameters:

app.UseOpenIdConnectAuthentication(newOpenIdConnectOptions("Auth0"){// Set the authority to your Auth0 domain
Authority="https://YOUR_AUTH0_DOMAIN",// Configure the Auth0 Client ID and Client Secret
ClientId="CLIENT ID",ClientSecret="CLIENT SECRET",// Do not automatically authenticate and challenge
AutomaticAuthenticate=false,AutomaticChallenge=false,// Set response type to code
ResponseType="code",// Set the callback path 
CallbackPath=newPathString("/signin-auth0"),// Configure the Claims Issuer to be Auth0
ClaimsIssuer="Auth0",Events=newOpenIdConnectEvents{OnRedirectToIdentityProvider=context=>{if(context.Properties.Items.ContainsKey("connection"))context.ProtocolMessage.SetParameter("connection",context.Properties.Items["connection"]);returnTask.FromResult(0);}}});

Now, when you go to http://YOUR_URL/Account/Login, the OIDC middleware will get invoked and Auth0 Lock will be display as always. However if you go to http://YOUR_URL/Account/Login?connection=google-oauth2 then the user will be sent directly to the Google authorization page. Likewise, if you go to http://YOUR_URL/Account/Login?connection=github, the user will be sent directly to the GitHub authorization page.

Using Roles with the ASP.NET Core JWT middleware

$
0
0

Here is a great find: The JWT middleware in ASP.NET Core knows how to interpret a “roles” claim inside your JWT payload, and will add the appropriate claims to the ClaimsIdentity. This makes using the [Authorize] attribute with Roles very easy.

This is best demonstrated with a simple example.

First of all I head over to JWT.io and create a JSON Web Token with the following payload:

{"iss":"http://www.jerriepelser.com","aud":"blog-readers","sub":"123456","exp":1499863217,"roles":["Admin","SuperUser"]}

Note the array of roles in the “roles” claim.

This is an HS256 token and signed with the secret “mysuperdupersecret”, as can be seen in the following screenshot:

In my ASP.NET Core application I am configuring the JWT middleware:

publicclassStartup{publicvoidConfigure(IApplicationBuilderapp,IHostingEnvironmentenv,ILoggerFactoryloggerFactory){varkeyAsBytes=Encoding.ASCII.GetBytes("mysuperdupersecret");varoptions=newJwtBearerOptions{TokenValidationParameters={ValidIssuer="http://www.jerriepelser.com",ValidAudience="blog-readers",IssuerSigningKey=newSymmetricSecurityKey(keyAsBytes)}};app.UseJwtBearerAuthentication(options);app.UseMvc();}}

When I make a request to my API with the JWT created above, the array of roles in the “roles” claim in the JWT will automatically be added as claims with the type http://schemas.microsoft.com/ws/2008/06/identity/claims/role to my ClaimsIdentity.

You can test this by creating the following simple API method that returns the user’s claims:

publicclassValuesController:Controller{[Authorize][HttpGet("claims")]publicobjectClaims(){returnUser.Claims.Select(c=>new{Type=c.Type,Value=c.Value});}}

So when I make a call to the /claims endpoint above, and pass the JWT generated before, I will get the following JSON returned:

[{"type":"iss","value":"http://www.jerriepelser.com"},{"type":"aud","value":"blog-readers"},{"type":"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier","value":"123456"},{"type":"exp","value":"1499863217"},{"type":"http://schemas.microsoft.com/ws/2008/06/identity/claims/role","value":"Admin"},{"type":"http://schemas.microsoft.com/ws/2008/06/identity/claims/role","value":"SuperUser"}]

Where this gets really interesting is when you consider that passing Roles to the [Authorize] will actually look whether there is a claim of type http://schemas.microsoft.com/ws/2008/06/identity/claims/role with the value of the role(s) you are authorizing.

This means that I can simply add [Authorize(Roles = "Admin")] to any API method, and that will ensure that only JWTs where the payload contains the claim “roles” containing the value of Admin in the array of roles will be authorized for that API method.

publicclassValuesController:Controller{[Authorize(Roles="Admin")][HttpGet("ping/admin")]publicstringPingAdmin(){return"Pong";}}

This makes things very easy.

What makes this doubly interesting is that this works with the OpenID Connect middleware as well. So in other words, if the ID Token returned when you authorize a user using the OIDC middleware contains a “roles” claim, the exact samle principle applies - simply decorate the MVC controllers with [Authorize(Roles = "Admin")] and only users whose ID Token contains those claims will be authorized.

So bottom line: Ensure the “roles” claim of your JWT contains an array of roles assigned to the user, and you can use [Authorize(Roles = "???")] in your controllers. It all works seamlessly.

Running a specific test with .NET Core and NUnit

$
0
0

I converted the Unit tests for the Auth0.NET SDK to .NET Core. Currently the unit testing framework being used is NUnit, and NUnit 3 comes with a test runner for .NET Core.

You can make use of it by configuring your project.json as follows:

{"version":"1.0.0-*","dependencies":{"NUnit":"3.5.0","dotnet-test-nunit":"3.4.0-beta-3"},"testRunner":"nunit","frameworks":{"netcoreapp1.0":{"imports":"portable-net45+win8","dependencies":{"Microsoft.NETCore.App":{"version":"1.0.0-*","type":"platform"}}}}}

The configuration above is current as of the writing of this blog post. Please refer to the NUnit 3 Test Runner for .NET Core GitHub Page to obtain the must up to date informaton on how to configure it.

With this in place you can easily run your unit tests from the command line by simply running the command

dotnet test

This will however run all the tests in a particular assembly (except for the Explicit ones), but what if you want to run only a specific unit test?

Well for that you can refer to the documentation for the Console Command Line. According to that documentation, one of the parameters you can pass to the Console Runner is --test, which allows you to specify a comma-separated list of names of test to run.

You can also pass this --test parameter to the dotnet test runner, which it seems is then passing it on to the NUnit .NET Core Test runner. So for example, if I wanted to run the unit test Auth0.ManagementApi.IntegrationTests.UsersTests.Test_users_crud_sequence, I could execute the following command:

dotnet test --test Auth0.ManagementApi.IntegrationTests.UsersTests.Test_users_crud_sequence

And that will then only run that particular unit test:

Using Configuration files in .NET Core Unit Test Projects

$
0
0

So another thing I came across while converting the Integration tests for the Auth0.NET SDK to .NET Core was that I had to make use of configuration files which specify the settings so the Integration test can talk with Auth0.

Here are some of the basics which got it working for me…

Add the configuration file

First, add a client-secrets.json file to the Integration test project, e.g.

{"AUTH0_CLIENT_ID":"...","AUTH0_CLIENT_SECRET":"..."}

Configure the client-secrets.json file to be copied to the output directory by updating the buildOptions in the project.json file:

{"version":"1.0.0-*","buildOptions":{"copyToOutput":{"include":["client-secrets.json"]}},"dependencies":{"..."},"testRunner":"nunit","frameworks":{"net461":{}}}

Include the .NET Core Configuration NuGet package

Include the JSON Configuration file NuGet package (Microsoft.Extensions.Configuration.Json) in your project.json

{"version":"1.0.0-*","buildOptions":{"copyToOutput":{"include":["client-secrets.json"]}},"dependencies":{"...","Microsoft.Extensions.Configuration.Json":"1.0.0"},"testRunner":"nunit","frameworks":{"net461":{}}}

Be sure to run dotnet restore after you have added the package.

Use the configuration in your unit tests

You can now use the configuration file in your unit tests by using the ConfigurationBuilder class:

varconfig=newConfigurationBuilder().AddJsonFile("client-secrets.json").Build();

And then access any configuration value:

varclientId=config["AUTH0_CLIENT_ID"]

You can read more about how configuration works in .NET Core projects in the ASP.NET Core Configuration documentation

Assign execute permissions with Git

$
0
0

Recently I had to configure a build on Jenkins for the work I am doing at Auth0 and ran into an issue with a shell script that did not want to execute and failed with a “Permission Denied” error.

Being new to the Linux world I reached out to a colleague and it turned out the solution was an easy one. It is new for me, so I am sharing it so it can maybe help you in the future.

The solution is to use the Git update-index command to assign the execute permissions.

Let’s say the bash script in question is named foo.sh, then go to your shell (or Git shell if you’re on Windows like me) and execute the following command:

git update-index --chmod=+x foo.sh

This will assign execute permissions to the bash file. After that you can commit the changes to the repo.

Hope this helps someone out there :)


New job at Auth0

$
0
0

The new role

I have been doing freelancing work for the past 4 months for a couple of clients, one of which is a company called Auth0. They offer an identity platform which allows developers to easily add authentication (and other identity management) tasks to their applications.

The initial freelance work I did for them involved writing the C# SDK for their Authentication and Management APIs. Early in January they offered me a permanent position which I gladly accepted.

The new position involves writing SDKs, sample applications, quick starts and other documentation. This is the sort of work I enjoy doing as it allows me to cover a wide range of technologies.

What does this mean for this blog and my other plans for 2016?

So what does this mean for this blog (and my other plans for 2016)?

Well as far as the blog is concerned it will probably moved away from being focused on ASP.NET and I will start to cover a range of other technologies.

My initial focus at Auth0 is to work on the .NET related stuff, but over time I will move into other areas. This is good for me as it means I will start learning a whole lot of stuff in languages and frameworks that I did not previously do work in.

I also still plan to do more videos this year though I suspect these will be about some other cool things I come across, and not necessarily ASP.NET. I actually anticipated this change in focus last year already, and started rebranding my Youtube channel from AspnetCasts to just using my personal name.

As far as my travel plans are concerned, these are still in place. The position at Auth0 is a remote position, which means I can work from anywhere in the world. Right now I am in Orlando, and heading up to Seattle next week, for a week. After that I am off to Mexico for perhaps 3 months and then hopefully off to South America.

The only thing that’s really going to change is that I am reneging on my goal to do monthly reviews. These were planned because of the long-term goal of creating a business around my content and earning an income from it. With the new job this is not my focus anymore, and though I still plan on producing a lot of content I am not worried too much if things slip a little, and therefore I am not too worried about keeping myself accountable with this.

Hey, looking for a job?

BTW, Auth0 is hiring across the board in many roles. Have a look at the job openings on their website.

Using JsonExtensionData with JSON.NET

$
0
0

Background

One of the ussues we faced when developing the .NET SDK for Auth0 was that user profiles can have different properties based on the origin of the user profile. There are a set of standard properties for a user profile, but over and above that many of the providers can “enhance” the user profile with extra properties.

Here is an example of a user profile for a database user:

And here is one for a user profile from LinkedIn:

The LinkedIn image is cropped and does not display nearly all the available fields in the screenshot. There are just too many of them. But as you can see the LinkedIn user has a bunch of extra properties which were populated that are unique to LinkedIn and which will most probably not ever be present in profiles from other providers.

This also surfaces when retrieving the user profile through the Management API.

Here is the JSON returned for the database user:

{"email":"jerrie@jerriepelser.com","email_verified":true,"user_id":"auth0|568a366de8e57fe426d23100","picture":"https://s.gravatar.com/avatar/6222081fd7dcea7dfb193788d138c457?s=480&r=pg&d=https%3A%2F%2Fcdn.auth0.com%2Favatars%2Fje.png","nickname":"jerrie","identities":[{"user_id":"...","provider":"auth0","connection":"Username-Password-Authentication","isSocial":false}],"updated_at":"2016-01-22T07:36:32.419Z","created_at":"2016-01-04T09:07:57.302Z","name":"jerrie@jerriepelser.com","last_password_reset":"2016-01-17T16:50:27.407Z","last_ip":"154.119.56.252","last_login":"2016-01-22T07:36:32.418Z","logins_count":16,"blocked_for":[]}

And once again for the LinkedIn user:

{"email":"jerrie@jerriepelser.com","given_name":"Jerrie","family_name":"Pelser","picture":"https://media.licdn.com/mpr/mprx/0_zVjRUCq9dLg1h2I8vUxXU3rUdCZr82R8UYdkU31k9Xuhxe2hMpDczTbEF74fiIVuqJ0LNQDBKd3H","name":"Jerrie Pelser","apiStandardProfileRequest":{"headers":{"_total":1,"values":[{"name":"x-li-auth-token","value":"..."}]},"url":"https://api.linkedin.com/v1/people/ORklSDbtFm"},"currentShare":{...removedforbrevity...},"distance":0,"headline":"Software Developer, Blogger, Screencaster, Curator of ASP.NET Weekly.","industry":"Information Technology and Services","location":{"country":{"code":"th"},"name":"Thailand"},"numConnections":331,"numConnectionsCapped":false,"positions":{"_total":4,"values":[{...},{...},{...},{...}]},"publicProfileUrl":"https://www.linkedin.com/in/jerriepelser","relationToViewer":{"distance":0},"siteStandardProfileRequest":{"url":"......","email_verified":true,"updated_at":"2016-01-14T14:33:25.720Z","user_id":"linkedin|ORklSDbtFm","nickname":"jerrie","identities":[{"provider":"linkedin","user_id":"...","connection":"linkedin","isSocial":true}],"created_at":"2016-01-14T14:33:25.720Z","last_ip":"197.229.128.5","last_login":"2016-01-14T14:33:25.716Z","logins_count":1,"blocked_for":[]}

Once again in the case of the LinkedIn document, I have removed a lot of the information returned from the API for brevity. But you get the idea that the API returns vastly different information depending on the original source of the user profile.

For the .NET SDK we wanted to return a strongly typed User object but obviously this left us with a decision to make:

  1. The first option was to simply return the properties from the normalized user profile and then somehow make the other properties available dynamically
  2. The second option was to return a specialized User class for each user profile, depending on the original source. So for a user from LinkedIn we could for example return a LinkedInUser object which had all the extra properties for LinkedIn profiles.
  3. The last option was to simply add every possible property to the User class.

The last option was quickly discounted, and after some deliberation we decided that the second option was also potentially too much work, as it meant that every time the core API team added an extra provider or decided to retrieved extra attributes from a specific provider, that we had to update the SDK as well.

So finally we settled on the first option, but we were still not sure how to do this.

After some investigation I stumbled across the JsonExtensionData attribute in JSON.NET. What this allows you to do is to serialize elements of a JSON document which does not have matching properties on the destination object to the dictionary which is decorated with the [JsonExtensionData] attribute. (Also see the JSON.NET documentation on the various Serialization Attributes)

Example

As a practical example, let us assume the following JSON array is being returned from an API call:

[{"first_name":"Jerrie","last_name":"Pelser","initials":"JJ","profile_image":"http://www.gravatar.com/some_image"},{"first_name":"Peter","last_name":"Parker","initials":"P","profile_image":"http://www.gravatar.com/another_image","address":{"city":"New York","suburb":"Forest Hills"},"family":[{"first_name":"May","last_name":"Parker"}]}]

We are trying to deserialize this into the following User class:

publicclassUser{[JsonProperty("first_name")]publicstringFirstName{get;set;}[JsonProperty("last_name")]publicstringLastName{get;set;}[JsonProperty("initials")]publicstringInitials{get;set;}[JsonProperty("profile_image")]publicstringProfileImage{get;set;}}

The standard properties for the User class is FirstName, LastName, Initials and ProfileImage. These are common across all our user objects.

As you can see however, the second object in the JSON array has extra attributes which have no backing attributes on our User class, so what we can do is to update the User class by adding an AdditionalData property of type IDictionary<string, JToken> which is decorated with the [JsonExtensionData] attribute:

publicclassUser{[JsonProperty("first_name")]publicstringFirstName{get;set;}[JsonProperty("last_name")]publicstringLastName{get;set;}[JsonProperty("initials")]publicstringInitials{get;set;}[JsonProperty("profile_image")]publicstringProfileImage{get;set;}[JsonExtensionData]publicIDictionary<string,JToken>AdditionalData{get;set;}}

Now when we deserialize the JSON array, all the extra attributes for the JSON documents which are not mapped to properties in the class will be added to the AdditionalData dictionary.

Below you can see a screenshot of the Visual Studio debugger at runtime, and you can see the extra attributes being added to the dictionary:

Conclusion

JSON.NET is very easy to use to handle the most common cases, but what makes it so powerful is that it also has a few really advanced features which allows you to control how data gets serialized and deserialized, so you can almost always find a way to work around those tricky edge cases.

Using custom converters in JSON.NET: Array or Object?

$
0
0

The problem

During the development of the Auth0 .NET SDK, I ran into an issue with one of our Management API calls where it could return a different JSON structure based on parameters passed in by the user.

The offending API call in question was the Users endpoint where you could pass in a parameter called include_totals which will return the list of users, along with the total number of records and some other paging information. If you did not pass in this parameter (or specifeied a value of false), it would simply return an array of users.

To make this a bit more visual, here is an approximate example when you as the API call to include the totals:

{"start":0,"limit":50,"length":36,"total":36,"users":[{"email":"john.doe@gmail.com","email_verified":false,"username":"johndoe","phone_number":"+199999999999999","phone_verified":false,"user_id":"usr_5457edea1b8f33391a000004","created_at":"","updated_at":"","identities":[[{"connection":"Initial-Connection","user_id":"5457edea1b8f22891a000004","provider":"auth0","isSocial":false}]],"app_metadata":{},"user_metadata":{},"picture":"","name":"","nickname":"","multifactor":[""],"last_ip":"","last_login":"","logins_count":0,"blocked":false},{},{}]}

So you can see that the API returns a JSON object with the paging information, and then a users property containing an array with the list of users.

In the case where you ask to not include the totals, the API would simply return an array of users, e.g.:

[{"email":"john.doe@gmail.com","email_verified":false,"username":"johndoe","phone_number":"+199999999999999","phone_verified":false,"user_id":"usr_5457edea1b8f33391a000004","created_at":"","updated_at":"","identities":[[{"connection":"Initial-Connection","user_id":"5457edea1b8f22891a000004","provider":"auth0","isSocial":false}]],"app_metadata":{},"user_metadata":{},"picture":"","name":"","nickname":"","multifactor":[""],"last_ip":"","last_login":"","logins_count":0,"blocked":false},{},{}]

In my case I wanted to always return the user an instance of the PagedList<User> class, which is defined as follows:

publicclassPagedList<T>:List<T>{publicPagedList(){}publicPagedList(IEnumerable<T>collection):base(collection){}publicPagedList(IEnumerable<T>collection,PagingInformationpaging):base(collection){Paging=paging;}publicPagedList(intcapacity):base(capacity){}publicPagingInformationPaging{get;set;}}publicclassPagingInformation{[JsonProperty("length")]publicintLength{get;set;}[JsonProperty("limit")]publicintLimit{get;set;}[JsonProperty("start")]publicintStart{get;set;}[JsonProperty("total")]publicintTotal{get;set;}publicPagingInformation(intstart,intlimit,intlength,inttotal){Start=start;Limit=limit;Length=length;Total=total;}}

Depending on whether they request paging information, the Paging property will contain the paging information.

JsonConverter to the rescue

As you can expect this creates problem when trying to deserialize the resulting JSON to a specific .NET type, because we are working with two very different potential JSON structures being deserialized.

Thankfully JSON.NET offers a solution by allowing you to create a custom converter which specifies how an object is serialized or deserialized. All you need to do is inherit from JsonConverter and then provide implementations for the CanConvert, WriteJson and ReadJson methods.

The logic then is fairly simple; I simply check whether the JSON being serialized is an object or an array.

If it is an object, I know that the user requested the totals, so I extract the various paging information properties, and deserialize the “users” property of the JSON object to a list of User. I then return a PagedList<User> with all the list of users as well as the paging information.

In the case where the JSON is an array, I know that paging information was not requested, so I simply convert the JSON to a list of User, and return a PagedList<User> with the list of users and no paging information:

internalclassUserPagedListConverter:JsonConverter{publicoverridevoidWriteJson(JsonWriterwriter,objectvalue,JsonSerializerserializer){thrownewSystem.NotImplementedException();}publicoverrideboolCanConvert(TypeobjectType){returntypeof(PagedList<User>).GetTypeInfo().IsAssignableFrom(objectType.GetTypeInfo());}publicoverrideobjectReadJson(JsonReaderreader,TypeobjectType,objectexistingValue,JsonSerializerserializer){if(reader.TokenType==JsonToken.StartObject){JObjectitem=JObject.Load(reader);if(item["users"]!=null){varusers=item["users"].ToObject<IList<User>>(serializer);intlength=item["length"].Value<int>();intlimit=item["limit"].Value<int>();intstart=item["start"].Value<int>();inttotal=item["total"].Value<int>();returnnewPagedList<User>(users,newPagingInformation(start,limit,length,total));}}else{JArrayarray=JArray.Load(reader);varusers=array.ToObject<IList<User>>();returnnewPagedList<User>(users);}// This should not happen. Perhaps better to throw exception at this point?
returnnull;}}

Simple as that. Then when you want to deserialize an JSON string, you can simply pass along the UserPagedListConverter to the Convert method, e.g.

JsonConvert.DeserializeObject<PagedList<User>>(content,newUserPagedListConverter());

ASP.NET Core: No more worries about checking in secrets

$
0
0

A number of articles have been written about the new Configuration model in ASP.NET Core, but one of the things which does not seem to be highlighted quite often, is how it can protect you from accidently checking secrets (such as connection string passwords or OAuth keys) into source control.

The have been various cases in the media over the past number of years where people have ended up on the wrong side of an Amazon Web Services bill after an unscrupulous operator have managed to get a hold of their AWS keys and used it to create EC2 instances.

In ASP.NET Core this is dead-simple.

Let us first look at a sample piece of code from the Startup class generated by one of the default ASP.NET Core application templates:

publicclassStartup{publicStartup(IHostingEnvironmentenv){varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true).AddEnvironmentVariables();Configuration=builder.Build();}// Rest of class omitted for brevity...
}

As you can see in the configuration of the ConfigurationBuilder, by default the configuration will be read from 3 different sources.

  1. The appsettings.json file
  2. The appsettings file which correlates with the current environment, e.g. appsettings.Development.json
  3. The environment variables

Configuration settings will be read from these 3 sources in order.

Using the enviroment-specific appsettings file

One of the first ways you can avoid checking in secrets is by using the environment-specific appsettings file and excluding that file from source control.

So in the configuration specified above, if you run on your local machine and have configured the Development environment (read more about Working with Multiple Environments), then the ASP.NET Core runtime is going to try and load settings from an optional appsettings.Development.json file.

As you can see in the code snippet, this file is specified as optional, so what you can do is to specify your secret values inside this file, e.g.

{"twitter":{"consumerKey":"your consumer key goes here","consumerSecret":"your consumer secret goes here"}}

This will make the configuration settings with the keys twitter:consumerKey and twitter:consumerSecret will be available inside your application.

All you need to do is exlude the file from source conrol, so if you use Git then simply add the file to your .gitignore file, so it does not get checked in.

You can even make it more explicit that the file contains secrets, by naming it secrets.json and excluding the secrets.json file from source control:

publicclassStartup{publicStartup(IHostingEnvironmentenv){varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true).AddJsonFile($"secrets.json",optional:true).AddEnvironmentVariables();Configuration=builder.Build();}// Rest of class omitted for brevity...
}

Store them as environment variables

You may have noticed the call to AddEnvironmentVariables in the code samples above. What this does is that it will load configuration values from environment variables.

So using the example of the Twitter Consumer Key and Secret above, I can simple specify environment variables twitter:consumerKey and twitter:consumerSecret with the relevant values:

And because of the call to AddEnvironmentVariables, the configuration settings with the keys twitter:consumerKey and twitter:consumerSecret will once again be available inside my application

Use the Secret Manager tool

One more (and probably the best) way in which you can do this is to actually use the Secret Manager Tool which is available as a .NET Core tool and was built specifically for this purpose.

You can read the article above for more detail on exactly how to use this, but what it boils down to is that there is a .NET Core tool available which you can add to your application, called the Secret Manager Tool.

You can use this tool you can specify the values for any secrets you use inside your application, and it will be stored securely on your local machine without any chance of them being checked into source control.

At runtime you can use the AddUserSecrets method to load the values of the configuration variables from the secret storage:

publicStartup(IHostingEnvironmentenv){varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true);if(env.IsDevelopment()){// For more details on using the user secret store see http://go.microsoft.com/fwlink/?LinkID=532709
builder.AddUserSecrets();}builder.AddEnvironmentVariables();Configuration=builder.Build();}

A handy side-effect for our Auth0 samples

Let’s quickly look again at the code for specifying the configuration sources inside our application:

varbuilder=newConfigurationBuilder().SetBasePath(env.ContentRootPath).AddJsonFile("appsettings.json",optional:true,reloadOnChange:true).AddJsonFile($"appsettings.{env.EnvironmentName}.json",optional:true).AddEnvironmentVariables();Configuration=builder.Build();

I mentioned before that the environment variables get loaded in the order in which the configuration sources are specified, but what I did not make clear is that all configuration sources can declare the configuration settings with the same key. What will happen in this case is that the values from a subsequent configuration source will override the values from a previous configuration source.

This is useful for me when developing our Auth0 samples, because we have a clever little trick where we replace configuration values with the actual values from your Auth0 instance.

Here is the contents from the configuration file of one of our samples:

{"AppSettings":{"SiteTitle":"Auth0 - ASP.NET 5 Web App Sample"},"Auth0":{"ClientId":"{CLIENT_ID}","ClientSecret":"{CLIENT_SECRET}","Domain":"{DOMAIN}","RedirectUri":"http://localhost:5001/"}}

Do you see those values {CLIENT_ID}, {CLIENT_SECRET} and {DOMAIN}? When you download this sample application through our documentation website, and you are signed in to your Auth0 account, we will automatically replace those with the correct values from your Auth0 instance, so you do not have to do any configuration of the application after you have downloaded it - you can just run it immediately and it is pre-configured to work with your specific Auth0 instance.

Now previously when I worked on these samples to code and test them, I had to set the values for those configuration settings to the actual values. So instead of {CLIENT_ID}, I would have to specify the actual Client ID.

I then also had to remember that everytime I checked a sample application in to GitHub that I once again replaced the actual Client ID I used while testing the sample application with the string {CLIENT_ID}, so our sample downloader worked correctly.

From time to time I forgot to do this…

With the new multiple configuration sources in ASP.NET Core, this is a thing of the past. I never have to touch the values of those configuration settings in appsettings.json again. All I do is to specify environment variables with the correct values which will then override the values in the appsettings.json file becuase of the call to AddEnvironmentVariables.

So when I use them on my computer, the environment variables I use get specified, but when a user downloads the sample they will have the correct values specified in appsettings.json and I do not have to worry about messing things up by accident.

Accessing the Request object inside a Tag Helper in ASP.NET Core

$
0
0

Last week I was doing a little experiment for our Auth0 support for ASP.NET Core which involved writing a Tag Helper. For this Tag Helper I had to access the actual URL for the request, so I therefore had to somehow get a hold of the HttpRequest inside of the Tag Helper.

Injecting IHttpContextAccessor

The Request is not available as a property of the TagHelper base class so I figured that I needed to inject IHttpContextAccessor into my Tag Helper’s constructor, for example:

publicclassLockTagHelper:TagHelper{privatereadonlyIHttpContextAccessor_contextAccessor;publicLockTagHelper(IHttpContextAccessorcontextAccessor){_contextAccessor=contextAccessor;}}

The Request can then later be accessed as follows:

varrequest=_contextAccessor.HttpContext.Request;

On my first try I got the following exception:

InvalidOperationException: Unable to resolve service for type ‘Microsoft.AspNetCore.Http.IHttpContextAccessor’ while attempting to activate ‘Auth0.AspNetCore.Mvc.TagHelpers.LockTagHelper’.

I know this worked before when I used ASP.NET Core (then still called ASP.NET 5) last year, and after a bit of research it seemed that the default behaviour has changed and you now had to configure IHttpContextAccessor manually with the DI framework.

So inside the ConfigureServices method of your Startup class, simple add the following line.

services.AddSingleton<IHttpContextAccessor,HttpContextAccessor>();

This worked great but it posed a problem for me. This particular Tag Helper would be available as a NuGet package and I did not want to expect users to have to configure IHttpContextAccessor with the DI in order for my Tag Helper to work correctly.

Using ViewContextAttribute

I needed a way which was less error prone, and after posing the question on GitHub, Pranav supplied a much better solution..

Simply declare a property of type ViewContext and decorate it with the [ViewContext] attribute.

You can then access the HttpRequest through the ViewContext.HttpContext.Request property.

publicclassLockTagHelper:TagHelper{protectedHttpRequestRequest=>ViewContext.HttpContext.Request;protectedHttpResponseResponse=>ViewContext.HttpContext.Response;[ViewContext]publicViewContextViewContext{get;set;}// Code omitted for brevity
}
Viewing all 317 articles
Browse latest View live