ASP.NET Blogs News Feed 
Sunday, May 25, 2014  |  From ASP.NET Blogs

This is a good page showing the available AppBarButton icons for your Windows 8.1 and Windows Phone 8.1 universal apps.

http://msdn.microsoft.com/en-us/library/windows/apps/xaml/jj841127.aspx

appbarbuttons

Friday, May 23, 2014  |  From ASP.NET Blogs

If you're looking to catch up with me, you can find me at my current blog, which is at http://devhammer.net/

Earlier this year, I moved on from my role at Microsoft and I am currently pursuing freelance development opportunities with ASP.NET, HTML, JavaScript, and web technologies in general. I'm also available for development and consulting for Windows Store apps and Windows Phone apps. 

Friday, May 23, 2014  |  From ASP.NET Blogs

ASP.NET Web API does not provide any output caching capabilities out of the box other than the ones you would traditionally find in the ASP.NET caching module. Fortunately, Filip wrote a very nice library that you can use to decorate your Web API controller methods with an [OutputCaching] attribute, which is similar to the one you can find in ASP.NET MVC. This library provides a way to configure different persistence storages for the cached data, which uses memory by default. As part of this post, I will show how you can implement your own persistence provider for AppFabric in order to support distributed caching on web applications running on premises.

Read more here 

 

Wednesday, May 21, 2014  |  From ASP.NET Blogs

C’est le grand retour du concours Speaker Idol de la Communauté .NET Montréal!!!

Pour le dernier meeting de l'année nous vous invitons à venir présenter votre techno préférée, votre librairie fétiche, votre projet open source innovateur ou tout autre sujet touchant le développement logiciel.  En fait, le choix de la techno n'est pas si important, ce qui l'est c'est de gagner de l'expérience pour présenter un sujet technique.  C'est un "soft skill" qui est primordial dans le développement de votre carrière.  En effet, vous aurez à faire des présentations à des clients ou à vos patrons. Une présentation bâclée ou mal présentée peut être un frein à un projet, une vente ou même une promotion.

> Rappelez-vous que ce qui sera jugé est votre présentation et non la techno que vous présentez.

 

L’aventure vous tente ?

Nous vous suggérons de visionner la formation Plurasight gratuite "Get Involved" de Scott Hanselman et Jeff Attwood: http://getinvolved.hanselman.com

Ainsi que:

Prix* à gagner pour les présentateurs:

  • 1 Xbox One*!!
  • 1 certificat* pour formation gratuite au choix chez Intertech.com
  • 1 licence* Telerik DevCraft Complete
  • 1 licence* Jetbrains au choix (dont resharper)
  • 1 licence* Mindscape au choix (sauf MegaPack)
  • 2 licences* de Cerebrata Azure Management Studio

*Les prix (sauf la XBox One) sont des gracieusetés des fabricants.  Sans aucun engagement de la part de la Communauté .NET.

Informations sur le concours

Présentation de 10 minutes: en français ou en anglais, avec support visuel comme un PowerPoint et du code.
Attention, 10 minutes c'est très court pour les démos en direct.  Assurez-vous d'introduire votre sujet, d'expliquer la problématique qu'il essaie de régler, de le démontrer et de conclure/résumer à la fin.  Et le tout en seulement 10 minutes!  Oui c'est un gros défi alors assurez-vous de vous concentrer sur l'essentiel et le message que vous voulez passer.

Une première présentation en publique: Le concours est ouvert uniquement aux personnes qui n'ont jamais fait de présentation technique dans un user group ou une conférence.

Date limite: Vous avez jusqu'au lundi 26 mai 23h59 pour soumettre votre candidature. Veuillez envoyer une brève description (200 mots max.) de votre présentation ainsi que votre bio à info@dotnetmontreal.com

Nombre maximum de participants: Parmi les candidatures reçus, les 8 meilleures seront choisies pour présenter. L'annonce des candidatures retenues sera faite le vendredi 30 mai.

L'ordre des présentations: En ordre alphabétique des noms de famille.

Panel d'expert: Après chaque présentation un panel d'expert va donner un retour aux participants basé sur

  • Maitrise du sujet
  • Qualité de la présentation
  • Aptitude à faire passer votre message
  • Qualité du PowerPoint

Important: les experts sont là pour vous aider à vous améliorer en vous donnant des conseils.

Ce qui va être fourni: Un laptop avec les plus récents outils Visual Studio et SQL Server Express.  Si vous avez besoin d'outils particuliers veuillez apporter votre propre laptop.

Ce que vous devez amener: Dans tous les cas assurez vous d'avoir une clé USB avec votre présentation PowerPoint et votre code.

Vote du publique: À la fin de la soirée le publique dans la salle vont voter et des prix seront remis aux meilleures présentations (1 par participant, voir la liste ci-haut).

Le gagnant aura la possibilité de faire une présentation complète d'une heure la prochaine saison.

Wednesday, May 21, 2014  |  From ASP.NET Blogs

VG.net Designer in Visual Studio 2013


We have released version 8.5 of the VG.net vector graphics system. This release supports Visual Studio 2013. Companies who purchased a VG.net license after October 1, 2013, are eligible for a free upgrade. We will be sending you an email.


There is one cosmetic problem which wasted our time, as we could not find a work around. It occurs when your display is set to a high DPI. You can see the problem in the image of the toolbox below, which uses a DPI of 125%, on Windows 7:


Visual Studio 2013 High DPI Toolbox Bug


The ToolboxItem class accepts only Bitmaps with a size of 16x16. We tried many sizes and many bitmap formats. As you can see, this tiny Bitmap is then scaled by the toolbox, and the scaling algorithm adds artifacts. This is an "improvement" Microsoft recently added to Visual Studio 2013.


Monday, May 19, 2014  |  From ASP.NET Blogs

This is part of a series of posts about NHibernate Pitfalls. See the entire collection here.

When saving a new entity that has references to other entities (one to one, many to one), one has two options for setting their values:

  • Load each of these references by calling ISession.Get and passing the foreign key;
  • Load a proxy instead, by calling ISession.Load with the foreign key.

So, what is the difference? Well, ISession.Get goes to the database and tries to retrieve the record with the given key, returning null if no record is found. ISession.Load, on the other hand, just returns a proxy to that record, without going to the database. This turns out to be a better option, because we really don’t need to retrieve the record – and all of its non-lazy properties and collections -, we just need its key.

An example:

   1: //going to the database



   2: OrderDetail od = new OrderDetail();



   3: od.Product = session.Get<Product>(1);    //a product is retrieved from the database



   4: od.Order = session.Get<Order>(2);        //an order is retrieved from the database



   5:  



   6: session.Save(od);



   7:  



   8: //creating in-memory proxies



   9: OrderDetail od = new OrderDetail();



  10: od.Product = session.Load<Product>(1);    //a proxy to a product is created



  11: od.Order = session.Load<Order>(2);        //a proxy to an order is created



  12:  



  13: session.Save(od);




So, if you just need to set a foreign key, use ISession.Load instead of ISession.Get.

Monday, May 19, 2014  |  From ASP.NET Blogs

Olá amigos,
A Microsoft publicou mais dois artigos meu onde explico em detalhes:
Uso de GPS com Mapa no Windows Phone 8.1 -http://msdn.microsoft.com/pt-br/library/dn690107.aspx
Windows Phone 8.1 com Banco de Dados no SQL Server Compact 3.5/4 - http://msdn.microsoft.com/pt-br/library/dn690241.aspx

Bom estudo a todos e sucesso.
Renatão

Monday, May 19, 2014  |  From ASP.NET Blogs

If you’re one of the two people who has followed my blog for many years, you know that I’ve been going at POP Forums now for over almost 15 years. Publishing it as an open source app has been a big help because it helps me understand how people want to use it, and having it translated to six languages is pretty sweet. Despite this warm and fuzzy group hug, there has been an ugly hack hiding in there for years.

One of the things we find ourselves wanting to do is hide some kind of regular process inside of an ASP.NET application that runs periodically. The motivation for this has always been that a lot of people simply don’t have a choice, because they’re running the app on shared hosting, or don’t otherwise have access to a box that can run some kind of regular background service. In POP Forums, I “solved” this problem years ago by hiding some static timers in an HttpModule. Truthfully, this works well as long as you don’t run multiple instances of the app, which in the cloud world, is always a possibility. With the arrival of WebJobs in Azure, I’m going to solve this problem.

This post isn’t about that.

The other little hacky problem that I “solved” was spawning a background thread to queue emails to subscribed users of the forum. This evolved quite a bit over the years, starting with a long running page to mail users in real-time, when I had only a few hundred. By the time it got into the thousands, or tens of thousands, I needed a better way. What I did is launched a new thread that read all of the user data in, then wrote a queued email to the database (as in, the entire body of the email, every time), with the properly formatted opt-out link. It was super inefficient, but it worked.

Then I moved my biggest site using it, CoasterBuzz, to an Azure Website, and it stopped working. So let’s start with the first stupid thing I was doing. The new thread was simply created with delegate code inline. As best I can tell, Azure Websites are more aggressive about garbage collection, because that thread didn’t queue even one message. When the calling server response went out of scope, so went the magic background thread. Duh, all I had to do was move the thread to a private static variable in the class. That’s the way I was able to keep stuff running from the HttpModule. (And yes, I know this is still prone to failure, particularly if the app recycles. For as infrequently as it’s used, I have not, however, experienced this.)

It was still failing, but this time I wasn’t sure why. It would queue a few dozen messages, then die. Running in Azure, I had to turn on the application logging and FTP in to see what was going on. That led me to a helper method I was using as delegate to build the unsubscribe links. The idea here is that I didn’t want yet another config entry to describe the base URL, appended with the right path that would match the routing table. No, I wanted the app to figure it out for you, so I came up with this little thing:

public static string FullUrlHelper(this Controller controller, string actionName, string controllerName, object routeValues = null)
{
	var helper = new UrlHelper(controller.Request.RequestContext);
	var requestUrl = controller.Request.Url;
	if (requestUrl == null)
		return String.Empty;
	var url = requestUrl.Scheme + "://";
	url += requestUrl.Host;
	url += (requestUrl.Port != 80 ? ":" + requestUrl.Port : "");
	url += helper.Action(actionName, controllerName, routeValues);
	return url;
}


And yes, that should have been done with a string builder. This is useful for sending out the email verification messages, too. As clever as I thought I was with this, I was using a delegate in the admin controller to format these unsubscribe links for tens of thousands of users. I passed that delegate into a service class that did the email work:



Func<User, string> unsubscribeLinkGenerator = 
	user => this.FullUrlHelper("Unsubscribe", AccountController.Name, new { id = user.UserID, key = _profileService.GetUnsubscribeHash(user) });
_mailingListService.MailUsers(subject, body, htmlBody, unsubscribeLinkGenerator);


Cool, right? Actually, not so much. If you look back at the helper, this delegate then will depend on the controller context to learn the routing and format for the URL. As you might have guessed, those things were turning null after a few dozen formatted links, when the original request to the admin controller went away. That this wasn’t already happening on my dedicated server is surprising, but again, I understand why the Azure environment might be eager to reclaim a thread after servicing the request.



It’s already inefficient that I’m building the entire email for every user, but going back to check the routing table for the right link every time isn’t a win either. I put together a little hack to look up one generic URL, and use that as the basis for a string format. If you’re wondering why I didn’t just use the curly braces up front, it’s because they get URL formatted:



var baseString = this.FullUrlHelper("Unsubscribe", AccountController.Name, new { id = "--id--", key = "--key--" });
baseString = baseString.Replace("--id--", "{0}").Replace("--key--", "{1}");
Func unsubscribeLinkGenerator =
	user => String.Format(baseString, user.UserID, _profileService.GetUnsubscribeHash(user));
_mailingListService.MailUsers(subject, body, htmlBody, unsubscribeLinkGenerator);


And wouldn’t you know it, the new solution works just fine. It’s still kind of hacky and inefficient, but it will work until this somehow breaks too.

Monday, May 19, 2014  |  From ASP.NET Blogs

Most of the
modern programming languages including C# support objected oriented
programming. Features such as encapsulation, inheritance, overloading
and polymorphism are code level features. Using these features is just
one part of the story. Equally important is to apply some object
oriented design principles while writing your C# code. SOLID principles
is a set of five such principles--namely Single Responsibility
Principle, Open/Closed Principle, Liskov Substitution Principle,
Interface Segregation Principle and Dependency Inversion Principle.
Applying these time proven principles make your code structured, neat
and easy to maintain. This article discusses SOLID principles and also
illustrates how they can be applied to your C# code.

http://www.binaryintellect.net/articles/7f857089-68f5-4d76-a3b7-57b898b6f4a8.aspx

 

Sunday, May 18, 2014  |  From ASP.NET Blogs

I love my MSDN Azure account. I can spin up a demo/dev app or VM in seconds. In fact, it is so easy to create a virtual machine that Azure shut down my whole account!

Last night I spun up a Java Virtual Machine to play with some Android stuff. My mistake was that I didn’t read the Virtual Machine pricing warning:

“I have a MSDN Azure Benefit subscription. Can I use my monthly Azure credits to purchase Oracle software?

“No, Azure credits in our MSDN offers are not applicable to Oracle software. In order to purchase Oracle software in the MSDN Azure Benefit subscription, customers need to turn off their {0} spending limit and pay at the regular pay-as-you-go rate. Otherwise, Oracle usage will hit the {1} spending limit and the subscription will be immediately disabled.

 Immediately disabled? Yup. Everything connected to the subscription was shut off, deallocated, rendered useless - even the free Web sites and the free Sendgrid email service.

 The fix? I had to remove the spending limit from my account so I could pay $0.49 (49 cents) for the JVM usage. I still had $134.10 in credits remaining for regular usage with 6 days left in the billing month.

 Now the restoration/clean-up begins… figuring out how to get the web sites and services back online.

 To me, the preferable way would be for Azure to warn me when setting up a JVM that I had no way of paying for the service. In the alternative, shut down just the offending services – the ones that can’t be covered by the regular credits.

What a mess.

Saturday, May 17, 2014  |  From ASP.NET Blogs

This is part of a series of posts about NHibernate Pitfalls. See the entire collection here.

NHibernate allows you to force loading additional references (many to one, one to one) or collections (one to many, many to many) in a query. You must know, however, that this is incompatible with paging. It’s easy to see why.

Let’s say you want to get 5 products starting on the fifth, you can issue the following LINQ query:

   1: session.Query<Product>().Take(5).Skip(5).ToList();




Will product this SQL in SQL Server:





   1: SELECT



   2:     TOP (@p0) product1_4_,



   3:     name4_,



   4:     price4_



   5: FROM



   6:     (select



   7:         product0_.product_id as product1_4_,



   8:         product0_.name as name4_,



   9:         product0_.price as price4_,        



  10:         ROW_NUMBER() OVER(



  11:     ORDER BY



  12:         CURRENT_TIMESTAMP) as __hibernate_sort_row



  13:     from



  14:         product product0_) as query



  15:     WHERE



  16:         query.__hibernate_sort_row > @p1



  17:     ORDER BY




If, however, you wanted to bring as well the associated order details, you might be tempted to try this:





   1: session.Query<Product>().Fetch(x => x.OrderDetails).Take(5).Skip(5).ToList();




Which, in turn, will produce this SQL:





   1: SELECT



   2:     TOP (@p0) product1_4_0_,



   3:     order1_3_1_,



   4:     name4_0_,



   5:     price4_0_,



   6:     order2_3_1_,



   7:     product3_3_1_,



   8:     quantity3_1_,



   9:     product3_0__,



  10:     order1_0__



  11: FROM



  12:     (select



  13:         product0_.product_id as product1_4_0_,



  14:         orderdetai1_.order_detail_id as order1_3_1_,



  15:         product0_.name as name4_0_,



  16:         product0_.price as price4_0_,



  17:         orderdetai1_.order_id as order2_3_1_,



  18:         orderdetai1_.product_id as product3_3_1_,



  19:         orderdetai1_.quantity as quantity3_1_,



  20:         orderdetai1_.product_id as product3_0__,



  21:         orderdetai1_.order_detail_id as order1_0__,



  22:         ROW_NUMBER() OVER(



  23:     ORDER BY



  24:         CURRENT_TIMESTAMP) as __hibernate_sort_row



  25:     from



  26:         product product0_



  27:     left outer join



  28:         order_detail orderdetai1_



  29:             on product0_.product_id=orderdetai1_.product_id



  30:         ) as query



  31: WHERE



  32:     query.__hibernate_sort_row > @p1



  33: ORDER BY



  34:     query.__hibernate_sort_row;




However, because of the JOIN, what happens is that, if your products have more than one order details, you will get several records – one per order detail – per product, which means that pagination will be broken.









There is an workaround, which forces you to write your LINQ query in another way:





   1: session.Query<OrderDetail>().Where(x => session.Query<Product>().Select(y => y.ProductId).Take(5).Skip(5).Contains(x.Product.ProductId)).Select(x => x.Product).ToList()




Or, using HQL:





   1: session.CreateQuery("select od.Product from OrderDetail od where od.Product.ProductId in (select p.ProductId from Product p skip 5 take 5)").List<Product>();




The generated SQL will then be:





   1: select



   2:     product1_.product_id as product1_4_,



   3:     product1_.name as name4_,



   4:     product1_.price as price4_



   5: from



   6:     order_detail orderdetai0_



   7: left outer join



   8:     product product1_



   9:         on orderdetai0_.product_id=product1_.product_id



  10: where



  11:     orderdetai0_.product_id in (



  12:         SELECT



  13:             TOP (@p0) product_id



  14:         FROM



  15:             (select



  16:                 product2_.product_id,



  17:                 ROW_NUMBER() OVER(



  18:             ORDER BY



  19:                 CURRENT_TIMESTAMP) as __hibernate_sort_row



  20:             from



  21:                 product product2_) as query



  22:         WHERE



  23:             query.__hibernate_sort_row > @p1



  24:         ORDER BY



  25:             query.__hibernate_sort_row);




Which will get you what you want: for 5 products, all of their order details.

Saturday, May 17, 2014  |  From ASP.NET Blogs

This is part of a series of posts about NHibernate Pitfalls. See the entire collection here.

NHibernate allows you to force loading additional references (many to one, one to one) or collections (one to many, many to many) in a query. You must know, however, that this is incompatible with paging. It’s easy to see why.

Let’s say you want to get 5 products starting on the fifth, you can issue the following LINQ query:

   1: session.Query<Product>().Take(5).Skip(5).ToList();




Will product this SQL in SQL Server:





   1: SELECT



   2:     TOP (@p0) product1_4_,



   3:     name4_,



   4:     price4_



   5: FROM



   6:     (select



   7:         product0_.product_id as product1_4_,



   8:         product0_.name as name4_,



   9:         product0_.price as price4_,        



  10:         ROW_NUMBER() OVER(



  11:     ORDER BY



  12:         CURRENT_TIMESTAMP) as __hibernate_sort_row



  13:     from



  14:         product product0_) as query



  15:     WHERE



  16:         query.__hibernate_sort_row > @p1



  17:     ORDER BY




If, however, you wanted to bring as well the associated order details, you might be tempted to try this:





   1: session.Query<Product>().Fetch(x => x.OrderDetails).Take(5).Skip(5).ToList();




Which, in turn, will produce this SQL:





   1: SELECT



   2:     TOP (@p0) product1_4_0_,



   3:     order1_3_1_,



   4:     name4_0_,



   5:     price4_0_,



   6:     order2_3_1_,



   7:     product3_3_1_,



   8:     quantity3_1_,



   9:     product3_0__,



  10:     order1_0__



  11: FROM



  12:     (select



  13:         product0_.product_id as product1_4_0_,



  14:         orderdetai1_.order_detail_id as order1_3_1_,



  15:         product0_.name as name4_0_,



  16:         product0_.price as price4_0_,



  17:         orderdetai1_.order_id as order2_3_1_,



  18:         orderdetai1_.product_id as product3_3_1_,



  19:         orderdetai1_.quantity as quantity3_1_,



  20:         orderdetai1_.product_id as product3_0__,



  21:         orderdetai1_.order_detail_id as order1_0__,



  22:         ROW_NUMBER() OVER(



  23:     ORDER BY



  24:         CURRENT_TIMESTAMP) as __hibernate_sort_row



  25:     from



  26:         product product0_



  27:     left outer join



  28:         order_detail orderdetai1_



  29:             on product0_.product_id=orderdetai1_.product_id



  30:         ) as query



  31: WHERE



  32:     query.__hibernate_sort_row > @p1



  33: ORDER BY



  34:     query.__hibernate_sort_row;




However, because of the JOIN, what happens is that, if your products have more than one order details, you will get several records – one per order detail – per product, which means that pagination will be broken.









There is an workaround, which forces you to write your LINQ query in another way:





   1: session.Query<OrderDetail>().Where(x => session.Query<Product>().Select(y => y.ProductId).Take(5).Skip(5).Contains(x.Product.ProductId)).Select(x => x.Product).ToList()




Or, using HQL:





   1: session.CreateQuery("select od.Product from OrderDetail od where od.Product.ProductId in (select p.ProductId from Product p skip 5 take 5)").List<Product>();




The generated SQL will then be:





   1: select



   2:     product1_.product_id as product1_4_,



   3:     product1_.name as name4_,



   4:     product1_.price as price4_



   5: from



   6:     order_detail orderdetai0_



   7: left outer join



   8:     product product1_



   9:         on orderdetai0_.product_id=product1_.product_id



  10: where



  11:     orderdetai0_.product_id in (



  12:         SELECT



  13:             TOP (@p0) product_id



  14:         FROM



  15:             (select



  16:                 product2_.product_id,



  17:                 ROW_NUMBER() OVER(



  18:             ORDER BY



  19:                 CURRENT_TIMESTAMP) as __hibernate_sort_row



  20:             from



  21:                 product product2_) as query



  22:         WHERE



  23:             query.__hibernate_sort_row > @p1



  24:         ORDER BY



  25:             query.__hibernate_sort_row);




Which will get you what you want: for 5 products, all of their order details.

Thursday, May 15, 2014  |  From ASP.NET Blogs

In 2010 I had an experience to work for a business that had lots of challenges.

One of those challenges was luck of technical architecture and business value recognition which translated in spending enormous amount of manpower and money on creating C++ solutions for desktop client w/o using .NET to minimize “footprint” (2#) of the client application in deployment environments. This was an awkward experience, considering that C++ custom code was created from scratch to make clients talk to .NET backend while simple having .NET as a dependency would cut time to market by at least 50% (and I’m downplaying the estimate). Regardless, recent Microsoft announcement about .NET vNext has reminded me that experience and how short sighted architecture at that company was. Investment made into making C++ client that cannot be maintained internally by team due to it’s specialization in .NET have created a situation where code to maintain will be more brutal over the time and  number of developers understanding it will be going and shrinking. Not only that. The ability to go cross-platform (#3) and performance achievement gained with native compilation (#1) would be an immediate pay back.

Why am I saying all this? To make a simple point to myself and remind again – when working on a product that needs to get to the market, make it simple, make it work, and then see how technology is changing and how you can adopt. Simplicity will not let you down. But a complex solution will always do.

image

Wednesday, May 14, 2014  |  From ASP.NET Blogs


The following is a link to cross platform data access training with
Xamarin & C#.   It is intended for use on iPhone, iPad, and Android
devices.  The course covers local data in Sqlite, calling Web Services
via REST and JSON, and calling Sql Server.



Url: http://www.learnnowonline.com/course/cpx2/xamarin-cross-platform-data-access/ 

Course Data 


Applications live on data. These applications can vary from an online
social network service, to a company’s internal database, to simple
data, and all points in between. This Course will focus on how to easily
access data on the device, communicate back and forth with a web
service, and then finally to a SQL server database.

Outline


  • Local Data (27:36)

    • Introduction (00:36)

    • Problem (01:57)

    • Solution (02:01)

    • LINQ (02:03)

    • LINQ Status (00:48)

    • SQLite (02:18)

    • SQLite - .Net Developers (00:50)

    • SQLite-net (01:07)

    • SQLite-net Attributes (02:10)

    • Getting Started (01:09)

    • CRUD (01:05)

    • SQLite Platforms (01:17)

    • Demo: SQLite – Android (04:53)

    • Demo: SQLite – iOS (04:56)

    • Summary (00:20)


  • Web Services Data (32:43)

    • Introduction (00:19)

    • Async Commands (03:15)

    • HttpClient (01:26)

    • HTTP Verbs (01:29)

    • Notes (00:58)

    • GET Operation (01:37)

    • JSON.NET (01:50)

    • Images (01:16)

    • Other Http Verbs (01:27)

    • Post (03:18)

    • Demo: Http – iOS prt1 (05:26)

    • Demo: Http – iOS prt2 (05:28)

    • Demo: Http – Android (04:20)

    • Summary (00:27)

  • Direct Data (12:33)

    • Introduction (00:23)

    • Remote Data - Direct (02:47)

    • Sql Server (01:15)

    • Demo: Sql Server – iOS (04:15)

    • Demo: Sql Server – Android (01:49)

    • "codepage 1252 not supported" (01:03)

    • Other Resources (00:43)

    • Summary (00:15)


Note: Thanks to Frank Kreuger for his data access library
Sqlite-Net.  It is very helpful and I have used it in some other
projects beyond just this training session.


Wednesday, May 14, 2014  |  From ASP.NET Blogs

Introduction

It is normal in databases to have hierarchical tables, that is, tables that are related with themselves, forming a parent-child relation. For example, consider this:

image

The parent_id column points to the parent record, which, in some cases, will not exist.

So, imagine we have a number of records, such as:

   1: INSERT INTO dbo.list (id, parent_id) VALUES (1, NULL)



   2: INSERT INTO dbo.list (id, parent_id) VALUES (2, 1)



   3: INSERT INTO dbo.list (id, parent_id) VALUES (3, 2)



   4: INSERT INTO dbo.list (id, parent_id) VALUES (4, 3)




How can we find the id of the topmost parent? In this case, it will always be 1, of course.



In SQL Server, we have two options:



  1. A Common Table Expression (CTE);

  2. A recursive function.

Let’s see how to implement each.


Common Table Expression Approach



We need to write a CTE that starts with some record and goes all the way up until it finds the parent. Let’s wrap it in a nice scalar function:





   1: CREATE FUNCTION dbo.GetTopmostParentCTE



   2: (



   3:     @id INT



   4: ) 



   5: RETURNS INT



   6: AS



   7:     BEGIN



   8:         DECLARE @parentId INT



   9:  



  10:         ;WITH cte AS 



  11:         (



  12:             SELECT a.id, a.parent_id



  13:             FROM dbo.list AS a 



  14:             WHERE a.id = @id



  15:             UNION ALL



  16:             SELECT b.id, b.parent_id 



  17:             FROM dbo.list AS b



  18:             INNER JOIN cte AS c



  19:             ON c.parent_id = b.id



  20:         )



  21:  



  22:         SELECT TOP 1 @parentId = id



  23:         FROM cte



  24:         WHERE parent_id IS NULL



  25:  



  26:         RETURN @parentid



  27:     END



  28: GO




I won’t explain here how CTEs work, they have been around for quite some time, and there are several posts how there for that.



Recursive Function Approach



The other approach is using a recursive function. The gotcha here is that when we create a function, it is compiled, and if it has a reference to itself – which doesn’t exist first – it will fail. Therefore, we need to first create a dummy function and then change it to do what we want:





   1: CREATE FUNCTION dbo.GetTopmostParent



   2: (



   3:     @id INT



   4: )



   5: RETURNS INT



   6: AS



   7: BEGIN



   8:     RETURN



   9:     (



  10:         SELECT 0



  11:     )



  12: END



  13: GO



  14:  



  15: ALTER FUNCTION dbo.GetTopmostParent



  16: (



  17:     @id INT



  18: )



  19: RETURNS INT



  20: AS



  21: BEGIN



  22:     RETURN



  23:     (



  24:         SELECT TOP 1 CASE WHEN parent_id IS NULL THEN id ELSE dbo.GetTopmostParent(parent_id) END



  25:         FROM dbo.list



  26:         WHERE id = @id



  27:     )



  28: END



  29: GO




Conclusion



You can get results from the two functions by running the following T-SQL queries:





   1: SELECT dbo.GetTopmostParent(4)



   2: SELECT dbo.GetTopmostParentCTE(4)




Interesting, both execution plans are exactly the same:



image



I can’t really recommend one over the other, since from my tests, both took the same amount of time (you will need far more records than the ones from my sample to tell that).



So, any thoughts from database gurus out there?

Wednesday, May 14, 2014  |  From ASP.NET Blogs

One of my
earlier articles shows how to create cascading DropDownLists by making
Ajax calls to the MVC action methods. While that approach is quite
common and popular recently a reader asked whether something similar can
be done without making any Ajax calls. If you want to implement
cascading dropdownlists purely on client side then you will need to
"eagerly load" all the data needed by them at the time of loading the
page. This data can be stored in a hidden field and used as and when
needed. Obviously this technique is not suitable for huge amount of data
since everything is loaded at once on the client side. However, if the
data is small and you understand the implications of loading it in
advance here is how you can accomplish the task.

http://www.binaryintellect.net/articles/36efdcf6-8280-4ba6-abb3-f846147c1266.aspx

 

Wednesday, May 14, 2014  |  From ASP.NET Blogs

 You have probably seen the URI   tel:

tel:8885551212

However, here's a trick to dial in to a conference call that requires you to punch in a conference id, indicate whether or not you are a leader, and then pres # afterwards.

 tel:8885551212;postd=ppp12345#ppp#ppp# 

where 8885551212 is the phone number, 12345 is the meeting access code.

This string will join you as an anonymous participant.

Try creating one and emailing it to yourself on your smartphone.  It works on almost all modern phones. 

 

See also: http://answers.oreilly.com/topic/2135-how-to-initiate-a-phone-call-from-your-mobile-website/ 

Tuesday, May 13, 2014  |  From ASP.NET Blogs

I've been a big fan of cloud-based infrastructure for a long time. I was fortunate enough to be on a small team of developers who built the reputation system for MSDN back in 2010, on Microsoft's Azure platform. Back then, there was a serious learning curve because Azure was barely a product. At the end of the day, we built something that easily could handle millions of transactions per month without sweating, and that was a sweet opportunity. Most people never get to build stuff to that scale.


My personal projects, specifically CoasterBuzz and PointBuzz, have been on everything from shared hosting in the early days to dedicated servers at my house and in various data centers. These sites are pretty modest in terms of traffic (low millions of requests per month), and the forum engine I wrote (POP Forums) is pretty efficient for the most part, so they don't require a ton of horsepower. That said, the overall cost of the various cloud services were still too high, or in some cases just didn't make a lot of sense to use. Bandwidth was the biggest cost problem. Even then, some services, like Amazon, might have been roughly equivalent on price, but if it's mostly just replacing a server with a virtual machine, that's a lateral move.


I've been a fan of Azure because the toolbox is so big. More specifically, their platform-as-a-service (PaaS) options take a lot of the nonsense out of running and administering stuff, which is a lot more fun. The Azure Web Sites and Cloud Service products, essentially purpose-built virtual machines, are really fantastic. Throw in the storage, queues, caches, etc., there's a lot to love.


With the recent price cuts, it was time to make the switch. The daily use and monitoring is a different topic to write about, and certainly I want to wait a few weeks until I have some experience with it. I want to talk about the migration effort here, which was relatively easy, but I do see a pretty big flaw that frankly should have been addressed years ago.


There isn't anything horribly exotic about my sites. They range from ASP.NET Webforms to MVC. There are also some other minor sites that are a little scary and do suboptimal things, mostly because they haven't changed in many years. As is typical, the little stuff ends up causing the most work. For example, the old CoasterBuzz blog site writes its MP3's to the file system. That, as it turns out, is a little tricky to handle because you can't just RDP into the server and set file permissions. Instead, you have to do that as part of the deployment. I haven't totally figured that out, so I can't share how that's done just yet. We have a similar function where we FTP up photos and bulk import them to photo albums.


Microsoft recently introduced the concept of a "hosting plan," which is roughly equivalent to a virtual server (or many instances thereof) that has all of the sites you want to group together. You can scale these up (server and CPU) and out (multiple instances behind a load balancer) as a group. This is cool, but it's also something of a minor liability. My sites tend to do a ton of caching, but that caching happens in the local memory. Therefore, I can't scale out, because the memory isn't shared across instances, and cache invalidation would be broken. I've actually done some prototyping on POP Forums that makes this easy to fix, but the older sites, especially PointBuzz, aren't ready for that. Fortunately, I don't expect to have to scale out any time soon. My stuff isn't built for multiple instances.


Setting up the sites in the Azure portal is super easy. Most of the default settings are good, though you have to open up web sockets for your SignalR stuff (the forums use it). Beyond that, there are some backup options to enable, and this requires you to be on the standard tier. Deployment is super easy as well from Visual Studio, as it performs web.config transforms as necessary and connects right up to the platform.


The one thing that has required a bit of work is the weekly execution of a stored procedure. I calculate the top 100 roller coasters by running a sproc (those who know me will find that odd, because I so don't like doing this sort of thing with SQL). What I ended up doing was firing this from an MVC action that looks for a certain request body over SSL. Azure has a scheduler that handles this call. The problem is that the scheduler won't update via the management portal to include the text you want to send in that body, so that's borken. I reached out on the forums and found someone else with the same problem. No MSFT response so far.


As I expected, SQL is still a big mess. On one hand, I am so endlessly impressed with the way that Microsoft is iterating quickly with the web products and frameworks, and open sourcing much of it on top of that. Ditto with the ever expanding Azure toolbox. Moving stuff to Azure has been pretty easy for the most part, even into PaaS components, despite much of my stuff being written for traditional n-tier server scenarios.


But SQL Azure still has the worst migration story ever. I understand most of the other constraints that the platform has when compared to the on-premise version of SQL Server. You basically have two ways to get your data into SQL Azure. You can use the sync framework, which will junk up your database with triggers and stuff (as far as I remember... I haven't looked at it in years). The other alternative is you can import a BACPAC file that you have uploaded into blob storage. I chose the latter, and here's what happened.


The CoasterBuzz database weighs in over 8 gigs. Yes, I'm storing image data in there, but that only accounts for around a gig, maybe two, of that data. I created the BACPAC on my old dedicated server, and it took about an hour and 15 minutes. I should also mention that using SQL Studio caused it to lock up, but the command line version worked fine. It's an old box with slow disks, so whatever. I can hydrate the resulting BACPAC on my MacBook Air in less than 10 minutes. Restoring it to a SQL Azure database, using the "business" tier, took about 2 hours and 15 minutes. That's pretty terrible. What was even more terrible, however, was trying to import it into one of the new basic or standard tiers, which will eventually replace web and business. In the basic tier, I did test runs for the much smaller PointBuzz database (around 2 gigs in size), and I gave up after 6 hours. Reading the MSDN forums, this is a widespread problem, in that people with even a slightly large database can't get the data in fast enough to avoid significant down time. I get throttling the database under normal use conditions, but importing really shouldn't be neutered like that.


All told, it took me four hours to move the database, which for something that isn't trivial like a site for roller coaster nerds, would be totally unacceptable. A lot of the reason comes down to the fact they won't support two tried and true methods that would minimize, if not eliminate, down time. In the old world of dedicated servers, we had two options. We could turn on replication between databases (ignoring licensing for a moment, because an indie publisher like me certainly wouldn't have two licenses). Then you have a near-real-time copy of the database, and you can mostly just flip the DNS switch to the new location and/or connection strings and you're done. Alternatively, you can do a full back up, as in .bak files, and hydrate the database in the new location. When you're ready to cut over, you take down the site, do an incremental backup, then apply that backup that to the new location. This is what I did on my last move back in 2010, and it worked like a champ. I was down for 10 minutes, late at night when I had minimal traffic anyway.


I'm super enthusiastic about Azure, but if this were some bigger thing related to a "day job" project, I would not be OK with this crappy migration story. With all of this magic and innovation, it's weird that the SQL story is so poor. It has been that way since the beginning. (Disclaimer: I have some context about why that might be from my time working in Redmond, so that may color my discontent.) Considering how important the SQL part is of the Microsoft stack, you would think this would be a great priority. I mean, they still have an awful Silverlight based management portal for SQL Azure that doesn't even work in Chrome.


Putting all of that complaining aside, I can say that I'm super happy to be migrated and done. I love the idea that I no longer have to feed and care for a dedicated server. No more shipping transaction logs or patching or configuring. Stuff mostly just works now, and whenever I need something new, it can be created in a matter of seconds. That's where the magic is. Where we would not, once upon a time, ever think about using things like queues and table storage and service buses, now we can. That's so powerful, and it's not expensive.

Tuesday, May 13, 2014  |  From ASP.NET Blogs

This week at TechEd, the ASP.NET team announced some pretty exciting updates on the way for ASP.NET.

Top Links

Blog Posts

ASP.NET team session videos from TechEd

ASP.NET site content

Getting Involved

What Is It?

In case you haven't read up on it, I'll just quote from the ASP.NET site:

The next version of ASP.NET (“ASP.NET vNext”) has been re-designed from the ground up. The goal is to create a lean and composable .NET stack for building modern cloud-based apps.

Here are some of the features of ASP.NET vNext:

  • vNext includes new cloud-optimized versions of MVC, Web API, Web Pages, SignalR, and Entity Framework.
  • MVC, Web API, and Web Pages will be merged into one framework, called MVC 6. The new framework removes a lot of overlap between the existing MVC and Web API frameworks. It uses a common set of abstractions for routing, action selection, filters, model binding, and so on. You can use the framework to create both UI (HTML) and web APIs.
  • ASP.NET vNext apps can use a cloud-optimized subset of .NET vNext. This subset is factored for server and web workloads, has a smaller footprint than the full .NET vNext, and supports side-by-side deployment.
  • MVC 6 has no dependency on System.Web. The result is a leaner framework, with faster startup time and lower memory consumption.
  • vNext will support true side-by-side deployment. If your app uses the cloud-optimized subset of .NET vNext, you can bin deploy all of your dependencies, including the .NET vNext (cloud optimized) packages. That means you can update your app without affecting other applications on the same server.
  • vNext is host agnostic. You can host your app in IIS, or self-host in a custom process. (Web API 2 and SignalR 2 already support self-hosting; ASP.NET vNext brings this same capability to MVC.)
  • Dependency injection is built into the framework. Use your preferred IoC container to register dependencies.
  • vNext uses the Rosyln compiler to compile code dynamically. You will be able to edit a code file, refresh the browser, and see the changes without rebuilding the project.
  • vNext is open source and cross platform.

To me as a web developer, this means I get:

  • All the advantages of the .NET platform (performance, stability, security, comprehensive API), and
  • The development experience of C# and Visual Studio... with
  • The simplicity, portability, quick dev refresh cycle and flexibility of an interpreted web framework.

And I like the sound of that.

Source Code

All the source code and samples are published under a new ASP.NET organization on GitHub. There are lots of interesting repos to look at; here are some top ones to get started with:

Home repository

This is the place to get started. The readme for this repo explains how to install and run the Hello World samples.

Music Store sample

Ah, the Music Store. The team wanted some samples to validate and test vNext as they developed it, and this was one of them. I updated the source code to ASP.NET MVC 5 and threw it over the wall to them, and it seems to have held up. Cephas Lin has a Music Store walkthrough posted in the vNext content on the ASP.NET site, and it's pretty easy to follow along.

BugTracker sample

The BugTracker is a single page application using SignalR,  Knockout.js and Web API. I'm pretty happy that they had a single page application as one of their validation cases from the beginning.

KRuntime

If you're feeling adventurous, this it the actual runtime. It includes things like the compilation system, SDK tools, and the native CLR hosts. (parental warning advisory if stumbling across some C++ gives you nightmares)

Quick Walkthrough

I promised I'd skip the detailed walkthrough, because you really should be looking at the "official" ones I've linked to above. My point here is not really guide you through them, but to give you a look at what the experience is like if you're not feeling up to doing it yourself. So let's see what I can get running in 30 minutes or so (until my next meeting). If you want to follow along, read the walkthroughs.

Important Notes Before We Get Started

I'm doing this on my dev machine. It runs side by side with my existing .NET and Visual Studio 2013 installs.

This looks a bit fiddly because we're doing this all from the commandline. Don't worry if that's not your bag - this will all be supported via Visual Studio. This is an early preview. But, it's good that this level of control is available. Note that I'm doing all of this without firing up Visual Studio or installing any other software.

You'll see the letter K pops up from time to time. This was internally called Project K before it was released. I have no idea if the k will go away now, but I kind of like it. So we've got kvm (k version manager), kre (k runtime engine), kpm (k package manager), and k (the actual bootstrapper to run our app).

The Home Repo samples

First, let's try out the ASP.NET vNext Home repo. I've already got this locally, but for the purposes of science I'll pretend that I don't. Since I'm just kicking the tires here, instead of cloning the repo I'll just download the zip to my desktop, unblock if necessary, and unzip it.

2014-05-13_14h27_33

Here's what that gets me:

C:\Users\Jon\Desktop\Home-master\Home-master>dir
Volume in drive C has no label.
Volume Serial Number is 5E2E-AE5E

Directory of C:\Users\Jon\Desktop\Home-master\Home-master

05/13/2014 02:29 PM <DIR> .
05/13/2014 02:29 PM <DIR> ..
05/13/2014 01:51 PM 851 .gitattributes
05/13/2014 01:51 PM 245 .gitignore
05/13/2014 01:51 PM 1,513 CONTRIBUTING.md
05/13/2014 01:51 PM 356 kvm.cmd
05/13/2014 01:51 PM 17,278 kvm.ps1
05/13/2014 01:51 PM 28 kvmsetup.cmd
05/13/2014 01:51 PM 592 LICENSE.txt
05/13/2014 01:51 PM 481 NuGet.Config
05/13/2014 01:51 PM 6,390 README.md
05/13/2014 02:29 PM <DIR> samples
9 File(s) 27,734 bytes
3 Dir(s) 11,363,086,336 bytes free


The next step in the readme tells me to execute kvmsetup.cmd, which tells me this:



Copying file C:\Users\Jon\.kre\bin\kvm.ps1
Copying file C:\Users\Jon\.kre\bin\kvm.cmd
Adding C:\Users\Jon\.kre\bin to process PATH
Adding C:\Users\Jon\.kre\bin to user PATH
Adding C:\Program Files\KRE;%USERPROFILE%\.kre to process KRE_HOME
Adding C:\Program Files\KRE;%USERPROFILE%\.kre to machine KRE_HOME
Press any key to continue ...


And with that, we've got the version manager installed. Important: this is the version manager, not the runtime. We can install multiple versions of the runtime engine, and use kvm to select the active one for a project.



Next, the readme tells me to install a named version of the K Runtime Engine: kvm install 0.1-alpha-build-0421



C:\Users\Jon\Desktop\Home-master\Home-master>kvm install 0.1-alpha-build-0421
Downloading KRE-svr50-x86.0.1-alpha-build-0421 from https://www.myget.org/F/aspnetvnext/api/v2/
Installing to C:\Users\Jon\.kre\packages\KRE-svr50-x86.0.1-alpha-build-0421
Adding C:\Users\Jon\.kre\packages\KRE-svr50-x86.0.1-alpha-build-0421\bin to process PATH


Now we've got a runtime installed, so we can run some samples. The readme recommends running the console sample first, and I think that makes sense since it's an incredibly simple app that verifies things are installed. So I cd to samples\ConsoleApp and run kpm restore. This looks scary, but it's really fast, and it's a good thing. The idea is that instead of running on big, monolithic framework assemblies, ASP.NET vNext is grabbing a bunch of small, focused NuGet packages.



C:\Users\Jon\Desktop\Home-master\Home-master>cd samples\ConsoleApp

C:\Users\Jon\Desktop\Home-master\Home-master\samples\ConsoleApp>kpm restore

C:\Users\Jon\Desktop\Home-master\Home-master\samples\ConsoleApp>CALL "C:\Users\Jon\.kre\packages\KRE-svr50-x86.0.1-alpha-build-0421\bin\KLR.cmd" --lib "C:\Users\Jon\.kre\packages\KRE-svr50-x86.0.1-alpha-build-0421\bin\;C:\Users\Jon\.kre\packages\KRE-svr50-x86.0.1-alpha-build-0421\bin\lib\Microsoft.Framework.PackageManager" "Microsoft.Framework.PackageManager" restore
Restoring packages for C:\Users\Jon\Desktop\Home-master\Home-master\samples\ConsoleApp\project.json
Attempting to resolve dependency ConsoleApp >= 1.0.0
Attempting to resolve dependency System.Console >= 4.0.0.0
GET https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Console'
GET https://nuget.org/api/v2/FindPackagesById()?Id='System.Console'
Attempting to resolve dependency mscorlib >=
Attempting to resolve dependency System >=
Attempting to resolve dependency System.Core >=
Attempting to resolve dependency Microsoft.CSharp >=
Attempting to resolve dependency ConsoleApp >= 1.0.0
Attempting to resolve dependency System.Console >= 4.0.0.0
OK https://nuget.org/api/v2/FindPackagesById()?Id='System.Console' 931ms
OK https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Console' 972ms
GET https://www.myget.org/F/aspnetvnext/api/v2/package/System.Console/4.0.0.0
OK https://www.myget.org/F/aspnetvnext/api/v2/package/System.Console/4.0.0.0 1696ms
Attempting to resolve dependency System.IO >= 4.0.0.0
GET https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.IO'
GET https://nuget.org/api/v2/FindPackagesById()?Id='System.IO'
Attempting to resolve dependency System.Runtime >= 4.0.0.0
GET https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Runtime'
GET https://nuget.org/api/v2/FindPackagesById()?Id='System.Runtime'
OK https://nuget.org/api/v2/FindPackagesById()?Id='System.Runtime' 659ms
OK https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.IO' 838ms
OK https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Runtime' 841ms
GET https://www.myget.org/F/aspnetvnext/api/v2/package/System.Runtime/4.0.20.0
OK https://nuget.org/api/v2/FindPackagesById()?Id='System.IO' 954ms
GET https://www.myget.org/F/aspnetvnext/api/v2/package/System.IO/4.0.0.0
OK https://www.myget.org/F/aspnetvnext/api/v2/package/System.IO/4.0.0.0 1779ms
Attempting to resolve dependency System.Text.Encoding >= 4.0.0.0
GET https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Text.Encoding'
GET https://nuget.org/api/v2/FindPackagesById()?Id='System.Text.Encoding'
Attempting to resolve dependency System.Threading.Tasks >= 4.0.0.0
GET https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Threading.Tasks'
GET https://nuget.org/api/v2/FindPackagesById()?Id='System.Threading.Tasks'
OK https://www.myget.org/F/aspnetvnext/api/v2/package/System.Runtime/4.0.20.0 1919ms
OK https://nuget.org/api/v2/FindPackagesById()?Id='System.Text.Encoding' 746ms
OK https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Text.Encoding' 837ms
GET https://www.myget.org/F/aspnetvnext/api/v2/package/System.Text.Encoding/4.0.10.0
OK https://www.myget.org/F/aspnetvnext/FindPackagesById()?Id='System.Threading.Tasks' 843ms
OK https://nuget.org/api/v2/FindPackagesById()?Id='System.Threading.Tasks' 1051ms
GET https://www.myget.org/F/aspnetvnext/api/v2/package/System.Threading.Tasks/4.0.0.0
OK https://www.myget.org/F/aspnetvnext/api/v2/package/System.Text.Encoding/4.0.10.0 1356ms
OK https://www.myget.org/F/aspnetvnext/api/v2/package/System.Threading.Tasks/4.0.0.0 1721ms
Resolving complete, 8357ms elapsed
Installing System.Console 4.0.0.0
Installing System.Runtime 4.0.20.0
Installing System.IO 4.0.0.0
Installing System.Text.Encoding 4.0.10.0
Installing System.Threading.Tasks 4.0.0.0
Restore complete, 8495ms elapsed


Now I run it with k run:



C:\Users\Jon\Desktop\Home-master\Home-master\samples\ConsoleApp>k run
Hello World

C:\Users\Jon\Desktop\Home-master\Home-master\samples\ConsoleApp>


Like I said, not all that exciting. Just a quick verification check. Now that I know that's working, I'll quickly pop into one of the other sandbox samples, the HelloWeb one. Notice how simple the startup.cs file is (the official walkthrough explains it in detail).



C:\Users\Jon\Desktop\Home-master\Home-master\samples\HelloWeb>dir
Volume in drive C has no label.
Volume Serial Number is 5E2E-AE5E

Directory of C:\Users\Jon\Desktop\Home-master\Home-master\samples\HelloWeb

05/13/2014 02:29 PM <DIR> .
05/13/2014 02:29 PM <DIR> ..
05/13/2014 01:51 PM 310,647 image.jpg
05/13/2014 01:51 PM 506 project.json
05/13/2014 01:51 PM 227 Startup.cs
3 File(s) 311,380 bytes
2 Dir(s) 11,368,951,808 bytes free

C:\Users\Jon\Desktop\Home-master\Home-master\samples\HelloWeb>copy startup.cs con
using Microsoft.AspNet.Builder;

namespace KWebStartup
{
public class Startup
{
public void Configure(IBuilder app)
{
app.UseStaticFiles();
app.UseWelcomePage();
}
}
} 1 file(s) copied.

C:\Users\Jon\Desktop\Home-master\Home-master\samples\HelloWeb>


Now I'll call kpm restore , just like before. This time kpm restore takes a bit longer, because there are more included packages (listed in project.json) and their dependencies.



And I'm ready to run it. This time, instead of k run, I'll call k web since it's a web app. If I forget and call k run, it reminds what's what:



C:\Users\Jon\Desktop\Home-master\Home-master\samples\HelloWeb>k run
'HelloWeb' does not contain a static 'Main' method suitable for an entry point


Fine, k web it is. It tells me the server's started, but how do I view it? Well, the readme tells me it's at http://localhost:5001, but if I didn't know I could consult the commands section of project.json:



"commands": {
"web": "Microsoft.AspNet.Hosting server=Microsoft.AspNet.Server.WebListener server.urls=http://localhost:5001"
}


Simple enough. And browsing to that gives me a cool hello world page:



2014-05-13_15h01_50



Nerd note: That page is shown because we've got app.UseWelcomePage() in startup.cs. There are no images or css in the project because everything's contained in the emitted HTML. It's actually really impressive - it's got embedded fonts, images as data:urls, CSS, and a minimized version of jQuery (for some nice animations), so it's a 287KB HTML payload... but since it's being served locally and you're not going to run this in production, not a problem.


Music Store



Okay, now that I've got this stuff installed, let's see how fast I can get the Music Store sample running.



Step 1: Grab the zip from https://github.com/aspnet/MusicStore, verify it's not blocked, and unzip on desktop.



Step 2: Run kpm restore.



Step 3: Select the hosting opeion (Helios, SelfHost, CustomHost - explained here). In this case I'll stick with selfhost, so I run k web and browse to localhost:5002:



2014-05-13_15h15_46



Yippee!



Again, the point is that it's pretty quick and painless to get started and play with the samples; just follow the walkthroughs. If you're used to Node or Rails a lot of these steps should seem pretty familiar. If you're not and this freaks you out, don't be freaked out... this will all work smoothly from within Visual Studio in the release version. This is a preview.



Take a look! Have Fun! Let us know what you think!

Monday, May 12, 2014  |  From ASP.NET Blogs

Web Camps are free, no fluff, lots of code events where you can get learn what's new in the Microsoft web platform and how you can put it to use right away. They're by developers for developers - no marketing, just building web apps.

DevCamps-Summer-2014

Upcoming events

Here's the list, with speakers:

Fresh ASP.NET and Visual Studio 2013.2 Content

We'll be covering all kinds of great new stuff in ASP.NET, Visual Studio 2013.2 and Azure for web developers. We organized the content so we start with tools and frameworks (e.g. ASP.NET MVC, ASP.NET Web API, Visual Studio 2013.2, Azure Web Sites), then dig into some specific scenarios for modern web application development. Here's the general agenda (varies by location, but should give you the general idea):

  • Keynote
  • Introduction to ASP.NET and Visual Studio 2013 Web Tools
  • Building Web Applications using the latest ASP.NET technologies
  • Building web front ends for both desktop and mobile using the latest web standards
  • API Services for both web and devices
  • Running, improving and maintaining a site in the real world
  • Real-time Communications with SignalR
  • Wrap Up

Go on, register (links are up above)! And let your web dev friends and co-workers who don't read blogs know, too, because this is a great way for them to get a recap on what's going on with ASP.NET, Visual Studio 2012, HTML5, JavaScript, and more.

And... if we're not coming to a city near you (this time), you can play along from home. There's a complete video recording of our Web Camp from Vancouver posted on Channel 9, and the Web Camps Training Kit includes all the decks and demos, plus 5 in-depth hands on labs.

 ASP.NET Blogs News Feed