Tuesday, May 08, 2018

Cross Platform Test Code Coverage for .NET Core: Coverlet

I ran across a post on Medium, entitled, Setting up Coveralls with Coverlet for a .NET Core project. This brief and informative article details the setup and use of this new library from Toni Solarin-Sodara.  


As per the instructions, I issued the following from the prompt:
dotnet add package coverlet.msbuild
Here was the result:

Then, to update everything, I ran dotnet restor to update the dependencies:


The next step per the above article was to issue the dotnet test command to set the CollectCoverage property to true:
dotnet test /p:CollectCoverage=true
Here is the output:



As you can see from the output, I need to create more unit tests to increase coverage! In any event, to see the latest code up close, go to the unit tests github repository.



Friday, May 04, 2018

Encouraging the Wisdom of the Collective - The Up Vote

We have long known that the wisdom and experience of the many is greater, due simply to number and volume, than that of the few. This echoes the line from Star Trek II: The Wrath of Kahn movie (1982) where Spock sacrifices himself for the entire ship and shares that statement with Captain Kirk that, "The needs of the many, outweigh the needs of the few."


The concept of more knowledge existing in the collective than in the individual is not a new idea sparked by the information age. We see it as early as Aristotle and also reflected 13 years ago with James Surowiecki's book, The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations.




It is when I am prompted by a coding challenge that I best experience the knowledge of the collective. It is not that I do not try to solve the current dilemma myself. In fact, I will often spend ample time trying to come up with a solution. However, after I realize that I have exhausted what I know to do, I go to the the wisdom of the many, namely Google or Bing

One of the best sites for collective intelligence long known to software developers is Stack Overflow

I have recently realized that after finding answers to a coding problem from the collective, that I typically test the information, see that it works, and then get on with the next challenge without voting up the solution from the collective. A solution, by the way, that saved me both time and stress. 

What is voting an answer up? Per Stack Overflow, "When you vote up, you are moving that content 'up' so it will be seen by more people." In short, others will see this solution sooner that likely have the same issue and will likely be helped sooner than later. Also, "Upvotes on a question give the asker +5 reputation [points]. Upvotes on an answer give the person that answered the question +10 reputation [points]. 

Therefore, it just seems like the right thing to do when you find the answer to your problem to click the up arrow as a matter of giving those who researched and worked hard to provide the answer reputation points as well as encouragement to continue to help. The goal, be more consistent in up voting the helpful answers before running onto the next challenge so that the collective is supported and encouraged. 



Wednesday, May 02, 2018

Interstitial Journaling...whaaaaat?



​I ran across an interesting post entitled, Replace Your To-Do List With Interstitial Journaling To Increase Productivity by Coach Tony. My initial reaction was whaaaaaat Journaling? I have to admit that I had to look this adjective up in the dictionary. Per merriam-webster.com the word is defined as: "pertaining to, situated in, or forming interstices." OK, that did not help. What is an interstices? Again, per merriam-webster.com interstices is defined: 
"1 a space that intervenes between things; especially : one between closely spaced things....2 : a short space of time between events." Therefore, Interstitial Journaling is a form of writing that one does between certain events or milestones.

I have long been a fan of writing a daily journal. I have been using 750words.com for sometime. In fact, I have long enjoyed its analysis of my content along with its badge system.


However, the idea of journaling when in between the steps of executing a plan is novel to me. After considering it, the value of introspection and writing those thoughts before moving to the next stage in a plan is beneficial. What am I thinking right now at this phase? What went well the last hour? How was I most effective this time in that task? It is during those short breaks that you can use the exercise of journaling to recognize what just happened, or did not happen, and to have an honest, revealing, and education about one's self.

Tony states, "Anyone who has ever done journaling in other contexts knows this — your journal is an opportunity for truth and honesty about yourself that you don’t normally have. I’m too ambitious about what I take on, while being cowardly about working hard. Putting those thoughts into a journal moves them from feelings that secretly rule my decisions to rational concepts that I can analyze and solve."

Moreover, we often think we commonly make rational decisions and are guided by facts as we move through our day. However, upon close examination, this may not be the case. Per the Coach, "I almost never explain any productivity topic without referencing the book, Thinking, Fast and Slow. The book covers two modes of decision making. One is a rational but effortful mode. This is what we wish ruled our life. The other is an emotional and habitual mode that sits just below our consciousness. This is what actually rules our life. The magic of journaling is that it is almost always effective at bringing thoughts and feelings up to a place that triggers your rational mind. The net effect is that you’re rebalancing and being more rational."

Therefore, instead of just journaling at one part of the day, morning or evening, I think I will journal throughout the day and/or between the day's tasks. Sounds rational, right?

Friday, April 27, 2018

ASP.Net Core 2 with Postgresql on Linux Part 11: Deploying to Heroku via Docker

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


Part VII, saw the Service layer setup along with its unit tests.

In Part VIII we imported the Controller classes from the existing app and setup unit tests for them.

Then in Part IX, we migrated the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing app.

Part X detailed the Program.cs and Startup.cs files as well as the appsettings.json and project files (*.csproj) content that was then migrated and updated from utilizing SQL Server to accessing Postgresql for data.

Now to deploy this new app to Heroku with Docker from Visual Studio Code on my Ubuntu 16.4 laptop.

First, you will need to download and install the Heroku CLI for your system. Go here for instructions. Then create a Heroku account.


Next, make sure you have Docker installed. Start here.


A quick thing to note is that I renamed my root folder from myApp to myapp as per this. In short, the repo name must be lowercase letters. After renaming it, the following commands to deploy the app worked.

Publish your App - pack the application and its dependencies into a folder for deployment to a hosting system - in this case: .bin/Release/netcoreapp2.0/publish by issuing the following from a command prompt:

 $ dotnet publish -c Release  
Add a file named Dockerfile to the directory where the your app was published for release. In my case, the .bin/Release/netcoreapp2.0/publish directory. Here is the content:
 FROM microsoft/aspnetcore  
 WORKDIR /app  
 COPY . .  
 CMD ASPNETCORE_URLS=http://*:$PORT dotnet myapp.dll  
Build the Docker Image:
 $ docker build -t myapp.dll ./bin/Release/netcoreapp2.0/publish  
Login to Heroku and from your dashboard create a new, free Heroku app to which we will deploy this app.
In the directory of the new app I did the following:
 $ heroku login  
 $ heroku container:login  
Then I tagged the target image that will be deployed:

 $ docker tag myapp.dll:latest registry.heroku.com/newpostgresapp/web  
Now to push the docker image to Heroku:
 $ docker push registry.heroku.com/newpostgresapp/web  
Now, when browsing to the new app URL, it works!


Fell free to create an account by selecting the Register link in the upper-right hand corner of the view. Also, let me know what you think.



Wednesday, April 25, 2018

ASP.Net Core 2 with Postgresql on Linux Part 10: Program, Startup, appsetting.json, etc.

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


Part VII, saw the Service layer setup along with its unit tests.

In Part VIII we imported the Controller classes from the existing app and setup unit tests for them.

Then in Part IX, we will migrated the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing app.

In this post, the Program.cs and Startup.cs files as well as the appsettings.json and project files (*.csproj) content will be migrated and then updated from utilizing SQL Server to accessing Postgresql for data.

Here is the compare of those root files:




Since these files are specific to the application, a simple copy and replace cannot be made. Examining the content we see that existing app (on the left) we see the bower.json file .bowerrc, which are part of the Bower Package Manager. This was addressed earlier in Part V where discussed the recommended move from Bower to Yarn

First, let's look at the appsettings.json files. After moving the Logging and Appsettings object settings, the remaining difference was the database connection strings, which specify the database platform used.




Next, when comparing the two apps' project files you see several differences. 



A few the differences are specific residual settings needed when the existing app was moved from .Net Core 1.x to 2.0. For example, the following setting:


<PropertyGroup>
  <UserSecretsId>....</UserSecretsId>
</PropertyGroup>
was need when UserSecrets 1.0.0, that required the existence of the project.json. When moved to a csproj file the above was needed. See this: https://github.com/aspnet/Announcements/issues/209.

AssetTargetFallback below was set so that with the move to .NET Core 2.0, any NuGet package that is compatible with .NET Framework 4.6.1 or higher can be used without additional configuration.


<AssetTargetFallback>$(AssetTargetFallback);portable-net45+win8+wp8+wpa81;</AssetTargetFallback>
The other settings are/were necessary for the current app's functionality.

The final file to consider is the Startup.cs file.



The differences here are the between the SQL Server and Postgresql DbContextOptionsBuilder object that is used in the lambda expression as options:
options => options.UseNpgsql

The DbContextOptionsBuilder object, "provides a simple API surface for configuring DbContextOptions. Databases (and other extensions) typically define extension methods on this object that allow you to configure the database connection (and other options) to be used for a context."

Ok. Now we are ready to run the new ASP.net MVC .Net Core 2.0 app.



Browsing to the https://localhost:5000 URL and we see the initial page.


In the next and final post in this series, we will deploy this app to Heroku via Docker.


Tuesday, April 24, 2018

ASP.Net Core 2 with Postgresql on Linux Part 9: The View Layer

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


In Part VIIwe got the Service layer setup along with its unit tests.

In Part VIII we imported the Controller classes from the existing app and setup unit tests for them.

In this post, we will migrate the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing app.

As we have done in past posts, let's use Beyond Compare to see what needs to move. 



So, as with the other tiers, we have some files to move in this layer.

After using Beyond Compare to move the Razor files, here is the view:

Next, let's look at the contents of the wwwroot folder, that holds the JavaScript, Cascading Style Sheet (CSS), and image files.



From here I simply replaced the contents of the new app with the JavaScript, Cascading Style Sheet (CSS), and image files of the existing app. 



Now, we need to build and see what happens. 


So, thus far, all is good. However, we know that the Program.cs and Startup.cs files as well as the appsettings.json and project files (*.csproj) content is still different. In the next post, Part X, we will migrate them from utilizing SQL Server to accessing Postgresql for data.



Monday, April 23, 2018

ASP.Net Core 2 with Postgresql on Linux Part 8: The Controller Layer

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


In Part VIIwe got the Service layer setup along with its unit tests.

In this post, we will import the Controller classes from the existing app and setup unit tests for them.

Here is look at the existing and new Controller folders from Beyond Compare:



First, I copied the existing Controller classes to the new app and then updated the namespace settings as well as adjusted other dependencies. With that done, the Controller classes are moved.


Then, I created unit tests for the main Controller methods. Instead of displaying the code in a screen shot, go to the unit tests github repository to see them all. Moreover, to see the corresponder Controller classes go here to the github repo.

With that done, the next post, Part IX, will detail the migration of the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing ASP.Net MVC app to the new app.