Sunday, December 09, 2018

Cross Platform .Net Core 2.0 C# Unit Testing

I wanted to make some changes to my personal .Net Core 2.0 website that lives on Heroku and is deployed via Docker. But, before doing this I wanted to setup a basic unit test project where I could test some of what I wanted to accomplish. Given that I wanted the unit tests to be cross platform I decided to use xUnit within Visual Studio Code.

Here are the steps. First, a created a folder that would hold both the objects under test as well as the unit tests.


After changing into the new unitTestsCSharp folder, I created the directory for the objects under test named ouut that stand for Objects Under Unit Tests.


From ouut I created a new project using dotnet new classlib command for the...well you guessed it...the code to be tested.


Then, back in the unitTestsCSharp directory I created a folder to hold the actual unit tests that is creatively entitled tests.


From the new tests folder I ran the dotnet new xunit command to create a new project for the unit tests.


I then opened the unitTestsCSharp folder in Visual Studio Code (VS Code). You may see dialog in VS Code informing you that some assets are missing for building and debugging and asking to add them. Select the Yes button.



From the unitTestsCSharp/ouut folder I renamed Class1.cs to MonthDateStuff.cs as I wanted to do some basic Date/Time parsing. As many of you know, when using the test-driven development (TDD) methodology, you first create a failing implementation of the MonthDateStuff class.



 using System;  
 namespace ouut  
 {  
   public class MonthDateStuff  
   {  
     public int GetMonthAsInteger()  
     {  
       throw new NotImplementedException("Sanity Test as this should fail!");  
     }  
   }  
 }  

Now we can look at the tests. Let's add the MonthDateStuff class library as a dependency to the test project using the dotnet add reference command.


Next, I changed the UnitTest1.cs name to MonthDateStuffTests.cs, added the using ouut directive, and then added a ReturnIntegerGivenValidDate method for an initial test.

 using System;  
 using Xunit;  
 using ouut;  
 namespace tests  
 {  
   public class MonthDateStuffTests  
   {  
     private ouut.MonthDateStuff _monthDateStuff;  
     public MonthDateStuffTests(){  
       _monthDateStuff = new MonthDateStuff();  
     }  
     [Fact]  
     public void ReturnIntegerGivenValidDate()  
     {  
       var result = _monthDateStuff.GetMonthAsInteger();  
       Assert.True(result.Equals(0));  
     }  
   }  
 }  

The [Fact] attribute lets the xUnit framework know that the ReturnIntegerGivenValidDate method is to be run by the test runner. From the tests folder I execute dotnet test to build the tests and the class library and then run the tests. The xUnit test runner contains the program entry point to run your tests. The dotnet test command starts the test runner using the unit test project. Here, for a sanity check, we want the test to fail to at least make sure all is setup properly.


Now, let's get the test to pass. Here is the code in the updated MonthDateStuff class.

 using System;  
 namespace ouut  
 {  
   public class MonthDateStuff  
   {  
     public int GetMonthAsInteger()  
     {  
       var dateInput = "Jan 1, 2019";  
       var theDate = DateTime.Parse(dateInput);  
       return theDate.Month;  
     }  
   }  
 }  

The test result.


Let's add more features. xUnit has other attributes that enable you to write a suite of similar tests. Here are a few:

[Theory] represents a suite of tests that execute the same code but have different input arguments.

[InlineData] attribute specifies values for those inputs.

Rather than creating several tests, use these two attributes to create a single theory. In this case, the theory is a method that tests several month integer values to validate that they are greater than zero and less than twelve:

 [Theory]  
 [InlineData(1)]  
 [InlineData(2)]  
 [InlineData(3)]  
 public void ReturnTrueGivenValidMonthInteger(int monthInt)  
 {  
  Assert.True(_monthDateStuff.IsValidMonth(monthInt));  
 }  


Note that there were 4 tests run as the theory ran three times.

Get more information about xUnit here .

Thursday, July 12, 2018

Synthesis (learning and growing) is not Easy




What does the title mean by Synthesis? Ever heard of Hegel’s Dialectic? For Georg Wilhelm Friedrich Hegel, the dialectic method of historical and philosophical progress consists of (1) a beginning idea called a thesis, (2) a competing and opposite proposal of that thesis called the antithesis, and (3) a synthesis whereby the two conflicting ideas are reconciled or “synthesized” to form a new and more advanced idea.



Isn’t that the idea of learning? To take your current ideas and put them against opposing viewpoints to achieve a better understanding?  


Here is an example of the dialectic method used in an English class:




As you can see from above, a thesis or viewpoint on a topic is not always correct. If we know that there are things to be gained from Hegel’s Dialectic, why don’t we do it more often? Simple. It is hard on us psychologically and it is mental and emotional work. When the antithesis is introduced to us from a social media post or something we have just read then there is struggle and pain to achieve synthesis.

When the antithesis is introduced to us from a social media post...then there is struggle and pain to achieve synthesis.

When your identity and thinking is on the side of or beholden to a particular thesis, considering the antithesis is difficult. You have to take time and make effort to consider antithetical viewpoints. You have to overcome preconceived ideas about those who hold the antithesis. This is not easy. No wonder we protect ourselves by surrounding ourselves with those who think like us. This is also know as an "echo chamber" where we hear the same things repeated on a continual basis.

No wonder we protect ourselves by surrounding ourselves with those who think like us.

To help in this, you must trust that once the process of synthesis is completed, or even in the painful process, you gain in various areas. First, you become more empathetic. You better understand what "they" have been feeling and thinking. Next, you learn that the topic being considered is not as simple as you originally thought. Moreover, you also better understand those complexities. However, as was previously stated, there is an element of discomfort when you hold something as true and then consider the opposing viewpoint. In short, it’s not easy but more than often, worth it.

... there is an element of discomfort when you hold something as true and then consider the opposing viewpoint. In short, it’s not easy but more than often, worth it.

Tuesday, July 10, 2018

AI and More Free Time



Technology has given us more free time. Imagine what our days would be like if we had to grow out own food, etc. AI will more likely give us more free time in the future. My question is, what will we do with it? Or another way to ask the question, "What are we doing now that we will no longer have to do which should result in more free time?"

Some things that come to my mind are drive a car, complete standard forms such as annual tax reviews, get regular medical testing, and grocery shopping.

AI driving a car is not hard to imagine. We are on the cusp of that reality.

Filling out standard forms would be something AI could do give the data that is already part of our everyday lives. Imagine the 1040EZ or 1040A forms being completed and submitted based on the information from where you work, the bills you pay, and the charities that your support.

Concerning medical tests, what if each time you urinated or defecated, your home restroom facilities would run a chemical analysis on your urine or stool and share that info with you?

Another reality that is coming soon to you is having your groceries brought to your door step. Imagine a smart fridge where what is needing replaced of your commonly purchased items automatically arrives. How would simply supplying a list to your personal AI of needed items that were at your home in less than 12 hours save you time?

Finally, if these are more possible tasks are fulfilled by AI, what would you do with that freed up time? Learn something new. Go for more walks? Spend more time with a loved one?

Tuesday, July 03, 2018

Most Advanced Yet Acceptable Ideas

In his book Hitmakers, Derek Thompson discusses an acronym, The MAYA Principle - "Most Advanced. Yet Acceptable."



This acronym was set forth by Raymond Loewy, known as the father of industrial design. Loewy's Lucky Strike cigarette package, Exxon logo, and blue nose of Air Force One are only a few of his famous designs.

Per Thompson, in his 2017 Atlantic article, "Loewy had an uncanny sense of how to make things fashionable. He believed that consumers are torn between two opposing forces: neophilia, a curiosity about new things; and neophobia, a fear of anything too new. As a result, they gravitate to products that are bold, but instantly comprehensible. Loewy called his grand theory 'Most Advanced Yet Acceptable'"

How would this work in the area of ideas? How can we better form concepts that are novel, but quickly understandable?

Tuesday, June 26, 2018

AI helping us be more ethical.



With the advent of self-driving cars, we see AI starting to make ethical decisions. How you may ask? 

These automobiles will have to decide what to do when human lives are at stake. For example, let's say a couple is in a hurry. Instead of obeying the traffic signal to not cross the street, they attempt to cross it not seeing the oncoming vehicle. Moreover, this is a self-driving car that is about to hit them. The AI calculates that it does not have enough time to come to a safe stop. In a millisecond the AI reviews the scenario. Should the ethic algorithms decide to hit the single pedestrian on the side walk instead? Hit the couple crossing the street? Or risk the life of the passengers in the car as well as those in the neighboring office building by careening into it?

Let's take this concept a step further. What if the AI knows who all the humans are that are in danger via face recognition? Also, what if the algorithms instantly calculate that one of the people in peril crossing the street has a high proclivity to criminal behavior?

Once intelligent machines start making, at least hypothetically better and more informed ethical decisions based on data, will we then slowly start to hand over our own ethical decisions to them? Why not have an AI (artificial intelligence) that has access to massive amounts of information and can process it at a thousand fold rate more then we can at least assist us in those dilemmas?

Tuesday, June 19, 2018

Less is More?

Blaise Pascal famously wrote, "I would have written a shorter letter, but I did not have the time." 


It takes effort to be brief. Speaking of that, I listen to various podcast throughout the day. One that I listen to is only one minute in length. What is interesting is that I find myself really concentrating as I listen to it. I will listen to it attentively, because it only takes a minute time, and I can afford that effort during my busy day. Moreover, I will soon have another minute to listen to it again. Secondly, I want to make sure I get that singular thought that is communicated from that one minute podcast. What I end up doing with the other podcasts that are 15 minutes to 1 hour in length is listening to them one time and even speeding up the listening process only to not hear them again. 

Why is it I pay more attention and listen more carefully to the shorter podcast? 

First, it simply takes less time. Next, because there is only a singular idea or concept contained, I want to make sure that I obtained what is so briefly being shared. 

In short, I likely spend more time listening to and then considering the shorter podcast than I do with the longer!


Thursday, May 31, 2018

Are You, Like the Emerging AI, More than the Sum of your Parts?


I recently subscribed to Medium. What I typically do is take the daily email from Medium, browse through what articles, and if I find something of interest, I save that to Instapaper to read and more commonly listen to with Instapaper's audio feature on its mobile app.

I was listening to an article, On Metamodernism by Seth Abramson. In that article he stated that, "people reduce you to your data in a way that’s soul-crushing." Immediately, what came to my mind was the Aristotelian quote, "The whole is more than the sum of its parts"

A week or so later, I was listening to the Triangulation Podcast from This Week in Technology, that featured the new book, The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity by Byron Reese. Reese seemed optimistic and also realistic in his assessment of Artificial Intelligence (AI) and its current and potential interaction with humanity.

My typical process, after hearing of a book that I would like to read/hear is to check on my local library's web site to see if the audio or ebook is available. It did not have either. However, the Overdrive app that it uses does have the ability to recommend an audio or ebook. So, I recommend the audio book, which also puts you first in line to borrow the resource. A few days later, I get an email stating that my local library, which I love by the way, purchased the audio book and that I can start accessing it at will.

Again, my modus operandi is to run in the morning, listening to audio books and/or podcasts. So, the next morning after downloading the audio book, I am moving along listening. At the very outset Reese brings up the philosophical discussion of monism vs dualism or what is more frequently termed the mind/body problem. Before we look at these ideologies, note that where you fall on that continuum will influence your consideration of the possibility of AI becoming conscious.

To be sure, these views span a continuum. But, for discussion sake and for clarity, let's look at the two systems in contrast. Concerning consciousness,  the earliest discussions of dualist ideas are found in the writings of Plato who held that intelligence could not be identified with, or explained in terms of, their physical body. The best-known version of dualism comes from RenĂ© Descartes (1641), who held that the mind is a non-extended, non-physical substance, and separate from the physical brain. To be sure, the brain is necessary for both schools of thought, however, for the dualist consciousness or the mind "emerges" from the physical brain.

In short, a simplistic understanding is if you think that you are no more than the sum of your parts then you are a monist. Material is all that there is. Your sense of self consciousness is nothing more than a "trick" of your brain. In contrast, if you think that you are more than the sum of your physical make up, then you are a dualist and consciousness "emerges" from the brain.

So, a monist would typically respond that of course AI systems can become conscious given they possess the same physical properties that our human brain possesses. However, the dualist would say, "Hold on here, how do we know if there has been an emergent product of those physical properties that is like human consciousness or do they simply mimic the characteristics of consciousness?"

What do you "think?"


Thursday, May 24, 2018

Brotopia: It's Bad for Everyone

I recently listened to the audio book Brotopia: Breaking Up the Boys' Club of Silicon Valley by Emily Chang during my morning jogs. 

My initial thought as I started to listen to the book was that the under representation of women and minorities in Tech was due to not as many women and minorities pursuing STEM degrees and programs as white men. Boy, (pun intended) was I wrong! 

Romans of the Decadence (1847), by Thomas Couture, as updated to parody Silicon Valley’s male-dominated sexual and sexist culture. Photo Illustration by Darrow.


What I found out from the book is that not only is there inequality but also a toxic culture of white, privileged males that are intentionally or at least subconsciously barring others who look different than they do from entering and taking part in the fast and furious tech world of Silicon Valley. 

While Cincinnati, Ohio is not Silicon Valley, there is still has a good deal of tech based businesses as well as insurance and banking companies that are essentially IT organizations. What this means is there is ample opportunities here for the alienation of women and minorities by white males such as myself.

What I hope to take away from this book is to learn how I am contributing to the problem and put an immediate stop to it. With that stated, before being exposed to Brotopia, I did not consider myself as part of the problem, however after listening to this book I am now aware that, in fact, that is likely not true. While not desiring to actively suppress others, I no doubt have committed sins of omission by not combatting the toxicity. 

In short, I hope to be active in two ways. First, encourage and assist my co-workers and associates in tech that look different than me. This can be done by making sure that they know that I am listening to them, that I in fact prize their input, and that they are a value to the team. While I "think" I have been doing that, I no doubt have been remiss in actively communicating that I know they are smart and productive.

Secondly, I could be proactive in seeing that women and minorities enter the tech field via helping the various groups that have emerged in the last 5 years such as Girls Who Code. Here, I can volunteer time and resources. Also, as an Adjunct Instructor at a local university teaching a programming class to Business Informatics students, I can encourage the women and minorities in the class to push forward and fight the good fight. 

What has been most painful in listening to the book is putting myself in the place of many who while working endless hours in a taxing industry, had to battle the emotional stress of the many sexist and mean comments, sexual advances, belittements, and generally being demeaned and alienated all the while trying not only to advance, but also to survive. 

Ms. Chang is correct in her hope for her sons to work and thrive in whatever future they choose. Believe me, the same type of persons that create a toxic environment for women and minorities, often bully and demean their male coworkers we well. Truly, "If one is oppressed, all are oppressed."

Updating pgAdmin 4 on Ubuntu 16.04

When I opened pgAdmin on my Ubuntu 16.04 system this morning and received a notification that there was a new version, 3.0. So, in order to upgrade I did the following:
 $ virtualenv -p python3 pgadmin4   
 $ cd pgadmin4   
 $ source bin/activate   
 $ pip3 install https://ftp.postgresql.org/pub/pgadmin/pgadmin4/v3.0/pip/pgadmin4-3.0-py2.py3-none-any.whl  

The *.whl (wheel file) above is a ZIP-format archive with a specially formatted filename and the .whl extension and is a built-package format for Python. Go to https://www.postgresql.org/ftp/pgadmin/pgadmin4/ for the latest python wheel files.

All looked good until I opened the new version and attempted to run the Query Tool. Whenever I try to open the Query Tool by going through the drop down Tools menu, I would receive the error dialog "Query Tool Initialize Error."


By experimenting I found that instead of browsing to http://12.70.0.1:5050, when I changed the URL to http://localhost:5050, I was able to use the Query Tool via the drop down menus.

Therefore, what I did was edit the pgAdmin4/lib/python3.5/site-packages/pgadmin4/config.py file by updating the line:
 DEFAULT_SERVER = '127.0.0.1'  
to
 DEFAULT_SERVER = 'localhost'  
Now when I start pgAdmin, all is good.


Tuesday, May 08, 2018

Cross Platform Test Code Coverage for .NET Core: Coverlet

I ran across a post on Medium, entitled, Setting up Coveralls with Coverlet for a .NET Core project. This brief and informative article details the setup and use of this new library from Toni Solarin-Sodara.  


As per the instructions, I issued the following from the prompt:
dotnet add package coverlet.msbuild
Here was the result:

Then, to update everything, I ran dotnet restor to update the dependencies:


The next step per the above article was to issue the dotnet test command to set the CollectCoverage property to true:
dotnet test /p:CollectCoverage=true
Here is the output:



As you can see from the output, I need to create more unit tests to increase coverage! In any event, to see the latest code up close, go to the unit tests github repository.



Friday, May 04, 2018

Encouraging the Wisdom of the Collective - The Up Vote

We have long known that the wisdom and experience of the many is greater, due simply to number and volume, than that of the few. This echoes the line from Star Trek II: The Wrath of Kahn movie (1982) where Spock sacrifices himself for the entire ship and shares that statement with Captain Kirk that, "The needs of the many, outweigh the needs of the few."


The concept of more knowledge existing in the collective than in the individual is not a new idea sparked by the information age. We see it as early as Aristotle and also reflected 13 years ago with James Surowiecki's book, The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations.




It is when I am prompted by a coding challenge that I best experience the knowledge of the collective. It is not that I do not try to solve the current dilemma myself. In fact, I will often spend ample time trying to come up with a solution. However, after I realize that I have exhausted what I know to do, I go to the the wisdom of the many, namely Google or Bing

One of the best sites for collective intelligence long known to software developers is Stack Overflow

I have recently realized that after finding answers to a coding problem from the collective, that I typically test the information, see that it works, and then get on with the next challenge without voting up the solution from the collective. A solution, by the way, that saved me both time and stress. 

What is voting an answer up? Per Stack Overflow, "When you vote up, you are moving that content 'up' so it will be seen by more people." In short, others will see this solution sooner that likely have the same issue and will likely be helped sooner than later. Also, "Upvotes on a question give the asker +5 reputation [points]. Upvotes on an answer give the person that answered the question +10 reputation [points]. 

Therefore, it just seems like the right thing to do when you find the answer to your problem to click the up arrow as a matter of giving those who researched and worked hard to provide the answer reputation points as well as encouragement to continue to help. The goal, be more consistent in up voting the helpful answers before running onto the next challenge so that the collective is supported and encouraged. 



Wednesday, May 02, 2018

Interstitial Journaling...whaaaaat?



​I ran across an interesting post entitled, Replace Your To-Do List With Interstitial Journaling To Increase Productivity by Coach Tony. My initial reaction was whaaaaaat Journaling? I have to admit that I had to look this adjective up in the dictionary. Per merriam-webster.com the word is defined as: "pertaining to, situated in, or forming interstices." OK, that did not help. What is an interstices? Again, per merriam-webster.com interstices is defined: 
"1 a space that intervenes between things; especially : one between closely spaced things....2 : a short space of time between events." Therefore, Interstitial Journaling is a form of writing that one does between certain events or milestones.

I have long been a fan of writing a daily journal. I have been using 750words.com for sometime. In fact, I have long enjoyed its analysis of my content along with its badge system.


However, the idea of journaling when in between the steps of executing a plan is novel to me. After considering it, the value of introspection and writing those thoughts before moving to the next stage in a plan is beneficial. What am I thinking right now at this phase? What went well the last hour? How was I most effective this time in that task? It is during those short breaks that you can use the exercise of journaling to recognize what just happened, or did not happen, and to have an honest, revealing, and education about one's self.

Tony states, "Anyone who has ever done journaling in other contexts knows this — your journal is an opportunity for truth and honesty about yourself that you don’t normally have. I’m too ambitious about what I take on, while being cowardly about working hard. Putting those thoughts into a journal moves them from feelings that secretly rule my decisions to rational concepts that I can analyze and solve."

Moreover, we often think we commonly make rational decisions and are guided by facts as we move through our day. However, upon close examination, this may not be the case. Per the Coach, "I almost never explain any productivity topic without referencing the book, Thinking, Fast and Slow. The book covers two modes of decision making. One is a rational but effortful mode. This is what we wish ruled our life. The other is an emotional and habitual mode that sits just below our consciousness. This is what actually rules our life. The magic of journaling is that it is almost always effective at bringing thoughts and feelings up to a place that triggers your rational mind. The net effect is that you’re rebalancing and being more rational."

Therefore, instead of just journaling at one part of the day, morning or evening, I think I will journal throughout the day and/or between the day's tasks. Sounds rational, right?

Friday, April 27, 2018

ASP.Net Core 2 with Postgresql on Linux Part 11: Deploying to Heroku via Docker

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


Part VII, saw the Service layer setup along with its unit tests.

In Part VIII we imported the Controller classes from the existing app and setup unit tests for them.

Then in Part IX, we migrated the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing app.

Part X detailed the Program.cs and Startup.cs files as well as the appsettings.json and project files (*.csproj) content that was then migrated and updated from utilizing SQL Server to accessing Postgresql for data.

Now to deploy this new app to Heroku with Docker from Visual Studio Code on my Ubuntu 16.4 laptop.

First, you will need to download and install the Heroku CLI for your system. Go here for instructions. Then create a Heroku account.


Next, make sure you have Docker installed. Start here.


A quick thing to note is that I renamed my root folder from myApp to myapp as per this. In short, the repo name must be lowercase letters. After renaming it, the following commands to deploy the app worked.

Publish your App - pack the application and its dependencies into a folder for deployment to a hosting system - in this case: .bin/Release/netcoreapp2.0/publish by issuing the following from a command prompt:

 $ dotnet publish -c Release  
Add a file named Dockerfile to the directory where the your app was published for release. In my case, the .bin/Release/netcoreapp2.0/publish directory. Here is the content:
 FROM microsoft/aspnetcore  
 WORKDIR /app  
 COPY . .  
 CMD ASPNETCORE_URLS=http://*:$PORT dotnet myapp.dll  
Build the Docker Image:
 $ docker build -t myapp.dll ./bin/Release/netcoreapp2.0/publish  
Login to Heroku and from your dashboard create a new, free Heroku app to which we will deploy this app.
In the directory of the new app I did the following:
 $ heroku login  
 $ heroku container:login  
Then I tagged the target image that will be deployed:

 $ docker tag myapp.dll:latest registry.heroku.com/newpostgresapp/web  
Now to push the docker image to Heroku:
 $ docker push registry.heroku.com/newpostgresapp/web  
Now, when browsing to the new app URL, it works!


Fell free to create an account by selecting the Register link in the upper-right hand corner of the view. Also, let me know what you think.



Wednesday, April 25, 2018

ASP.Net Core 2 with Postgresql on Linux Part 10: Program, Startup, appsetting.json, etc.

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


Part VII, saw the Service layer setup along with its unit tests.

In Part VIII we imported the Controller classes from the existing app and setup unit tests for them.

Then in Part IX, we will migrated the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing app.

In this post, the Program.cs and Startup.cs files as well as the appsettings.json and project files (*.csproj) content will be migrated and then updated from utilizing SQL Server to accessing Postgresql for data.

Here is the compare of those root files:




Since these files are specific to the application, a simple copy and replace cannot be made. Examining the content we see that existing app (on the left) we see the bower.json file .bowerrc, which are part of the Bower Package Manager. This was addressed earlier in Part V where discussed the recommended move from Bower to Yarn

First, let's look at the appsettings.json files. After moving the Logging and Appsettings object settings, the remaining difference was the database connection strings, which specify the database platform used.




Next, when comparing the two apps' project files you see several differences. 



A few the differences are specific residual settings needed when the existing app was moved from .Net Core 1.x to 2.0. For example, the following setting:


<PropertyGroup>
  <UserSecretsId>....</UserSecretsId>
</PropertyGroup>
was need when UserSecrets 1.0.0, that required the existence of the project.json. When moved to a csproj file the above was needed. See this: https://github.com/aspnet/Announcements/issues/209.

AssetTargetFallback below was set so that with the move to .NET Core 2.0, any NuGet package that is compatible with .NET Framework 4.6.1 or higher can be used without additional configuration.


<AssetTargetFallback>$(AssetTargetFallback);portable-net45+win8+wp8+wpa81;</AssetTargetFallback>
The other settings are/were necessary for the current app's functionality.

The final file to consider is the Startup.cs file.



The differences here are the between the SQL Server and Postgresql DbContextOptionsBuilder object that is used in the lambda expression as options:
options => options.UseNpgsql

The DbContextOptionsBuilder object, "provides a simple API surface for configuring DbContextOptions. Databases (and other extensions) typically define extension methods on this object that allow you to configure the database connection (and other options) to be used for a context."

Ok. Now we are ready to run the new ASP.net MVC .Net Core 2.0 app.



Browsing to the https://localhost:5000 URL and we see the initial page.


In the next and final post in this series, we will deploy this app to Heroku via Docker.


Tuesday, April 24, 2018

ASP.Net Core 2 with Postgresql on Linux Part 9: The View Layer

In Part I, I discussed the rationale for porting a ASP.Net Core 2 app to my Linux system replacing SQL Server with Postgresql and place it on Heroku. 

With Part II, we transferred the SQL Server data to Postgres.


In Part III, I create the ASP.Net MVC Core 2.x app, from which we will proceed, and get Entity Framework Core installed.


Then in Part IV, I look at the configuration differences between the ASP.Net MVC app that uses SQL Server and the new app, in preparation of porting the old app to use Postgresql.

Part V, we compared the differences in both the projects' structures and ported the existing project code, starting with the data model layer, to the new project. In addition to this, we got the jquery.countdown jquery plugin installed, that is used to implement a JavaScript timer. More on that in a later article. Finally, we used the dotnet ef dbcontext scaffold command that builds a DbContext and entity types for out Postgresql database. And, oh yes, we added the project to github, which frankly should have been done day one.


In Part VI, we got the NUnit testing framework in place 
in order to test both the new and future code.


In Part VIIwe got the Service layer setup along with its unit tests.

In Part VIII we imported the Controller classes from the existing app and setup unit tests for them.

In this post, we will migrate the Razor (*.cshtml) files as well as the associated JavaScript, Cascading Style Sheet (CSS), and image files from the existing app.

As we have done in past posts, let's use Beyond Compare to see what needs to move. 



So, as with the other tiers, we have some files to move in this layer.

After using Beyond Compare to move the Razor files, here is the view:

Next, let's look at the contents of the wwwroot folder, that holds the JavaScript, Cascading Style Sheet (CSS), and image files.



From here I simply replaced the contents of the new app with the JavaScript, Cascading Style Sheet (CSS), and image files of the existing app. 



Now, we need to build and see what happens. 


So, thus far, all is good. However, we know that the Program.cs and Startup.cs files as well as the appsettings.json and project files (*.csproj) content is still different. In the next post, Part X, we will migrate them from utilizing SQL Server to accessing Postgresql for data.