learnmicrosoft.net Open in urlscan Pro
50.87.249.41  Public Scan

Submitted URL: http://www.learnmicrosoft.net/
Effective URL: https://learnmicrosoft.net/
Submission: On March 05 via api from US — Scanned from US

Form analysis 1 forms found in the DOM

GET https://learnmicrosoft.net/

<form method="get" action="https://learnmicrosoft.net/">
  <input type="text" name="s" size="20" class="search-textbox" placeholder="Search..." tabindex="1" required="">
  <button type="submit" class="search-button"></button>
</form>

Text Content

Skip to content
 * 
 * 
 * 
 * 


LEARN MICROSOFT .NET TECHNOLOGY

 * Subscribe





DEPLOYING A MUTLI-TIER MEAN APPLICATION ON MICROSOFT AZURE

Brij Mohan November 2, 2015
In this post I am going to connect all the dots I have given in my previous 3
post.
 1. Continuous deployment of Node, Express application on Azure using Github
 2. Simple CRUD operation using MEAN stack
 3. NodeJS connectivity to MongoDB on Cloud – Microsoft Azure and mongolab

In brief, I will restructure and reconfigure my previous application I had
created in my post mentioned in #2 above (Simple CRUD operation using MEAN
Stack), then deploy the application on Microsoft Azure with continuous
deployment configured with Github and finally I am going to connect all the
different layers using the concept I have covered in NodeJS connectivity to
MongoDB on Cloud.
To start with this example I would recommend you to go through my previous posts
mentioned above, this will give you a little background on what I am going to
explain in this post.
Step 1: Restructure the application, in my previous example “Simple CRUD
operation using MEAN stack” I have provided an application build on MEAN stack.
I am going to take the sample application and split this into 2 parts one is for
Client having Angular JS as client side technology and another into Server build
on NodeJS + Express technology. And finally this is going to connect to MongoDB
which I have used mongolab as DBaaS (Database as a Service). So the overall
architecture of the application will be as follows

So my application structure will be as follows.

Client and Server apps AddressBookClient AddressBookServer

Alternately you can also signup free for cloud based MongoDB, provided by
MongoDB Atlas using this URL: https://www.mongodb.com/cloud/atlas

Step 2: Configure the Client and Server applications on Github, I have provided
more details on application configuration on Github @ Continuous deployment of
Node, Express application on Azure using Github. I am following similar steps
here except the fact that instead of one consolidated application I am deploying
two application i.e one for server and another for client.
Step 3: Configure the Client application on Microsoft Azure having Github as
source control with continuous deployment


Step 4: Configure the Server application on Microsoft Azure having source
control on Github with continuous deployment


Step 5: Create a new app settings key in Microsoft Azure console, and follow the
steps
WEB APPS—>Your Application—>Configure—>Go to App Settings—>Add a new Key and
Value as mentioned below.

Step 6: Configure your nodejs application with this key. Using this key my
nodejs server application will pull the DB connection string from the
environment variable configured on Azure.

Step 7: Once your application is deployed on Azure, you can test your API by
hitting the Url of NodeJS Express API, in my case it is :
addressbookserver.azurewebsites.net/persons

This gave me the JSON Response from the mongolab  data store. In case if you
don’t have any sample data on mongolab, you can login into your mongolab console
and add few data with the schema similar to the one I have provided in the
screen above. This is required only for testing purpose. From next time onwards
my client application is going to GET/PUT/POST and DELETE these data.
Step 8: So up till here DB Server, Express API and Angular Client everything
deployed on Azure, so ideally when I will hit the client URL it should call the
Express API and using the mondodb driver the Express server should fetch the
data from MongoDB hosted on mongolab server. Let’s go ahead and test my
application end to end, well even though I have data in MongoDB but I am not
getting any output on my client, instead if you open the console you can see I
am getting error from my Express Server.


> XMLHttpRequest cannot load http://addressbookserver.azurewebsites.net/persons.
> No ‘Access-Control-Allow-Origin’ header is present on the requested resource.
> Origin ‘http://addressbookclient.azurewebsites.net’ is therefore not allowed
> access.

This is due to the security feature which is provided by default by JavaScript,
known as CORS (Cross-Origin Resource Sharing). This prevents JavaScript from
making requests across domain boundaries, and has spawned various hacks for
making cross-domain requests. CORS introduces a standard mechanism that can be
used by all browsers for implementing cross-domain requests. The spec defines a
set of headers that allow the browser and server to communicate about which
requests are (and are not) allowed.
Since in this example I have broken down my application into client and server
and hosted both of them on 2 different domains, which in turn triggered this
extra layer of validation. CORS is a very vast topic in itself, so instead of
diverting from actual topic I would recommend you to go to the dedicated site
for CORS and read about it, I believe this might give you a very good
understanding of the internals of CORS.
So coming back to the original article my next step will be to configure my
application for CORS.
Step 9: CORS (Cross-Origin Resource Sharing)

In the code above I am adding my client URL to Access-Control-Allow-Origin fo
the server.js file in Express API. With this all the response server by the
Express API will validate the requestor before it sends any further response
back to the client. In this way the default security feature provided by
Javascript will make sure that no one else other than my client domain:
http://addressbookclient.azurewebsites.net will be able to access my API. You
can refer the expressjs implementation of CORS @
http://enable-cors.org/server_expressjs.html
Step 10: Now that we have everything in place let try refreshing the client URL:
http://addressbookclient.azurewebsites.net
So now I am able to get the data from mongodb through Express API running on
Nodejs.


And if I see the response, I can see the Access-Control-Allow-Origin is set for
my client domain, which means if any other URL will try to access this they will
get a similar error which I received in Step 8 above. But if you don’t want any
constraints or restrictions then you can change this to
Access-Control-Allow-Origin: ‘*’ here I have used wildcard asterisk to allow any
domain to access my API.

And this is pretty much I had to share on this topic, but to get complete
picture on all the associated topics, don’t forget to visit my previous posts:

 1. Continuous deployment of Node, Express application on Azure using Github
 2. Simple CRUD operation using MEAN stack
 3. NodeJS connectivity to MongoDB on Cloud – Microsoft Azure and mongolab

YOU CAN FORK OR CLONE THE CODE @

 * Server: https://github.com/bmdayal/AddressBookAPIServer.git
 * Client: https://github.com/bmdayal/AddressBookAPIClient.git

References:

 * http://blog.mongolab.com/2013/02/node-js-and-mongolab-on-windows-azure/
 * https://github.com/mafintosh/mongojs



Categories: Atlas, Azure, Express, express-generator, Github, Javascript, MEAN,
Microsoft Cloud Infrastructure, MongoDB, mongolab, Nodejs, Windows Azure



NODEJS CONNECTIVITY TO MONGODB ON CLOUD – MICROSOFT AZURE AND MONGOLAB

Brij Mohan October 19, 2015

In this post I am going to cover different options where you can host your
MongoDB database using different options such as  DBaaS (Database as a Service),
Paas (Platform as a Service) and IaaS (Infrastructure as a Service) using
Microsoft Azure.




OPTION 1 : USING INFRASTRUCTURE AS A SERVICE (IAAS)

In the first option I am using Microsoft Azure dedicated VM to configure
MongoDB. This means that you can have your database in your favorite cloud in
the same location as your application tier.

Step 1: Configure you mongo DB database on Windows Server 2008 on the VM hosted
on cloud (Microsoft Azure). I found an excellent article which demonstrate how
to configure your mongodb on Windows Server 2008 hosted on Azure. Instead of
reiterating the same thing here does not make any sense, so I am providing you
the link to the article. You can use this article to configure your mongodb
database server.

https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-install-mongodb-windows-server/


OPTION 2: USING PLATFORM AS A SERVICE (PAAS)

For this option I am using mongolab via Microsoft Azure console or alternately
you can directly go to https://mongolab.com and create your free account or paid
account.

I am starting from Step 0, just to match my Step 1 till steps 3 with the steps
provided by Microsoft Azure wizard and to avoid any confusion.

Step 0: Once you are logged in into Microsoft Azure console, select Market place
from the list of available options.



Step 1: Select MongoLab as a developer service and select Next arrow



Step 2: Select your plan and click next arrow



Step 3: Click on Purchase

Step 4: Go to your MongoLab dashboard to manage you MongoLab service, this link
will take you to https://mongolab.com/

Step 5: Once you are in mongolab Dasbboard, you have full control of your
mongodb database, collections, documents, users, profile, etc






TEST CONNECTIVITY USING COMMAND PROMPT IN WINDOWS OS

Step 1: Before you connect to mongolab database, you need to create users for
your database, you have options to create a user as readonly or users with full
access.



Step 2: To test connectivity you can use command prompt and enter the following
command

> $ mongo ds048368.mongolab.com:48368/MongoLab-3 -u <dbuser> -p <dbpassword>

This will take you to the mongodb console, you can use mongodb commands/queries
to access the documents and collections.

Same will be reflected in the web console, or you have option where you can
directly edit your databases, collections, users and documents.


CONNECTING TO MONGOLAB USING NODEJS

Step 1: install mongodb drivers on node using the command

> $ npm install mongodb

Step 2: Create a server.js file and add the following code.



Step 3: Run the application to test connectivity to mongolab.

Now you are all set to start your development using MongoDB hosted on cloud
platform. Since I am using these technologies for my learning purpose so
personally I felt using the mongolab as most convenient option, as this will not
have lots of configuration before you actually start your development and you
don’t have to run those extra commands to make sure your server is running on
the console.

Even though mongolab and Windows Azure VM is hosting mongodb in both the option,
but the option have entirely different infrastructure and concepts. For learning
purpose you may choose any of this option, but if you are seriously thinking to
host your mongodb for your enterprise applications then you might have to do a
detailed study on which one suits you better in terms of IaaS or PaaS.

Mongolab also has a partnerships with the cloud provider to offer both
Infrastructure as a Service and Platform as a Service. You can learn more @
https://mongolab.com/company/partners/

References and other helpful links

 1. https://docs.mongodb.org/manual/
 2. https://mongodb.github.io/node-mongodb-native/api-articles/nodekoarticle1.html

Categories: Azure, MongoDB, mongolab, Nodejs



SIMPLE CRUD OPERATION USING MEAN STACK

Brij Mohan October 18, 2015

In this post I am going to cover a very basic CRUD operations using MEAN
(MongoDB, Express, Angular and NodeJS) stack. You can use this example to kick
start your project using MEAN stack.



Before I begin, I am assuming that the reader of this post has basic
understanding of AngularJS, MongoDB, NodeJS, jQuery and other client side
technologies like jQuery, HTML 5, CSS 3, etc.

Most of the commands, environment I have used is for Windows operating system,
but there’s not much difference if you are using MacOS or Linux operating
system, and moreover all the technologies used in this example are platform
independent.

1. SETUP THE ENVIRONMENT

Follow the URL in their respective sites to get detailed information on download
and installation instructions. You can get the installers of all the OS which is
supported by these framework and their installation instructions in these links.

 * Install/configure NodeJS
   * http://blog.teamtreehouse.com/install-node-js-npm-windows
 * Install/configure MongoDB
   * http://docs.mongodb.org/manual/tutorial/install-mongodb-on-windows/
 * If you want to download this sample you can fork or clone using github using
   the link
   * https://github.com/bmdayal/MEANSample

2. SETUP SOLUTION

Solution Structure: In this example I am using a person database and the only
function of this application would be to Create-Read-Update-Delete a person
record. To achieve this I have created a simple solution structure, using the
following structure.



Here I am using app folder for most of my application code, this folder contains
angular controller, service and views. I am keeping my node modules very simple
just to expose API’s which will be consumed by AngularJS services. I have
covered most of the server side node modules in just one file called server.js 
I will try to cover specially more complex node module/architecture in my next
post.

3. DOWNLOADING, INSTALLATION AND CONFIGURATION OF DEPENDENCIES

Download and installation instructions: For this example my application is using
following node-modules, to install these modules I am using node package manager
using GitBash.

 * body-parser: It’s a node.js body parsing middleware, for simple application,
   for mode detail please refer, https://github.com/expressjs/body-parser
   * Installation: $ npm install body-parser
 * express: Express is a minimal and flexible Node.js web application framework
   that provides a robust set of features for web and mobile applications
   (Reference: http://expressjs.com/). I am using express in my middleware to
   create node.js server and API’s
   * Installation: $ npm install express
 * mongojs : A node.js module for mongodb, that emulates the official mongodb
   API as much as possible. It wraps mongodb-core and is available through npm
   (Reference: https://github.com/mafintosh/mongojs).
   * Installation: $ npm install mongojs
 * angular-loading-bar : I am using this for my client side for displaying
   progress bar. Interesting thing is due to the fast performance of my
   application I never get to see this in action.
   * Installation: $ npm install angular-loading-bar

Configuration: of AngularJS client application

 * Angular JS CDN:
   https://ajax.googleapis.com/ajax/libs/angularjs/1.4.7/angular.min.js
 * Angular Route CDN:
   https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.5.0-beta.1/angular-route.min.js
 * BOWER: bower install angular#1.4.7
 * Node package manager: npm install angular@1.4.7
 * Additional Modules: https://code.angularjs.org/1.4.7/

4. SETUP MIDDLEWARE

I am using my node.js for middleware, to setup my middleware first I have
created my server in node.js

Here in line 1 and 2 above I have imported express module and created a instance
of express, and in line 3 I have used listen function of express to create new
server on port 3000. With this example my server is listening on port 3000. When
I run this in GitBash I will get the following message.



I still need to add few more modules to my server.js which will help the
application to connect to my mongodb server, configure the application working
directories. But before I do that let me first start creating GET/POST/PUT and
DELETE API’s in server.js which can communicate with my Client module in
AngularJS

4. CREATE GET/POST/PUT AND DELETE SERVER API’S

In the above code I have create a blueprint of my middleware in server.js

Node JS API Controller Description

/persons

Get All Persons /person/:id Get single person by person Id /addPerson Add new
person /deletePerson/:id Delete single person by id /updatePerson Update single
person

in the first few lines of my code I have created the instances of express and
configure my application to use my root directory of the application as a
working folder. Now lets run the server and check the output in web  browser.

In this example I have created stubs for Get by Id, GetAll, Update, and Delete
person request. To test the code start the node server, though initially you can
see the same output what I have provided earlier. But when you enter the URL’s
in the browser with the URL’s on which I have created my API’s you can see the
outputs in the console.

I have provided example of GET requests. I will write Angular modules to test
POST/PUT and DELETE API’s

GET All (localhost/persons)







GET By ID (localhost/person/1)



5. CONFIGURE ANGULAR JS CLIENT

I am using simple MVC model in AngularJS Client, but you can fork or clone my
Git repository to extend this solution. This example assume that you have basic
understanding of Angular JS. To start with lets create a simple router which
redirects my application to the home page of my person page.

Angular JS Routing

Here in the example above, I have mapped the paths /person and /person/:PersonId
with the PersonCtrl and PersonAddressCtrl  controller respectively and
corresponding view as app/views/person.html and app/views/persondetail.html
views respectively. Which means when I hit the URL:
http://localhost:3000/person, this will go to the person.html and mapped to the
PersonCtrl.js controller and when I hit a single person using URL:
http://localhost:3000/person/1 out of the list using personId my routing engine
will redirect the users to persondetail.html using PersonAddressCtrl.js

Angular Controllers – PersonCtrl

onError : This is going to be my common error module, this action method is used
only for the purpose of displaying the error when something goes wrong during
the whole communicate process wither at the server level or at the client level.



refresh : This will be my default get all person action method, which is used to
load all the person data from mongodb database via nodejs. Here in this code I
am calling the corresponding API ‘/persons’ which is created in nodejs and
express to populate all the persons from mongodb. Once the data is returned from
mongodb I have created a callback function onPersonGetCompleted event, which is
used to populate the person data in the scope of the current module. refresh()
is called as soon as person.html my default page is loaded in the application or
every time we need to refresh my data from mongodb.



searchPerson : This action method will be called when I want to view a single
person based on person id, for example during opening the update form for the
selected person. This method is associated with the callback function
onGetByIdCompleted, which populates the scope with a single person result from
mongodb.



addPerson : As name suggest, this action method is used to add a new person, and
the associated call back function to this is onAddPersonCompleted, which refresh
the data once add is completed to reflect the uptodate information on the Person
default page.



deletePerson : As the name suggest, only purpose of this action method is used
to delete the selected person. This calls ‘/deletePerson/:id’ API of nodejs and
once the call returns it uses its callback function onPersonDeleteCompleted to
refresh the update data in the HTML page person.html



updatePerson : This action updates the person information in mongodb. This calls
the ‘/updatePerson’ API in nodejs. Once the update completed, it calls the
callback function onUpdatePersonCompleted action in Angular to refresh the
updated data in the Person page.



Let’s put everything together to see my completed PersonCtrl.js, next I am going
to show how this is going to integrate with the View and the final step will be
to write the body of my API controller which will be used to transact with
mongodb.



5. COMPLETING THE NODEJS EXPRESS API CONTROLLER

Now lets revisit the API controller, the skleton of which I had created in the
Step 4 above. Here I am going to write the body of my GET/POST/PUT and DELETE
methods which will be used to transact with mongodb document database. In the
chart below I have shown the NodeJS express API and corrosponding mongodb API
against that which will be called to complete the transactions.

Action NodeJS Express API MongoDB API Get All Persons /persons db.Persons.find()
Get Person by Id /person/:id db.Persons.findOne() Add new Person /addPerson
db.Persons.insert() Delete Person by Id /deletePerson/:id db.Persons.remove()
Update single Person /updatePerson db.Persons.findAndModify()

If you want to extend your knowledge beyond the list provided above, then I
would suggest you to look into the very detailed and excellent documentation
provided by MongoDB : http://docs.mongodb.org/manual/applications/crud/

Now lets look into the completed API of my nodejs, Here in line #4, AddressBook
is mu collection name which is equivalent to the tables in RDBMS and Person is
the document. If you are still confused with the document and collection and
other jargons of MongoDB then probably you can look into the link:
http://docs.mongodb.org/manual/reference/glossary/, this covers all the mogodb
glossary.

I have also introduced body-parser here, which is used to parse the request and
response in json format.

And again, for an extensive explanation of each of the operations I have used in
this example below you can refer the MongoDB documentation:
http://docs.mongodb.org/manual/applications/crud/



This is the least code you can write to implement the CRUD operation using
mongodb and node, but the possibilities are limitless you you want to extend
this example. Some of the things which you can do with very little effort if you
want to make this solution more structure and object oriented are:

1. Use the application generator tool, express, to quickly create an application
skeleton. http://expressjs.com/starter/generator.html

2. Use ORM/ODM tools like mongoose. Mongoose provides a straight-forward,
schema-based solution to model your application data. It includes built-in type
casting, validation, query building, business logic hooks and more, out of the
box.

3. Use routing in Server side, though I have used routing in my client side but
you can also use routing in middleware to keep more control of your code. You
can learn more on these topics @ http://expressjs.com/starter/basic-routing.html
OR http://expressjs.com/guide/routing.html

This is just few of options which can be done with very little effort but the
possibilities are unlimited. You can fork or clone my example to extend it up to
your capacity for learning purpose.

Now lets get back to the pending items which will wind up my post by creating
Views

6. ANGULAR JS AND HTML VIEWS

I am going to create a Dashborad which will display all the persons I have in my
mongodb database in AddressBook collection. I have provided below the screenshot
to show you how the completed code will look like. Here I have used AngularJS
bindings for model binding and CSS3 and HTML5 to make my page look little
descent.On click of Edit I am going to edit the person using a Model popup with
pre-populated person data and similarly I am going to have a model popup for Add
Person with empty controls which will allow the user to add a new person record
and delete is simply going to remove the record from the mongodb. My example is
designed to take you to the persondetail.html page which will have more details
of the person, like person address but for the simplicity sake I have not
implemented that in this example.





Now to achieve this I will need only few pages, which are listed below

index.html : This page is going to have only the references to the controller,
angularjs, bootstrap and other client side libraries. This page also have a

> <div data-ng-view="data-ng-view"></div>

This acts as a place holder for displaying the views injected by angularjs
routing engine.

person.html

This is the home page or default page in my application. Screenshots of this
page is provided above. The purpose of this page is to list all the persons in
the mongodb database, add, edit and delete the persons from mongodb.

I have used a very loosely defined model called person for my model and this is
going to be saved in mongodb directly without any further transformation. My
view is binded with persons model in the $scope through the PersonCtrl. Once the
model is populated I have used ng-repeat to display all the persons in the
table. In line #17, I have called the searchPerson with personId in PersonCtrl
to get selected person to edit from mongodb. Here data-toggle and data-target is
used to tell the Edit button to display the #personEditModal is clicked. And for
delete in line #18 I have called deletePerson action in PersonCtrl, this in turn
calls the NodeJs API for delete and remove the person from mongodb database.

And for Add and Edit modal popup, I have a very basic design, refer the
screenshot above. I have provided below the example of edit person, only
difference of edit and add person is the display captions/headings and on click
of save button, Edit calls the updatePerson(person) action in PersonCtrl and in
add person modal this calls the addPerson(person) action in PersonCtrl.

So with these code I have completed end to end data bindings, client side API
using angular, server side API using nodejs and express and you might have
noticed I have not done much like schema creation in mongodb data as mongodb is
schema less document oriented database. So the same JSON which I have used to
communicate between the different layers are stored directly into the mongodb
database. But again this is just the start, the things I have not covered in
mongodb is relation data example like person and person address documents and
how they can be associated with each other using references, complex model
bindings, etc. You can Clone, Form or download this example from my git
repository @ https://github.com/bmdayal/MEANSample

REFERENCES AND OTHER HELPFUL LINKS

 * http://adrianmejia.com/blog/2014/10/03/mean-stack-tutorial-mongodb-expressjs-angularjs-nodejs/
 * http://mean.io/
 * https://angularjs.org/
 * http://mongodb.github.io/node-mongodb-native/2.0/tutorials/crud_operations/
 * https://scotch.io/tutorials/using-mongoosejs-in-node-js-and-mongodb-applications
 * https://mongodb.github.io/node-mongodb-native/api-articles/nodekoarticle1.html

Categories: AngularJS, CSS, CSS3, Express, express-generator, HTML 5, MEAN,
MongoDB, Nodejs



CONTINUOUS DEPLOYMENT OF NODE, EXPRESS APPLICATION ON AZURE USING GITHUB

Brij Mohan September 6, 2015

Normally we deploy the projects on Azure using Visual Studio IDE, but Azure is
not just limited to all Microsoft technologies specially when the majority of
developers are on open source or moving to open source.

In this post I am going to provide you a very step by step process to deploy
your Nodejs project having a source control on Github to Microsoft Azure cloud.

The basic things which you need to setup before you can start your deployment
are as follows:

 1. Nodejs : You can download Nojejs from here  https://nodejs.org/en/
 2. Github repository account: First you need to setup a new account (if you
    don’t already have) in https://github.com/
 3. Github for Desktop: I am using Github for windows to Commit and Push my
    changes to Github repository, but you can use other tools as well to push
    your files to Github. You can download from here:
    https://desktop.github.com/.
 4. Microsoft Azure: If you don’t have Azure subscription you can signup for 1
    month free trial. This will give you $200 credit to spend on all the Azure
    Services: You can visit this link to Signup for a month free service:
    https://azure.microsoft.com/en-us/pricing/free-trial/

You don’t need to have very advanced knowledge of Node, Github or Microsoft
Azure to follow the steps I have covered here but I am assuming that you have at
least basic knowledge of Nodejs, Github and Azure. Now Once you have all the
required setup installed on your machine, lets get started with the basic steps
which is required to deploy your Nodejs sample application on Azure using
Github.

Step 1 Setup your environment: Create a working folder which you want to use for
your application, and start Git Bash on your working directory. Alternately you
can use windows command prompt. For this example I have created a folder in my
local C drive as C:/Source/NodeOnAzureSample

If you already have your application which you want to deploy on Azure you can
use that. For this example I am using express-generator to quickly create a
sample application using Node, Express and Jade, I am using this just to save
time and keep our focus on the actual topic.

Step 2 (Optional) Setup my demo project: This is a optional step if you already
have your application ready. Here I am using the command “npm install
express-generator -g” to generate my sample application template with a basic
welcome page. You can find more detail of express-generator @
http://expressjs.com/starter/generator.html

This will create the sample files inside node modules globally in your
C:/Users/YourUserName/AppData/Roaming/npm/express folder, or you can create the
files in your local application folder by removing the –g from the command.

Now run “express NodeOnAzureSample” command, this is going to create all the
required folders and files which you are going to start your development.



Step 3 (Optional) Restore the dependencies: Run “npm install”, this will restore
all the dependencies like jade, express, etc which your application has
dependencies.

Step 4 Test the application on localhost: After executing Step 1 to 3, we have
our application ready to run on local machine, lets go ahead and test our sample
application by switching to the application directory and run the command “npm
start” in Git Bash (you can use powershell or windows command prompt) and enter
“localhost:3000” in your browser. This should give you the following screen and
your sample application is ready for hosting. Now in Next Steps I am going to
configure my application with Github repository.



For simplicity sake and stick to the topic, I am not going to explain the folder
structure, code, and other granular details of this sample application.

Step 5: Setup Github Repository @ Guthub.com Go to https://github.com/new and
create a new repository, once you select all your desired options as shown in
the image below. Click on the button “Create Repository”



You will be provided step by step information on the new repository which you
have created,



Copy the URL, in your clipboard or notepad, you are going to need this URL
during commit and push of your local application files to github repository at
github.com online.

For this example my repository URL is :
https://github.com/bmdayal/NodeOnAzureSample.git

Step 6: Setup Github Repository on you Desktop: Open Git Gui and Select Create
new Repository from the menu which will ask you to locate your application
directory, select the folder which you want to deploy to Azure. In my case I
have mapped my “NodeOnAzureSample” application to Git repository.

Once you select create, Git GUI will list all the files which are either new or
modified. Next step is you need to Stage Changed files, Commit and Sign Off, to
perform these first you need to setup a new repository in https://github.com
which I have already explained in Step 5 above.



Select “Stage changed” –> Sign Off –> Provide initial commit message
–>Commit—>and Push.

Selecting Push is going to final commit your files to online github repository,
to do this you will need the URL of your GitHub repository at github.com, and
this will also prompt you to authenticate yourself.



Once your files successfully pushed to online repository, you will get a
confirmation message as below.



Now that your files are ready for deployment, lets switch to Azure to configure
continuous deployment.

Step  7 Configure Azure website: to do this you need to go to
https://manage.windowsazure.com dashboard. Select Web Apps—>And select New from
the footer menu.



This is going to provide you a wizard to create a new WEB APPS. For this example
I am going to use Quick Create option to create my web application and named
this as NodeOnAzureSample. If the name is available you will get a green check
mark next to the name. This will provide you with a URL as
https://nodeonazuresample.azurewebsites.net. Once you are happy with the name
click on “Create Web App” button.



You can see the site is now listed in the WEB APP list and showing as Running,
now click on the URL to test your web site @
http://nodeonazuresample.azurewebsites.net/.

This is going to provide you a welcome page of Azure Web site. In my nest steps
I am going to replace the welcome page with actual application which we have
created and pushed in Github.



Step 8 Configure deployment using Github: Select the Web App listed in the image
above and Select Dashboard from the top menu.



From the dashboard, select Set up deployment from source control from right
navigation menu.



You will be provided with a list of options, where you need to select GitHub,
and select Next



You will need to authenticate yourself to GitHub where you have created your
repository that needs to be deployed on Azure.



On successful authorization, you will be provided with the list of your
applications which you have deployed on Github, for this example I have selected
my “NodeOnAzureSample” application.



Select Done when you have made your selection, and this is pretty much what we
need to do to link my Github repository to Azure website.

Deployment Stage 1

Deployment Stage 2: Notice you got a new menu called “Deployments” after
Dashboard. Be patience, this might take few seconds to minutes depending upon
the size of your application.

Deployment Stage 3: Deployment completed, you can see the initial comments which
I had provided during my Commit to Github

Step 9 Testing your deployment on Azure website: Now that your application is
deployed on Azure the next step is to test my application, yes no further
configuration is required. Azure configures everything for you. Lets go to the
WEB APP Dashboard and hit on the URL : nodeonazuresample.azurewebsites.net



Yes our application is ready and running on Azure

Step 10 Continuous deployment: To verify Continuous deployment. Go to your
index.js, in NodeOnAzureSample/routes folder and change the text from “Express”
to “Node, Express, Azure and Github”



Save the file, go to your Github Desktop—> Rescan—>Commit and Push the file with
comment “Changed the welcome text from Express to Node, Express, Azure and
Github”



Once you complete these steps, go back to your Azure dashboard. And probably by
the time you switch back to your Azure your changes might have already been
deployed to your website. Lets go ahead and verify.



And here we go, on my azure dashboard above I can see the latest changes are
already deployed with the comment I have provided in my github commit.

Lets test this change in my browser by refreshing the page, and yes my changes
are reflected below.



This is just a simple example of how the things are simplified in Microsoft
Azure. This post will give a head start to integrate Node, Express, Github and
Azure.

You can use any of your favorite open source like Angular, Knockout, Backbone,
Jade, etc with your favorite source control like Visual Studio, GitHuib,
Dropbox, Codeplex, etc and integrate with Microsoft Azure and above all you
don’t need to be a Ninja on this technologies.

Happy Coding and Keep exploring

Categories: Azure, Express, express-generator, Github, Nodejs



SHARING THE COOKIES IN WEB FARM OR ACROSS DIFFERENT SERVERS

Brij Mohan December 22, 2014

Consider you have an application which provides you an authentication cookie
using ASP.NET Membership provider, and you are using this authentication cookie
across multiple servers to access the secured contents. I have depicted this
scenario using the diagram below.



Now to read the client cookies across the applications you can simple use the
following line of line of code fetch the cookies

HttpContext.Current.Request.Cookies[FormsAuthentication.FormsCookieName].Value;

But wait a minute, is it that simple? In fact yes except the fact that to secure
your session and prevent from any men in the middle attacks your cookies are
encrypted using the machine key of Authentication Server. Which might look
similar to the one below. This is configured at your machine level, which means
you may not usually find this key in your local web.config files.

<machineKey  
  validationKey="21F090935F6E49C2C797F69BBAAD8402ABD2EE0B667A8B44EA7DD4374267A75D7AD972A119482D15A4127461DB1DC347C1A63AE5F1CCFAACFF1B72A7F0A281B" 
  decryptionKey="ABAA84D7EC4BB56D75D217CECFFB9628809BDB8BF91CFCD64568A145BE59719F" 
  validation="SHA1" decryption="AES"/>



And also you might be aware that every machine has its own machine.config file
which is tailored to that particular machine, so if the cookie is encrypted
using the machine key of Server1 then it cannot be decrypted using machine key
of Server2 or any other Server.

So even if you managed to get the cookie, but when you try to read the data from
the cookie you must have to first decrypt the cookie in order to read any key
from the cookie. I have provided below a sample code which exactly does the
same.

private void SetFormsAuthenticationTicket()
{
    FormsAuthenticationTicket ticket = default(FormsAuthenticationTicket);
    if (System.Web.HttpContext.Current.Request.Cookies.Get(System.Web.Security.FormsAuthentication.FormsCookieName) != null)
    {
        ticket = System.Web.Security.FormsAuthentication.Decrypt(
            System.Web.HttpContext.Current.Request.Cookies.Get(
            System.Web.Security.FormsAuthentication.FormsCookieName).Value);
        string[] roles = ticket.UserData.Split(new char[] { '|' });
        GenericIdentity userIdentity = new GenericIdentity(ticket.Name);
        GenericPrincipal userPrincipal = new GenericPrincipal(userIdentity, roles);
        System.Web.HttpContext.Current.User = userPrincipal;
    }
}



To get a better view I have also provided the screenshot of the code below.



This does solve your problem, but the when you try run the code you might get
the following exception.

> System.Web.HttpException : Unable to validate data. at
> System.Web.Configuration.MachineKeySection.EncryptOrDecryptData(Boolean
> fEncrypt, Byte[] buf, Byte[] modifier, Int32 start, Int32 length, IVType
> ivType, Boolean useValidationSymAlgo, Boolean signData)



This happens because as I mentioned above, your cookie was created by machine
key of Server1, but some of the part of your application is served by Server2
which tries to decrypt the cookie using the code above. So to mitigate this
issue first this you might need to do is to generate the machineKey which can be
shared across all your applications who is sharing the cookies and located
across your network. I have written a separate post on How to generate  the
machineKey using IIS 7.0+ you can visit the link:
http://www.dotnetglobe.com/2014/12/generate-machinekey-in-iis-70.html

Secondly you have to place the same decryptionKey in all your application local
web.config, and you are done.

You might encounter this types of scenario is small scale applications, but for
most of the complicated application these days where applications are placed on
completely different domains, you will need to implement your own SSO
architecture. Details are out of then scope of this article, so you might take a
look for details in some other article such as Single Sign On (SSO) for
cross-domain ASP.NET applications or Single Sign-On (SSO) for .NET or Using a
third party identity provider like Facebook, Google, etc

> References:
> 
> http://msdn.microsoft.com/en-us/library/ff649308.aspx
> 
> http://www.codeproject.com/Articles/288631/Secure-ASP-NET-MVC-applications

Categories: .NET Framework, Cookies, General Topics, Visual Studio.NET 2008,
Visual Studio.NET 2010, Web Farm



GENERATE MACHINEKEY IN IIS 7.0+

Brij Mohan December 17, 2014

Normally machineKey is already configured in your machine.config file which is
applied to all your application if you are working in the same Web Farm. But
most of the time this may not be the scenario, it may be possible that your
applications are distributed across different Web Farms.

To deal with this all you need is to configure the same decryption keys in
web.config of all the applications to decrypt the authentication cookie created
at client side.

To generate a new machine key all you need is to first open the IIS Admin
Console, and

1. Select the Server

2. Select the Machine Key from right hand side of the console



3. Double Click to open the Machine Key and then Select ‘Generate Keys’ from
Actions



4. You got the Decryption Key and Unique Key

You can use this Keys in you web.config file which might look similar to
something like



Categories: IIS 7.0



UNSUBSCRIPTION EMAIL UTILITY

Brij Mohan December 16, 2014

In this post I am going to show a very simple yet very useful utility which
everyone of us encounter when we get lots of unwanted mails which we had
subscribed sometime in our web exploration but later we want to get rid of those
subscribed mails. Moreover this is a very integral part of our business use
cases.

I have used ASP.NET MVC as an example but if you understand the concept, this
can be achieved using any of your favorite programming logic as well.

I tried to keep the example as simple as possible so that I can cover the
concept/business logic in more details. In this example as soon as I run the
application I will call the Index action method with my hardcoded email id. In
real world you might want to make it more dynamic and will try to pass this as a
parameter.



Now I have written a utility  method which is used to generate the subscription
link, which can be embedded in the marketing emails, so when the user click on
this link Say Unsubscribe Me, he will land on the page where just in a click of
button his mail id will be removed from the marketing email database.



For the simplicity sake of this example I have not used the real time database
operations but for your implementation you will definitely need DB operation to
update the user preferences.

Not let me highlight the most important logic of this GetUnSubscriptionLink
method; since we are dealing with one key information of a user which is his
email id, so I have used CryptoHelper utility which is used to Encrypt the email
id of the consumer. This will also help us to filter any Brute force attack
where the hacker will try to enter random emails on his list to invoke the
unsubscribe methods.

You can download your copy of CryptoHelper utility from here:
http://1drv.ms/1r2kisl

One of the example of the unsubscription link is shown in the screen below. For
better programming practice you can configure the root URL in your config file.



Additionally you can do a second level of validation where you can prompt the
end user to provide his email id in a text box and validate this against the
decrypted string containing the email id from the query string, if both has a
match then unsubscribe otherwise ignore the unsubscribe request.

In my example as soon as you pass all the validation I am calling the
Unsubscribe action method of my controller which takes the encryptedText as a
parameter and decode the text to unsubscribe the user from marketing emails.



Sample screen once the user is unsubscribed



I am providing below a utility Property which I found somewhere on internet,
this might be very useful to you if you are playing with URL’s



Crypto Helper Utility: http://1drv.ms/1r2kisl

You can download the full working copy of this example from this link:
http://1drv.ms/1vU5ZCO

Hope this helps, let me know in comments if you have any trouble in downloading
the files or understanding my thoughts

Categories: ASP.Net, ASP.NET MVC, CryptoHelper, Email, Unsubscribe, Visual
Studio 2013 RC



HELLO WORLD !!! USING APACHE CORDOVA AND VISUAL STUDIO 2013

Brij Mohan December 11, 2014

You heard it right, this is for all the .NET Developers (other platform also
welcome) who have ever wished to build an app using Apache Cordova that targets
multiple mobile platforms: Android, iOS, Windows, and Windows Phone.



Disclaimer: All images are copyright to their respective owners.

I am going to show you a very simple mobile application which displays Hello
World, using apache Cordova and Visual Studio 2013.

1. Setup the environment by installing the Preview of Visual Studio Tooling
Support for Apache Cordova, you can download the CTP from here.

Visual Studio Tools for Apache Cordova



2. Once the installation is complete, run Visual Studio 2013 and create a new
project. You need to go to Javascript templates or you can select the Multi
device Hybrid App and click OK button. this will give you default project
structure





3. For this post I am not going to explain all the files, rather I will go
directly to my index.html and change the default text Hello World !!! and we are
done.



4. And finally select the desired Emulators and run the program.



And here is the output, that’s it and you are done.



You can double click on config.xml file to customize the Custom and Core
properties of your application individually for each platforms.



Hope this makes your life simple Enjoy.

> Other helpful link:
> 
> http://msopentech.com/blog/2014/05/12/apache-cordova-integrated-visual-studio/
> 
> http://msdn.microsoft.com/en-us/library/dn757054.aspx

Categories: Andriod, Apache, Apple, Cordova, Windows 8, Windows Phone 8



ASP.NET VNEXT

Brij Mohan September 14, 2014

With ASP.NET vNext Microsoft has remodeled the entire pattern, with lots of
added features and improvements in programming model. I have tried to compile
the information from few useful source in this post which contains the complete
list of ASP.NET vNext features.

The best part of this is it’s a open source, in this post I have just collected
few interesting features of ASP.NET vNext and the video of 2 great Scott’s
(Scott Hanselman and Scott Hunter) originally presented in Microsoft TechEd
North America 2014.



Download :- MP3 (Audio only), Mid Quality MP4 (Windows Phone, HTML5, iPhone),
High Quality MP4 (iPad, PC, Xbox), MP4 (iPhone, Android)

Here are some of the new features in ASP.NET vNext.

 * Cloud-ready out of the box
 * A single programming model for Web sites and services
 * Low-latency developer experience
 * Make high-performance and high-productivity APIs and patterns available –
   enable them both to be used and compose together within a single app
 * Fine-grained control available via command-line tools and standard file
   formats
 * Delivered via NuGet
 * Release as open source via the .NET Foundation
 * Can run on Mono, on Mac and Linux

Ref :
http://blogs.msdn.com/b/dotnet/archive/2014/05/12/the-next-generation-of-net-asp-net-vnext.aspx

REBUILT FROM THE GROUND UP

 * MVC, Web API, and Web Pages are merged into one framework, called MVC 6. The
   new framework uses a common set of abstractions for routing, action
   selection, filters, model binding, and so on.
 * Dependency injection is built into the framework. Use your preferred IoC
   container to register dependencies.
 * vNext is host agnostic. You can host your app in IIS, or self-host in a
   custom process. (Web API 2 and SignalR 2 already support self-hosting; vNext
   brings this same capability to MVC.)
 * vNext is open source and cross platform.

LEANER, FASTER

 * MVC 6 has no dependency on System.Web.dll. The result is a leaner framework,
   with faster startup time and lower memory consumption.
 * vNext apps can use a cloud-optimized runtime and subset of the .NET
   Framework. This subset of the framework is about 11 megabytes in size
   compared to 200 megabytes for the full framework, and is composed of a
   collection of NuGet packages.
 * Because the cloud-optimized framework is a collection of NuGet packages, your
   app can include only the packages you actually need. No unnecessary memory,
   disk space, loading time, etc.
 * Microsoft can deliver updates to the framework on a faster cadence, because
   each part can be updated independently.

TRUE SIDE-BY-SIDE DEPLOYMENT

The reduced footprint of the cloud-optimized runtime makes it practical to
deploy the framework with your app.

 * You can run apps side-by-side with different versions of the framework on the
   same server.
 * Your apps are insulated from framework changes on the server.
 * You can make framework updates for each app on its own schedule.
 * No errors when you deploy to production resulting from a mismatch between the
   framework patch level on the development machine and the production server.

NEW DEVELOPMENT EXPERIENCE

vNext uses the Roslyn compiler to compile code dynamically.

 * You can edit a code file, refresh the browser, and see the changes without
   rebuilding the project.
 * Besides streamlining the development process, dynamic code compilation
   enables development scenarios that were not possible before, such as editing
   code on the server using Visual Studio Online ("Monaco").
 * You can choose your own editors and tools.

You can find this video and many other interesting videos of TechEd North
America 2014 @ http://tena2014.eventpoint.com/topic/list

Link to Scott Hanselman blog:
http://www.hanselman.com/blog/IntroducingASPNETVNext.aspx

Complete list of feature in Text:

http://www.asp.net/vnext/overview/aspnet-vnext/getting-started-with-aspnet-vnext-and-visual-studio

http://www.asp.net/vnext

I hope this compiled information might help you to kick start your journey of
ASP.NET vNext, please share your views in the comments below.

Categories: ASP.NET MVC, ASP.NET vNext, ASP.NET Web Api, Framework 4.5, HTML 5,
MVC 5, Visual Studio 2014



4.93 MILLION GOOGLE ACCOUNTS HACKED !!! – USERNAME LIST

Brij Mohan September 12, 2014

Couple of days back I heard that few millions Google users account id and
passwords are leaked online, initially I though its just a rumor, but later
after reading in leading news papers I got to know the seriousness of the issue

> Russian hackers have leaked the email IDs and passwords of as many as 4.93
> million Google accounts. The same Google account password is used across all
> Google products, such as Gmail, Drive, Plus, YouTube, Maps etc.
> 
> The account details have been posted on bitcoin forum btcsec.com by a user
> named Tvskit. On the forum, Tvskit has said that approximately 60% of the
> passwords are still active.

Source :
http://timesofindia.indiatimes.com/tech/tech-news/4-93-million-Gmail-passwords-leaked-by-hackers/articleshow/42241159.cms

http://www.nydailynews.com/news/world/5-million-gmail-usernames-passwords-posted-online-article-1.1935155



After digging down into the thread I managed to download the complete list of
users whose account passwords are hacked, remember this is just the list of
users not their passwords. Though this list does not contain my user name but I
found couple of emails I am familiar with, so though to share the complete list
with the readers of my blog and friends.

You can also query your email on the link here, to see if your Google account is
present in the hackers database.

https://isleaked.com/en



I have uploaded the file here, if you find your username in this list then I
suggest you to take necessary action to protect your account, as you might be
aware that the same account is used across all your Google services like Gmail,
Google Drive, You Tube, Android, etc.

Link to download the file containing all the Usernames: http://1drv.ms/1qOspGg

Categories: Gmail, Google, Hacked



POSTS NAVIGATION

1 2 … 15


SEARCH BLOG POST






RECENT POSTS


 * Deploying a mutli-tier MEAN application on Microsoft Azure
 * NodeJS connectivity to MongoDB on Cloud – Microsoft Azure and mongolab
 * Simple CRUD operation using MEAN stack
 * Continuous deployment of Node, Express application on Azure using Github
 * Sharing the Cookies in Web Farm OR across different servers




CALENDAR



March 2023 M T W T F S S  12345 6789101112 13141516171819 20212223242526
2728293031  

« Nov    



ARCHIVES


 * November 2015 (1)
 * October 2015 (2)
 * September 2015 (1)
 * December 2014 (4)
 * September 2014 (3)
 * January 2014 (1)
 * September 2013 (1)
 * August 2013 (1)
 * June 2013 (2)
 * April 2013 (1)
 * February 2013 (1)
 * December 2012 (1)
 * November 2012 (5)
 * April 2012 (2)
 * March 2012 (2)
 * January 2012 (2)
 * November 2011 (1)
 * October 2011 (1)
 * September 2011 (3)
 * July 2011 (3)
 * June 2011 (3)
 * May 2011 (8)
 * April 2011 (3)
 * March 2011 (1)
 * June 2010 (3)
 * May 2010 (5)
 * April 2010 (1)
 * March 2010 (2)
 * August 2009 (2)
 * July 2009 (9)
 * April 2009 (5)
 * January 2009 (1)
 * December 2008 (2)
 * November 2008 (3)
 * October 2008 (5)
 * September 2008 (3)
 * August 2008 (6)
 * July 2008 (7)
 * May 2008 (4)
 * April 2008 (5)
 * March 2008 (8)
 * February 2008 (13)
 * January 2008 (5)




META


 * Log in
 * Entries feed
 * Comments feed
 * WordPress.org

 * Subscribe



fGeek Theme powered by WordPress