masnun.com Open in urlscan Pro
2a06:98c1:3121::7  Public Scan

URL: http://masnun.com/
Submission: On February 25 via manual from GB — Scanned from GB

Form analysis 2 forms found in the DOM

GET http://masnun.com/

<form id="search-form" role="search" method="get" action="http://masnun.com/">
  <input type="text" id="search" name="s" value="" required="required" placeholder="ENTER KEYWORD">
  <input type="submit" id="search-submit" title="Search" value="→">
</form>

GET http://masnun.com/

<form role="search" id="searchform" method="get" action="http://masnun.com/">
  <div>
    <label class="screen-reader-text" for="s">Search for:</label>
    <input type="text" id="s" name="s" placeholder="ENTER KEYWORD" value="" required="required">
    <input type="submit" id="searchsubmit" value="Search">
  </div>
</form>

Text Content

ABU ASHRAF MASNUN


TALES OF A SOFTWARE CRAFTSMAN

Home Home About Me Tutorials Codes Contact
 * Home
 * About Me
 * Tutorials
 * Codes
 * Contact




POLYGLOT.NINJA() – MY NEW INITIATIVE!

posted inUncategorized on May 5, 2017 by maSnun with 0 Comment

So here it goes again, I have started another new blog / site –
http://polyglot.ninja/ – I am planning to create programming related contents
there. Mostly learning guides, tutorials and who knows, may be some video
tutorials too?

Why did I start another site? Well, for one I love that domain. Also being a
Polyglot developer by heart, I feel the need to make the term “polyglot” more
familiar among developers. I would also like to convey the idea that learning
multiple programming languages is very important. It does make us better
developers if we learn new ideas and concepts from other languages.

I hope to continue writing there. If you are interested, you can subscribe to
the mailing list (I have the same list, so subscribing here would also get you
updates from that site). I will keep you posted of the new contents I create
there


I HAVE A NEW BLOG!

posted inUncategorized on February 15, 2017 by maSnun with 0 Comment

Hey everyone,

It’s nice to see a lot of traffic coming to this site. However, I am moving to a
new domain: http://masnun.rocks

From now on, I would be writing to the new blog.


NETWORK SECURITY REQUIREMENTS

posted inPython on September 18, 2016 by maSnun with 16 Comments


The right network security solution will help your business stay compliant with
business and government regulations.

Rigorous standards are essential for compliance with local, state, and federal
laws.

System Requirements

There are no formal requirements for these standards. However, according to the
Government of the United States, the following requirements apply.

National Electrical Code

A computer and Internet connection

A single user account (no guest accounts)

2.5 GHz radio devices that work in both the 2.4 GHz and 5 GHz bands, and an
account manager or administrator that has a command and control (CnC) role

A video camera and microphone that is used for customer and/or employee safety

A program that encrypts network communication

A policy in place that limits the number of people using the network

Equal bandwidth

End-to-end encryption

Single application access

Google Cloud Platform

The following are the requirements for Google Cloud Platform:

Required Hardware

A computing device that has 4GB of RAM

Any 2.5 GHz computer or a partner computer (e.g. Google Compute Engine) that has
access to the Internet

(if using a partner computer) A 2.4 GHz computer or a partner computer (e.g.
Google Compute Engine) that is connected to the Internet at least once a month

Optional Hardware

A wireless LAN router

A hard disk drive with sufficient storage capacity

A USB flash drive with sufficient storage capacity

An email server with enterprise management

For more information about Google Cloud Platform, see the Google Cloud Platform
Architecture overview and Introduction to Google Cloud Platform.

Self-Hosted Cloud Infrastructure

Typically, you can self-host your own servers on your own infrastructure. This
means you control the physical design of your servers and using application
servers for this is important and you can get tools with the right application
access at sites like
https://www.fortinet.com/solutions/enterprise-midsize-business/network-access/application-access.
You are also in control of hardware upgrades and system maintenance. Some of the
drawbacks to self-hosting include:

Performance

Self-hosted cloud infrastructure typically performs worse than conventional
cloud infrastructure in most situations. Because you control the server design,
you can choose the resources you need to maximize the performance of your
servers. For example, you can prioritize processing power for
performance-intensive operations. This minimizes the risk of system failures.

Operating costs

Because you are managing all your servers through your computer at the hardware
level, you are responsible for connecting them together and keeping them up to
date. You are also responsible for maintaining and upgrading them as required.

Cloud Automation

In some situations, automated systems in your environment may be able to scale
up or down to meet your needs, such as:

Processing more transactions

Processing more requests at a faster rate

Processing more incoming messages

For more information about Cloud Automation, see the Cloud Automation Overview.

Network Connectivity

Your cloud infrastructure needs to have a network connection. Different
companies choose different networks for this purpose.

Important: Don’t use your existing network to deploy your cloud infrastructure.
If you do so, you will be adding additional complexity to your existing network
infrastructure, which could negatively impact your Cloud Platform provisioning
and functioning.


A BRIEF INTRODUCTION TO DJANGO CHANNELS

posted inDjango, Python on September 11, 2016 by maSnun with 8 Comments

There’s a new updated version of this article here:
http://masnun.rocks/2016/09/25/introduction-to-django-channels/

--------------------------------------------------------------------------------

Django has long been an excellent web framework. It has helped many developers
and numerous businesses succeed over the years. But before the introduction of
Channels, Django only supported the http protocol well. With the gradual
evolution of the web technologies, standing here in 2016, supporting http only
is simply not enough. Today, we are using websockets for real time
communications, WebRTC is getting popular for real time collaboration or video
calling, HTTP/2 is also being adapted by many. In the current state of the web,
any modern web framework needs to be able to support more and more protocols.
This is where Django Channels come into play. Channels aim at adding new
capabilities to Django, including the support for modern web technologies like
websockets or http2.


HOW DOES “CHANNELS” WORK?

The idea behind Channels is quite simple. To understand the concept, let’s first
walk through an example scenario, let’s see how Channels would process a
request.

A http/websocket request hits the reverse proxy (ie, nginx). This step is not
compulsory but we’re conscious developers and always make sure our requests
first go through a hardened, battle proven reverse proxy before it hits our
application server

Nginx passes the request to an application server. Since we’re dealing with
multiple protocols now, instead of application server, let’s call it “Interface
Server”. This interface server knows how to handle requests using different
protocols. The interface server accepts the request and transforms into a
message. It then passes the message on to a channel.

We have to write consumers which will listen on to specific channels. When new
messages arrive on those channels, the consumers would process them and if
needed, send a response back to a reply/response channel. The interface server
listens on to these response channels and when we write back to these channels,
the interface server reads the message and transmits it to the outside world (in
this case, our user). The consumers are run in background worker processes. We
can spawn as many workers as we like to scale up.

So as you can see, the concept is really simple – an interface server accepts
requests and queues them as messages on channels. Consumers process these queues
and write back responses on response channels. The interface server sends back
the responses. Plain, simple yet effective!

There are channels which are already available for us. For example –
http.request channel can be listened on if we want to handle incoming http
messages. Or websocket.receive can be used to process incoming websocket
messages. In reality, we would probably be less interested in handling
http.request ourselves and rather let Django handle it. We would be more
interested in adding our custom logic for websocket connections or other
protocols. Besides the channels which are already available, we can also create
our own custom channels for different purposes. Since the project works by
passing messages to channels and handling them with background workers, we can
actually use it for managing our background tasks too. For example, instead of
generating thumbnails on the fly, we can pass the image information as a message
to a channel and the worker does the thumbnailing in the background. By default
Channels ship with a management command – runworker which can run background
workers to listen to the channels. However, till now, there is no retry
mechanism if the message delivery somehow fails. In this regard, Celery can be
an excellent choice for writing / running / managing the background workers
which would process these channels.

Daphne is now the de-facto interface server that works well with Channels. The
channels and message passing work through a “channel layer” which support
multiple backends. The popular ones are – In Memory, Redis, IPC. As you can
guess, these backends and the channel layer is used to abstract away the process
of maintaining different channels/queues and allowing workers to listen to
those. In Memory backend maintains the channels in memory and is a good fit for
local development. While a Redis cluster would be more suitable in a production
environment for scaling up.


LET’S BUILD A WEBSOCKET ECHO SERVER

Enough talk. Let’s build a simple echo server. But before we can do that, we
first have to install the package.




Shell

pip install channels
1
pip install channels



That should install Django (as it’s a dependency of channels) and channels along
with the necessary packages. Start a Django project with django-admin and create
an app.

Now add channels to the INSTALLED_APPS list in your settings.py. For local
development, we are fine with the in memory channel layer, so we need to put
these lines in settings.py to define the default channel layer:




Python

CHANNEL_LAYERS = { "default": { "BACKEND": "asgiref.inmemory.ChannelLayer",
"ROUTING": "realtime.routing.channel_routing", }, }
1
2
3
4
5
6
CHANNEL_LAYERS = {
    "default": {
        "BACKEND": "asgiref.inmemory.ChannelLayer",
        "ROUTING": "realtime.routing.channel_routing",
    },
}



In the above code, please note the ROUTING key. As the value of this key, we
have to pass the path to our channel routing. In my case, I have an app named
realtime and there’s a module named routing.py which has the channel routing.




Python

from channels.routing import route from .consumers import websocket_receive
channel_routing = [ route("websocket.receive", websocket_receive,
path=r"^/chat/"), ]
1
2
3
4
5
6
from channels.routing import route
from .consumers import websocket_receive
 
channel_routing = [
    route("websocket.receive", websocket_receive, path=r"^/chat/"),
]



In the channel routing list, we define our routes which looks very similar to
Django’s url patterns. When we receive a message through a websocket connection,
the message is passed on to the websocket.receive channel. So we defined a
consumer to consume messages from that channel. We also defined a path to
indicate that websocket connections to /chat/ should be handled by this
particular route. If we omit the path, the clients can connect to any url on the
host and we can catch them all! But if we define a path, it helps us namespace
things and in another cause which we will see later in this article.

And here’s the consumers.py:




Python

def websocket_receive(message): text = message.content.get('text') if text:
message.reply_channel.send({"text": "You said: {}".format(text)})
1
2
3
4
def websocket_receive(message):
    text = message.content.get('text')
    if text:
        message.reply_channel.send({"text": "You said: {}".format(text)})



The consumer is very basic. It retrieves the text we received via websocket and
replies back. Note that the websocket content is available on the content
attribute of the message. And the reply_channel is the response channel here
(the interface server is listening on to this channel). Whatever we send to this
channel is passed back to the websocket connection.

We have defined our channel layer, created our consumer and mapped a route to
it. Now we just need to launch the interface server and the background workers
(which run the consumers). In local environment, we can just run – python
manage.py runserver as usual. Channels will make sure the interface server and
the workers are running in the background. (But this should not be used in
production, in production we must use Daphne separately and launch the workers
individually. See here).

Once our dev server starts up, let’s open up the web app. If you haven’t added
any django views, no worries, you should still see the “It Worked!” welcome page
of Django and that should be fine for now. We need to test our websocket and we
are smart enough to do that from the dev console. Open up your Chrome Devtools
(or Firefox | Safari | any other browser’s dev tools) and navigate to the JS
console. Paste the following JS code:




JavaScript

socket = new WebSocket("ws://" + window.location.host + "/chat/");
socket.onmessage = function(e) { alert(e.data); } socket.onopen = function() {
socket.send("hello world"); }
1
2
3
4
5
6
7
socket = new WebSocket("ws://" + window.location.host + "/chat/");
socket.onmessage = function(e) {
    alert(e.data);
}
socket.onopen = function() {
    socket.send("hello world");
}



If everything worked, you should get an alert with the message we sent. Since we
defined a path, the websocket connection works only on /chat/. Try modifying the
JS code and send a message to some other url to see how they don’t work. Also
remove the path from our route and see how you can catch all websocket messages
from all the websocket connections regardless of which url they were connected
to. Cool, no?

Our websocket example was very short and we just tried to demonstrate how things
work in general. But Django Channels provide some really cool features to work
with websockets. It integrates with the Django Auth system and authenticates the
websocket users for you. Using the Group concept, it is very easy to create
group chats or live blogs or any sort of real time communication in groups. Love
Django’s generic views? We have generic consumers to help you get started fast.
The channels docs is quite nice, I suggest you read through the docs and try the
concepts.


USING OUR OWN CHANNELS

We can create our own channels and add consumers to them. Then we can simply add
some messages to those channels by using the channel name. Like this:




Python

Channel("thumbnailer").send({ "image_id": image.id })
1
2
3
Channel("thumbnailer").send({
        "image_id": image.id
    })




WSGI OR ASGI?

Since Daphne and ASGI is still new, some people still prefer to handle their
http requests via WSGI. In such cases, we can configure nginx to route the
requests to different servers (wsgi / asgi) based on url, domain or upgrade
header. In such cases, having the real time end points under particular
namespace can help us easily configure nginx to send the requests under that
namespace to Daphne while sending all others to wsgi.


NODEJS: GENERATOR BASED WORKFLOW

posted inNodeJS on June 19, 2016 by maSnun with 0 Comment

This post assumes you already know about Promises, Generators and aims at
focusing more on the co library and the generator based workflow it supports. If
you are not very familiar with Promises or Generators, I would recommend you
study them first.


PROMISES ARE NICE

Promises can save us from the callback hell and allow us to write easy to
understand codes. May be something like this:




JavaScript

function promisedOne() { return new Promise((resolve, reject) => { setTimeout(()
=> { resolve(1) }, 3000); }) } promisedOne().then(console.log);
1
2
3
4
5
6
7
8
9
function promisedOne() {
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      resolve(1)
    }, 3000);
  })
}
 
promisedOne().then(console.log);



Here we have a promisedOne function that returns a Promise. The promise is
resolved after 3 seconds and the value is set to 1. For keeping it simple, we
used setTimeout. We can imagine other use cases like network operations where
promises can be handy instead of callbacks.


GENERATOR BASED WORKFLOW

With the new ES2015 generators and a nice generator based workflow library like
co (https://github.com/tj/co) , we can do better. Something like this:




JavaScript

const co = require('co'); function promisedOne() { return new Promise((resolve,
reject) => { setTimeout(() => { resolve(1) }, 3000); }) } function * main() {
try { let promised_value = yield promisedOne(); console.log(promised_value) }
catch (e) { console.error(e); } } co(main)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
const co = require('co');
 
function promisedOne() {
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      resolve(1)
    }, 3000);
  })
}
 
function * main() {
  try {
    let promised_value = yield promisedOne();
    console.log(promised_value)
  } catch (e) {
    console.error(e);
  }
 
}
 
co(main)



What’s happening here? The co function takes a generator function and executes
it. Whenever the generator function yields something, co checks if the object is
one of the yieldables that it supports. In our case, it’s a Promise which is
supported. So co takes the yielded promise, processes it and returns the results
back into the generator. So we can grab the value from the yield expression. If
the promise is rejected or any error occurs, it’s thrown back to the generator
function as well. So we could catch the error.

co returns a Promise. So if the generator function returns any values for us, we
can retrieve those using the promise APIs.




JavaScript

function * returnSomething() { return "Done" }
co(returnSomething).then(console.log); // Or co(function * () { return
"Something else" }).then(console.log); // Let's make it a little more complex
co(function *() { return yield new Promise((resolve, reject) => { setTimeout(()
=> { resolve(1) }, 3000); }) }).then(console.log);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
function * returnSomething() {
  return "Done"
}
 
co(returnSomething).then(console.log);
 
// Or
 
co(function * () {
  return "Something else"
}).then(console.log);
 
 
// Let's make it a little more complex
 
co(function *() {
  return yield new Promise((resolve, reject) => {
    setTimeout(() => {
      resolve(1)
    }, 3000);
  })
}).then(console.log);



So basically, we yield from the yieldables, catch any errors and return the
values. We don’t have to worry about how the generator is working behind the
scene.

The co library also has an useful function – co.wrap. This function takes a
generator function but instead of executing it directly, it returns a general
function which returns a promise.




JavaScript

const co = require('co'); function promisedOne() { return new Promise((resolve,
reject) => { setTimeout(() => { resolve(1) }, 3000); }) } const main =
co.wrap(function *() { return yield promisedOne(); }); // main function now
returns promise main().then(console.log);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
const co = require('co');
 
function promisedOne() {
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      resolve(1)
    }, 3000);
  })
}
 
const main = co.wrap(function *() {
  return yield promisedOne();
});
 
 
// main function now returns promise
main().then(console.log);



This can often come handy when we want to use co with other libraries/frameworks
which don’t support generator based workflow. For example, here’s a Gist that
demonstrates how to use generators with the HapiJS web framework –
https://gist.github.com/grabbou/ead3e217a5e445929f14. The route handlers are
written using generator function and then adapted using co.wrap for Hapi.


BUILDING A FACEBOOK MESSENGER BOT WITH PYTHON

posted inPython on May 22, 2016 by maSnun with 33 Comments

Facebook now has the Messenger Platform which allows us to build bots which can
accept messages from users and respond to them. In this tutorial, we shall see
how we can build a bot and add it to one of our pages so that the users can
interact with the bot by sending messages to the page.

To get started, we have three requirements to fulfill:

 * We need a Facebook Page
 * We need a Facebook App
 * We need a webhook / callback URL to accept incoming messages

I am assuming you already have a Facebook Page. If you don’t, go ahead and
create one. It’s very simple.


CREATING AND CONFIGURING THE FACEBOOK APP

(1) First, we create a generic facebook app. We need to provide the name,
namespace, category, contact email. Simple and straightforward. This is how it
looks for me:

.

(2) Now we have to browse the “Add Product” section and add “Messenger”.



(3) Generate access token for a Page you manage. A popup will open asking you
for permissions. Grant the permission and you will soon see the access token for
that page. Please take a note of this token. We shall use it later send messages
to the users on behalf of the page.



Next, click the “Webhooks” section.

(4) Before we can setup a webhook, we need to setup an URL which is publicly
accessible on the internet. The URL must have SSL (that is it needs to be
https). To meet this requirement and set up a local dev environment, we setup a
quick flask app on our local machine.

Install Flask from PyPi using pip:


Shell

pip install Flask
1
pip install Flask



Facebook will send a GET request to the callback URL we provide. The request
will contain a custom secret we can add (while setting up the webhook) and a
challenge code from Facebook. They expect us to output the challenge code to
verify ourselves. To do so, we write a quick GET handler using Flask.




Python

## Filename: server.py from flask import Flask, request app = Flask(__name__)
@app.route('/', methods=['GET']) def handle_verification(): return
request.args['hub.challenge'] if __name__ == '__main__': app.run(debug=True)
1
2
3
4
5
6
7
8
9
10
11
12
## Filename: server.py
 
from flask import Flask, request
 
app = Flask(__name__)
 
@app.route('/', methods=['GET'])
def handle_verification():
    return request.args['hub.challenge']
 
if __name__ == '__main__':
    app.run(debug=True)



We run the local server using python server.py. The app will launch at port 5000
by default. Next we use ngrok to expose the server to the internet. ngrok is a
fantastic tool and you should seriously give it a try for running and debugging
webhooks/callback urls on your local machine.




Shell

ngrok http 5000
1
ngrok http 5000



With that command, we will get an address like https://ac433506.ngrok.io. Copy
that url and paste it in the Webhook setup popup. Checkmark the events we’re
interested in. I check them all. Then we input a secret, which our code doesn’t
care about much. So just add anything you like. The popup now looks like this:



Click “Verify and Save”. If the verification succeeds, the popup will close and
you will be back to the previous screen.



Select a Page again and click “Subscribe”. Now our app should be added to the
page we selected. Please note, if we haven’t generated an access token for that
page in the earlier step, the subscription will fail. So make sure we have an
access token generated for that page.


HANDLING MESSAGES

Now every time someone sends a message to the “Masnun” page, Facebook will make
a POST request to our callback url. So we need to write a POST handler for that
url. We also need respond back to the user using the Graph API. For that we
would need to use the awesome requests module.




Shell

pip install requests
1
pip install requests



Here’s the code for accepting incoming messages and sending them a reply:




Python

from flask import Flask, request import requests app = Flask(__name__)
ACCESS_TOKEN =
"EAAP9MMaGh1cBAHS7jZCnuQgm2GWx5grLraIElFlWlIw2r3Afb34m2c2rP0xdkkkKEeiBOykGINAP0tScwmL5NNBJQN9ayPCuq13syvWocmbYZA7BXL86FsZCyZBxTmkgYYp8MDulLc1Tx70FGdU5ebQZAJV28nMkZD"
def reply(user_id, msg): data = { "recipient": {"id": user_id}, "message":
{"text": msg} } resp =
requests.post("https://graph.facebook.com/v2.6/me/messages?access_token=" +
ACCESS_TOKEN, json=data) print(resp.content) @app.route('/', methods=['POST'])
def handle_incoming_messages(): data = request.json sender =
data['entry'][0]['messaging'][0]['sender']['id'] message =
data['entry'][0]['messaging'][0]['message']['text'] reply(sender, message[::-1])
return "ok" if __name__ == '__main__': app.run(debug=True)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
from flask import Flask, request
import requests
 
app = Flask(__name__)
 
ACCESS_TOKEN =
"EAAP9MMaGh1cBAHS7jZCnuQgm2GWx5grLraIElFlWlIw2r3Afb34m2c2rP0xdkkkKEeiBOykGINAP0tScwmL5NNBJQN9ayPCuq13syvWocmbYZA7BXL86FsZCyZBxTmkgYYp8MDulLc1Tx70FGdU5ebQZAJV28nMkZD"
 
 
def reply(user_id, msg):
    data = {
        "recipient": {"id": user_id},
        "message": {"text": msg}
    }
    resp =
requests.post("https://graph.facebook.com/v2.6/me/messages?access_token=" +
ACCESS_TOKEN, json=data)
    print(resp.content)
 
 
@app.route('/', methods=['POST'])
def handle_incoming_messages():
    data = request.json
    sender = data['entry'][0]['messaging'][0]['sender']['id']
    message = data['entry'][0]['messaging'][0]['message']['text']
    reply(sender, message[::-1])
 
    return "ok"
 
 
if __name__ == '__main__':
    app.run(debug=True)



The code here accepts a message, retrieves the user id and the message content.
It reverses the message and sends back to the user. For this we use the
ACCESS_TOKEN we generated before hand. The incoming request must be responded
with a status code 200 to acknowledge the message. Otherwise Facebook will try
the message a few more times and then disable the webhook. So sending a http
status code 200 is important. We just output “ok” to do so.

You can now send a message to your page and see if it responds correctly. Check
out Flask’s and ngrok’s logs to debug any issues you might face.

You can download the sample code from here: https://github.com/masnun/fb-bot


PERSONAL LOANS AND DEBTS

posted inPython on May 3, 2016 by maSnun with 0 Comment


Personal loans often come with lower interest rates than credit cards.

Debt consolidation is an easy way to reduce your balance.

Debt consolidation is usually the fastest and easiest way to get out of debt.

You’ll be able to pay off your debt in full.

With no fees and no minimum payments, it’s a no-brainer.

Get a full list of how much you can borrow

Compare the cost of debt consolidation with other methods

Find out how you can consolidate your student loans.

Why It’s So Expensive

The cost of consolidation is a hefty price. To get the loan approved, you need
to complete an application, and then complete additional steps to bring your
loans up to date.

You’ll need to pay for a $300 application fee, which will likely be refunded if
you’re approved. In addition, you’ll also need to pay a $300 non-refundable
non-application fee. The rest of your loan is forgiven after you make a down
payment of at least 20 percent, make your first monthly payment of at least 12.5
percent, or make 10 years of payments. This will be a big deal for people who
can’t afford to put away as much money for a down payment, which is why it’s
important to do it right.

Get the Best Offers for Your Debt

Next, it’s important to do your homework on every lender. For example, you’ll
want to find out the best interest rate and the length of the loan, and what
kind of credit you can get on top of that, also if you need a loan quickly,
there are also other services that offer short term loans which are useful to
pay loans.

Next, it’s important to do your homework on every lender. For example, you’ll
want to find out the best interest rate and the length of the loan, and what
kind of credit you can get on top of that. Step 3: Avoid Getting Rushed

Even though a mortgage application can take weeks or even months, make sure you
give yourself plenty of time to do your research, to check out the available
deals and to make sure your lender is your best choice. In the long run, that
means you’ll get a better deal. It’s easy to spend a lot of time and money
researching a home, only to find out that the house is too expensive or the
lender doesn’t have enough credit for your situation. To avoid that, use your
time wisely.

Find Out How Much You Can Buy

Even though this is only the start of your research, you should now be able to
see if you can afford to buy the home you’re interested in. It’s always good to
ask for an appraisal, but you can also use a home inspection report from a local
lender, a home inspection report from an online company or a real estate agent
to get an accurate picture of your home’s market value. It’s important to find
out if your state allows an open house and a “walk-through” before you buy a
home. Learn more about state laws on home inspections.

Your Mortgage Lender

If you have a mortgage that is not in your name, a lender (like a bank) can
review your mortgage and credit history and may offer you more favorable terms
and conditions. Lenders may also help you find a better rate and rate details.


DJANGO: RUNNING MANAGEMENT COMMANDS INSIDE A DOCKER CONTAINER

posted inDjango, Python on April 23, 2016 by maSnun with 3 Comments

Okay, so we have dockerized our django app and we need to run a manage.py
command for some task. How do we do that? Simple, we have to locate the
container that runs the django app, login and then run the command.


LOCATE THE CONTAINER

It’s very likely that our app uses multiple containers to compose the entire
system. For exmaple, I have one container running MySQL, one container running
Redis and another running the actual Django app. If we want to run manage.py
commands, we have to login to the one that runs Django.

While our app is running, we can find the running docker containers using the
docker ps command like this:



$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 308f40bba888
crawler_testscript "/sbin/my_init" 31 hours ago Up 3 seconds 5000/tcp
crawler_testscript_1 3a5ccc872215 crawler_web "bash run_web.sh" 31 hours ago Up
4 seconds 0.0.0.0:8000-&gt;8000/tcp crawler_web_1 14f0e260fb2c redis:latest
"/entrypoint.sh redis" 31 hours ago Up 4 seconds 0.0.0.0:6379-&gt;6379/tcp
crawler_redis_1 252a7092870d mysql:latest "/entrypoint.sh mysql" 31 hours ago Up
4 seconds 0.0.0.0:3306-&gt;3306/tcp crawler_mysql_1
1
2
3
4
5
6
$ docker ps
CONTAINER
ID        IMAGE                COMMAND                  CREATED            
STATUS              PORTS                    NAMES
308f40bba888        crawler_testscript   "/sbin/my_init"          31 hours
ago        Up 3 seconds        5000/tcp                 crawler_testscript_1
3a5ccc872215        crawler_web          "bash run_web.sh"        31 hours
ago        Up 4 seconds        0.0.0.0:8000-&gt;8000/tcp   crawler_web_1
14f0e260fb2c        redis:latest         "/entrypoint.sh redis"   31 hours
ago        Up 4 seconds        0.0.0.0:6379-&gt;6379/tcp   crawler_redis_1
252a7092870d        mysql:latest         "/entrypoint.sh mysql"   31 hours
ago        Up 4 seconds        0.0.0.0:3306-&gt;3306/tcp   crawler_mysql_1



In my case, I am using Docker Compose and I know my Django app runs using the
crawler_web image. So we note the name of the container. In the above example,
that is – crawler_web_1.

Nice, now we know which container we have to login to.


LOGGING INTO THE CONTAINER

We use the name of the container to login to it, like this:




Shell

docker exec -it crawler_web_1 bash
1
docker exec -it crawler_web_1 bash



The command above will connect us to the container and land us on a bash shell.
Now we’re ready to run our command.


RUNNING THE COMMAND

We cd into the directory if necessary and then run the management command.




Shell

cd /project python manage.py <command>
1
2
cd /project
python manage.py <command>




SUMMARY

 * docker ps to list running containers and locate the one
 * docker exec -it [container_name] bash to login to the bash shell on that
   container
 * cd to the django project and run python manage.py [command]


DJANGO REST FRAMEWORK: REMEMBER TO DISABLE WEB BROWSABLE API IN PRODUCTION

posted inDjango, Python on April 20, 2016 by maSnun with 7 Comments

So this is what happened – I built an url shortening service at work for
internal use. It’s a very basic app – shortens urls and tracks clicks. Two
models – URL and URLVisit. URL model contains the full url, slug for the short
url, created time etc. URLVisit has information related to the click, like user
IP, browser data, click time etc and a ForeignKey to URL as expected.

Two different apps were using this service, one from me, another from a
different team. I kept the Web Browsable API so the developers from other teams
can try it out easily and they were very happy about it. The only job of this
app was url shortening so I didn’t bother building a different home page. When
people requested the / page on the domain, I would redirect them directly to
/api/.

Things were going really great initially. There was not very heavy load on the
service. Roughly 50-100 requests per second. I would call that minimal load. The
server also had decent hardware and was running on an EC2 instance from AWS.
nginx was on the front while the app was run with uwsgi. Everything was so
smooth until it happened. After a month and half, we started noticing very poor
performance of the server. Sometimes it was taking up to 40 seconds to respond.
I started investigating.

It took me some time to find out what actually happened. By the time it
happened, we have shortened more than a million urls. So when someone was
visiting /api/url-visit/ – the web browsable api was trying to render the html
form. The form allows the user to choose one of the entries from the URL model
inside a select (dropdown). Rendering that page was causing usages of 100% cpu
and blocking / slowing down other requests. It’s not really DRF’s fault. If I
tried to load a million of entries into a select like that, it would crash the
app too.

Even worse – remember I added a redirect from the home page, directly to the
/api/ url? Search engines (bots) started crawling the urls. As a result the app
became extremely slow and often unavailable to nginx. I initially thought, I
could stop the search engine crawls by adding some robots.txt or simply by
adding authentication to the API. But developers from other teams would still
time to time visit the API to try out things and then make the app non
responsive. So I did what I had to – I disabled the web browsable API and added
a separate documentation demonstrating the use of the API with curl, PHP and
Python.

I added the following snippet in my production settings file to only enable
JSONRenderer for the API:




Python

REST_FRAMEWORK = { 'DEFAULT_RENDERER_CLASSES': (
'rest_framework.renderers.JSONRenderer', ) }
1
2
3
4
5
REST_FRAMEWORK = {
    'DEFAULT_RENDERER_CLASSES': (
        'rest_framework.renderers.JSONRenderer',
    )
}



Things have become pretty smooth afterwards. I can still enjoy the nice HTML
interface locally where there are much fewer items. While on my production
servers, there is no web browsable APIs to cause any bottlenecks.


COMPOSITION OVER INHERITANCE

posted inPython on April 19, 2016 by maSnun with 7 Comments


INHERITANCE

If you know basic OOP, you know what Inheritance is. When one class extends
another, the child class inherits the parent class and thus the child class has
access to all the variables and methods on the parent class.




Python

class Duck: speed = 30 def fly(self): return "Flying at {}
kmph".format(self.speed) class MallardDuck(Duck): speed = 20 if __name__ ==
"__main__": duck = Duck() print(duck.fly()) mallard = MallardDuck()
print(mallard.fly())
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
class Duck:
    speed = 30
 
    def fly(self):
        return "Flying at {} kmph".format(self.speed)
 
 
class MallardDuck(Duck):
    speed = 20
 
 
if __name__ == "__main__":
    duck = Duck()
    print(duck.fly())
 
    mallard = MallardDuck()
    print(mallard.fly())



Here the MallardDuck extends Duck and inherits the speed class variable along
with the fly method. We override the speedin the child class to suit our needs.
When we call fly on the mallard duck, it uses the fly method inherited from the
parent. If we run the above code, we will see the following output:



Flying at 30 kmph Flying at 20 kmph
1
2
Flying at 30 kmph
Flying at 20 kmph



This is inheritance in a nutshell.


COMPOSITION

Let’s first see an example:




Python

class GmailProvider: def send(self, msg): return "Sending `{}` using
Gmail".format(msg) class YahooMailProvider: def send(self, msg): return "Sending
`{}` using Yahoo Mail!".format(msg) class EmailClient: email_provider =
GmailProvider() def setup(self): return "Initialization and configurations" def
set_provider(self, provider): self.email_provider = provider def
send_email(self, msg): print(self.email_provider.send(msg)) client =
EmailClient() client.setup() client.send_email("Hello World!")
client.set_provider(YahooMailProvider()) client.send_email("Hello World!")
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
class GmailProvider:
    def send(self, msg):
        return "Sending `{}` using Gmail".format(msg)
 
 
class YahooMailProvider:
    def send(self, msg):
        return "Sending `{}` using Yahoo Mail!".format(msg)
 
 
class EmailClient:
    email_provider = GmailProvider()
    
    def setup(self):
        return "Initialization and configurations"
 
    def set_provider(self, provider):
        self.email_provider = provider
 
    def send_email(self, msg):
        print(self.email_provider.send(msg))
 
 
client = EmailClient()
client.setup()
 
client.send_email("Hello World!")
 
client.set_provider(YahooMailProvider())
client.send_email("Hello World!")



Here we’re not implementing the email sending functionality directly inside the
EmailClient. Rather, we’re storing a type of email provider in the
email_provider variable and delegating the responsibility of sending the email
to this provider. When we have to send_email, we call the send method on the
email_provider. Thus we’re composing the functionality of the EmailClient by
sticking composable objects together. We can also swap out the email provider
any time we want, by passing it a new provider to the set_provider method.


COMPOSITION OVER INHERITANCE

Let’s implement the above EmailClient using inheritance.




Python

class EmailClient: def setup(self): return "Initializions and configurations!"
def send_email(self, msg): raise NotImplementedError("Use a subclass!") class
GmailClient(EmailClient): def send_email(self, msg): return "Sending `{}` from
Gmail Client".format(msg) class YahooMailClient(EmailClient): def
send_email(self, msg): return "Sending `{}` from YMail! Client".format(msg)
client = GmailClient() client.setup() client.send_email("Hello!") # If we want
to send using Yahoo, we have to construct a new client yahoo_client =
YahooMailClient() yahoo_client.setup() yahoo_client.send_email("Hello!")
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
class EmailClient:
    def setup(self):
        return "Initializions and configurations!"
 
    def send_email(self, msg):
        raise NotImplementedError("Use a subclass!")
 
 
class GmailClient(EmailClient):
    def send_email(self, msg):
        return "Sending `{}` from Gmail Client".format(msg)
 
 
class YahooMailClient(EmailClient):
    def send_email(self, msg):
        return "Sending `{}` from YMail! Client".format(msg)
 
 
client = GmailClient()
client.setup()
 
client.send_email("Hello!")
 
# If we want to send using Yahoo, we have to construct a new client
 
yahoo_client = YahooMailClient()
yahoo_client.setup()
 
yahoo_client.send_email("Hello!")



Here, we created a base class EmailClient which has the setup method. Then we
extended the class to create GmailClient and YahooMailClient. Things got
interesting when we wanted to start sending emails using Yahoo instead of Gmail.
We had to create a new instance of YahooMailClient for that purpose. The
initially created client was no longer useful for us since it only knows how to
send emails through Gmail.

This is why composition is often favoured over inheritance. By delegating the
responsibility to the different composable parts, we form loose coupling. We can
swap out those components easily when needed. We can also inject them as
dependencies using dependency injection. But with inheritance, things get
tightly coupled and not easily swappable.

← Older posts

Search for:


TWITTER




FACEBOOK




STACKOVERFLOW




CATEGORIES

 * Bangla (53)
 * Clojure (4)
 * Data Science (2)
 * Django (12)
 * Javascript (22)
 * Linux (21)
 * Mac (8)
 * MySQL (5)
 * NodeJS (8)
 * Personal (22)
 * PHP (114)
 * Python (113)
 * Screencast (32)
 * Series (19)
 * Uncategorized (52)
 * Work (8)

February 2022 S S M T W T F « May      1234 567891011 12131415161718
19202122232425 262728