Showing posts with label python. Show all posts
Showing posts with label python. Show all posts

Saturday 26 May 2018

numpy basics python

NumPy is the fundamental package for scientific computing with Python. It contains among other things:

  • a powerful N-dimensional array object
  • sophisticated (broadcasting) functions
  • tools for integrating C/C++ and Fortran code
  • useful linear algebra, Fourier transform, and random number capabilities


Importing Numpy:

import numpy as np

Creating numpy Array:

 d = np.array([1,2,3,4,5])

Numpy range:

  d = np.arange(1,10). # It will create numpy array from range 1 to 9
   

numpy shape:


It will return total elements count based on rows or shape
d = np.array([1,2,3])
print d   # array([1, 2, 3])
print d.shape # (3,)

numpy reshape:



It will change the shape of numpy arrays

d = np.arange(1,10)    # array([1,2,3,4,5,6,7,8,9])
d.shape     # (9,)
d.reshape(3,3)
print d    #  Array([[1, 2, 3],
                                [4, 5, 6],
                                [7, 8, 9]])
Above example, it will reshape like 3X3 matrix structure


np.zeros()


    It will create zero value matrix numpy array. We have to give dimension value in the function and it will create matrix arrays.
np.zeros(3, 3)       # Array([[0., 0., 0.],
                                            [0., 0., 0.],
                                            [0., 0., 0.]]) 


np.vstack()


It will vertically stack each elements in  numpy array.

c = np.array([1,2,3])  # array([1, 2, 3])
np.vstack(c)    # array([ [ 1],
                                       [ 2],
                                       [ 3]])
   

np.eye()


It will create numpy  identical matrix array.

 np.eye(3) # it will create 3X3 matrix
                     Array (  [ 1,    0,   0]
                                   [ 0,     1,   1]
                                   [ 0,     0,   1])
        

     
np.dot()


 It will dot product of two matrix (multiplication)       
 np.dot(M1, M2)

np.sum()


 It will sum of all the elements in given array.
#M = Array([[1, 2, 3],
            [4, 5, 6],
            [7, 8, 9]])
np.sum(M) # 45 it will sum all the elements in array
np.sum(M, axis=0)   # [[12, 15, 18]] 
    
If axis= 0, it will sum column wise, it 
If axis = 1, it will sum row wise


np.random.rand()


It will produce random np arrays


np.append()


Append elements to nd array
 A = array([1, 2, 3])
 B = np.append(A, 4) # [1, 2, 3, 4]
 B = np.append(A, [4, 5,6,7]) # [1, 2, 3, 4, 5, 6, 7]
        
    






            
    

     

ansible basics for beginners

What is Ansible


Ansible interacting with machines via SSH. So nothing need to be installed in client machines. Only prerequisite is ansible need to be installed in controller machine with python and ssh enabled.

Inventory:


Inventory file:


Inventory file is an simple text file which contains List of machines going to interact with it. We can mention single machines or group of machines going to use it. We can pass direct commands to modules in cmd line using ansible cli.

Cmd: ansible group-name -i <inventory-filename> -m <module-name> <module-params>

ansible group-name -i <inventory-filename> -m <module-name> <module-params>
 
Inventory:
server1.mycomp.com
server2.mycomp.com
 
[clients] #group name
server3.mycomp.com
server4.mycomp.com  


Ex: 
ansible clients -i inventory -m ping
ansible clients -i inventory -m apt -a "name=mysql-server state=present"

    Inventory file can also be an executable file. For example if you don’t know the number of instances running in AWS means we can simple write a script to return running instances name from AWS.

Ansible play books:


    Ansible playbook is an simple YAML file which contains list of tasks that need to be performed in client machines which we mentioned in inventory file.

playbook.yaml
---

- hosts: all
  tasks:
    - name: updating package list
      apt: update_cache=yes cache_valid_time=3600
- hosts: clients
  tasks:
    - name: installing mysql server
      apt: name=mysql-server state=present

In above code snippet, we used apt module for updating and installing packages. Host all specifies perform the task to all the host machines which we mentioned in inventory file. 

And also we can perform task to specific group of hosts. “hosts: client” specifies perform  below mentioned tasks only to client group which we created in inventory file. “-name” of each tasks contains some human readable message which will print while performing the tasks. This will be very helpful while monitoring the execution

    Running playbook:

  ansible-playbook -i inventory playbook.yaml

Vaiables in playbook:


Ansible using jinja2 templating system for dealing with varibles.

playbook.yaml
---
- hosts: all
  tasks:
    - name: updating package list
      apt: update_cache=yes cache_valid_time=3600
- hosts: clients
  vars:
    init_script: "create_db.sql"
  tasks:
    - name: installing mysql server
      apt: name=mysql-server state=present
    - name: coping init sql files

      copy: src=/tmp/{{init_script}} dest=/tmp/mysql/{{init_script}}


Variable loops in playbook:


playbook.yaml
---
- hosts: all
  tasks:
    - name: updating package list
      apt: update_cache=yes cache_valid_time=3600
- hosts: clients
  vars:
    init_script: “create_db.sql"
  tasks:
    - name: installing mysql server
      apt: name={{item}} state=present
      with_items:
        - python 
        - python-pip 
        - vim
    - name: coping init sql files

      copy: src=/tmp/{{init_script}} dest=/tmp/mysql/{{init_script}}

Other way - we can combine the variables based on hosts vise

playbook.yaml


---

- hosts: all
  tasks:
    - name: updating package list
      apt: update_cache=yes cache_valid_time=3600
- hosts: clients
  vars:
    packages:
      - python 
      - python-pip 
      - vim
  tasks:
    - name: installing mysql server
      apt: name={{item}} state=present
      with_items: {{packages}}
        - name: coping init sql files
          copy: src=/tmp/{{init_script}} dest=/tmp/mysql/{{init_script}}
     

Directory Group variables:


In default ansible will look directory called “group_vars” and “host_vars” in same location which playbook located. If you define any variables under the group_vars directory it will automatically applied to that specific group.

My folder structure:
    - inventory
    - playbook.yml
    - group_vars
            - all 
            - clients
    - host_vars
            - server.com

In above folder structure, variable defined in the file called “all” under the group_vars directory which will be available for all hosts defined in inventory hosts. If you want to define variables for specific host create file with same hostname under the “host_vars” directory.

Inventory directory:


    Normally inventory file will be simple test file but it can also be an directory. 

     ansible-playbook -i <inventory-dirctory> playbook.yml

  • ansible-playbook -i uat deploy.yml
  • ansible-playbook -i dev deploy.yml
  • ansible-playbook -i prod deploy.yml

Directory structure of inventory folder:
        
        dev
              - hosts
              - group_vars
              - host_vars
        uat
              - hosts
              - group_vars
              - host_vars
        Prod
              - hosts
              - group_vars
              - host_vars
        deploy.yml

Is there any text files available in your inventory directory, ansible will treat it as inventory file.

Roles in ansible:


You can use single playbook file for managing entire tasks of your infrastructure. But once in a stage your playbook file will be more bigger and hard to manage. For this ansible has the “role” feature, so you can split your playbook yaml file into more moduler way.

You can create a directory called “roles” and create playbook modules.

Roles directory structure:

        dev
              - hosts
              - group_vars
              - host_vars
        roles
              - common
                    - defaults
                        - main.yml   # variable values
                    - tasks
                        - main.yml   # list of tasks need to be execute
                    - files
                        - server.py   # file need to be copy
                    - templates
                        - config.py.j2  # template file used for template module
                    - meta
                        - main.yml  # list the dependency task before perform
              - webserver
                      - defaults
                        - main.yml   # variable values
                    - tasks
                        - main.yml   # list of tasks
              - db
                    - tasks
                        - main.yml   # list of tasks
        deploy.yml

Deploy.yaml

- hosts: database-server
  roles:
    - common
    - db
- hosts: web-server
  roles:
    - common
    - webserver



Here we can break down the roles folder into more modules. It has documented in ansible documentation site. 
  • Defaults folder contains the variable need to be register
  • Task folder contains task need to be perform for that group
  • Files folder contains the files need to be transferred
  • Templates folder is for template module
  • Meta folder contains the dependency list for That specific group
    
        Ex:
                main.yml
                --- 
                Dependencies:
                    - common
                    - db 

Sunday 11 February 2018

Introduction to Python Argparse


What is Python Command line arguments?


While executing python script, we can provide additional arguments in command line. These arguments are passed into the program. We can access those arguments inside the program with help of python modules(sys, argparse, etc..).  python "sys" library module is one of the traditional and simple way of handling command line arguments.

my-script.py
import sys
print len(sys.argv)
print sys.argv
print sys.argv[0]
print sys.argv[1]
print sys.argv[2]


$ python my-script.py arg1 arg2
3
['my-script.py', 'arg1', 'arg2']
test.py
arg1
arg2


Python "argparse" module:


There are many python modules available for handling python command line arguments. One of the most popular module is argparse.  Argparse was added into python 2.7 as replacement of optparse.  It provided more features then traditional sys module.

Parsing command line arguments:


There is an function called "arg_parse" from ArgumentParser class which is used to parse the command line arguments. In default it will take arguments from sys.argv[1:], but we can also provide our own list. 

We can define arguments using add_argument function it will return the Namespace object which containing the arguments to the commands.

import sys
import argparse
parser = argparse.ArgumentParser(description='sample app')
parser.add_argument("name", help="Please enter your name")
args = parser.parse_args()
print args
print args.name
$ python my-script.py -h
usage: my-script.py [-h] name

sample app

positional arguments:
  name        Please enter your name

optional arguments:
  -h, --help  show this help message and exit
$ python my-script.py Jerry
Namespace(name='Jerry')
Jerry
-h or --help is an default feature added into your script when you import argparse module.  it will show the available positional arguments and optional arguments with help messages provided by us.


argument type:


We can externally specify the type of the argument to argparse can accept. "type" field is used for specifying cast. it will convert the argument value to specified type while parsing the arguments. if cannot convert to specified type it will throw error.

import sys
import argparse
parser = argparse.ArgumentParser(description='sample app')
parser.add_argument("square", type=int, help="Please enter your name")
args = parser.parse_args()
print args.square**2
$ python my-script.py 4
16

Optional arguments:


When you add positional argument to parser, we must provide value to positional arguments otherwise it will throw an error. But optional arguments are actually optional, there is no error when running the program without it.

import sys
import argparse
parser = argparse.ArgumentParser(description='sample app')
parser.add_argument("--square", dest="square", default=2, type=int, help="Please enter integer value")
args = parser.parse_args()
print args.square**2
$ python my-script.py --square 4
16
$ python my-script.py
4

If we are not providing any command line arguments, it will take value from default field. "None" is the default value for default field.

Short options:


We can define the short versions of the optional arguments. it is very useful for handy

import sys
import argparse
parser = argparse.ArgumentParser(description='sample app')
parser.add_argument("-s","--square", dest="square", default=2, type=int)
args = parser.parse_args()
print args.square**2

$ python my-script.py -s 4
16

Argument Actions:


action field of add_argument() function specifies what kind of action need to be perform to that argument. default value is "store", i.e store the given value to the destination variable. following are the six different kind of actions can be triggered when we add argument.

  • store - it is a default value of an action field. it will store specified value to destination variable
  • store_const - store the value defined as part of argument specification
  • store_true/store_false - save boolean values to the variables
  • append - save the value to the list
  • append_const - store the value defined in the argument to list
  • version - prints the version details about the program

examples:


import sys
import argparse
parser = argparse.ArgumentParser(description='sample app')
parser.add_argument("-v", "--verbose", action="store_true", default=False)
parser.add_argument("-s","--square", dest="square", default=2, type=int)
parser.add_argument("-a","--add", dest="my_list", default=[], action="append")
args = parser.parse_args()
if args.verbose:
    print "printing verbose output"
print "square value ", args.square**2
print "my list is ", args.my_list
$ python my-script.py -v --square 4 -a 2 -a 3
printing verbose output
square value  4
my list is  ['2', '3']


Monday 29 January 2018

Creating simple hello world flask app using docker

First make sure docker is installed in your machine and you have necessary permissions to execute the following commands in your system. If you want to know about basics of docker please refer my previous blog - Docker guide for beginners

Following sample files are available in this github repo


Step 1 : Create working directory

mkdir flask-helloworld
cd flask-helloworld

Step 2: create following files inside the working directory


requirements.txt

Flask

app.py

from flask import Flask
app = Flask(__name__)

@app.route("/")
def hello():
    return "Hello World!"

if __name__ == "__main__":
    app.run(host='0.0.0.0')

Dockerfile

# base image
FROM python:3-onbuild
# specify the port that container should expos
EXPOSE 5000
# run the application
CMD ["python", "./app.py"]


Step 3 : Build and Run your container


Build docker image:


Following command read your Dockerfile and build your custom images. Go to the path where your Docker file locates and execute following command.

docker build -t <imagename>:<tag-name> <path of your dockerfile>
docker build -t sample-app:v1 .

final dot(.) specifies the current directory. -t specifies the tag name of the image



Listing docker images:


docker images
docker images -a




Running Docker container:


Docker run command is used to run a image inside the container. If the specified image available in local machine docker will take it from local or it will download from dockerhub and then store it to local machine.
docker container run <image-name>
docker container run sample-app:v1
It will create a new container and run the sample-app image inside the container.




If you want to execute the container in background use --detach (or) -d flag. It will detach the process from foreground and allow us to execute it into background. It will return the unique container id.

docker container run -p 5000:5000 -d sample-app:v1













Executing commands inside the container:


Following command allow us to login inside the container. It is very helpful to debug our application if something went wrong. we can execute linux commands inside the container

docker run -it <image-name> sh

docker run -it sample-app:v1 sh




Stop the container:


Following command used to stop the container.
docker container stop 9a425901d134

Deleting containers:


Every run docker creating new containers so it will eat disk space, so best practice is cleanup the containers once done with that.

docker container rm <container id>

docker container rm 9a4

Thursday 2 February 2017

Integrate pylint into git hook and pycharm

What is pylint:


Pylint is a source code, bug and quality checker for the Python programming language. It follows the style recommended by PEP 8, the Python style guide.[4] It is similar to Pychecker and Pyflakes, but includes the following features:
  • Checking the length of each line
  • Checking if variable names are well-formed according to the project's coding standard
  • Checking if declared interfaces are truly implemented.

Installing pylint:

Install following pylint package using pip installer

mac/unix:

    pip install pylint

windows:

   python -m pip install pylint

Once you installed pylint into your system, check the pylint version using following command to make sure pylint was installed properly or not.

 pylint --version

pylint example.py

Configure pylint into git hook:


Pre-commit hook for Git checking Python code quality. The hook will check files ending with .py or that has a she bang (#!) containing python.
The script will try to find pylint configuration files in the order determined by pylint. It also looks for a [pre-commit-hook] section in the pylint configuration for commit hook specific options.
pip install git-pylint-commit-hook
Next go to your git initialized folder and navigate into .git/hooks/ directory. Rename the existing template file "pre-commit.sample" into "pre-commit"
Delete everything in that file and paste this in the pre-commit file
#!/bin/sh
git-pylint-commit-hook

Usage

The commit hook will automatically be called when you are running git commit. If you want to skip the tests for a certain commit, use the -n flag,
 git commit -n.

pylint configuration

Settings are loaded by default from the .pylintrc file in the root of your repo.
[pre-commit-hook]
command=custom_pylint
params=--rcfile=/path/to/another/pylint.rc
limit=8.0
command is for the actual command, for instance if pylint is not installed globally, but is in a virtualenv inside the project itself.
params lets you pass custom parameters to pylint
limit is the lowest value which you want to allow for a pylint score. Any lower than this, and the script will fail and won’t commit.

Integrate Pylint into pycharm IDE:


step 1: 
Go to file -> settings 


Step2:

select "Tools -> External tools-> click add icon"



Step 3:

Fill the tool setting parameters. To be a little more flexible, you can use PyCharm macros. As an example use the value “$FilePath$” for Working directory and “$Promt$” for Parameters. This allows the use in other projects, too.



Step 4:

Now pylint is configured into your system. Right click the file and select pylint from external tools to run pylint for specific files