:::: MENU ::::

Posts Categorized / Python

  • Apr 01 / 2018
  • 0
Linux, Python

Using procmail with custom python script

If you want to execute some special commands, log or make an API call when you’re receiving an email onto your server, you can easily set up this by using procmail. Procmail is designed to filter and sort emails but can make any call you want.

Here, as an example, we will set up a simple call to a python script that will read the content of the mail (headers and body) and put the information into a log file.

1. Install procmail

Depending on the OS you’re using, you should find a package pre-compiled in the common repositories.

For example, on a Debian-based:

apt-get install procmail

or on a CentOS-based:

yum install procmail

2. Build the python script you want to call to analyze the message

You now have to prepare your script you will execute when receiving an email, that will read and parse the content to log interesting information in a file.

Let’s create a script called procmail_script.py

import os
import time
import email
import sys
import logging.handlers
import base64

# Constants
LOG_FILE_PATH = os.path.expanduser('/opt/mailAnalysis.log')

# Set up a logger
my_logger = logging.getLogger('MyLogger')
my_logger.setLevel(logging.INFO)
handler = logging.handlers.RotatingFileHandler(LOG_FILE_PATH,  maxBytes=500000, backupCount=4,)
formatter = logging.Formatter("%(asctime)s - %(levelname)s - %(message)s",  "%Y-%m-%d %H:%M:%S")
handler.setFormatter(formatter)
my_logger.addHandler(handler)

# Main function
def main():
    try:
        # Get message
        full_msg = sys.stdin.read()
        msg = email.message_from_string(full_msg)

        # Prepare dict containing data
        data = {}

        # Fill dict 
        data['From'],e    = email.Header.decode_header(msg['From'])[0] if msg.has_key('From') else ''
        data['To'],e      = email.Header.decode_header(msg['To'])[0] if msg.has_key('To') else ''
        data['Subject'],e = email.Header.decode_header(msg['Subject'])[0] if msg.has_key('Subject') else ''
        data['Body']      = msg.get_payload()

        # Add information to log
        my_logger.info('From:    ' + data['From'])
        my_logger.info('To:      ' + data['To'])
        my_logger.info('Subject: ' + data['Subject'])
        my_logger.info('Body:    ' + data['Body'])
    except Exception,e:
        my_logger.error('----- ERROR ENCOUNTERED')
        my_logger.error(str(e))
        my_logger.error('----- END OF ERROR')

# Main program
if __name__ == "__main__":
    start_time = time.time()
    my_logger.info('----- START: ' + str(time.time()))
    result = main()
    my_logger.info('----- END: ' + str(time.time()))
    my_logger.info('Elapsed Seconds: ' + str(time.time() - start_time))
    handler.close()
    sys.exit(result)

3. Configure your user to pass the mail to your script

So that you can pass rules you want to execute when receiving an email, you need to create a file (hidden) called .procmailrc that will take place in the home directory of the user you want to use.

For example, for executing rules when receiving email to [email protected], you will have to put that file into the home dir like /home/mailuser/.procmailrc

LOGFILE=/var/log/procmail.log
VERBOSE=YES

:0:
* ^Subject:.*procmail.*
* ^[email protected](com|net)
{
  :0c
  procmail-dir/

  :0 fw
  | /usr/bin/python /home/mailuser/procmail_script.py

  :0 e
  procmail-failed-dir/
}

This will perform multiple steps:

  1. Check that the mail recipient is [email protected] or [email protected]
  2. Put a copy of the email into the procmail-dir directory
  3. Pass the message to our python script procmail_script.py
  4. Discard the message if the script succeeds (remove from queue) or copy it to procmail-failed-dir if failed (so you can process it later)

4. Prepare an email and perform a testing locally

First, create a sample mail that you will use for testing in a file called procmail_test.txt:

From: [email protected]
To: [email protected]
Subject: This is a procmail testing

Hey there,
I hope this message will be parsed and logged properly as expected.
This is my first test for procmail deployment!

Then, you can test it by executing procmail manually:

procmail VERBOSE=on /home/mailuser/.procmailrc < /home/mailuser/procmail_test.txt

Now, open the file /opt/mailAnalysis.log and you should have something like:

2018-03-31 08:08:45 - INFO - ----- START: 1522570125.06
2018-03-31 08:08:45 - INFO - From:    [email protected]
2018-03-31 08:08:45 - INFO - To:      [email protected]
2018-03-31 08:08:45 - INFO - Subject: This is a procmail testing
2018-03-31 08:08:45 - INFO - Body:    Hey there,
I hope this message will be parsed and logged properly as expected.
This is my first test for procmail deployment!
2018-03-31 08:08:45 - INFO - ----- END: 1522570125.12
2018-03-31 08:08:45 - INFO - Elapsed Seconds: 0.0624470710754
  • May 30 / 2017
  • 0
Linux, Python

DNS queries from a file/list to CSV

It’s not easy to perform bulk DNS resolution when you have many DNS/IPs to control. Here is a simple script allowing you to perform DNS resolution over a list of DNS entries or IPs.

Here is a list of DNS (names and IPs) that we put in a file called listDNS.txt

www.python.org
www.pyython.org
208.67.220.220
www.bing.com

Let’s copy that script that will do the job in a file called resolverDNS.sh

#!/bin/bash
# Script file - resolverDNS.sh
# Checking existence of arg
if [ "$1" == "" ]
then
  # Display help if wrong usage
  echo "Usage: /bin/bash resolverDNS.sh /path/to/file"
  exit 35
else 
  # Loop over dns and resolve
  while IFS='' read -r line || [[ -n "$line" ]]; do
    dns=''
    # Resolve reverse DNS
    if [[ $line =~ ^[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}$ ]]; then
      dns=`dig +noall +answer -x $line +short|tr 'n' ' '`
    # Resolve A record
    else
      dns=`dig a $line +short|tr 'n' ' '`
    fi
    echo -e "$linetis resolving intot${dns}"
  done < "$1"
fi

And now, execute it by passing file path as an arg, and see the output:

$ bash /home/user/resolverDNS.sh /home/user/listDNS.txt 
www.python.org	is resolving into	python.map.fastly.net. 151.101.60.223 
www.pyython.org	is resolving into	
208.67.220.220	is resolving into	resolver2.opendns.com. 
www.bing.com	is resolving into	www-bing-com.a-0001.a-msedge.net. a-0001.a-msedge.net. 204.79.197.200 13.107.21.200 

Resolution are done for every line, depending on if it’s an IP or a name (and remain empty if it can’t resolve).
Feel free to adjust the script according to your needs!

  • Jun 09 / 2016
  • 0
Linux, Python

openssl/pyOpenSSL – “SSL23_GET_SERVER_HELLO:tlsv1 alert internal error”

You’re getting this annoying error message again and again when trying to fetch certificate and/or establish a connection to your website using openssl:

139647967614624:error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error:s23_clnt.c:769:

This issue is well known in several openssl versions, and a bug has been addressed for Ubuntu repositories:
https://bugs.launchpad.net/ubuntu/+source/openssl/+bug/1475228

For now, there’s a simple workaround that works to quickly fix it!

For openssl

If you’re facing it while using openssl directly, you can fix it by specifying the servername on command-line:

openssl s_client -connect www.mywebsite.com:443 -servername www.mywebsite.com

For pyOpenSSL

If you’re having this issue while using pyOpenSSL (python wrapper for OpenSSL), it can also be fixed with a quick workaround by adding the option set_tlsext_host_name() to specify the server name in your “Connection” object.
You will get something like this:

import socket
from OpenSSL import SSL

# REPLACE WITH YOUR OWN WEBSITE
hostname = 'www.mywebsite.com'
ctx = SSL.Context(SSL.TLSv1_METHOD)
sock = socket.socket()
ssl_sock = SSL.Connection(ctx, sock)
ssl_sock.set_tlsext_host_name(hostname)
ssl_sock.connect((hostname, 443))
ssl_sock.do_handshake()
cert = ssl_sock.get_peer_certificate()
print cert.get_subject().commonName

  • Mar 30 / 2016
  • 0
Linux, Python

Generate SHA-512 hash on command-line with Python

Need to generate the hash for a password? No need to use an online generator, totally insecure for your passwords …
This simple command will ask you which string you want to hash and will return you the result after pressing “Enter” key!

python -c "import crypt, getpass, pwd; print crypt.crypt(raw_input(), '\$6\

Besoin de générer un hash pour un mot de passe ? Pas besoin d’utiliser un générateur en ligne, totalement insécurisé pour vos mots de passe …
Cette simple commande vous demandera pour quelle chaine vous souhaitez effectuer un hash et vous retournera le résultat dès que vous presserez “Entrée” !

python -c "import crypt, getpass, pwd; print crypt.crypt(raw_input(), '\$6\

 + raw_input() + '\

  • Oct 09 / 2015
  • 0
Python

AWS – Update massively metadata using boto (python) on multiple S3 objects

A simple script that allows you to update massively Content-Type for files on an S3 bucket.
This script is able to:

  • Browse recursively a bucket
  • Perform action only on files matching specific prefix
  • Auto-detect type of file depending on its extension

Obviously, you can add any extension you want to the function update_md to handle more if needed.

#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os, re, sys
import boto
from boto.s3.connection import S3Connection
from boto.s3.key import Key

# Variables
AWS_ACCESS_KEY_ID     = 'YOUR_AWS_ACCESS_KEY_ID'
AWS_SECRET_ACCESS_KEY = 'YOUR_AWS_SECRET_ACCESS_KEY'
AWS_BUCKET_NAME       = 'YOUR_BUCKET_NAME'

# Function to update MetaData
def update_md(k):
    """
    Update the metadata with an existing object.
    """
    # Get extension
    ext = k.name.split('.')[-1]
    if ext in ['bmp','BMP']:
        metadata = {'Content-Type':'image/bmp'}
    elif ext in ['jpg','jpeg','JPG','JPEG']:
        metadata = {'Content-Type':'image/jpeg'}
    elif ext in ['gif','GIF']:
        metadata = {'Content-Type':'image/gif'}
    elif ext in ['png','PNG']:
        metadata = {'Content-Type':'image/png'}
    elif ext in ['pdf','PDF']:
        metadata = {'Content-Type':'application/pdf'}
    elif ext in ['txt','TXT']:
        metadata = {'Content-Type':'text/plain'}
    elif ext in ['zip','ZIP']:
        metadata = {'Content-Type':'application/zip'}
    else:
        return
    # If not same type -- update
    if metadata['Content-Type'] != akey.content_type:
      akey.copy(AWS_BUCKET_NAME, k.name, metadata, preserve_acl=True)
    return k

# Main function
if __name__ == '__main__':
    # Connect to S3
    conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
    b = conn.get_bucket(AWS_BUCKET_NAME)

    # Select files to parse (prefix can be specified)
    rs = b.list(prefix="")

    # Browse files
    for k in rs:
        print k.name
        akey = b.get_key(k.name)
        # Print type before
        print "Before:",akey.content_type
        try:
            k = update_md(k)
            akey = b.get_key(k.name)
            print "After: ",akey.content_type
        except Exception,e:
            print "Content-Type not handled by this script"

    print "Script finished!"
  • Sep 18 / 2015
  • 0
Python

Python – Browse directory and push on S3 (with regex)

A simple script that can browse a directory, and upload to S3 some files matching a regex.
The file will be uploaded by respecting the path you have in local.

#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os, re
import boto
from boto.s3.connection import S3Connection
from boto.s3.key import Key

# Variables
AWS_ACCESS_KEY_ID     = 'YOUR_AWS_ACCESS_KEY_ID'
AWS_SECRET_ACCESS_KEY = 'YOUR_AWS_SECRET_ACCESS_KEY'
AWS_BUCKET_NAME       = 'YOUR_BUCKET_NAME'
DIR_TO_SCAN			  = '/path/to/your/directory/'

# Prepare regex 
r = re.compile("[0-9]+\.jpg$")

# Open connection to S3
conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
b = conn.get_bucket(AWS_BUCKET_NAME)
k = Key(b)

for root, directories, filenames in os.walk(DIR_TO_SCAN):
    for filename in filenames:
        if r.match(os.path.join(root,filename)):
            print os.path.join(root,filename)
            # Remove the full path
            tp = os.path.join(root,filename).split(DIR_TO_SCAN)[1]
            # Push file to S3
            k.key = tp
            size = k.set_contents_from_filename(os.path.join(root,filename), replace=False)
            print "%d bytes uploaded for %s"%(size, tp)


Pages:12
Question ? Contact