The django.test.client.Client

July 19, 2013

I like django and the more I work with it, the more I like it 🙂

For a unittest I needed to simulate requests coming from different remote addresses. And the django.test.client.Client makes this pretty easy:

class DistributedTestClient(Client):
    def request(self, **request):
        request["REMOTE_ADDR"] = "192.168.%i.%i" % (random.randint(1,254), random.randint(1,254))
        return super(DistributedTestClient, self).request(**request)

class DistributedClientkTestCase(TestCase):
    client_class = DistributedTestClient
    def test_distributed_meep(self):
        test_stuff()

Thanks django!

sha512crypt for node

July 7, 2013

I implemented sha512crypt in nodejs here.

$ ./demo.js pass salt
$6$salt$3aEJgflnzWuw1O3tr0IYSmhUY0cZ7iBQeBP392T7RXjLP3TKKu3ddIapQaCpbD4p9ioeGaVIjOHaym7HvCuUm0

$ python -c 'import crypt; crypt.crypt("pass", "$6$salt")
$6$salt$3aEJgflnzWuw1O3tr0IYSmhUY0cZ7iBQeBP392T7RXjLP3TKKu3ddIapQaCpbD4p9ioeGaVIjOHaym7HvCuUm0

With that, I plan to extend the PassHash firefox plugin to use that as the default algorithm for the password generation.

rapt (restricted apt wrapper)

June 26, 2013

One of the projects I created a while ago is called “rapt (restricted apt)“. As I was asked about it on irc about recently I thought I should mention it here as well 🙂

It is a python-apt app that will allow regular users to install/update software or install build-depends via sudo without giving them full root access. rapt will ensure that there is no interaction (like conffile prompts or debconf) that might allow the user to get a rootshell. It allows blacklisting and with a suiteable sources.list it is a easy way to give limited access to more trusted users. One use-case is to allow developers to install build dependencies on a staging machine.

You can install it via

$ bzr branch lp:rapt

and just run the binary via sudo (and a sudoers file that allows to run it). All it needs is python and python-apt (which is installed on most system anyway).

PassHash sha512 support

June 9, 2013

I added sha512 support to the PassHash firefox extension here (and added pull request to get it into the upstream branch). I felt its important to do this after reading this article.

ansible ad-hoc data gathering

June 1, 2013

When using ansible and its “setup” module to gather ad-hoc facts-data about multiple hosts, remember that it runs the jobs in parallel which may result in out-of-order output. With “ansible -f1” the number of parallel processes can be limited to one to ensure this won’t happen. E.g.:

$ ansible all -f1 -m setup -a filter=ansible_mounts

(the filter argument for the facts module is also a nice feature).

Ansible and the facts from the “setup” module

May 30, 2013

I recently started using ansible to automate some server administration tasks.

Its very cool and easy to learn/extend. One nice feature is the “facts” gathering. It will collect information about the host(s) and stores them in its internal variables. This is useful for conditional execution of tasks (see below) but also as a ad-hoc way to gather information like DMI information or the running kernel.

To see all “facts” known to ansible about the hosts, run:

$ ansible all -m setup

To execute tasks conditionally you can do something like this:

- name: install vmware packages
  action: apt pkg=open-vm-tools
  only_if: "'$ansible_virtualization_type' == 'VMware'"

Note that ansible 1.2+ has a different (and simpler) conditional called “when”.

Ansible is available in Ubuntu 12.04+ via:

$ sudo apt-get install ansible

It is also available in Debian unstable and testing.

git fast-import apt

May 16, 2013

Due to popular demand I moved debian apt and python-apt from bzr to git today. Moving was pretty painless:

$ git init
$ bzr fast-export --export-marks=marks.bzr -b debian/sid /path/to/debian-sid | git fast-import --export-marks=marks.git

And then a fast-import for the debian-wheezy and debian-experimental branches too. Then a

$ git gc --aggressive

(thanks to Guillem Jover for pointing this out) and that was it.

The branches are available at:

Webkitgtk & SSL

April 30, 2013

For a project of mine I created a small app based on webkitgtk that talks to a SSL server.

And I almost forgot about the libsoup default behavior for SSL certificates checking. By default libsoup and therefore webkitgtk will not do any SSL certificate checks. You need to put something like the following snippet into your code (adjust for your language of choice):

from gi.repository import WebKit

session = WebKit.get_default_session()
session.set_property("ssl-use-system-ca-file", True)

If you don’t do this it will accept any certificate (including self-signed ones).

This is documented behavior in libsoup and they don’t want to change it for compatiblity reasons in libsoup. But for webkit its unexpected behavior (at least to me) and I hope the webkitgtk developers will consider changing this default in webkit. I filed a bug about it. So if you use webkitgtk and SSL, remember to set the above property.

PassHash cmdline

April 18, 2013

I use the PassHash firefox extension to generate site-specific strong passwords. The idea behind the extension is that a master password and a siteTag (e.g. the domain name) is used to generate a sha1 hash. This hash is used as the password for the website. In python its essentially this code:

h = hmac.new(master_pass, site_tag, hashlib.sha1)
print(b64encode(h.digest())[:hash_len])

I want a commandline utility that can output me PassHash compatible hashes when I use w3m (or if the extension stops working for some reason).

To my delight I discovered that the upstream git repNice and hard to brute-force.o of PassHash already has a python helper to generate passhash compatible password. I added some tweaks to add pythons argparse [1] and now I’m really happy with it:

$ ./tools/passhash.py --hash-size 14 slashdot.org
Please enter the master key: 
KPXveo7bq7j1%X

Hard to brute-force and matches what the extension generates.

squid-deb-proxy for Debian

April 11, 2013

I uploaded squid-deb-proxy into Debian unstable today and its in the NEW queue. I created it back in the days of Ubuntu 10.04 and some people voiced interest in having it in Debian as well so I spend a bit of time to get it customized for Debian.

Squid-deb-proxy uses the well known squid proxy with a custom configuration to cache deb package and Indexfiles (like Packages.gz) that will allow caching from the default archives and mirrors and reject anything else by default.

The basic philosophy is that “it just works”. You run on your server:

root@server# apt-get install squid-deb-proxy

and on your clients:

root@client# apt-get install squid-deb-proxy-client

and that’s it. It does not require any fiddling with configuration (unless you want to 😉 ). The default will let you connect to .debian.org and nothing else.

The server will announce itself via avahi as _apt_proxy._tcp and the
client will hook into apt to use Acquire::http::ProxyAutoDetect. The
client is useful for other servers that announce themself via avahi.

Packaging was a bit more work than anticipated because there is a bit of setup and teardown work in the initscript. For Debian as sysvinit script was needed, Ubuntu uses upstart so it took a bit of refactoring to extract the code into a common helper.

If you want to try it now, its available via:

$ bzr branch lp:squid-deb-proxy
$ cd squid-deb-proxy
$ bzr-buildpackage

and in unstable once it leaves the NEW queue.