Friday, December 28, 2012

From raw idea to useful source code

A couple months ago I had an Idea.

I even blogged about it: A lookup service for US National Weather Service codes. Those codes are necessary to access their machine-readable products.

In this post, I will show how I developed the idea into some code, how I grew the code into a project, added structure and version control, and finally moved the project onto online hosting.

This is not the only way to create a project.
This is probably not the best way for many projects.
It's just the way I did it, so you can avoid the most common mistakes.

You can see my final product hosted online at Launchpad.

From Idea to Code:

I know barely enough C to be able to ask there the bathroom is, so it's easier for me to use Python.

Code starts out as a single Python script:

As we add more features, a single script gets big and unwieldly, and we break it into smaller pieces.

For example, this structure easily allows more interfaces to be added.

Let's add a dbus interface, too. Dbus will semd messages to the interface if it knows about it. Let dbus know about it using a service file.
- dbus_service_file.service

Let's add an http interface, so not everyone in the world needs to download 5-6MB of mostly-unused lookup databases:

Let's go back and formalize how we create the databases:

We have a lot of hard-coded variables in these scripts. Let's break them out into a config file.
- configfile.conf

There are other possible files, that we're not using. For example:
- Upstart config file (if we want the service to run/stop at boot or upon some system signal, lives in /etc/init)
- Udev rule file (if we want the service to run/stop when a device is plugged in, lives in /etc/udev/rules.d)

But that's a lot of files and scripts! 8 files, plus the databases.

Adding Version Control:

It's time to get serious about these eight files. We have invested a lot of time creating them, and it's time to start organizing the project so others can contribute, so we can track new bugs and features, and to protect all our invested work.

First, we need to introduce version control. Ideally, we would have done that from the start. But we didn't. So let's fix that.

Version control offers a lot of advantages:
    We can undo mistakes.
    It helps us package the software later.
    It helps us track bugs.
    It helps us apply patches.
    It helps us document changes.

There are plenty of good version control systems available. For this example, I'll use bazaar. The developers have a very good tutorial.

Installing bazaar:

$ sudo apt-get install bzr
$ bzr whoami "My Name "

Since we didn't start with proper version control, we need to create a new directory using version control, move our files into it, and add our files to version control.

$ bzr init-repo My_NEW_project_directory
$ bzr init My_NEW_project_directory
$ mv My_OLD_project_directory/* My_NEW_project_directory/
$ cd My_NEW_project_directory
$ bzr add *

Finally, we need to clean up the old directory, and commit the changes.

$ rm ../My_OLD_project_directory
$ bzr commit -m "Initial setup"

Organizing the code

My project directory is starting to get disorganized, with eight scripts and files, plus six database files, plus version control, plus more to come. I'm going to restructure my project folder like this:

  +-- data   (all the database files)
  +-- src    (all the python scripts and other non-data files)
  +-- .bzr   (bzr's version control tracking)

Once version control is active, we cannot just move things around. We need to use the version control tools so it can keep tracking the right files.

$ bzr mkdir data src
$ bzr mv *.gz data/
$ bzr mv *.py src/
$ bzr mv dbus_service_file.service src/
$ bzr mv configfile.conf src/

See how bazaar adds the directories and performs the moves?

Now My_project_directory should be empty of files. Once reorganization is complete, remember to commit the change:

$ bzr commit -m "Reorganize the files to a better project structure"

Integrating into the system:

We have a problem with our eight files. They run beautifully, but only if they are in our home directory.

That won't work in the long run. A server should not be run as a user with shell access - that's a security hole. Nor should it be run out of a user's /home. Nor should it be run as root. Also, other applications that are looking for our server won't find it - all the files are in the wrong places.

So we need to put our files into the right places. And often that means fixing the scripts to replace hard-coded temporary paths (like '~/server/foo') with the proper locations ('/usr/lib/foo-server/foo').

Where are the right places?

The Linux Filesystem Hierarchy Standard (FHS) is used by Debian to define the right places.

Two files are directly user-launched in regular use:
- /usr/bin
- /usr/bin

The database files are read-only and available to any application:
- database files: /usr/shared

Three files are launched or imported by other applications or scripts:
- /usr/lib
- /usr/lib
- /usr/lib

One file is very rarely user-launched under unusual circumstances:

The  dbus service file will be looked for by dbus in a specific location:
- geolocation_dbus.service: /usr/share/dbus-1/services

Config files belong in /etc
- geolocation.conf: /etc

Makefiles make organization easier:

Now that we know where the right places are, let's create a Makefile that will install and uninstall the files to the right place. Our originals stay where they are - the makefile copies them during the install, and deletes the copies during uninstall.

Makefiles are really config files for the make application (included in the build-essential metapackage). Makefiles tell make which files depend upon which, which files to compile (we won't be compiling), and where the installed application files should be located, and how to remove the application.

Here is a sample makefile for my project (wbs-server):
DATADIR = $(DESTDIR)/usr/share/weather-location
LIBDIR  = $(DESTDIR)/usr/lib/wbs-server
BINDIR  = $(DESTDIR)/usr/bin
DBUSDIR = $(DESTDIR)/usr/share/dbus-1/services
CACHDIR = $(DESTDIR)/var/cache/wbs-webserver

 # Indents use TABS, not SPACES! Space indents will cause make to fail
 mkdir -p $(DATADIR)
 cp data/*.gz $(DATADIR)/

 mkdir -p $(LIBDIR)
 cp src/ $(LIBDIR)/
 cp src/ $(LIBDIR)/
 cp src/ $(LIBDIR)/
 cp src/ $(LIBDIR)/

 cp src/ $(BINDIR)/
 cp src/ $(BINDIR)/
 cp src/wbs-server.service $(DBUSDIR)/
 cp src/confile.conf $(CONFDIR)/wbs-server.conf
 mkdir -p $(CACHDIR)

 rm -rf $(DATADIR)
 rm -rf $(LIBDIR)

 rm -f $(BINDIR)/
 rm -f $(BINDIR)/
 rm -f $(DBUSDIR)/wbs-server.service
 rm -f $(CONFDIR)/wbs-server.conf
 rm -rf $(CACHDIR)

Let's save the makefile as Makefile, and run it using sudo make install and sudo make uninstall.

We run a test:

$ sudo make install
$ /usr/bin/ zipcode 43210
bash: /usr/lib/wbs-server/ Permission denied

Uh-oh. Let's investigate:

$ ls -l /usr/lib/wbs-server/ 
-rw-r--r-- 1 root root 3287 Dec 23 20:46 /usr/lib/wbs-server/

Aha. Permissions are correct, but the executable flag is not set. Let's uninstall the application so we can fix the makefile.

$ sudo make uninstall

In the makefile, we can make a few changes if we wish. We can set the executable flag. We can also create links or symlinks, or rename the copy.

For example, is a rather obtuse name for a command-line executable. Instead of copying it to /usr/bin, let's copy it to /usr/lib with its fellow scripts, make it executable, and create a symlink to /usr/bin with a better name like 'weather-lookup'

 cp src/ $(LIBDIR)/
 chmod +x $(LIBDIR)/
 ln -s $(LIBDIR)/ $(BINDIR)/weather-lookup

 rm -f $(BINDIR)/weather-lookup

Another example: It's a bad idea to run a webserver as root. So let's add a couple lines to the makefile to create (and delete) a separate system user to run the webserver.

USERNAM = wbserver
 adduser --system --group --no-create-home --shell /bin/false $(USERNAM)
 cp chgrp $(USERNAM) $(LIBDIR)/*
 cp chgrp $(USERNAM) $(CACHDIR)
 # Launch the webserver using the command 'sudo -u wbserver wbs-server'

 deluser --system --quiet $(USERNAM)

Sharing the code

Now we have a complete application, ready to distribute.

Thanks to the Makefile, we include a way to install and uninstall.

It's not a package yet. It's not even a source package yet. It's just source code and an install/uninstall script.

We can add a README file, a brief description of how to install and use the application.

We can also add an INSTALL file, detailed instructions on how to unpack (if necessary) and install the application.

It would be very very wise to add a copyright and/or license file, so other people know how they can distribute the code.

After all that, remember to add those files to version control! And to finally commit the changes:

bzr commit -m "Initial code upload. Add README, INSTALL, copyright, and license files."

Finally, we need a place to host the code online. Since I already have a Launchpad account and use bzr, I can easily create a new project on bzr.

And then uploading the version-controlled files is as simple as:

bzr launchpad-login my-launchpad-name
bzr push lp:~my-launchpad-name/my-project/trunk 

You can see my online code hosted at Launchpad.

Next time, we'll get into how to package this source code.

Sunday, December 16, 2012

Very Simple Database in Python

Experimenting with big lookup tables for my weather code lookup server. Instead of using a big configparse file, I want to try a small database.

Python's dbm bindings are included in the default install of Ubuntu. It's light and easy to use.

#!/usr/bin/env python3
import dbm.gnu                # python3-gdbm package
zipcodes = '/tmp/testdb'

# Create a new database with one entry
# Schema: Key is Zipcode
# Value is Observation_Station_Code, Radar_Station_Code, Forecast_Zone
zipc =, 'c')
zipc['53207'] = b'kmke,mkx,wiz066'

# Close and reopen the database
zipd =, 'r')

# List of database keys
keys = zipd.keys()

# Retrieve and print one entry


It works very well and is very fast. It's not easy to view or edit the database with other applications, since it is binary (not text).

Saturday, November 24, 2012

Dbus Tutorial - GObject Introspection instead of python-dbus

Network Manager
Create a Service
GObject Introspection

In previous posts, I have looked at using the python-dbus to communicate with other processes, essentially using it the same way we use the dbus-send command.

There is another way to create DBus messages. It's a bit more complicated than python-dbus, and it depends upon Gnome, but it's also more robust and perhaps better maintained.

Using Gobject Introspection replacement for python-dbus is described several places, but the best example is here. Python-dbus as a separate bindings project has also suffered with complaints of "lightly maintained," and an awkward method of exposing properties that has been unfixed for years.

These examples only work for clients.  Gnome Bug #656330 shows that services cannot yet use PyGI.

Here's an example notification using Pygi instead of Python-DBus. It's based on this blog post by Martin Pitt, but expanded a bit to show all the variables I can figure out....

1) Header and load gi

#!/usr/bin/env python3

import gi.repository
from gi.repository import Gio, GLib

2) Connect to the DBus Session Bus

session_bus = Gio.BusType.SESSION
cancellable = None
connection = Gio.bus_get_sync(session_bus, cancellable)

3) Create (but don't send) the DBus message header

proxy_property = 0
interface_properties_array = None
destination = 'org.freedesktop.Notifications'
path = '/org/freedesktop/Notifications'
interface = destination
notify = Gio.DBusProxy.new_sync(

4) Create (but don't send) the DBus message data
The order is determined by arg order of the Notification system

application_name = 'test'
title = 'Hello World!'
body_text = 'Subtext'
id_num_to_replace = 0
actions_list = []
hints_dict = {}
display_milliseconds = 5000
icon = 'gtk-ok'  # Can use full path, too '/usr/share/icons/Humanity/actions/'
args = GLib.Variant('(susssasa{sv}i)', (

5) Send the DBus message header and data to the notification service

method = 'Notify'
timeout = -1
result = notify.call_sync(method, args, proxy_property, timeout, cancellable)

6) (Optional) Convert the result value from a Uint32 to a python integer

id = result.unpack()[0]

Play with it a bit, and you will quickly see how the pieces work together.

Here is a different, original example DBus client using introspection and this askubuntu question. You can see this is a modified and simplified version of the above example:

#!/usr/bin/env python3
import gi.repository
from gi.repository import Gio, GLib

# Create the DBus message
destination = 'org.freedesktop.NetworkManager'
path        = '/org/freedesktop/NetworkManager'
interface   = 'org.freedesktop.DBus.Introspectable'
method      = 'Introspect'
args        = None
answer_fmt  = ('(v)')
proxy_prpty = Gio.DBusCallFlags.NONE
timeout     = -1
cancellable = None

# Connect to DBus, send the DBus message, and receive the reply
bus   = Gio.bus_get_sync(Gio.BusType.SYSTEM, None)
reply = bus.call_sync(destination, path, interface,
                      method, args, answer_fmt,
                      proxy_prpty, timeout, cancellable)

# Convert the result value to a formatted python element

Here is a final DBus client example, getting the properties of the current Network Manager connection

#!/usr/bin/env python3
import gi.repository
from gi.repository import Gio, GLib

# Create the DBus message
destination = 'org.freedesktop.NetworkManager'
path        = '/org/freedesktop/NetworkManager/ActiveConnection/19'
interface   = 'org.freedesktop.DBus.Properties'
method      = 'GetAll'
args        = GLib.Variant('(ss)', 
              ('org.freedesktop.NetworkManager.Connection.Active', 'None'))
answer_fmt  = ('(v)')
proxy_prpty = Gio.DBusCallFlags.NONE
timeout     = -1
cancellable = None

# Connect to DBus, send the DBus message, and receive the reply
bus   = Gio.bus_get_sync(Gio.BusType.SYSTEM, None)
reply = bus.call_sync(destination, path, interface,
                      method, args, answer_fmt,
                      proxy_prpty, timeout, cancellable)

# Convert the result value to a useful python object and print
[print(item[0], item[1]) for item in result.unpack()[0].items()]

As you can see from this example, dbus communication is actually pretty easy using GLib: Assign the nine variables, turn the crank, and unpack the result.

GeoClue vs Geocode-Glib

This post has been superseded by a more recent post with updated information.  

GeoClue, used in Gnome and Unity, is a Dbus service that consolidates location input from multiple sources to estimate a best location. It's lightly maintained, and the most recent maintainer has essentially deprecated it in favor of his newer Geocode-Glib

That announcement is here, plus a bit of reading-between-the-lines and a few subsequent clarifications.

The big advantage if geocode-glib0, is that it uses GObject introspection instead of specialty DBus bindings. The big disadvantage is that it leaves KDE and other non-Gnome environments unsupported...and has less functionality that the older Geoclue.

For now, it looks like I need to stick with GeoClue, and perhaps even help maintain it for my current weather-backend project. geocode-glib simply doesn't do the job I need GeoClue to do.

Here is an example of using python and geocode-glib to get geolocation information. I have not seen any examples of geocode-glib anywhere else, so I may be first here to use the lib with python:

#!/usr/bin/env python3
import gi.repository
from gi.repository import GeocodeGlib

# Create an object and add location information to it 
location = GeocodeGlib.Object()

# Do the gelocation and print the response dict
result = location.resolve()
[print(item[0], item[1]) for item in result.items()]

Give it a try...
The relevant packages are libgeocode-glib0 and gir1.2-geocodeglib-1.0
The Geocode-Glib source code and API reference are hosted at Gnome.

Saturday, November 3, 2012

NEW - US National Weather Service geolocation lookup server

Just completed my first hack of a US National Weather Service (NWS) geolocation lookup server.

It doesn't tell you weather...instead it tells you the best Observation Station code and Forecast Zone code so your application can pull the right data directly from the NWS servers.

NWS lookup is important to US users for obvious reasons. Internationally, it's important, too - NWS is one of the easiest and most open sources of international METAR current-condition reports.

Proprietary services like wunderground,, and Yahoo! will feed weather data based on raw locations...but as the recent and sudden Google Weather shutdown showed us, it's good to keep more than one arrow in your quiver. And this doesn't require registration nor an API's totally free.

This lookup feature has been sorely lacking. In the past, you needed to scrape a NWS web page to figure out those codes...or ask the user to go find them somehow.


Here's an example of how to use the lookup: I have a zipcode (53207), and I need to know the appropriate Observation Station to get current conditions from:

$ wget -q -O - \
               | sed 's/.*<station_code>\(.*\)<\/station_code>.*/\1/'


The local station is 'kmke'. Plug that into the NWS current conditions URL:

$ wget -q -O -

Nov 03, 2012 - 06:52 PM EDT / 2012.11.03 2252 UTC
Wind: from the NE (040 degrees) at 6 MPH (5 KT):0
Visibility: 10 mile(s):0
Sky conditions: mostly cloudy
Temperature: 39.0 F (3.9 C)
Dew Point: 30.0 F (-1.1 C)
Relative Humidity: 69%
Pressure (altimeter): 30.19 in. Hg (1022 hPa)
ob: KMKE 032252Z 04005KT 10SM FEW035 SCT120 BKN250 04/M01 A3019 RMK AO2 SLP231 T00391011
cycle: 23

How to use it

The lookup testing server at accepts 5-digit zipcodes (53207), three-letter airport codes(mke), place names (Milwaukee,WI), and lat/lon (lat=42.57n,lon=87.54w). Lat/Lon is available worldwide, and will return the closest METAR location, the rest are US-only.,CA,lon=32.443E

It returns an XML string. The string includes three sections: A data_source credit, the original parsed request, and the location data with the Observation Station code (kmke), a descriptive location string (General Mitchell International Airport, WI), and a Forecast Zone code (wiz066). Non-US METAR sites lack the Forecast Zone, since NWS creates forecasts for the US only.

  <data_source>weather-util data files, Oct 25, 2012, v2.0,</data_source>
    <station_location>Entebbe Airport, Uganda</station_location>

Limitations on name searches

Some large city requests (City,St) may return multiple locations, or unexpected results since the city may have multiple observation stations or multiple forecast zones within the boundaries. This is a limitation of the public data sets. For example, a search for /place/New+York+City,NY will return an error, since that's not really the city's name. A search for /place/Brooklyn,NY will return a closest Observation Station at Central Park instead of Kennedy Airport. Zipcode searches will generally return more accurate results.

Usage limits

The testing server is limited to five requests per day, so cache those results! Or just contact me if you want to be on the 'unlimited' list. Throttling is merely to spare my server from abuse - it's not meant to be a tease.

Give it a try, and let me know what you think.
If you find a bug, please e-mail me instead of leaving a comment.

Data sets and Credits

The dataset is merely the weather-util-data package, a set of reformatted and correlated lookup tables of freely-available US Government geospatial information. This USG public data is available for public use without copyright or restriction.

Thanks to the US National Weather Service for public access to so many cool products.

Special thanks to Jeremy Stanley, developer of the weather-util* packages, who created most of the important geolocation code and figured out a way to create the lookup tables. I merely wrapped an XML server around those tables. His code is released under the ISC license, which is GPL-compatible, and authorizes reuse.

Source code for this 0.0.1 (still hacking) version is at, since I haven't actually organized a project around this script yet.

Saturday, October 13, 2012

Dbus Tutorial - Create a service

Network Manager 
Create a Service
GObject Introspection

A dbus service is usable by other applications. It listens for input from another process, and responds with output.

When you create a service, you need to make a couple decisions about when you want to start your service, and when you want to terminate it...

Start: Startup, login, first-use, on-demand?
End: Each time? logout? Shutdown?
In other words, is this a single-use service, or a forever-running daemon?

Happily, the actual code differences are trivial. Dbus itself can launch a service that's not running yet. (Indeed, a lot of startup and login depends on that!)

Example Dbus Service in Python3

Here's an example of  a self-contained daemon written in Python 3 (source). It's introspectable and executable from dbus-send or d-feet. When called, it simply returns a "Hello, World!" string.

It can be started by either dbus or another process (like Upstart or a script). Since it runs in an endless loop awaiting input, it will run until logout. It can also be manually terminated by uncommenting the Gtk.main_quit() command.

# This file is /home/me/
# Remember to make it executable if you want dbus to launch it
# It works with both Python2 and Python3

from gi.repository import Gtk
import dbus
import dbus.service
from dbus.mainloop.glib import DBusGMainLoop

class MyDBUSService(dbus.service.Object):
    def __init__(self):
        bus_name = dbus.service.BusName('', bus=dbus.SessionBus())
        dbus.service.Object.__init__(self, bus_name, '/org/me/test')

    def hello(self):
        #Gtk.main_quit()   # Terminate after running. Daemons don't use this.
        return "Hello,World!"

myservice = MyDBUSService()

Daemon that runs all the time

Just run the script at startup (or login). Or send a dbus-send message to the service, and dbus will start it. It will be terminated as part of shutdown (or logout). While it's running, it's introspectable and visible from d-feet.

Dbus-initiated start

Add a .service file. This file simply tells dbus how to start the service.

Here's an example service file:

# Service file: /usr/share/dbus-1/services/test.service
[D-BUS Service]

Dbus should automatically pick up the new service without need for any restart. Let's test if dbus discovered the service:

$ dbus-send --session --print-reply \
       --dest="org.freedesktop.DBus" \
       /org/freedesktop/DBus \
       org.freedesktop.DBus.ListActivatableNames \
       | grep test
string ""

The new service does not show up in d-feet until after it is run the first time, since before there is nothing to probe or introspect. But it does exist, and is findable and usable by other dbus-aware applications.

Let's try the new service:

$ dbus-send --session --print-reply \
--dest="" /org/me/test

method return sender=:1.239 -> dest=:1.236 reply_serial=2
   string "Hello,World!"

$ dbus-send --session --print-reply \
--dest="" /org/me/test

Error org.freedesktop.DBus.Error.UnknownMethod: Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/dbus/", line 654, in _message_cb
    (candidate_method, parent_method) = _method_lookup(self, method_name, interface_name)
  File "/usr/lib/python3/dist-packages/dbus/", line 246, in _method_lookup
    raise UnknownMethodException('%s is not a valid method of interface %s' % (method_name, dbus_interface))
dbus.exceptions.UnknownMethodException: org.freedesktop.DBus.Error.UnknownMethod: Unknown method: Frank is not a valid method of interface

It worked! Dbus launches the script, waits for the service to come up, then asks the service for the appropriate method.Upon execution of the method, the waiting loop terminates, and the script finishes and shuts down.

As a test, you can see that 'hello' is indeed a valid method and returns a valid response, while the invalid method 'Frank' causes a not-found error.

Dbus-initiated stop

Dbus doesn't stop scripts or processes. But a script can stop itself.

In order to wait for input, python-dbus uses a Gtk.main() loop. In this case, simply uncomment the line Gtk.main_quit(). When the method is called,the main() loop gets terminated, and the script continues to the next loop or end.

If you use on-demand starting and stopping, be aware that the service will exist, but will be visible in d-feet or introspectable only for the few seconds it's actually running.

Obsolete: Before python included introspection, you needed to include an interface definition. But you don't need this anymore - introspection seems to have replaced it. Avoid confusion - some old tutorials out there still include it.

<?xml version="1.0" encoding="UTF-8"?>
<!-- /usr/share/dbus-1/interfaces/ -->
<node name="/org/me/test">
        <interface name="">
                <annotation name="org.freedesktop.DBus.GLib.CSymbol" value="server"/>
                <method name="EchoString">
                        <arg type="s" name="original" direction="in" />
                        <arg type="s" name="echo" direction="out" />
                <!-- Add more methods/signals if you want -->

Wednesday, October 3, 2012

Python 3: Using httplib2 to download just a page header

I want to use the National Weather Service geolocator for my private weather script.

When my laptop moves to a new location, the script automatically figures out the correct weather to show.

The NWS geolocator uses web page redirection. If I tell it that I want the web page for "Denver, CO," it redirects me to a web page for the appropriate Latitude/Longitude. I don't actually want the web page - I get the data from other sources...but I do sometimes want that redirect so I can parse the lat/lon pair.

>>> import httplib2
>>> url = ""
>>> h = httplib2.Http()
>>> h.follow_redirects = False
>>> head = h.request(url, "HEAD")
>>> head
({'status': '302', 'connection': 'Keep-Alive', 'location': '', 'content-length': '0', 'server': 'BigIP'}, b'')
>>> head[0]['location']

Saturday, September 29, 2012

Python3 configparse

configparse is a Python Module, part of the standard library, that read and writes config files and treats them almost like python dictionaries.

Well, almost.

Example dictionary
>>> test = {}
>>> test['Cleese'] = {'a':'1', 'b':'2'}
>>> test['John'] = {'f':True, 'g':'Hello'}

Example config object
>>> import configparser
>>> test = configparser.ConfigParser()
>>> test['Cleese'] = {'a':'1', 'b':'2'}
>>> test['John'] = {'f':True, 'g':'Hello'}

First, let's get the top-level list of sections:

Get the list of sections for a dict
{'John': {'g': 'Hello', 'f': True}, 'Cleese': {'a': '1', 'b': '2'}}
>>> test.keys()
dict_keys(['John', 'Cleese'])
>>> list(test.keys())
['John', 'Cleese']

Let's try the same for a config object:
>>> test              
<configparser .configparser=".configparser" 0xb755c34c="0xb755c34c" at="at" object="object">
>>> test.keys()
KeysView(<configparser .configparser=".configparser" 0xb755c34c="0xb755c34c" at="at" object="object">)
>>> test.sections()
['Cleese', 'John']

Second, let's get the list of keys in one section:

Get the list of keys for one section of a dict
>>> test['Cleese']
{'a': '1', 'b': '2'}
>>> list(test['Cleese'].keys())
['a', 'b']

Same task for a config object:
>>> test['Cleese']
<section: cleese="cleese">
>>> test['Cleese'].sections()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: 'SectionProxy' object has no attribute 'sections'
>>> [a for a in test['Cleese']]
['a', 'b']

Third, let's get the key-value pairs in the same section:

Get the key-value pair for one section of a dict
>>> test['Cleese']
{'a': '1', 'b': '2'}
>>> [(a, test['Cleese'][a]) for a in test['Cleese'].keys()]
[('a', '1'), ('b', '2')]
>>> [print(a, test['Cleese'][a]) for a in test['Cleese'].keys()]
a 1
b 2
[None, None]

Same task for a config object:
>>> [(a, test['Cleese'][a]) for a in test['Cleese']]
[('a', '1'), ('b', '2')]
>>> [print(a, test['Cleese'][a]) for a in test['Cleese']]
a 1
b 2
[None, None]

Finally, let's dump ALL the key-value pairs in the whole object:

Iterate and dump all key-value pairs in a dict
>>> [[print(b, test[a][b]) for b in test[a].keys()] for a in test.keys()]
g Hello
f True
a 1
b 2
[[None, None], [None, None]]

Dump all in a configparse object
>>> [[print(b, test[a][b]) for b in test[a]] for a in test]
a 1
b 2
g Hello
f True
[[], [None, None], [None, None]]
>>> [[print(b, test[a][b]) for b in test[a].keys()] for a in test.sections()]
a 1
b 2
g Hello
f True
[[None, None], [None, None]]

Saturday, September 22, 2012

Dbus Tutorial - Fun with Network Manager

Introspection: figuring out the rules 
Fun with Network Manager
Create a service 
Gobject Introspection

Let's figure out how to use  dbus to get detailed information out of Network Manager (NM), and then to plug our own information back into NM.

Example #1: This script determines if a specific network is the active connection. This is handy for, say, mapping network printers or networked drives, or setting the time, or automating backups, or lots of other stuff.

# What network am I on?

# Get the Active Connection path
# Result should be like: /org/freedesktop/NetworkManager/ActiveConnection/187
active_connection_path=$( dbus-send --system --print-reply \
                          --dest=org.freedesktop.NetworkManager \
                          /org/freedesktop/NetworkManager \
                          org.freedesktop.DBus.Properties.Get \
                          string:"org.freedesktop.NetworkManager" \
                          string:"ActiveConnections" \
                          | grep ActiveConnection/ | cut -d'"' -f2 )

# Get the Access Point path
# Result should be like: /org/freedesktop/NetworkManager/AccessPoint/194
access_point_path=$( dbus-send --system --print-reply \
                     --dest=org.freedesktop.NetworkManager \
                     "$active_connection_path" \
                     org.freedesktop.DBus.Properties.Get \
                     string:"org.freedesktop.NetworkManager.Connection.Active" \
                     string:"SpecificObject" \
                     | grep variant | cut -d'"' -f2 )

# Get the Access Point ESSID
# Result should be something like "NETGEAR"
essid=$( dbus-send --system --print-reply \
                   --dest=org.freedesktop.NetworkManager \
                   "$access_point_path" \
                   org.freedesktop.DBus.Properties.Get \
                   string:"org.freedesktop.NetworkManager.AccessPoint" \
                   string:"Ssid" \
                   | grep variant | cut -d'"' -f2 )

# If we are on the HOME network
if [ "$essid"=="MyHomeNet" ]; then
   # Do network-specific changes here

elif [ "$essid"=="WorkCorporateNet" ]
   # Do network-specific changes here

   # Do changes for unrecognized network or no network at all here

exit 0

Example #2: Disable networking

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Set \
            string:"org.freedesktop.NetworkManager" \
            string:"NetworkingEnabled" \

Example #3: Enable networking. It's exactly the same as the previous example, except for the last line.

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Set \
            string:"org.freedesktop.NetworkManager" \
            string:"NetworkingEnabled" \

Example #4: Check networking status

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager" \

method return sender=:1.4 -> dest=:1.325 reply_serial=2
   variant       boolean true

Example #5: Disable Wireless

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Set \
            string:"org.freedesktop.NetworkManager" \
            string:"WirelessEnabled" \

Example #6: Enable Wireless

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Set \
            string:"org.freedesktop.NetworkManager" \
            string:"WirelessEnabled" \

Example #7: Check Wireless Status

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager" \

method return sender=:1.4 -> dest=:1.326 reply_serial=2
   variant       boolean true

Example #8:

Example #9: List all active network connections

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager" \

method return sender=:1.4 -> dest=:1.328 reply_serial=2
   variant       array [
         object path "/org/freedesktop/NetworkManager/ActiveConnection/18"

Example #10: Which interface is the active connection using?

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/ActiveConnection/18 \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager.Connection.Active" \

method return sender=:1.4 -> dest=:1.331 reply_serial=2
   variant       array [
         object path "/org/freedesktop/NetworkManager/Devices/0"

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/Devices/0 \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager.Device" \

method return sender=:1.4 -> dest=:1.332 reply_serial=2
   variant       string "wlan0"

Example #11: Get the current wireless access point

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/ActiveConnection/18 \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager.Connection.Active" \

method return sender=:1.4 -> dest=:1.334 reply_serial=2
   variant       object path "/org/freedesktop/NetworkManager/AccessPoint/209"

Example #12: Get the list of all visible wireless access points

$ dbus-send --system --print-reply \
            /org/freedesktop/NetworkManager/Devices/0 \

method return sender=:1.4 -> dest=:1.333 reply_serial=2
   array [
      object path "/org/freedesktop/NetworkManager/AccessPoint/209"
      object path "/org/freedesktop/NetworkManager/AccessPoint/208"
      object path "/org/freedesktop/NetworkManager/AccessPoint/207"
      object path "/org/freedesktop/NetworkManager/AccessPoint/206"

Example #13: Read the SSID of the active wireless access point

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/AccessPoint/209 \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager.AccessPoint" \

method return sender=:1.4 -> dest=:1.335 reply_serial=2
   variant       array of bytes "NETGEAR"

Example #14: Read the signal strength of the active wireless point

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/AccessPoint/209 \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager.AccessPoint" \

method return sender=:1.4 -> dest=:1.340 reply_serial=2
   variant       byte 54
# 54 in byte (hexadecimal) = 84 in decimal. Strength = 84%

Example #15: Read the stored NM connection that is currently active

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/ActiveConnection/18 \
            org.freedesktop.DBus.Properties.Get \
            string:"org.freedesktop.NetworkManager.Connection.Active" \

method return sender=:1.4 -> dest=:1.379 reply_serial=2
   variant       object path "/org/freedesktop/NetworkManager/Settings/10"

Example #16: Disconnect from the current network connection (auto-reconnect)

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager \
            org.freedesktop.NetworkManager.DeactivateConnection \

method return sender=:1.4 -> dest=:1.370 reply_serial=2

Example #17: Disconnect from the current network connection and stay disconnected

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/Devices/0 \

method return sender=:1.4 -> dest=:1.354 reply_serial=2

Example #18: Connect to a specific wireless network

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            org/freedesktop/NetworkManager \
            org.freedesktop.NetworkManager.ActivateConnection \
            objpath:"/org/freedesktop/NetworkManager/Settings/10" \
            objpath:"/org/freedesktop/NetworkManager/Devices/0" \

method return sender=:1.4 -> dest=:1.382 reply_serial=2
   object path "/org/freedesktop/NetworkManager/ActiveConnection/20"

Example #19: Dump all information about an access point

$ dbus-send --system --print-reply \
            --dest=org.freedesktop.NetworkManager \
            /org/freedesktop/NetworkManager/Settings/10 \

method return sender=:1.4 -> dest=:1.386 reply_serial=2
   array [## EDITED - it's really long]

Unity Application Indicators using Python 3

Unity replaced the old mess of confusing and inconsistent individual notifications with a standardized set of Application Indicators, each at the head of a customized menu.

Here's how to create your own Indicator and associated menu using Python 3.

Prerequisite packages: python3-gi

sudo apt-get install python3-gi

Documentation and API reference for python 3 and GTK+ 3.
Documentation for the Unity-specific AppIndicator class.

Hello World: Let's use the test script at There are two python scripts, use the PyGI. The other, PyGTK, is deprecated and Gtk is now included in PyGI.

The Hello World script is a bit long, so I won't copy it here. But it's easy enough to cut-and-paste.

The key points are:

1) Create an Application Indicator object:
      id = "Name of this client application"
      icon_name = "icon-name"                   Icons are in /usr/share/icons/ubuntu-mono-*-16
      category = appindicator.IndicatorCategory.APPLICATION_STATUS      List of categories
      indicator = (id, icon_name, category)

2) Create the associated menu:

    menu = Gtk.Menu()
    item = Gtk.MenuItem("Menu Item #1")
    item = Gtk.MenuItem("Menu Item #2")

3) Attach the menu to the Application Indicator object


4) Make it work


Garmin etrex Vista and Linux

I have a new (to me) ancient Garmin etrex Vista handheld GPS ($30, ebay). It has a custom serial connector on the back, takes AA batteries, and happily talks in MGRS in addition to degree-minute-second. MGRS was the key - I need it to be compatible when I go out to the field with the Army Reserve.

Onboard features:
Discovering the Software Version and Unit ID: Main Menu -> Setup -> System -> (Zoom) Software Version.

Problem #1:
The date is showing as 2051. That's a known bug, and this page shows how to fix it by hard-resetting the device.

The gpsbabel package, in the Debian Archives and Ubuntu Repositories, can transfer waypoints, tracks, and routes (but not maps) to/from the device. See the online documentation. It's a command-line application. Here are some example commands:

gpsbabel -D9                                                           # Debug verbose flag
gpsbabel -i garmin,get_posn -f /dev/ttyUSB0 -o kml -F myposition.kml   # Get current position
gpsbabel -o garmin,power_off -F /dev/ttyS0                             # Send power-off command

Thursday, September 6, 2012

US National Weather Service info feeds

I'm looking at updating my old desktop-weather script, so today I researched some of the weather -related information sources I can use.

There is a lot out there, quite a different landscape than three years ago. Location services, for example, to tell you where the system thinks you are, seem to be pretty mature. Gnome includes one based on both IP address and GPS.

In the US, the National Weather Service (NWS at has a nifty service to translate approximate locations (like postal codes) into approximate lat/lon coordinates. This is handy because most of their services use lat/lon coordinates. They have a "use us, but don't overuse us to the point of a Denial-of-Service-attack" policy.

The good old METAR aviation weather system keeps chugging along. Indeed, my old desktop script scraped METAR current condition reporting, and I likely will again. It's great for current conditions, and in places where METAR airports are frequent. It's not so good for forecasts or alerts...or for places not close to an airport.

Weather Underground is really incredible. Lots of worldwide data, an apparently stable API for it...but their terms of service seem oriented toward paid app developers. Maybe another time.

Looking at most of the desktop-weather info (not research or forecasting) systems out there, most seem to use METAR data...often funnelled through an intermediary like or Google.

It's a geographically and systemically fragmented market. So this time let's see how I can improve my existing NWS feed.

1) Geolocation

I change locations. It would be nice if the system noticed.

The National Weather Service (NWS) has a geolocation service...but they don't advertise it.
These services are intended for their customers - don't spam them with unrelated requests!

NWS Geolocation converts a City/State pair, a Zipcode, or an ICAO airport code into the appropriate latitude/longitude pair.

Here's an example of geolocation. Let's use an ICAO airport code (kmke), and see how the server redirects to an URL with Lat/Lon:

$ wget -O - -S --spider
Spider mode enabled. Check if remote file exists.
--2012-10-03 15:08:40--
Resolving (,
Connecting to (||:80... connected.
HTTP request sent, awaiting response... 
  HTTP/1.1 302 Moved Temporarily
  Server: Apache/2.2.15 (Red Hat)
  Content-Type: text/html; charset=UTF-8
  Content-Length: 0
  Cache-Control: max-age=20
  Expires: Wed, 03 Oct 2012 20:09:01 GMT
  Date: Wed, 03 Oct 2012 20:08:41 GMT
  Connection: keep-alive
Location: [following]
Spider mode enabled. Check if remote file exists.
--2012-10-03 15:08:41--
Connecting to (||:80... connected.
HTTP request sent, awaiting response... 
  HTTP/1.1 200 OK
  Server: Apache/2.2.15 (Red Hat)
  Content-Type: text/html; charset=UTF-8
  Cache-Control: max-age=82
  Expires: Wed, 03 Oct 2012 20:10:03 GMT
  Date: Wed, 03 Oct 2012 20:08:41 GMT
  Connection: keep-alive
Length: unspecified [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.

So kmke is located near lat=42.96&lon=-87.9

We can script this to reduce the output. Let's try it with a zipcode:

header=$(wget -O - -S --spider -q "$zipcode" 2>&1)
radr=$(echo "$header" | grep | cut -d'&' -f3 | cut -d'=' -f2)
lat=$(echo "$header" | grep | cut -d'&' -f4 | cut -d'=' -f2)
lon=$(echo "$header" | grep | cut -d'&' -f5 | cut -d'=' -f2)
echo "Result: $radr  $lat  $lon"
Result: ILN  39.9889  -82.9874

Let's try it with a City, ST pair. Replace all spaces ' ' with '+', and the comma is important!

$ wget -O - -S --spider -q,+CA

Finally, it also works with zip codes:

$ wget -O - -S --spider -q

Alternative: This is a small script that estimates location, or accepts a manually-entered zipcode for a location. If run during during network connection (by Upstart or by the /etc/network/if-up.d/ directory) it will determine an approximate Latitude and Longitude (within a zip code or two, perhaps)...close enough for weather. This assumes, of course, that GPS is not available, and that the system is not travelling far while online.

# Usage $ script zipcode

strip_tags () { sed -e 's/<[^>]*>//g'; }

# Determine latitude and longitude from USA zipcode using the 
# Weather Service's National Digital Forecast Database (NDFD) 
manual_zipcode () {   
   xml_location=$(wget -q -O -${1})
   lat=$(echo $xml_location | strip_tags | cut -d',' -f1)
   lon=$(echo $xml_location | strip_tags | cut -d',' -f2)

# Try to get a close lat/lon using the IP address
ip_lookup () { ip_location=$(wget -q -O - )
   lat=$(echo "$ip_location" | grep "li>Latitude" | cut -d':' -f2 | cut -c2-8)
   lon=$(echo "$ip_location" | grep "li>Longitude" | cut -d':' -f2 | cut -c2-8)
   zipcode=$(echo "$ip_location" | grep "li>Zip or postal code" | cut -d':' -f2 | cut -c2-6 )
   echo "Estimating location as zipcode ${zipcode}";}

# Test that a Zip Code was included in the command.
if [ "$(echo $1 | wc -c)" -eq 6 ]; then
   manual_zipcode $1
   # Test that the manual zipcode is valid.
   if [ $(echo "$lat" | wc -c) -eq 1 ]; then
      echo "$1 is not a valid US zipcode. Trying to calculate based on IP address..."


echo "Zip code $zipcode is located at approx. latitude $lat and longitude $lon"

This is just a beginning. It doesn't include Gnome's  (or other Desktop Environment) geolocation daemon, nor Ubuntu's geolocation IP lookup service, nor caching points to prevent repeat lookups, nor associating networks (or other events) with locations.

2) National Weather Service location-based elements.

NWS has many types of feeds, but they are all based on three elements: Current Observations are based on the local Reporting Station. Radar images are based on the local Radar Location. Forecasts and watches/warnings/alerts are based on the State Zone.

There is no simple way to grab those three elements (Reporting Station, Radar Location, Zone), but they are built into the forecast pages, so I wrote a web scraper to figure them out from lat/lon.

# Usage $ script latitude longitude

# Pull a USA National Weather Service forecast page using lat and lon, 
# and scrape weather station, radar, and zone information.
web_page=$(wget -q -O - "${1}&lon=${2}")

Station=$(echo "$web_page" | \
          grep 'div class="current-conditions-location">' | \
          cut -d'(' -f2 | cut -d')' -f1 ) 

Station_Location=$(echo "$web_page" | \
                   grep 'div class="current-conditions-location">' | \
                   cut -d'>' -f2 | cut -d'(' -f1 ) 

Radar=$(echo "$web_page" | \
        grep 'div class="div-full">.*class="radar-thumb"' | \
        cut -d'/' -f8 | cut -d'_' -f1 )

radar_1=$(echo $Radar | tr [:upper:] [:lower:])
Radar_Location=$(wget -q -O - "${radar_web_page}${radar_1}" | \
                 grep "title>" | \
                 cut -d' ' -f5- | cut -d'<' -f1)

Zone=$(echo "$web_page" | \
       grep 'a href="obslocal.*>More Local Wx' | \
       cut -d'=' -f3 | cut -d'&' -f1)

echo "This location is in Weather Service zone $Zone"
echo "The closest weather station is $Station at $Station_Location"
echo "The closest radar is $Radar at $Radar_Location"

3) Current Conditions

NWS takes current conditions at least once each hour. Each reading is released as METAR reports (both raw and decoded), and non-METAR reports.

Raw METAR reports look like this:

$ wget -q -O -
2012/09/04 03:52
KMKE 040352Z 00000KT 10SM BKN140 BKN250 25/20 A2990 RMK AO2 SLP119 T02500200

Raw METAR can be tough to parse - lots of brevity codes to expand, and the number of fields can be variable. For example, if there are multiple layers of clouds, each gets reported.

Here's the same observation in  decoded format. Note that the raw format is included on the next-to-last line:

$ wget -q -O -
Sep 03, 2012 - 11:52 PM EDT / 2012.09.04 0352 UTC
Wind: Calm:0
Visibility: 10 mile(s):0
Sky conditions: mostly cloudy
Temperature: 77.0 F (25.0 C)
Dew Point: 68.0 F (20.0 C)
Relative Humidity: 73%
Pressure (altimeter): 29.9 in. Hg (1012 hPa)
ob: KMKE 040352Z 00000KT 10SM BKN140 BKN250 25/20 A2990 RMK AO2 SLP119 T02500200
cycle: 4

Raw and decoded METAR reports are available from NWS for all international METAR stations, too. For example, try it for station HUEN (Entebbe ariport, Uganda).

Finally, METAR reports are available in XML, too:

$ wget -q -O - ""
<?xml version="1.0" encoding="UTF-8"?>

  <data_source name="metars" />
  <request type="retrieve" />
  <errors />
  <warnings />
  <data num_results="2">
      <raw_text>KMKE 072052Z 36009KT 8SM -RA SCT023 BKN029 OVC047 18/15 A2982 RMK AO2 RAB31 SLP094 P0003 60003 T01780150 53005</raw_text>
      <sky_condition sky_cover="SCT" cloud_base_ft_agl="2300" />
      <sky_condition sky_cover="BKN" cloud_base_ft_agl="2900" />
      <sky_condition sky_cover="OVC" cloud_base_ft_agl="4700" />

Non-METAR reports are somewhat similar to the METAR XML, but there are some important differences. Non-METAR looks like this:

$ wget -q -O -
<current_observation version="1.0" xmlns:xsd="" xmlns:xsi="" xsi:nonamespaceschemalocation="">
 <credit>NOAA's National Weather Service</credit>
 <img />
  <title>NOAA's National Weather Service</title>
 <suggested_pickup>15 minutes after the hour</suggested_pickup>
 <location>Milwaukee, General Mitchell International Airport, WI</location>
 <observation_time>Last Updated on Sep 3 2012, 9:52 pm CDT</observation_time>
        <observation_time_rfc822>Mon, 03 Sep 2012 21:52:00 -0500</observation_time_rfc822>
 <weather>Mostly Cloudy</weather>
 <temperature_string>75.0 F (23.9 C)</temperature_string>
 <wind_string>Southeast at 4.6 MPH (4 KT)</wind_string>
 <pressure_string>1011.4 mb</pressure_string>
 <dewpoint_string>68.0 F (20.0 C)</dewpoint_string>

There's a lot of good stuff here - how often the data is refreshed, and when each hour to do so, all the current observations in a multitude of formats, and even a suggested icon URL and history. However, observations tends to update about 10-15 minutes later than METAR reports. 

Yeah, the NWS uses (at least) two different XML servers, plus an http server to serve the same current condition observations in (at least) four different formats. I don't understand why, either.

I use the following in my Current Conditions display: Station, Time, Sky Conditions, Temp, Humidity, Wind Direction, Wind Speed. So my script below handles only them.

# Usage $ script station [ metar | nonmetar ]
# $1 is the Station Code (KMKE)
# $2 is the metar/nonmetar flag

strip_tags () { sed -e 's/<[^>]*>//g'; }

# The information is the same, but formatted differently.
case $2 in
      file=$(wget -q -O -${1}.TXT)
      Observation_zulu=$(echo $file | grep -o "${Station}).* UTC" | cut -d' ' -f14-15)
      Observation_Time=$(date -d "$Observation_zulu" +%H:%M)
      Sky_Conditions=$(echo $file | grep -o "Sky conditions: .* Temperature" | \
                       cut -d' ' -f3- | cut -d'T' -f1)
      Temperature="$(echo $file | grep -o "Temperature: .* F" | cut -d' ' -f2)F"
      Humidity=$(echo $file | grep -o "Humidity: .*%" | cut -d' ' -f2) 
      Wind_Direction=$(echo $file | grep -o "Wind: .* degrees)" | cut -d' ' -f4)
      Wind_Speed=$(echo $file | grep -o "degrees) .* MPH" | cut -d' ' -f3-4);;

      file=$(wget -q -O -${1}.xml)
      Observation_Time=$(echo $file | grep -o ".*" | cut -d' ' -f7)
      Sky_Conditions=$(echo $file | grep -o ".*" | strip_tags)
      Temperature="$(echo $file | grep -o '.*' | strip_tags)F"
      Humidity="$(echo $file | grep -o '.*' | strip_tags)%"
      Wind_Direction=$(echo $file | grep -o '.*' | strip_tags)
      Wind_Speed="$(echo $file | grep -o '.*' | strip_tags) MPH";;

echo "Observations at ${1} as of ${Observation_Time}"
Spacer='   '
echo "${Sky_Conditions} ${Spacer} ${Temperature} ${Spacer} ${Humidity} ${Spacer} ${Wind_Direction} ${Wind_Speed}"

The output is very close, but not exactly identical. See the example below - both are based on the same observation, but the humidity and wind information are slightly different. That's not my format error...the data is coming from NWS that way.

$ sh current-conditions KMKE metar
Observations at KMKE as of 11:52
partly cloudy      81.0F     71%     ESE 9 MPH

$ sh current-conditions KMKE nonmetar
Observations at KMKE as of 11:52
Partly Cloudy     81.0F     72%     East 9.2 MPH

Incidentally, this shows that METAR and non-METAR data use the same observation, so there's no improvement using both data sources. METAR updates sooner and has smaller files, but non-METAR is easier to parse.

4) Forecasts 

NWS has three sources for forecasts: Scraping the web pages, downloading normal text, and the National Digital Forecast Database xml server. Scraping the web page is easy, and scraping techniques are well beyond the scope of what I want to talk about. But here's an example of scraping a forecast using lat/lon:

$ wget -q -O - "" | grep -A 10 '"point-forecast-7-day"'
<ul class="point-forecast-7-day">
<li class="row-odd"><span class="label">This Afternoon</span> A 20 percent chance of showers and thunderstorms.  Mostly sunny, with a high near 85. Southeast wind around 5 mph. </ul>
<li class="row-even"><span class="label">Tonight</span> A 30 percent chance of showers and thunderstorms after 1am.  Mostly cloudy, with a low around 69. Calm wind. </li>
<li class="row-odd"><span class="label">Wednesday</span> Showers and thunderstorms likely.  Mostly cloudy, with a high near 83. Southwest wind 5 to 10 mph.  Chance of precipitation is 70%. New rainfall amounts between a quarter and half of an inch possible. </li>
<li class="row-even"><span class="label">Wednesday Night</span> Mostly clear, with a low around 59. Northwest wind 5 to 10 mph. </li>
<li class="row-odd"><span class="label">Thursday</span> Sunny, with a high near 78. Northwest wind 5 to 10 mph. </li>
<li class="row-even"><span class="label">Thursday Night</span> A 20 percent chance of showers.  Partly cloudy, with a low around 60. West wind around 5 mph. </li>
<li class="row-odd"><span class="label">Friday</span> A 30 percent chance of showers.  Mostly cloudy, with a high near 73. West wind around 5 mph becoming calm  in the afternoon. </li>
<li class="row-even"><span class="label">Friday Night</span> A 30 percent chance of showers.  Mostly cloudy, with a low around 57. Calm wind becoming north around 5 mph after midnight. </li>
<li class="row-odd"><span class="label">Saturday</span> A 20 percent chance of showers.  Mostly sunny, with a high near 69.</li>
<li class="row-even"><span class="label">Saturday Night</span> Partly cloudy, with a low around 56.</li>

The same information is available in a much smaller file by downloading the forecast text for the zone. The text isn't as pretty, but extra information (like the release time) are included. Reformatting to lower case can be a bit of sed work:

$ wget -q -O -
FPUS53 KMKX 041354 AAA
854 AM CDT TUE SEP 4 2012

854 AM CDT TUE SEP 4 2012
15 MPH. 

Finally, the National Digital Forecast Database is a server that organizes many discrete bits of bits of forecast data, like the high temperature for a specific 12-hour period, or the probability of precipitation. it's the data without all the words. Each forecast element applies to a 5km square and a 12-hour period. For example, try looking at this 7-day snapshot of a specific lat/lon (it's too big to reproduce here):

$ wget -q -O - ""

All the elements of the URL are described here. It's a really amazing system, but it's not oriented toward a casual end user...but at the same time, some pieces may be very useful, like future watches/warnings, the 12-hour probability of precipitation, expected highs and lows, etc. Here's an example script to download and reformat the straight text for a zone forecast, followed by a sample run. All the reformatting has been moved into functions. You can see how it's not difficult to separate and reformat the various forecast elements.

# Usage $ script zone
# Example: $ script wiz066
# $1 is the zone (wiz066)

# Starts each forecast with a "@"
# Replace ". ."  with "@" This separates the second and subsequent forecasts.
# Replace the single " ."  with "@" at the beginning of the first forecast.
# Trim the "$$" at the end of the message. 
separate_forecasts () { sed -e 's|\. \.|@|g' \
                            -e 's| \.|@|g' \
                            -e 's| \$\$||g'; }

# Make uppercase into lowercase
# Then recapitalize the first letter in each paragraph.
# Then recapitalize the first letter in each new sentence.
# Then substitute a ":" for the "..." and capitalize the first letter.
lowercase () { tr [:upper:] [:lower:] | \
               sed -e 's|\(^[a-z]\)|\U\1|g' \
                   -e 's|\(\.\ [a-z]\)|\U\1|g' \
                   -e 's|\.\.\.\([a-z]\)|: \U\1|g'; }

State=$(echo $1 | cut -c1-2)
raw_forecast=$(wget -q -O -${State}/${1}.txt)
for period in 1 2 3 4 5 6 7 8 9 10 11 12 13 14; do
   echo ""
   if [ ${period} -eq 1 ]; then
      header=$(echo $raw_forecast | separate_forecasts | cut -d'@' -f${period})
      header_size=$(echo $header | wc -w)
      header_zulu="$(echo $header | cut -d' ' -f4 | cut -c3-4):$(echo $header | cut -d' ' -f4 | cut -c5-6)Z"
      issue_time="$(date -d "$header_zulu" +%H:%M)"
      expire_time="$(echo $header | cut -d':' -f2 | cut -c9-10):$(echo $header | cut -d':' -f2 | cut -c11-12)"
      echo "Issue Time ${issue_time}, Expires ${expire_time}"
      echo $raw_forecast | separate_forecasts | cut -d'@' -f${period} | lowercase
echo ""

And when you run it, it looks like:
$ sh forecast wiz066

Issue Time 15:35, Expires 09:15

Tonight: Partly cloudy until early morning then becoming mostly cloudy. Chance of thunderstorms in the evening: Then slight chance of thunderstorms in the late evening and overnight. Lows in the upper 60s. Southwest winds up to 5 mph. Chance of thunderstorms 30 percent

Wednesday: Thunderstorms likely. Highs in the lower 80s. Southwest winds 5 to 10 mph. Chance of thunderstorms 70 percent

Wednesday night: Partly cloudy. Lows in the lower 60s. Northwest winds 5 to 10 mph

Thursday: Sunny. Highs in the upper 70s. Northwest winds up to 10 mph

Thursday night: Partly cloudy through around midnight: Then mostly cloudy with a 20 percent chance of light rain showers after midnight. Lows in the upper 50s. West winds up to 5 mph

Friday: Mostly cloudy with chance of light rain showers and slight chance of thunderstorms. Highs in the lower 70s. Chance of precipitation 30 percent

Friday night: Mostly cloudy with a 40 percent chance of light rain showers. Lows in the upper 50s

Saturday: Mostly sunny with a 20 percent chance of light rain showers. Highs in the upper 60s

Saturday night: Mostly clear. Lows in the mid 50s

Sunday: Sunny. Highs in the lower 70s

Sunday night: Mostly clear. Lows in the upper 50s

Monday: Sunny. Highs in the lower 70s

Monday night: Partly cloudy with a 20 percent chance of light rain showers. Lows in the upper 50s

5) Radar Images

NWS radar stations are evenly spread across the United States, to form a (more-or-less) blanket of coverage in the lower 48 states, plus spots in Alaska, Hawaii, Guam, Puerto Rico, and others.

Radars have their own station codes that may not correspond with local reporting stations. Each radar takes about 10 seconds to complete a 360 degree sweep. NWS radar images are in .png format, and are a composite of an entire sweep. The images are subject to ground clutter, humidity, smoke (and other obscurants). NWS does not clean up clutter or obsurants in the images. An animated (time-lapse) radar is just a series of images. NWS releases updated images irregularly...about every 5-10 minutes.

There are three scales: local, regional composite, and national composite. The larger ones are just the locals quilted together. For our purpose, I think the locals are adequate. There are two ways to download radar images - the Lite method and the Ridge method. Both use the same data, create the same size 600x550 pixel image, and can be used in animated loops. Lite images have no options - you get a current radar image in .png format (27k) with a set of options that you cannot change. It's a very easy way to get a fast image. Here's a Lite image example of base reflectivity:

$ wget

In Lite, the end of the URL,  /N0R/MKX_0.png, N0R is the radar image type (see here for the list of N** options), MKX is the radar station, 0.png is the latest image (1.png is the next oldest).

For more customized maps, the ridge method provides a series of hideable overlays (base map, radar image, title and legend, county/state lines, major highways, etc). For an example of Ridge in action, see this page. When downloading images, it's a little more complex - each overlay must be downloaded separately, then combined on your system (not at NWS). Here's an example of a script that caches the overlays that don't change (counties), and compares the server update time with it's own cached version to avoid needless updates.

# Usage $ script radar_station [clear-radar-cache]
# Example: $ script MKX
# Example: $ script MKX clear-radar-cache

#$1 is the radar station (MKX)
#$2 is a flag to clear the cache of radar overlays. Most overlays don't 
#   change, and don't need to be re-downloaded every few minutes.


# Test for the clear-cache-flag. If so, delete the entire cache and exit.
[ "$2" = "clear-radar-cache" ] && echo "Clearing cache..." && \
                                  rm -r /tmp/radar-cache && \
                                  exit 0

# Test that the radar cache exists. If not, create it.
[ -d ${cache} ] || mkdir ${cache}

# Test for each of the overlays for the N0R (Base Reflectivity) radar image.
# If the overlay is not there, download it.
[ -f ${cache}/${1}_Topo_Short.jpg ] || wget -q -P ${cache}/${1}_Topo_Short.jpg
[ -f ${cache}/${1}_County_Short.gif ] || wget -q -P ${cache}/${1}_County_Short.gif
[ -f ${cache}/${1}_Highways_Short.gif ] || wget -q -P ${cache}/${1}_Highways_Short.gif
[ -f ${cache}/${1}_City_Short.gif ] || wget -q -P ${cache}/${1}_City_Short.gif

# Test for the radar timestamp file. Read it. If it doesn't exist, create it.
[ -f ${cache}/radar_timestamp ] || echo "111111" > ${cache}/radar_timestamp
latest_local=$(cat ${cache}/radar_timestamp)

# Get the latest radar time from the server and compare it to the latest known.
# This avoids downloading the same image repeatedly.
radar_time_string=$(wget -S --spider${1}_N0R_0.gif | \
                    grep "Last-Modified:" | cut -d':' -f2)
radar_time=$(date -d "$radar_time_string" +%s)
echo "Current image is ${radar_time}, cached is ${latest_local}"

# If the local timestamp is different from the server,
# Download a new image and update the timestamp file.
# Then create a new final radar-image.gif file.
if [ "${radar_time}" -ne "${latest_local}" ]; then
   echo "Downloading updated image..."
   echo "${radar_time}" > ${cache}/radar_timestamp

   # Delete the old radar, warning, and legend layers, and replace them.
   [ -f ${cache}/${1}_N0R_0.gif ] && rm ${cache}/${1}_N0R_0.gif
   wget -q -P ${cache}/${1}_N0R_0.gif
   [ -f ${cache}/${1}_Warnings_0.gif ] && rm ${cache}/${1}_Warnings_0.gif
   wget -q -P ${cache}/${1}_Warnings_0.gif
   [ -f ${cache}/${1}_N0R_Legend_0.gif ] && rm ${cache}/${1}_N0R_Legend_0.gif
   wget -q -P ${cache}/${1}_N0R_Legend_0.gif

   # Delete the old final radar-image. We are about to replace it.
   [ -f ${cache}/radar-image.jpg ] && rm ${cache}/radar-image.jpg

   # Create the final radar-image using imagemagick.
   composite -compose atop ${cache}/${1}_N0R_0.gif ${cache}/${1}_Topo_Short.jpg ${cache}/radar-image.jpg
   composite -compose atop ${cache}/${1}_County_Short.gif ${cache}/radar-image.jpg ${cache}/radar-image.jpg
   composite -compose atop ${cache}/${1}_Highways_Short.gif ${cache}/radar-image.jpg ${cache}/radar-image.jpg
   composite -compose atop ${cache}/${1}_City_Short.gif ${cache}/radar-image.jpg ${cache}/radar-image.jpg
   composite -compose atop ${cache}/${1}_Warnings_0.gif ${cache}/radar-image.jpg ${cache}/radar-image.jpg
   composite -compose atop ${cache}/${1}_N0R_Legend_0.gif ${cache}/radar-image.jpg ${cache}/radar-image.jpg

   echo "New radar image composite created at ${cache}/radar-image.jpg"
exit 0 

And here's the result of the script using $ sh radar MKX:

Another handy use of imagemagick is to pad the sides or top/bottom of an image, enlarging the canvas without distorting or resizing the original image, to move the image around the desktop instead of leaving it in the center. This is especially handy so a top menu bar doesn't block the date/time title.

For example, this imagemagick command will add a transparent bar 15 pixels high to the top of the image, changing it from 600px wide by 550px tall to 600x565. See these instructions for more on how to use splice.

convert image.jpg -background none -splice 0x15 image.jpg

6) Warnings and Alerts

Warnings and Alerts are easily available in RSS format based on zone. For example, here is a zone with two alerts:
$ wget -q -O - ""

<?xml version = '1.0' encoding = 'UTF-8' standalone = 'yes'?>
This atom/xml feed is an index to active advisories, watches and warnings 
issued by the National Weather Service.  This index file is not the complete 
Common Alerting Protocol (CAP) alert message.  To obtain the complete CAP 
alert, please follow the links for each entry in this index.  Also note the 
CAP message uses a style sheet to convey the information in a human readable 
format.  Please view the source of the CAP message to see the complete data 
set.  Not all information in the CAP message is contained in this index of 
active alerts.

<feed xmlns:cap="urn:oasis:names:tc:emergency:cap:1.1" xmlns:ha="" xmlns="">

<!-- TZN = <cdt> -->
<!-- TZO = <-5> -->
<!-- http-date = Wed, 05 Sep 2012 11:49:00 GMT -->
<generator>NWS CAP Server</generator>

<title>Current Watches, Warnings and Advisories for Upper Jefferson (LAZ061) Louisiana Issued by the National Weather Service</title>
<link href=""></link>

Flash Flood Watch issued September 05 at 6:49AM CDT until September 05 at 12:00PM CDT by NWS
<link href=""></link>
<cap:event>Flash Flood Watch</cap:event>
<cap:areadesc>LAZ069; Lower Jefferson; Lower St. Bernard; Orleans; Upper Jefferson; Upper Plaquemines; Upper St. Bernard</cap:areadesc>
<value>022051 022071 022075 022087</value>
<value>LAZ061 LAZ062 LAZ063 LAZ064 LAZ068 LAZ069 LAZ070</value>

<title>Heat Advisory issued September 05 at 4:30AM CDT until September 05 at 7:00PM CDT by NWS</title>
<link href=""></link>
<cap:event>Heat Advisory</cap:event>
<cap:areadesc>LAZ069; Lower Jefferson; Lower Lafourche; Lower St. Bernard; Orleans; St. Charles; St. James; St. John The Baptist; Upper Jefferson; Upper Lafourche; Upper Plaquemines; Upper St. Bernard</cap:areadesc>
<value>022051 022057 022071 022075 022087 022089 022093 022095</value>
<value>LAZ057 LAZ058 LAZ059 LAZ060 LAZ061 LAZ062 LAZ063 LAZ064 LAZ067 LAZ068 LAZ069 LAZ070</value>

Each alert is within it's own <entry>, and has <severity>, <published>, <updated>, and <expires> tags, among other cool info. Unlike radar, you can't check the RSS feed to see if anything is new before downloading it all. So tracking the various alerts must be done by your system.

That VTEC line seems pretty handy - a standard, set of codes that explain most of the event, and include a reference number. See here for more VTEC information.

 Here's a sample script that caches alerts. If a new alert comes in, it pops up a notification. It also tracks which active alerts have already been notified, so each time the script runs you don't get spammed.

# Usage $ script zone
# Example: $ script WIZ066

#$1 is the zone (WIZ066)

strip_tags () { sed -e 's/<[^>]*>//g'; }

# Get the RSS feed of active alerts in zone
alerts=$(wget -q -O - "${1}&y=0")

# No alerts - if a cache exists, delete it.
if [ $(echo "$alerts" | grep -c "There are no active watches") -eq "1" ]; then
  echo "No active alerts in zone ${1}"  
  [ -d ${cache} ] && rm -r ${cache}/
  exit 0

# Get the number of active alerts
num_of_alerts=$(echo "$alerts" | grep -c "")
echo "${num_of_alerts} active item(s)"

# Test for an existing cache. If lacking, create one.
# Create a list of cached alert ids. Each cached alert's filename is the id.
[ -d ${cache} ] || mkdir ${cache}
cached_alerts=$(ls ${cache})

# Loop through each online alert
for entry_startline in $(echo "$alerts" | grep -n "" | cut -d':' -f1)
   alert=$(echo "$alerts" | tail -n +$( expr ${entry_startline}) | head -n 32)
   alert_id=$(echo "$alert" | grep "" | strip_tags | cut -d"." -f8)
   alert_title=$(echo "$alert" | grep "" | strip_tags )

   # Test if the alert is already cached.
   if [ $(echo "${cached_alerts}" | grep -c "${alert_id}") -eq 1 ]; then

      # The alert already exists. Do not notify it or re-cache it.
      echo "Alert ${alert_id}, ${alert_title} has already been notified."

      # New alert. Notify and cache
      alert_body=$(echo "$alert" | grep "" | strip_tags )
      raw_alert_issued=$(echo "$alert" | grep "" | strip_tags )
      alert_issued=$(expr $(expr $(date +%s) - $(date -d "${raw_alert_issued}" +%s)) / 60 )
      echo "New ${alert_title} issued ${alert_issued} minute(s) ago"
      notify-send "${alert_title}" "${alert_body}"
      echo "${alert}" > ${cache}/${alert_id}

# Loop though each item in the cache, and ensure it's not expired.
# If expired, delete it.
for alert in ${cached_alerts}; do
   raw_expire_time=$(cat ${cache}/${alert} | grep "" | strip_tags)
   [ $(date -d "${raw_expire_time}" +%s) -le $(date +%s) ] && rm ${cache}/${alert}
exit 0

Tuesday, August 28, 2012

Kernel hibernate bug

Bah. Got bit by a kernel bug preventing wakeup from hibernation.
3.0.0-12-generic wakes up fine.
3.2.0-30-generic doesn't wake up.
Seeing lots of flurry about various similar kernel bugs. So much activity that I'm not going to add to the confusion by trying to troubleshoot or see if it's already reported or fixed upstream. I'll just stay downgraded for now...

Since Ubuntu doesn't remove old kernels yet, it's easy. Simply remove the newest kernel, and use update-grub to boot from the last (known working) kernel. Then I reboot.

sudo apt-get remove linux-image-3.2.0-30-generic linux-headers-3.2.0-30*
sudo update-grub
sudo shutdown -r now 

External C-Media USB speakers on Ubuntu 11.10

I have a small-but-spiffy set of USB speakers, handy for travel. Here's how I got them to work.

First of all, there are a *lot* of moving pieces, and eliminating many potential sources of the problem (no sound emerges from the speakers) is critical to success.

For example, the speakers work on another system, so they aren't broken.

Other USB devices work in the port, so there's not a hardware defect.

Sound in general works on the laptop.

dmesg shows the speakers recognized appropriately when plugged in:
$ dmesg
[245788.608422] usb 1-1: ath9k_htc: USB layer initialized
[245789.124045] usb 3-1: new full speed USB device number 7 using ohci_hcd
[245789.261042] ADDRCONF(NETDEV_UP): wlan1: link is not ready
[245789.331223] input: C-Media USB Headphone Set   as /devices/pci0000:00/0000:00:13.1/usb3/3-1/3-1:1.3/input/input11
[245789.331380] generic-usb 0003:0D8C:000C.0003: input,hidraw0: USB HID v1.00 Device [C-Media USB Headphone Set  ] on usb-0000:00:13.1-1/input3

( If the hardware were not recognized, that would be a driver [kernel module] issue)

alsa shows the hardware properly recognized and configured. Alsa also creates a GUI mixer for the USB "card" when it's plugged in:
$ aplay -l
**** List of PLAYBACK Hardware Devices ****
card 0: SB [HDA ATI SB], device 0: ALC272X Analog [ALC272X Analog]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: Set [C-Media USB Headphone Set], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

Pulseaudio offers a good clue. It shows the USB card, but there is no sink for the USB card's output. Aha!
$ pacmd list-cards
Welcome to PulseAudio! Use "help" for usage information.
>>> 2 card(s) available.
    index: 0
 owner module: 4
               # It's the onboard audio
               # ...
 active profile: 
  alsa_output.pci-0000_00_14.2.analog-stereo/#0: Internal Audio Analog Stereo
  alsa_output.pci-0000_00_14.2.analog-stereo.monitor/#0: Monitor of Internal Audio Analog Stereo
               # ...
    index: 7
 owner module: 35
  alsa.card = "1"
  alsa.card_name = "C-Media USB Headphone Set"
  alsa.long_card_name = "C-Media USB Headphone Set at usb-0000:00:13.1-1, full speed"
  alsa.driver_name = "snd_usb_audio"
  device.bus_path = "pci-0000:00:13.1-usb-0:1:1.0"
  sysfs.path = "/devices/pci0000:00/0000:00:13.1/usb3/3-1/3-1:1.0/sound/card1" = "usb-0d8c_C-Media_USB_Headphone_Set-00-Set"
  device.bus = "usb" = "0d8c" = "C-Media Electronics, Inc." = "000c" = "Audio Adapter"
  device.serial = "0d8c_C-Media_USB_Headphone_Set"
  device.form_factor = "headphone"
  device.string = "1"
  device.description = "Audio Adapter"
  module-udev-detect.discovered = "1"
  device.icon_name = "audio-headphones-usb"
  output:analog-stereo: Analog Stereo Output (priority 6000)
  output:analog-stereo+input:analog-mono: Analog Stereo Output + Analog Mono Input (priority 6001)
  input:analog-mono: Analog Mono Input (priority 1)
  off: Off (priority 0)
 active profile: 
  analog-output-speaker: Analog Speakers (priority 10000, available: unknown)
  analog-input-microphone: Analog Microphone (priority 8700, available: unknown)
                # See? No output sink. 

There is probably a way out of this using output profiles, but I don't understand those yet. Instead, what appears to be needed is a udev rule that creates an output sink for the USB card, then redirects the default stream to use the USB card.

Upon plugging in the USB speakers, that udev rule should make the speakers the default sound device. Upon unplugging the speakers, the card and sink's disappearance will automatically force Pulseaudio to reroute the stream defaults to the onboard audio card and speakers.

Figure out the Pulseaudio index number of the USB card:
$ pacmd list-cards \
   | grep 'alsa.card_name = "C-Media USB Headphone Set"' --before-context=6 \
   | grep index \
   | cut -d' ' -f6


Figure out the alsa hardware number:
$ aplay -l | grep C-Media | cut -d' ' -f2 | cut -d: -f1


Add an alsa sink for the USB Card:
$ pacmd load-module module-alsa-sink device=hw:1,0

Figure out the Pulseaudio index number of the new alsa sink:
$ pacmd list-sinks | grep 'alsa.card_name = "C-Media USB Headphone Set"' --before-context=37 | grep index | cut -d' ' -f6


Change the default to the new sink
$ pacmd set-default-sink 2

Since the USB speakers are MUCH LOUDER than the onboard speakers, start them at a much lower level!
amixer -c 1 sset "Speaker" 5

So the final script should look something like:
# This script makes the C-Media external USB speakers the
# default audio output. It should be run by udev.

# Figure out the Pulseaudio index number of the USB card
card_index=$( pacmd list-cards \
              | grep 'alsa.card_name = "C-Media USB Headphone Set"' \
              --before-context=6 | grep index | cut -d' ' -f6 )
echo "The USB card index is $card_index"

# Figure out the alsa hardware number
alsa_hardware=$( aplay -l | grep C-Media | cut -d' ' -f2 | cut -d: -f1 )
echo "The alsa hardware number is $alsa_hardware"

# Test if a USB sink already exists. If not, create one
if [ $( pacmd list-sinks | grep -q C-Media ) ]; then
   # Add an alsa sink for the USB Card
   pacmd load-module module-alsa-sink device=hw:${alsa_hardware},0
   echo "New alsa sink added"

# Figure out the Pulseaudio index number of the alsa sink
alsa_sink_index=$( pacmd list-sinks \
                  | grep 'alsa.card_name = "C-Media USB Headphone Set"' \
                  --before-context=50 | grep index | cut -d' ' -f6 )
echo "Alsa sink index is $alsa_sink_index"

# If the current default sink is not the USB, change it
current_sink_index=$(pacmd list-sinks | grep '*' | cut -d' ' -f5)
if [ $current_sink_index -ne $alsa_sink_index ]; then
   # Change the default to the alsa sink
   pacmd set-default-sink ${alsa_sink_index}
   echo "Default sink changed from $current_sink_index to $alsa_sink_index"
   echo "Default sink not changed - it is already $current_sink_index"

# Set the USB speaker levels very low to start
amixer -c ${card_index} sset "Speaker" 5
echo "Speaker volume changed"

exit 0
Save the script (, and remember to make it executable with chmod +x Test it a bit. When the speakers are plugged in and the script run manually, the audio from the USB speakers is excellent. After unplugging, sound reverts to the onboard speakers. This is not a headphone jack - the audio must be stopped/started to use the changed