A few weeks ago there were a few cold and almost cloudless nights. Using the same setup described previously I captured images with a Raspberry Pi and glued them together.
More pictures from 2019 are in this Google album.
A few weeks ago there were a few cold and almost cloudless nights. Using the same setup described previously I captured images with a Raspberry Pi and glued them together.
More pictures from 2019 are in this Google album.
Using the excellent Heatflask you can visualize my cycling and running around Reading
and Kiel
.
Bike rides are blue and runs are red.
This August I managed to get a better picture of the Airbus Beluga in Hamburg
. This amazing plane looks about as airworthy as a bumblebee and unlikely to be able to fly.
From the tower in Laboe I took a picture of the Kiel lighthouse with an approaching DFDS ferry.
And on the way we were treated to a nice sunset in Hoek van Holland onboard the Stena Britannica.
There are more picture in my Google Photo album.
I wrote this a a few years ago for my website that I have now migrated to wordpress.
Soon after Google Latitude started I signed up for it and sporadically updated my location. This worked slightly better with more advanced phones, but until I got my hands on a Nexus S in Autumn 2010 this always stayed very patchy. Just before Christmas 2012 a friend pointed me to Google Takeout that allows to download all data Goggle has stored about you including the Latitude data, if you have signed up for it. I had previously wondered how I could display my annual travel and had dabbled with the possibility to download parts of my entire in kml format form within Latitude. This however looked like it would make the whole project much simpler. So, I downloaded my data and thought I would quickly make a little map.
The first disappointment was that the takeout data was in json format and looking at it was not easy … After searching around for a while I found this script that converted json to kml. At last I could see my track in Google Earth. Unfortunately, Latitude includes quite a bit of erroneous data. So, the next step was to find a program that allows to edit a gpx trace. Again, it took a while to find the right program that I liked, eventually I found gpsprune . Now, the problem was that the entire annual data was a bit large. I used gpsbabel to break down the data into days.
#!/bin/bash FirstDay=2011-12-24 for day in `seq 0 366` do begindate=`date -d "$FirstDay +$(( ${day} ))days" +%Y%m%d`00 enddate=`date -d "$FirstDay +$(( ${day} +1 ))days" +%Y%m%d`00 echo "$day $begindate $enddate" gpsbabel -t -i gpx -f ../latitude-ordered.gpx -x transform,trk=wpt,del -x track,split,start=$begindate,stop=$enddate -o gpx -F ../split/$begindate.gpx done
Finally, I could go through every day and delete points that were wrong. I soon noticed that Latitude keeps reporting the same wrong data. So, another was used to scrub out those points.
#!/bin/bash GPSBABEL=/usr/local/bin/gpsbabel file=$1 LAT=12.345 LON=6.7890 LAT0=1.2345 LON0=67.8901 $GPSBABEL -i gpx -f $file -x transform,wpt=trk,del -x radius,distance=0.2K,lat=$LAT,lon=$LON,nosort,exclude -x radius,distance=0.2K,lat=$LAT0,lon=$LON0,nosort,exclude -x transform,trk=wpt,del -o gpx -F tmp.gpx rm $file mv tmp.gpx $file exit
A few days (or weeks) later I had cleaned the entire data set and glued the traces back together with gpsbabel.
#!/bin/bash touch join.gpx # actually this needs to be file that has valid GPX but no points # in it. Otherwise gpsbabel throws a wobbly. I simply copied the # first gpx file and then deleted all track points. for file in *.gpx do echo $file gpsbabel -t -i gpx -f $file -i gpx -f join.gpx -x track,merge,title="2012" -o gpx -F join-tmp.gpx mv join-tmp.gpx join.gpx done exit
I could finally look at the result in Google Earth. It is a good idea to reduce the number of points a little bit which is easily achieved with gpsbabel.
#!/bin/bash GPSBABEL=/usr/local/bin/gpsbabel $GPSBABEL -i gpx -f $1 -x transform,wpt=trk,del -x transform,trk=wpt,del -x simplify,count=7500 -x transform,trk=wpt,del -o kml,lines=1,points=1,line_width=4,trackdirection=1,labels=1 -F simple.kml $GPSBABEL -i gpx -f $1 -x transform,wpt=trk,del -x transform,trk=wpt,del -x simplify,count=7500 -o gpx -F simple.gpx exit
The final product still has a few glitches which are mainly due to me being a bit slow to switch on data roaming or because Latitude failed (I believe there are a few days in 2012 where this was the case for my account).
I have finally found some time to use my Raspberry Pi as timelapse camera to capture some startrails. I found this useful page showing all the details needed to set the Pi camera to a maximum of 6s exposure. I got the best results by setting the shared video memory to the maximum setting of 256Mb. Otherwise the image numbers would have gaps.
Startrails taken with a Raspberry Pi
Now, I just need to repeat this next time there is good weather and new moon. Preferably I’m going to do this in a slightly darker spot with less light pollution.
I pointed /etc/rc.local
to this srcipt (adapted from the above mentioned website)
#!/bin/bash sleep 120 cd /home/pi/starttrails raspistill -bm -tl 0 -v -set -ISO 800 -awb off -awbg 1,1 -t 14400000 -ss 6000000 -o %06d.jpg shutdown -hP now exit
The pictures were assembled with the following lines:
#!/bin/bash # trail.jpg is a copy of the first immage 000000.jpg for f in 0*.jpg do echo $f convert trail.jpg $f -gravity center -compose lighten -composite -format jpg trail.jpg done exit
I have also turned the stills into a short timelapse video:
~/bin/ffmpeg-3.1.4-64bit-static/ffmpeg -r 15 -i %6d.jpg -vcodec libx264 trail.mp4
I finally got round to writing up the analysis of the energy consumption of our home. Using almost 2 years worth of consumption and temperature readings I work out the leakage of our house and look at the seasonality of our electricity consumption.
This is more a memo to self than a blog post. For an up to date map for the Garmin eTrex 30 go to Garmin Openstreetmap and download the required map. Then copy the file on to the SD card into a top level folder called Garmin, my Great Britain map has the filename gmapsupp_GB.img for instance. And that’s it, the map can now be enabled from the appropriate map menu. Routing works, but takes interesting turns for longer distances.
Similarly, tracks go into the Garmin/gpx. The can be selected as routes from the menu. The file Barry_s_Bristol_Ball_Buster.gpx gets displayed as Barry's Bristol Ball Bust for selection.
Last weekend we had potato pancakes to use up our surplus potatoes from Christmas. They were particularly yummy with German bacon and apple sauce from the our own garden.
![]() |
From 2016 |
It is a simple recipe to make potato pancakes:
Simply fry the mixture until golden brown and it’s done. They taste nice with apple sauce or any other sweet thing, like honey, jam or chocolate spread.
The cold weather earlier this weekend produced a stunning sunrise on Wednesday morning. Below is a picture I took from the Thames path in Reading.
![]() |
From 2016 |
For a while I had been contemplating replacing my 2008 Macbook (4,1) as it was stuck on OSX 10.6. But I couldn’t find any new affordable laptop that took my fancy (too expensive, too big, too small, just not right). I remembered reading about running Linux on MacBooks a while ago and thought, why don’t I try Linux?
I downloaded the 17.1 Linux Mint iso (cinnamon flavour) and followed the instructions to make a bootable USB stick.
I took out the internal hard disk and replaced it with an old one I had kicking about. After that, I plugged in the USB stick rebooted the MacBook holding down the ALT key and the machine booted into Linux after choosing the correct device (in the second attempt). I followed instructions that I largely ignored as they were not applicable. To get the WIFI working this line
apt-get install firmware-b43-installer
sufficed. And another irritation is that the touchpad doesn’t quite work but the interwebs have solutions for that as well. Though the relevant config file appears to live in
/usr/share/X11/xorg.conf.d/50-synaptics.conf
You can experiment with the settings using synclient.
I also added
coretemp applesmc
to
/etc/modules
And it’s a good idea to switch off hardware acceleration in Chrome as the screen goes black when watching youtube videos otherwise.
After a weekend of testing, I splashed out on a 1Tb SSD and put that in the MacBook and now have a fairly zippy 4Gb MacBook running a modern OS. Even Google’s tensorflow installed without a hitch.
Between 6 December and 22 November 2015 there were 2068 failed attempts to login into my machine exposed on the interwebs. I excluded the the ones due to me being unable to type my super-safe password correctly. This equates to about 6 attempts per hour on a machine that is not widely advertised. Presumably, this is mainly due to random attacks that you can also investigate with network telescope, in fact my machine is a network telescope of sorts. So looking at the frequency of usernames used in failed attempts (or attempted break ins if you like) root is the most popular choice and the top 20 are:
User Name | Count |
root | 1272 |
admin | 207 |
ubnt | 106 |
user | 69 |
pi | 64 |
support | 30 |
adm | 25 |
test | 22 |
zhangyan | 20 |
dff | 18 |
guest | 15 |
reception | 15 |
a | 9 |
oracle | 9 |
from | 7 |
aaron | 6 |
ftp | 6 |
cisco | 5 |
ftpuser | 5 |
Plotting the rank of a user name against its frequency logarithmically we get the all too familiar picture of a heavy tailed distribution or power law:
However, here the x-axis barely covers 2 decades, so one should be a bit careful to declare that we see a power law with a gradient of about -1.5. Another way to visualise the data is to look at the usernames used in chronological order (excluding repeats) as nodes in a network. Even this very small sample produces a surprisingly complicated graph.
This is a graph produced by the twopi program of graphviz which looked like the prettiest one. The most connected nodes are root and admin. Is this useful? One interesting way approach this question would be to compare this graph and the analysis above with a data set that contains only valid and successful logins. If the data looks suitably diffrent one could use this approach to get alerted to unwanted behaviour. Unfortunately, I do not have access to such a data set.
I was also intrigued to see the user name pi to feature quite high in the charts, ten years ago this would have been less popular.