19.02.2010 05:21
Notes on Alpine and Topal
Almost 9 months ago I wrote an article
about Alpine
and GnuPG. That article covers basics of
integrating Topal
and Alpine, I'd like to add more notes and talk about some
usage scenarios. But first I should mention that if you are
an Arch Linux user
the topal
package is now available in the AUR. I'm the
maintainer and I would appreciate any feedback, so far it got only one
vote and I expect more.
I remember trying to switch to GPG2 back when I was still using
the PinePG filter and it was not as easy as I hoped
for. Official support was not there, and I'm sorry I failed to mention
in my last article that Topal works fine with gpg2. You can set
"gpg-binary=gpg2" in your ~/.topal/config file to
switch. Note that the old gpg option
"--no-use-agent/--use-agent" makes no difference, gpg2 always
requires the agent. What this means for you is that gpg2 will try to
start the PIN entry dialog every time it needs the passphrase (even
when you don't use the agent normally), so if you don't
run X11 set "pinentry-program"
to /usr/bin/pinentry-curses in
your ~/.gnupg/gpg-agent.conf file. If you would like to start
using the GnuPG Agent instruct Topal to always connect to it with
"use-agent=3", and read my article on
the GnuPG
Passphrase Agent.
Do you remember the big SHA1 scare last year? Many people
generated new keys, and many more dumped SHA1 for good. My article on
GnuPG
basics also covered personal cipher options, and forcing stronger
digests. However Topal "gpg-options" setting by default
uses --no-options which instructs gpg2 not to read
your ~/.gnupg/gpg.conf file, which then fall-backs to SHA1
for signing. You should remove it in order to use personal digest and
cipher algorithms, and it's also useful because of other options
(like auto-key-retrieve if you want to fetch missing
keys). You will find my own GnuPG configuration files in
my dotfiles
repository.
In the past year I wrote
several articles
on GnuPG and they bring a lot of visitors here. Some recurring
Google searches are (more or less) "sharing private keys" and
"splitting gpg keys". There's already a very popular
(although outdated) article
on the subject so head on there. But if you are an Alpine
user read on. General scenario is this: you read your mail on a remote
server, which you can't trust as much as your workstation (or
removable storage). Reading mail on one host, verifying it and sending
from another, maintaining multiple key-rings, or even placing the
primary key on both machines... is tedious and risky. Using subkeys as
explained in that article is one way of solving the problem, but with
Topal you don't need to because of its remote and server mode of
operation.
Each time Topal is invoked you can select the remote
mode of operation. In remote mode Topal with connect to your trusted
machine with SSH, transfer any files necessary
with SCP and then perform the wanted GnuPG action. To
make use of the remote mode on the secure machine you need the SSH
daemon running and Topal started in the server mode ($ topal
-server). To make the whole procedure as transparent as possible
you can
employ SSH
public key authentication.
03.02.2010 19:50
Awesome window manager revisited
One year ago I wrote
a small
article about my usage of
the awesome window
manager. Specifics of awesome and benefits of using it
(or tiling window managers in general) I would rather leave for
another article. In this one I'd like to revisit some of my projects
and code related to awesome. Last year I just upgraded from
v2 to v3 which introduced the Lua programming language as a way
of configuring and extending awesome. My goal then was to create an
identical environment as I had with v2. One of the major obstacles was
replacing the Ruby widget library called Amazing with
a Lua one called Wicked. I still remember my first widgets,
knowing very little Lua I had to resort to Awk to grab battery
or mbox information. At the time I just started visiting the
#awesome IRC channel
on OFTC. I clearly remember
someone saying "it would be nicer if it was in Lua"...
A year has passed, so what has
changed? Previously
I wrote about vicious,
a modular widget library written in Lua which builds on the
foundations laid down by Wicked. I had certain ideas about
widgets that were not shared by a lot of people, so I had to do it for
myself. Making Wicked modular would have been a big design
change, and on top of that I wasn't confident enough in my Lua so I
decided not to contribute back, but to create a new project. Now I am
very satisfied how it turned out, I'm satisfied with the code and with
contributions of other users. Result is a series of Lua modules that
gather data about your system, basically system monitors like those
provided by Conky... at the moment we use them to
feed awesome widgets but they could be used in other places
just as easy. For example one could use them for populating
the Ion window
manager status-bar. I made the project public sometime in June, it
now counts 25+ widget types and gets 10+ downloads daily. It's hard to
make an estimate about the number of actual users, but the code was
downloaded well over 700 times.
Since I published the vicious git repository I wanted to use
the git web interface for more than just those few files, so I put my
awesome configuration in git and started pushing the
changes. This easy access, a lot of custom (and well commented) code
and my solutions to various usability problems quickly made
my awesome-configs
repo into a very popular starting point for new awesomers. It gets
almost as much clone requests and downloads as the vicious
repository. My Zenburn
theme also became very popular, in fact so popular that
from v3.4 it is a part of the awesome
distribution. That's not all I contributed to the awesome
tree, in recent months I started sending more and more code
contributions... I contributed to other open source projects but I'm
very proud of being a part of this one. It has a lot of users, most of
which are experienced Unix users with an interest in
improving their productivity and desktop usability. As someone said on
IRC just yesterday "awesome is the ultimate sysadmin
console".
One of my modules that is just gaining some attention is
the Scratchpad
manager. It brings back functionality that was present in v2, but
also expands on it by providing a drop-down applications
manager, contributed by the author
of Wicked. Former Ion users will also be familiar
with the scratch.pad functionality, while
the scratch.drop module allows users to have their favorite
terminal emulator, or application launcher like gmrun, slide
in from a screen edge. Another useful module that can be found in my
repo is
the On-Screen
Keyboard, initially written by another awesome user,
which I ported to v3.4. You can see it in action
in this
screenshot.
Finally let's see what other users have been up to. The author
of Wicked
wrote Eminent -
a dynamic tagging library (its functionality will be familiar
to WMII users)
and Rodentbane
- which allows for rapid control of the mouse pointer using only the
keyboard. Other notable modules
are Revelation
(implementing OSX like
expose), Shifty
(dynamic tagging with advanced client matching)
and Obvious
(another widget library). With this I conclude my little tour of
planet awesome.
27.01.2010 03:54
His Last Bow
I watched the Sherlock Holmes reboot. It is a movie that
provides solid entertainment; a dark mystery, fantastic London scenery
and a good director. The plot and characters are no different than
anything else coming out of Hollywood these days, so it is
bearable... but when I combine what I just said with the fact that the
story revolves around one of my favorite characters ever I can't but
be disappointed, utterly.
Reinventing, or better said rewiring, the character is acceptable... a
darker Holmes, filthy, unshaven and manic is legitimate. But they had
to poison it with Hollywood macho bullshit, which is especially hard
to stand. Holmes even has a woman in this story, he who is "not a
whole-souled admirer of womankind", that is just
preposterous. The movie ends by hinting there will be a sequel and I
hope it will be better than this. But now I am certain there will
never be a better Holmes than the role played
by Jeremy
Brett in the 1984 Granada TV series. He is, and will remain
without any doubt, The Holmes.
I love that adaptation, and enjoy watching those episodes, he
adds so much to the character; the short bursts of laughter, the mood
swings and his eccentric hand gestures... Watching Jeremy Brett play
is rewarding, he was Holmes, truly living the role and completely
absorbed in it. The Wikipedia article explains it all, and
sheds more light on what became of Mr. Brett. Even though Granada
filmed a lot of episodes they did not cover all the
stories. Fortunately we can get them all today in a single volume, my
edition is The Complete Stories, and they are also available
on-line
at Project
Gutenberg.
23.10.2009 22:14
Filesystem encryption 6 months later
Some 6 months ago
I
wrote about file-system encryption on GNU/Linux and my software of
choice,
eCryptfs. The entry was
followed by an
article
describing my setup, which I updated today. I learned from my
mistakes and those of others, and the article now introduces a pretty
different setup. Besides these changes I also did minor updates from
time to time, like adding information on two-factor
authentication and tmpfs with the purpose of further
securing your setup. If you by any chance followed my initial article
converting your current setup can be done in minutes. If you haven't
already found your own path to a better practice.
Ubuntu stays the only distribution with a completely
integrated eCryptfs setup. Integration in their last release is
excellent. I spent a lot of time with eCryptfs and Ubuntu developers
and learning from their experience was invaluable. I also saw a lot of
Ubuntu users that loosed their data (by their own fault). Which brings
me to stability and reliability of eCryptfs.
I must say that living with eCryptfs was excellent. It is reliable and
a performance hit was always minimal. At the time I started using it
the only published benchmark was from Phoronix. Results were
pretty good, most of the time performance impact was less
than 2%. Last week they published an
updated
benchmark and this time results are not looking too good. I can't
say what happened, I never noticed it in my own day to day operation,
but Phoronix is more or less a credible source. It could be worse
though, just days ago LUKS users
were locked out of
their homes.
Linux Magazine published a big
article on eCryptfs
just yesterday. It was written
by Dustin Kirkland,
eCryptfs developer, and it's one of the best on the subject.
21.10.2009 22:34
Cloud Atlas Sextet
I didn't write anything for a month. I was reading a lot. I found some
good books but one of them I can't get rid of, I think
about Cloud
Atlas every day.
This book by David Mitchell is composed of several shorter
stories, all connected by subtle links. None of them were bad, some
were good but some were pure genius. Starting in the 19th
century. Going trough the 20th trough a few stories and then all the
way forward to the end of civilization. Story following the
journalist Luisa Rey was an interesting thriller. Adventures
of Timothy Cavendish had me turning pages in suspense and
laughing like crazy on occasion. The story about Robert
Frobisher, the composer, was beautiful, but the one about clones
in the future superstate of Korea was genius.
I found a copy in the local library after failing to find an e-book in
English. You should invest some time in reading it, even if you don't
like it, it is undeniably an epic novel.
17.09.2009 21:59
Notes on lighttpd and git
Much to my
surprise
vicious
became very popular in just over a month. About a dozen people grab
the code every day, and they should be able to preview the repo and
browse trough it - so I decided to serve it
with cgit, a fast web
interface for git. Connecting it to lighttpd
revealed some quirks, so I'll describe my setup in short.
On my web server all web sites are served from "/var/www" so
when building cgit I decided to install it in
"/var/www/cgit":
$ make CGIT_SCRIPT_PATH=/var/www/cgitCgit binary, default css style sheet and logo will be stored there, configuration file is "/etc/cgitrc" and cache (if enabled later) is stored in "/var/cache/cgit".
After creating a new sub-domain I proceeded to configure lighttpd.
# {{{ git.webhost.tld $HTTP["host"] == "git.webhost.tld" { ## - force redirect HTTP to HTTPS #$HTTP["scheme"] == "http" { # url.redirect = ("" => "https://${url.authority}${url.path}${qsa}") #} server.document-root = "/var/www/cgit" index-file.names = ( "cgit.cgi" ) cgi.assign = ( "cgit.cgi" => "/var/www/cgit/cgit.cgi" ) url.rewrite-once = ( # - main Cgit worker that maps repositories and commits "^/([^?/]+/[^?]*)?(?:\?(.*))?$" => "/cgit.cgi?url=$1&$2", )} # }}}Before restarting lighttpd I did some quick changes on the default cgitrc. Since web sites are stored in "/var/www" I decided to keep public git repos in "/var/git" (as you might notice from the example repo below). Here are only the most relevant parts. Pay attention to the virtual-root which, together with the above rewrite line, fixes the cgit cache - otherwise it would constantly serve one and the same page.
# URL used as root for all cgit links # - fixes caching with the above rewrite virtual-root=/ # Specify some default clone prefixes # - repos are served only trough http(s) clone-prefix=http://git.sysphere.org # Specify the css url css=/cgit.css # Use a custom logo #logo=https://sysphere.org/images/cgit.png logo=/cgit.png # Set a custom footer message instead of default "generated by..." footer=footer.html ## List of repositories repo.url=myproject repo.path=/var/git/myproject.git repo.desc=my project that does something interesting repo.owner=user repo.readme=README.htmlServing it this way, beside the cache problem, had other quirks. The png logo cgit tried to serve as yet another repo. I had to link to it directly. The css file on the other hand was OK, custom footer too.
Since I'm talking about git again I'll add a few notes on top of my previous article about it. For years I've been sharing my dotfiles trough a simple directory index, the most popular of them I would convert to HTML. I was getting tired of the whole process (even though Emacs makes it a bit easier with htmlize and scpaste), and now that I'm keeping my dotfiles in git anyway I decided to make that repo public too.
If you read my previous article it is evident that my dotfiles repo could be full of sensitive information, for example a lot of dotfiles contain passwords these days. Once you get that information in there, and publish it (i.e. by mistake) it is hard to get it out. I thought about it for a few days and tried a few approaches. Maybe it would be best to strip all sensitive information and keep a separate repo. But that would require twice as much work, provided I was willing to stay on top of it.
Long story short, I eventually created a new branch called public in my dotfiles.git repo. I push only the public branch to the server and of course I'm careful that it stays clean of all sensitive data. When something changes in master, and it's worth publishing, I only cherry-pick specific commits.
23.08.2009 02:23
Notes on dotfiles and git
Most applications I use daily are running on my workstation and all
dotfiles are written, edited and kept there. From the workstation they
are distributed to other machines. What usually happens is that I do
massive syncing and then neglect remote systems, sometimes even for
months. I'm always trying to improve my environment so often I would
end up with completely different working environments. I needed a
better solution and git seemed
perfect for the job. I guess other distributed version control
systems would be just as good but I was already familiar with git. I
won't cover the actual git commands, by this
point tutorials are just about
everywhere, I will rather explain my work-flow.
There are many approaches, most popular seem to be; keeping your
dotfiles in a repo and symlinking them back to $HOME or
keeping your whole $HOME in git and ignoring mostly
everything. Some people also use a bit of trickery to change the git
work tree while keeping the repo in another directory and some employ
specific tools
like git-home.
After reviewing all the popular solutions I decided to go with another
approach. I created a repo in the "~/dotfiles.git" directory
and copied all important dotfiles there. I then use a script
called gitup
that for each file (or directory) in the current directory checks if
there is a dotfile with the same name in $HOME. If the file
exists and it changed the script copies it over. An acquaintance uses
this approach to easily share his dotfiles
via github. It seemed cleaner than symlinks to me, but
it does require more work.
I do all modifications on the actual files and later
use gitup to get the repo up to date. After I commit they can
be distributed further. When I'm setting up a new machine, or an
account, I clone the workstation repo directly. Afterwords I
can push from the workstation or pull from remote
machines when a change happens. Workstation has only the master
branch but each remote machine is different and some need to have
specific local changes. When that happens I create a new
local branch and do my changes there. Most of the time I
"live" in the local branch, when something eventually changes
in master I rebase it. Whole setup requires some
work and it's not perfect, but it certainly beats everything else I
tried.
15.08.2009 04:53
Notes on Alpine and CRM114
When I wrote about my
personal
e-mail solution I mentioned
that CRM114 Discriminator
can be used as a (solid) SPAM filter. I've been using it for about 3
years now, and it served me well. I must admit I never achieved the
unbelievable 99.9% accuracy they claim but it is a good
solution nonetheless. It is fast, lightweight, scalable and
flexible.
I use procmail which pipes all e-mail trough crm114, it is
installed system wide, but I keep all "*.crm" and
"*.css" files in the "~/.crm114" directory. If you
decide to try it you should know that it's very well documented, and
the HOWTO document in particular will get you started in no
time. There is no need for me to describe the installation and setup
process here. What I will talk about is using crm114 together with
Alpine.
With the recent "BlameThorstenAndJenny" release the format of
.css files has changed for the first time since I've been
using it. I decided to rebuild them from scratch. The old ones were
not giving perfect results, as I said, and hopefully I can do a better
job this time around. Since I get huge amounts of SPAM daily, the
first day was a little scary but I managed to get on top of it
quickly. I decided not to use mailtrainer.crm, which can be
used to train huge amounts of e-mail at once. Instead I rewrote some
of my old scripts and did it "manually". The first few batches of SPAM
and HAM that I received I trained with
my crmtrain
script. I used the Export function in Alpine to export each
wrongly classified (or unsure) e-mail to the "~/spam"
directory and the script took it from there.
Now that things have slowed down I use two scripts to train the filter
directly from Alpine. I use the Pipe function in Alpine to pipe
each wrongly classified (or unsure) e-mail to the respective
script. The crmspam
script trains SPAM,
while crmham
does so for HAM e-mail. I saw that mutt users have a similar
setup where they send each e-mail back to them selves, but this time
telling the filter how to flag it correctly. Only piping it, while
stripping the old CRM114 headers, seems a bit faster and simpler.
14.08.2009 23:53
Vicious widgets for awesome WM 2.0
It's been a month since
my announcement
of the vicious library. I kept my self busy and there
are 24 widgets in there now... and I'm all out. I went over
those from the Todo list and even over
the Wishlist. Last changes were dedicated to some cleanup and
I removed padding completely. I have no use for it and if you
do I'm sure you'll manage to reuse the old code. If you used it only
in a widget or two think about using "string.format"
instead. More and more people requested the tarball every day and
users started sending me e-mail. After a while I gave up and
put vicious
in git due to popular demand. I serve a bare repo over
http now. First, an updated summary:
* Original code modularized * Widgets ported from Wicked: - CPU, Date, FS, MEM, MPD, NET, Uptime * Widgets written for Vicious: - Battery, Battery (acpitool), CPU Information, CPU Frequency, Disk I/O, Entropy, HDD Temperature, Mbox, Mbox Count, Maildir, Org-Mode, Pacman, Sysload, Thermal, Volume, Weather, WirelessHowever, as removal of padding shows, I am still writing this primarily for my self. There could be more major changes in vicious soon, or maybe none at all since I'm not very pleased with the direction of awesome 3.4. I'm waiting to see how things will work out. Some parts of vicious will need to be completely rewritten in case I do switch. I also had other important decisions to make, while writing all these widgets, especially about using libraries that are not in the standard Lua distribution. For example, mail widgets for counting messages in both mbox and maildir folders could really benefit from the LuaFileSytstem library. But I decided against it for the time being. However the next development cycle, for 3.4, could include them.
In the last article I mentioned the obvious project, another modular widget library. They have a very different design, as mentioned, they handle all errors, format the output, setup buttons and timers, progress-bars and graphs... The project saw some major development recently and unfortunately they kept making it more complex going to the opposite extreme of vicious. Recently one vicious user reported a problem with the cpufreq module. It turns out his hardware does not support voltage scaling, yet I bundled it together with frequency scaling. We had a short discussion and I told him that: "with vicious I am giving each user only the framework to create his own stable configuration".
27.07.2009 04:26
The Secret Supper
I read some really good books recently, great stories from Spanish
authors. I'll mention the last one
first. The
Secret Supper by Javier Sierra is a wonderful crypto
treat. I really enjoyed the story, and all the "secret"
messages and early steganography made it into something
special. Since Leonardo Da Vinci plays a big role in the book
many people associated the work with Dan Brown and The Da
Vinci Code - this book is nothing like it, it functions on a
completely different level.
In January of 1497, Fray Augustin Leyre, a Dominican
Inquisitor and an expert on the interpretation of secret messages is
sent to supervise Leonardo Da Vinci's last touches to The Last
Supper painting. He was sent by Alejandro VI who had
heard that Da Vinci was painting the twelve apostles without their
halo of sanctity, that the chalice was missing, and that Leonardo had
painted himself in the painting with his back to Jesus. This could
have sent him to the inquisition. Why then did he do this?
Was Leonardo Da Vinci a heretic?
I felt that some of the puzzles were a bit easily solved, but I still
enjoyed each one. Spending some time with Leonardo, and getting inside
the greatest mind that ever lived was also a great experience. I found
a good resource while reading the book, using
the Arounder you can see the
panoramic view of both the
church Santa
Maria delle Grazie
and The
Last Supper painting.