Jan 192017
 

Entropy.

Any serious cryptographic routines needs a good source of random numbers, and whilst Linux provides a random number generator by default it’s sources of entropy can be somewhat limited. Especially when you’re talking about a virtual machine.

Indeed if you try to pull too much randomness out of the Linux entropy pool (especially when it is especially limited), what you get might not be quite as random as you expect.

Which is where hardware randomness generators come in. And I finally have one (actually two), and have hooked them up. You may be able to guess what time I plugged it in from the graph below :-

So what real world difference does it make?

Well nothing is dramatically obvious, but :-

  1. I have slightly more confidence that any cryptographic software I might run has a good source of randomness and is less likely to accidentally perform poorly (in terms of cryptographic strength).
  2. Some cryptographic software blocks if the Linux entropy pool is empty; with a hardware source I can be more confident that any performance issues are not due to a lack of randomness.
Oct 172012
 

I have recently become interested in the amount of entropy available in Linux and decided to spend some time poking around on my Debian workstation. Specifically looking to increase the amount of entropy available to improve the speed of random number generation. There are a variety of different ways of accomplishing this including hardware devices (some of which cost rather too much for a simple experiment).

Eh?

Linux has a device (/dev/random) which makes available random numbers to software packages that really need access to a high quality source of random numbers. Any decently written cryptographic software will use /dev/random (and not /dev/urandom which does not generate “proper” random numbers of quality) to implement encryption.

Using poor quality random numbers can potentially result in encryption not being secure. Or perhaps more realisticallybecause Linux waits until there is sufficient entropy available before releasing numbers through /dev/random, software reading from that device may be subject to random stalling. Not necessarily long enough to cause a major problem, but perhaps enough to have an effect on performance.

Especially for a server in a virtualised environment!

Adding Entropy The Software Way (haveged)

HAVEGED is a way of using processor flutter to add entropy to the Linux /dev/random device. It can be installed relatively easily with :-

apt-get install haveged
/etc/init.d/haveged start

As soon as this was running the amount of entropy available (cat /proc/sys/kernel/random/entropy_avail) jumped from several hundred to close to 4,000.

Now does this increased entropy have an effect on performance? Copying a CD-sized ISO image file using ssh :-

Default entropy 29.496
With HAVEGED 28.636

A 2% improvement in performance is hardly a dramatic improvement, but every little bit helps and it may well have a more dramatic effect on a server which regularly exhausts entropy.

Checking The Randomness

But hang on … more important than performance is the randomness of the numbers generated. And you cannot mess with the generation of random numbers without checking the results. The first part of checking the randomness is making sure you have the right tools installed :-

apt-get install rng-tools

Once installed you can test the current set of random numbers :-

dd if=/dev/random bs=1k count=32768 iflag=fullblock| rngtest

This produces a whole bunch of output, but the key bits of output are the “FIPS 140-2 failures” and “FIPS 140-2 successes”; if you have too many failures something is wrong. For the record my failure rate is 0.05% with haveged running (without: tests ongoing).

Links

… to more information.