h a l f b a k e r yWe are investigating the problem and will update you shortly.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
Compiled sources of the voting program, requires you to
trust the makers of the hardware.
Instead of having to trust that the makers of the
software
didn't insert any backdoor. Why not have a larger scale
version of TCC boot, which would compile and store the
result into a 'ram disk' for
execution by the voting
machine. (This mechanism can be possibly done via
some chip)
On the top of the machine by a simple LCD screen, it will
always show a checksum of the content of the source
code drive, and the 'ramdisk'. It will constantly do this
every second to ensure that no changes has happened to
the source code, or the executables. (Any changes may
be alerted to the user by lights or sound) [Also that
hardware needs to be separate from the computer, as
we don't want people tampering with the display]
===========
Also for this technique to be effective. The source code
should also be open sourced.
TCCBoot
http://bellard.org/tcc/tccboot.html it boots the linux kernal directly from source code [mofosyne, Jul 29 2011]
Reflections on Trusting Trust
http://cm.bell-labs.../who/ken/trust.html Even if the source is fine, how can you trust the compiler? Or the hardware? [Wrongfellow, Jul 29 2011]
Open Source Digital Voting Foundation (OSDV)
http://www.osdv.org/ US Open Source Voting effort [jutta, Jul 29 2011]
The latest on the 2004 Ohio election.
http://freepress.or...isplay/19/2011/4239 Note that the interesting bit here happened (if anything interesting happened) not at the polling place but later, out of sight. [jutta, Jul 29 2011]
[link]
|
|
"Even if the source is fine, how can you trust the compiler? Or the hardware?" |
|
|
Well can't you just ensure that the hardware uses a well known compiler (like TinyC compiler?)? |
|
|
It means you can download the source code at home, run the checksum on the source and the resultant binary, and rock up and compare with the checksum display of the voting machine. (Who knows, maybe you can even have each machine twitter it's checksum automagically to twitter) |
|
|
As for the hardware, well the very least we can do, is ensure that its custom made to be very minimalistic with low chip counts. Also to ensure that its hidden behind a transparent casing, so we can at least ensure that nobody blatently bypass the wiring. (VHDL to FPGA integrity checker? lol) |
|
|
// Well can't you just ensure that the hardware
uses
a well known compiler (like TinyC compiler?)?
In that case, can't you just ensure that the
hardware
uses a well-known voting system? |
|
|
// custom made to be very minimalistic with low
chip counts
Yeah, because when I look at impenetrable,
complex blocks of industrially produced
integrated circuitry with my bare eyes ...
sometimes I have trouble keeping track of them
if there are too many? |
|
|
Ken Thompson's article is a classic, and still
relevant after many years. |
|
|
Voting systems have the added problem that the
end points aren't really very interesting. But once
our voter has pushed the button, that data goes
somewhere else, and we somehow trust that
other place to add numbers correctly. |
|
|
Making voting systems open
source is a good idea (making *all* critical systems
open source is a good idea, IMHO), and a simple
web search for "open source" and "voting" should
find many existing efforts. |
|
|
//ensure that the hardware uses a well known compiler// |
|
|
Then, you have to ensure that the compiler was compiled with a second well-known, trusted compiler. |
|
|
How was the second compiler compiled? And so on. |
|
|
This is the essence of the article I linked to. |
|
|
This would be the easiest voting system in the world to
tamper with. My illiterate neighboor could crack it. Hell, I
could probably crack it. Proprietary code is proprietary for
a reason, no? |
|
|
//... you cannot make a cryptographically secure open system. At some level there needs to be something like an escrow with a trusted third party. In particular with voting you cannot verify someones identity in an open system without exposing identity tokens, for example, having your iris pattern or fingerprints snaffled at the voting station. // |
|
|
Are you sure about all that?
My understanding is that it's widely held that modern cryptographic systems should be open-source, and asymmetric cryptography deals with any need to keep secrets on the client. |
|
|
There is no variation on the theme of Secret Ballot which cannot be subverted when there are Dishonest People about. |
|
|
I wish such people were so easily identifiable in real life as
you make them in print (w/ caps). It would make it so
much simpler to get them in my crosshairs that way. |
|
|
Current drive is no longer valid> |
|
|
//Even if the source is fine, how can you trust the
compiler// Obviously, the voter writes their own
compiler (This has the side-benefit of
disenfranchising people who can't program.) ... |
|
|
//Or the hardware?// ... and simulates it with pencil
and paper. |
|
|
Yeah, because we all know that programmers are the key
demographic... |
|
|
I think you're talking about another issue, bigsleep. |
|
|
When you say "you cannot make a cryptographically secure open system." You don't mean open-source, you're talking about a system without private data, (ie passwords). |
|
|
So the voting booth isn't the only thing you have to trust - you also have to trust a counting device somewhere else. (as jutta said) |
|
|
I'm now wondering whether you actually do. Suppose that aggregate results were made public at the voting station. Noone's privacy is compromised, and there's few enough of them for manual validation - declaring larger and larger heirarchical regions if necessary. |
|
|
It seems pointless to try and make a transparent system
out of submicroscopic, practically invisible transistors when
you can use normal sized tools like pens or stamps and
paper which are obvious when they are working or not. |
|
|
I am not sure if that is what you were suggesting as a well
known voting system [jutta], if so I agree. |
|
|
But the invisible details of the hardware running in the
voting booth are the only really bad parts of this. If you
can trust the display and controls to be hooked up properly
to the chips, and the chips to display a crypto hash of what
is running, you are off to a much better start than a closed
electronic system. |
|
|
Regarding trusting trust, I will guess that voting fraudsters
are more likely to modify their own voting machines than
an old PC that's been gathering dust in the garage for 8
years, or the old Linux discs near it. So I can get a
relatively trusted compiler on there. It also has crypto
hash functions, though probably not the latest ones. |
|
|
With a sufficient number of offline PC's a trusting trust
attack on everyone's computers would be much more likely
to be discovered than modifying secret hardware that is
claimed to be a trade secret. |
|
|
Though Stuxnet proved something about how insecure
computers are. |
|
|
[Alterother] // Proprietary code is proprietary for a
reason, no? // there are plenty of other reasons than to be
more secure. Not wanting to have it copied, or not
wanting to admit what it does due to it looking bad. |
|
|
[caspian] All very good points. |
|
|
I'd love to bake this a bun, but I just don't have the ingredients |
|
|
//voting fraudsters are more likely to modify their own voting machines than an old PC// |
|
|
Surely that depends on the design of the voting system? |
|
|
As in, the fraudsters will modify whatever they have to in order to affect the result, whether that's a voting machine, an old PC, or something else entirely. |
|
| |