Computer: Brain Implant
Human Math Coprocessor   (+2, -4)  [vote for, against]
Proven to work in chipmanzees!

I can't do math. Can't do it at all. You know what would be really nice? If I could have the missing "math" chunk of my brain replaced by an implanted coprocessor completely dedicated to mathematical operations, that would be nice. I said it requires magic because I don't know enough about the brain to say how this would actually work - when the brain attempted to do math the signals would have to be rerouted to the processor to be calculated then sent back to the brain.

It's not an internal calculator per say, the human still has to remember how to do equations, it would just speed up the operations.
-- DIYMatt, Dec 14 2010

Counting http://www.sciencem...otics/smart/115.asp
Test your math implant! [Jinbish, Dec 15 2010]

The story of Mathematics http://www.open2.net/storyofmaths/
Website related to BBC series looking at the history of maths. I don't think it mentioned the Romans, but there were many other cultures studied. Presented by Prof. De Sautoy. [Jinbish, Dec 16 2010]

Look Around! http://www.youtube....watch?v=drE5cHe6c3s
An excellent UK school's eductaional film about maths from circa 80s [Jinbish, Dec 21 2010]

Example of "Vedic" maths http://www.youtube....cyU&feature=related
Not what it seems but it does work [nineteenthly, Dec 22 2010]

The human brain is a fantastic processing machine. Replacing any part of it with a machine is probably a trade down...
-- Jinbish, Dec 14 2010


so much mfd fodder...

but arithmetic is such an imaginary thing... and it's not a survival trait in any sense so humans haven't bred for it.

Meanwhile I thought this was gonna be an idea for cloning S.Hawking's brain and grafting it on as a third lobe.

It isn't.

so meh.
-- FlyingToaster, Dec 15 2010


//the missing "math" chunk of my brain// That chunk is probably repurposed. Is there anything you're particularly good at? Interested in trading that for improved math skills?
-- mouseposture, Dec 15 2010


Find some way to build all of the equations into it, then I'll be truly interested.
-- mr_bigmouth_502, Dec 15 2010


Given that partial sight had been restored to blind people by implanting a grid of electrodes in contact with the visual cortex, and that chipmanzees (< mistype stays) have been able to control a computer game via implanted electrodes, I suspect that this is doable, and not actually terribly difficult.

As [mouseposture] implies, I wouldn't want part of my brain replaced; better to add a co-processor to a basically intact brain.

It would probably take some time to learn to interface with the new hardware, but eventually you should be able to think operations and have the answer pop up.

I like the idea of implanting electrodes into babies' brains, and interfacing them with computers, mobile devices etc. As they grow up, communing with the internet, computing resources, and each other would be as easy as thought, and something like telepathy and a hive mind would evolve. Ethics schmethics.
-- spidermother, Dec 15 2010


// Replacing any part of it with a machine is probably a trade down... //

Beware, [Jinbish].

[spidermother], join us. You'll wonder why you ever hesitated ...
-- 8th of 7, Dec 15 2010


[8th] But you're a giant evil hegemony, crushing and absorbing all in your path, with your advanced but remorseless technology... Wait, where do I sign?
-- spidermother, Dec 15 2010


Hey, I'm not disparaging cybernetics here, [8th], it's just that the organically grown neurons are really excellent at what they do. Training them to work as required is a better solution than replacing them with some clunky semiconductor.

So, unless you're Noonian Soong, it's all about the neurons!
-- Jinbish, Dec 15 2010


I'm with spidermother - doesn't seem like it would need magic to me.

This sort of brain enhancement has been covered extensively in science fiction.
I don't know of it being done for maths; sight and hearing both have been in at least nascent form. The senses are possibly an easier thing to do since the information is all output (from the point of view of the implant).

I can imagine that it might be possible to train the brain to communicate with the device over time - perhaps one would have to start with a very basic processor and work up. For example, numbers might have to be shifted into the device one digit at a time. Returning the result of various operations (addition, subtraction, multiplication, division) could be done effectively simultaneously, and I expect would be relatively easy both technologically and to learn.
A full general-purpose (ie. Turning complete) processor would be much better, but probably harder to learn to interface to.
-- Loris, Dec 15 2010


This is not magic. Blind people have had brain implants in their visual cortices which produce grids of visual stimulation and it's possible to read brain activity to the extent that one can work out if someone's thinking about tennis or not. Taking the two of these together, the answer is to come up with a notation in which thinking about tennis in a particular rhythm represents bits: thinking about tennis equals one, not thinking about tennis equals zero. Then you learn ASCII in binary. The input device registers what you're thinking about and forms a kind of binary machine code, including arithmetic and logic operators and delimiters for start and end of sequents. The answer is then displayed as a series of dots in the visual field. However, you would be blind and it might be easier to do the maths than learn how to use the device.
-- nineteenthly, Dec 15 2010


//but arithmetic is such an imaginary thing... and it's not a survival trait in any sense//

I would disagree, the ability to track and understand numbers is definitely a survival trait, and it has been since human interaction became more important than self reliance (probably some time around the agricultural revolution). This is still, admittedly, a vanishingly short time in evolutionary terms.
-- MechE, Dec 15 2010


Right.

First of all, as Jinbish says, you already have one of the most advanced, super-fast, massively parallel processing devices in your head already, for free.

Cutting a bit out and replacing it with something that can only work at a snail's pace in comparison is not a good way to learn mathematics.

The magical part here, which I think is what nineteenthly has tried to address is the interface by which you get to mentally encode the information you want mathified, and then to mentally decode the myrad images of tennis players back into a meaningful result. But in order to do that, you'd need to learn maths.

In order to 'do' maths (assuming you are talking about applying mathematical techniques to a real situation is what you mean by "math" and not the 99% of maths that is often neglected by everyone else) you need to do 3 distinct things:

1) You take a real-world situation, define a question you want answering in strict enough terms for it to be encodified and collect all the information necessary to calculate the answer. (Ask a question)

2) You then apply various rules and translations on your data to come up with a result, (Calculate an Answer)

3) which then needs to be de-datafied back into a real-world answer. (Understand what the answer means)

If you're no good at part 1, you wont be able to do part 2 effectively, and if you're no good at part 3, then you may as well not have bothered in the first place.

So, assuming we are talking about an implant that does this 3-stage process - how do we go about stage 1? By what process do we shape the problem being addressed, define its important elements, and present them to the next step in a way that allows an answer to be generated - all in your mind?
-- zen_tom, Dec 15 2010


The only way I can see for this to be achieved would be to hire someone who is good at maths to follow you around and perform the required calculations at your request.

The interface is well established and user friendly (unless you get a grouchy mathematician), the software is continuously updated and the power supply is rechargeable from a variety of sources.

Purchase cost may be a little high.
-- Twizz, Dec 15 2010


It has been suggested that we all have savantry skills which are sometimes buried.
-- nineteenthly, Dec 15 2010


//First of all, as Jinbish says, you already have one of the most advanced, super-fast, massively parallel processing devices in your head already, for free.//

That may be true, but can it do basic maths on several-digit numbers in the blink of an eye? Can it bollocks.
I *can* multiply two 2+digit numbers together, but need a paper and pen. With much rote learning I memorised a 12x12 multiplication table - but there are still areas which suffer from pattern interference. Division is the same - possible with an external memory, but slow.
Face it - there are some things brains are good at, and there are some things computers are good at - but they're not the same.

So I think that provided one knew how to do basic maths, a co-processor-essentially a calculator would be really useful. A simple internal computer which you could program on the fly - well that would mean that amazing things could be done.

I'm not even sure SF has done that possibility justice, come to think of it. - In the stories I've read, characters are effectively described as using their senses to navigate a network/virtual space, rather than augment their intelligence.
-- Loris, Dec 15 2010


I am aware of people who have trained themselves (not inherent, definitely trained) to nearly instantly multiply high digit numbers. It is learnable.
-- MechE, Dec 15 2010


As far as I can tell, once we have neural interfaces of a sufficiently high fidelity (interface with thousands to tens-of-thousands of neurons), performing fast mathematical computations is going to be the simplest and easiest of tasks. More interesting is going to be your Google implant. With neural network simulations, you'll be able to augment your intelligence far beyond what most people consider.

Brain implant to process numbers????

Consider this: You are wondering why your car/jet-pack/whatever is mis-behaving, or how it works or whatever. Think on it for a few moments, then the answers will start to flow naturally from this interface as though you are remembering something you used to know.

There are going to be some fantastic advances in neural prostheses within a decade.
-- toodles, Dec 15 2010


The problem with this is that numbers are not the same as the notation used for them. If we were generally accustomed to using Roman numerals rather than Arabic ones we'd be struggling with how to perform operations on them. On the other hand, if we simply visualised certain mathematical and arithmetical processes, we'd be able to perform a lot of operations, including integer division, modulo, multiplication, squares, cubes and integer square and cube roots, addition and subtraction of positive integers, all up to a thousand or more, logarithms, calculus and trigonometrical functions intuitively, and probably other things. The trouble is that we get bogged down with the figures and how to manipulate them. They are significant as a sort of useful fiction which reveals certain truths and properties of numbers but there's another, i'm tempted to say deeper, reality pertaining to number and what can be done with it. However, there are limits to that too. All of those faculties are already there in our minds.
-- nineteenthly, Dec 15 2010


Oh.

I was expecting a large group of persons trained to rapidly perform small portions of large complex problems in their heads, then pool their answers with other persons who would mentally sum up the parts and present the solution.

Kind of like a choir for solving number problems.
-- normzone, Dec 15 2010


//Roman numerals// could work in an arithmetic environment given a bit of work: probably easier than Arabic numerals though more wordy ... you'd want to notate decimal fractions in reverse of course.

//choir for solving number problems//
"okay can anybody here read arabic numerals.... anybody.... oh, come on..."
-- FlyingToaster, Dec 15 2010


Roman numerals can be made to work and there are techniques for dealing with the likes of multiplication and division but in situations where they were the only numerical form of notation, the abacus, a place-value system, was used instead of bothering with them. My point is that Arabic numerals or any other system are only a representation of number, not number itself. Nothing magical happens when there are ten items in front of you instead of nine which necessitates the use of another symbol.

We're familiar with place value based notation using interchangeable symbols. Number bases don't generally alter the underlying situation. What we learn is not to get to grips with mentally placing two groups of objects together in our mind's eyes and apprehending the number immediately, but a series of symbol manipulation rules which are mathematically significant but not the same as actually doing arithmetic with raw quantities.

The resources exist within our own minds but are unavailable to us under everyday circumstances without practice.
-- nineteenthly, Dec 15 2010


Yikes. I'm about to take a few prerequisite math and science 12 courses after twenty some odd years out of school and could use one of these coprocessors I think.
<whispers>
it'sscary

Anything I could visualise; trigonometry, geometry and physics I was a whiz at but all the rest of mathematics just seems so bloody abstract to me.
<readjusts ammo belt, checks straps>
"Cover me, I'm goin in."
-- 2 fries shy of a happy meal, Dec 15 2010


Most mathematic-type subjects can be thought of with visualisation of some kind or other... if you're taught it in a particular way.
-- Jinbish, Dec 15 2010


If you have 10.2 pounds of mashed, and you want to divide it equally among 3 people, do you ...

a) get a weighscale and give each person... ummm... 3.4 pounds

b) Give each person equal cupfuls until there's less than 3 cupfuls left, then move down to tablespoons > teaspoons > cokespoons > etc.

c) build a tripart balance.


-- FlyingToaster, Dec 15 2010


//the organically grown neurons are really excellent at what they do// Tautology. Agree with [Loris].

[FlyingToaster] The answer is obviously c. Also: may I quote you?
-- mouseposture, Dec 16 2010


//Agree with [Loris]//
//various operations (addition, subtraction, multiplication, division) could be done effectively simultaneously//

I think it's about requirements. To what level of accuracy and what kind of speed do we need an implant for? For that matter, what 'math' do we need a co-processor to do? If all we're talking about is addition, subtraction, multiplication, and division, then how accurate do we need to be?

So I have to decide whether a 4 weekly rail season ticket is cheaper, or a monthly travelcard... That's not too tricky to do to the nearest dollar or pound - and if the difference comes down to pennies & cents, is it a significant factor?
Maybe I have a tax return form to do and there is lots of addition and then some %ages to do. An implanted chip might help here, I guess... but then, the real trick of the form is knowing what to put in which column/box...

Then there are more complicated functions like calculus (differentiation & integration and so on) - that's a point, I *hate* integration - but that's because I don't remember all the various rules... it's not the actual computation.

Matrices - now we're talking. That's some computation right there... except we don't often have a conscious need to compute a matrix. If there are complex functions - such as understanding the rate of change of height with vertical distance travelled, our brains often tacitly compute this.

And as for the mashed potato, you don't weigh it out. You iterate through with a series of "splodges". The chances are that you're weighing/portioning will have a greater error in it than any maths you do in your head. Besides, anything left over after 2 runs of splodges is used with the trebuchet that you have just built with wooden spoons... "Have at you!"

{Guest1 implicitly calculates trajectory of mashed potato and takes evasive action. Potato hits wall with a "splat!"}
-- Jinbish, Dec 16 2010


[spidermother] //chipmanzees// [marked-for-tagline]
-- FlyingToaster, Dec 16 2010


// //Roman numerals// could work in an arithmetic environment given a bit of work: probably easier than Arabic numerals though more wordy ... you'd want to notate decimal fractions in reverse of course.//

Proper roman numerals go i, ii, iii, iiii, v, vi, vii, viii, viiii, x, etc. (Sometimes j was used to break up the runs of i). The use of iv, ix, etc is a modern convention; strictly, roman numerals are non-positional. They're also not really decimal, so the use of a decimal mantissa would not sit well with me. I would prefer to do as the Romans did, and use ordinary fractions.

Aside: how do you write the square root of -i in roman numerals?
-- spidermother, Dec 16 2010


//10.2 pounds of mashed devided 3 ways//

That's a trick question right?
(d) All of the above.
-- 2 fries shy of a happy meal, Dec 16 2010


/Roman arithmetic/

I assume the working ledgers had separate columns for each power-of-10 grouping, each column being 5 characters wide, subdivided into 1,4 or 2,3 subcolumns. Even a long list being added or subtracted together is easily worked at systematically with a pencil and paper and you never have to run out of fingers. Doubtless this is how it was actually done. The finished sum of course would be written on the bill-of-sale or whatever without spacing.

I'll get back to you if I figure out multiplication & division... meanwhile, even though the Romans used 12ths fractions, you could do decimals by reversing the order (ie .VIIILDCCMM would be .8572) and not have to worry about occasions when you wanted more or less detail.
-- FlyingToaster, Dec 16 2010


//you could do decimals by reversing the order (ie .VIIILDCCMM would be .8572)//

Sure, but you'd have to read that as MM/(X) + DCC/M + L/C + VIII/X. That is (to me) an unaesthetic kludging of two not-very-compatible systems, and imposes a parsing algorithm which is not implied by roman numerals themselves, but rather imported to enforce an artificial compatibility with base 10 numbers.

And it breaks down when you want a zero in your decimal fraction (without still more kludging).
-- spidermother, Dec 16 2010


[MFD]//You know what would be really nice?// // that would be nice // //it requires magic//

I've got an idea, Laser Eyes. No wait, Hover Toes. Ooh ooh, a Rectal Turbine.
-- marklar, Dec 16 2010


//it breaks down when you want a zero in your decimal fraction// what ? does not... ah i c... well you coulda just said. "-i" indeed.
-- FlyingToaster, Dec 16 2010


Roman numerals can be dealt with directly but often weren't. The Roman abacus is not that similar to even the original numerals because it can have empty columns, i.e. zero, and Latin words for numerals are not generally connected to the notation (unlike Etruscan, oddly, which however used a different system but whose words used the subtractive principle found in later Roman numerals). So they used something else themselves. Roman numerals are mainly pirctures of numbers rather than something used for arithmetic, on the whole. Arabic numerals are directly connected to arithmetic, though being decimal, still a fairly poor representation.

The Romans had a concept of negative numbers as it was possible to be in debt (in a sense, their concept of negative numbers was slavery). They also had a notation for fractions, but their units suggest they tried to avoid fractions by setting currency, weights and so forth at small enough values to eschew their use. Looking at it that way, what they did after the decimal point was to stop using decimal.
-- nineteenthly, Dec 16 2010


At what point beyond wetwiring do we start to resemble the Matrix flicks?
-- RayfordSteele, Dec 16 2010


Why does everyone care so much about the Romans - what have they ever done for us?

(me)//various operations (addition, subtraction, multiplication, division) could be done effectively simultaneously//

(jinbish)//I think it's about requirements. To what level of accuracy and what kind of speed do we need an implant for? For that matter, what 'math' do we need a co-processor to do? If all we're talking about is addition, subtraction, multiplication, and division, then how accurate do we need to be?//

What is most important is probably that the interface is optimised for the brain, rather than itself.

In my quote above I was imagining a 'simple' calculator - what would be the easiest way to ensure that the brain learnt what the data meant? I have the impression that this is easier where the system is always-on.

I propose that above a certain amount of complexity, the output could easily be configurable. A conceptually simple return pathway at this level is to write to one of the senses. For example, hook in to the vision system somehow. We know the brain is flexible enough to cope with altered data - people have become used to seeing things upside down, for example.
So - just give people a HUD and it'll just work out.

However, the input (to the co-processor) would probably have to be very well targeted to optimise as few channels as possible. I've already suggested some sort of shift register. With 4 channels one could have 16 states, if the brain can deal with chording inputs. A fifth would then be needed to indicate input-ready. So this could represent 0..9, end-of-number, remove-last-input, decimal point, sign and have a few left over.
So in the simplest useful system, one could push in several digits for one number, then again for a second, and know what a variety of operations were. It would be nice to be able to then copy one of those results directly.
-- Loris, Dec 16 2010


The Romans are not on the table. Their own way of writing numbers fails to represent them accurately. So does ours, though differently. Both of them, though, also provide a structure which illustrates certain mathematical properties of numbers and has mathematical properties of its own.

The main thing, though, is that we can just do maths anyway. We just need to get in touch with our ability to do it rather than having a calculator implanted. If we had something like that, it should me more than a calculator. It could at least be a multimedia player, computer and telepathy device as well.
-- nineteenthly, Dec 16 2010


All integers are accurately represented in the decimal representation. All other numbers can be represented to arbitrary precision, so I fail to see the problem.
That's not to say that the output would necessarily be of that form of course.

Some people have developed the ability to do some basic operations reasonably quickly. But:
1) Not as fast as even simple processors from a decade ago
2) Not all operations, generally not division
3) It takes considerable time and effort to learn, as evinced by the small proportion of the population capable of demonstrating it. 4) Not as accurately - I saw a demonstration on TV where the guy failed to carry a 2.

And all this covers is the very simplest 'calculator' set-up. Even this proof of concept could still be very well worth it.
With the lessons learned about interfacing with the brain, the next stage would put an accessible processor in there with a bit of storage, even just a few dozen characters of output and I'm sure people would soon be doing amazing things.
-- Loris, Dec 16 2010


If you have a multimedia player implant and in particular a telepathy one (and incidentally i think it should also be a universal remote control), obsolescence raises its ugly head. Calculators with which one interacts telepathically are less important, except that the point might come when in order to interact meaningfully with others, hold down a job, get into university and so forth, one might need an up to date implant. So it's semi-voluntary brain surgery time. Even the way it's plugged into your brain would change in the end.

Subconscious yes, so interaction with the senses directly would be an improvement. You hear a numbers station as an auditory hallucination and you can't get the melody for Lincolnshire Poacher out of your head.
-- nineteenthly, Dec 16 2010


I'd be a lot more cautious with any form of communication device. There's just so much more scope for contamination.

Incidentally, I once wrote something about conceptually transmitted disease. Still proud of that.
-- Loris, Dec 16 2010


If the possibility exists that something better will come along and it becomes standard practice to have an implant which effectively enhances one's cognition, a major problem of social exclusion could emerge. Who'd employ, for instance, an accountant who couldn't glance at a balance sheet for a split second and tell you if it made sense?
-- nineteenthly, Dec 16 2010


Social exclusion and prejudice already exist in many forms, I don't think this would change that... Except that innumeracy rates would presumably go down.

//Who'd employ, for instance, an accountant who couldn't glance at a balance sheet for a split second and tell you if it made sense?//

Hasn't the development of spreadsheets has basically made such work obsolete in any case?

On the down-side, maybe those with implants wouldn't be able to go near strong magnets eg. to have an MRI scan.
-- Loris, Dec 16 2010


It's a hypothetical example, though the ability to do this might bring such a skill back. What i mean is, a job which requires rapid or high volume calculation, or more importantly, ordinary social interaction would have raised the bar, not because an implant was prohibitively expensive but because it would become obsolete.

This is more immediate than most other kinds of social exclusion. It's more like someone not being able to speak the native language than not being able to drive.

Concerning magnets, we haven't specified what the implant is made of. It could be neuronic for all we know.
-- nineteenthly, Dec 16 2010


It is a good question, but I don't think it would be such an issue in practice.

People do already become unemployable in their field for a variety of reasons. But somehow it doesn't seem to be as severe an issue as one might expect - people are adaptable, and experience goes a long way. The kids with V2 devices may potentially be better, but the oldies with V1 devices would really know how to use them.
Also, as they get older, the work is often do less 'doing' and more 'managing'.

For another thing, going beyond a certain level of computation must open some significant security vulnerabilities and other risks, even if it's not 'networked'.
I wouldn't want a brain implant which could run more than one program at once, or had lots of storage. I'd stick to something I could reasonably reset to factory-default without losing lots of stuff.
If you have lots of programs or data stored, a mistake or attack might mean losing the lot. - Suppose you were persuaded to enter a program which output epilepsy-inducing strobing. You'd want to shut that off and delete it straight away.
If lots of programs could run in the background, someone might find a way to hack your brain, like a keylogger.
Too much storage space, and people would just fill it up with cruft, or stuff they hadn't recorded elsewhere.
-- Loris, Dec 16 2010


Flexible silicon substrate with many silicon "pins" (conductive points) placed straight on the gray matter. Wired to an optical com link so no electricity can bridge to the brain. A human brain is easily adaptable enough to use it.
-- Voice, Dec 16 2010


Yes, as i was typing that i thought it was similar to knowledge becoming outdated.

The thing is, i like your thinking with the simplicity but i think the utility of a relatively feature-free device wouldn't be that much more than what could be achieved through a form of savantry training, which might make it pointless.
-- nineteenthly, Dec 16 2010


[Loris] the problem with brain implants and MRI scans turns out not to be the magnetic field (not much, anyway), but rather the RF coil. The implant acts as an antenna, AC current flows through a circuit, the thing heats up, and cooks the brain. Fortunately, the pulse sequence can be tuned to not deposit too much energy in the tissue, so, within a very carefully calculated envelope, people with brain implants can have MRIs. If you get it wrong, though, the patient comes out of the scanner a vegetable.

//security vulnerabilities// Strongly agree.
-- mouseposture, Dec 17 2010


M$-Brain: it makes the mind boggle... no, really.
-- FlyingToaster, Dec 17 2010


// M$-Brain: it makes the mind boggle...//

sp: buggered
-- Jinbish, Dec 17 2010


Well obviously this would have to be open source otherwise it'd make us all stupid and mad. Concerning the brain roasting problem though, i think this is an argument for literally making it out of a neural net. There are animals which transmit and receive information electrically.

I think what's needed is some kind of benign teratoma made from the host's tissue, living somewhere other than in the brain but communicating with it like an electric catfish would.
-- nineteenthly, Dec 17 2010


//I think what's needed is some kind of benign teratoma//

Why not just train an existing part of the brain then?
-- Jinbish, Dec 17 2010


Hire a chinese guy to follow you around.
-- Cuit_au_Four, Dec 18 2010


// communicating like an electric catfish // [marked-for-tagline]
-- baconbrain, Dec 18 2010


[Jinbish] I do hope you realise that the video you linked is a parody.
-- marklar, Dec 22 2010


There's the so-called "Vedic mathematics", but that only works for a few carefully chosen examples. I'll link to it on YouTube but it's not as useful as it seems, except for overcoming hangups about arithmetic.

I find counting in duodecimal on my fingers very helpful. There's a technique involving phalanges whereby one can count to a gross on one's fingers and sort of use them as an abacus.
-- nineteenthly, Dec 22 2010


[marklar]: WHAT?! You mean my copybook is filled with nonsense? Damn you Internet!! Why have you foresaken me?!?!
-- Jinbish, Dec 22 2010


No worries, there are various different techniques, but i think all of them only work more quickly or easily for relatively few examples. I can't speak for the others but i remember that the multiplication algorithm is best for decimally-expressed numbers with low digits, like twenty-one or thirty-two. Once you get higher, the overheads of drawing the lines and so forth make it too much hassle.

(Note to self: save it for the blog).

Something else which might have some mileage is to use the phalangeal counting system in duodecimal but apply abacus techniques to it, then to "visualise" your hands doing it proprioceptively rather than physically doing it.
-- nineteenthly, Dec 22 2010



random, halfbakery