robinbloke: (Default)
robinbloke ([personal profile] robinbloke) wrote2004-09-30 01:27 am

Musings from a synaptic breakpoint

Free will is one of the given basic assumptions that most1 people assume is part of this great thing we have call life, existence and everything. This often encompasses a quasi spiritual and usually immeasurable thing called a 'soul'.
But it's not an uncommon argument that eventually computing power will reach the stage where the individual synapses, neurons and suchlike of the brain will be able to be mapped into electronic ones and zeros in silicon (or diamond or whatever we happen to be using by that stage) and lo and behold we'll have the equivalent of the human brain in machine form. At this stage the apocalypse is usually a common projection for what is likely to follow. This could beg the question then, "Why do we bother making things that in all likelihood will rise up and kill us.". Common argument against this is that of all technological items humans excel the best at finding ways of killing each other.
So where and I going this time? As a code monkey programmed responses are something I work with a lot, when I write programs I expect input A process B output C for given values of A and C. Process B determines output C based on A, at least if the program is working.
We work like this too, we are after all a multi-complex of responses, decisions and computations all working at levels we don't really understand yet but every now and then I catch one of these and realise my current line of code, like an interrupt watching the main thread of code as it pauses.
This morning I had a pause as I was buying my daily pint o' milk (semi skimmed, lactose fans). The price was 30p, I handed over 30p. They took my coins and thanked me. I waited. She woke me up with "Did you want a receipt?" because I was waiting for change. I'm too used to receiving chance for everything to carry on automatically after handing over the exact amount.

Just one cycle in my code sequences. Next week I map my brain into class diagrams and find out every thought process inherits from the superclass "sugar".


1 Lets not get too sidetracked here...
(deleted comment)

[identity profile] robinbloke.livejournal.com 2004-09-30 09:31 am (UTC)(link)
That is definately unnerving, I've not studied philosphy myself but have often pondering doing so...
zotz: (Default)

[personal profile] zotz 2004-09-30 11:07 am (UTC)(link)
I don't really find it unnerving, and I don't see that it has major implications for ethics (not being religious, I'm not going to comment on possible damage there).
(deleted comment)
zotz: (Default)

[personal profile] zotz 2004-09-30 11:31 am (UTC)(link)
we couldn't reasonably be held 'responsible' for our actions

I don't agree, actually. The car's a false analogy. More complex decision-making systems - computerised Expert Systems, for instance - do, in practice get blamed for stuff even though their actions are definitely deterministic.

Its unnerving to me because I hold an irrational and romantic notion that I am more than the sum of my parts

Saying that we are less free if we accept that we are subject to these mechanisms is like saying that we're less alive since developing an understanding of biochemistry.