JREF Homepage Swift Blog Events Calendar $1 Million Paranormal Challenge The Amaz!ng Meeting Useful Links Support Us
James Randi Educational Foundation JREF Forum
Forum Index Register Members List Events Mark Forums Read Help

Go Back   JREF Forum » General Topics » Religion and Philosophy
Click Here To Donate

Notices


Welcome to the JREF Forum, where we discuss skepticism, critical thinking, the paranormal and science in a friendly but lively way. You are currently viewing the forum as a guest, which means you are missing out on discussing matters that are of interest to you. Please consider registering so you can gain full use of the forum features and interact with other Members. Registration is simple, fast and free! Click here to register today.

Tags consciousness

Reply
Old 1st March 2012, 03:12 PM   #2161
westprog
Philosopher
 
westprog's Avatar
 
Join Date: Dec 2006
Posts: 8,928
Originally Posted by Modified View Post
I'm not clear on what you're saying here. Of course it wouldn't be "any arbitrary computer", it would have to have software to emulate the multi-processor brain, have sufficient storage, and to run in real time be sufficiently fast. But if we eliminate the time constraint by slowing down the I/O (slowing down the universe, I suppose, which we can do if the universe is simulated), then any processor chip would be sufficient, given enough memory and a memory addressing scheme that makes it all available. Is an otherwise equivalent brain less conscious because it thinks more slowly?
You're still begging the question, as to whether something that implements a computation going on in the artificial brain is doing the same thing as the artificial brain. We can't easily test this hypothesis, because we can't plug the computer into a human body, unlike the artificial brain. In the event that we can produce an artificial brain, we can see if the human "works properly".
__________________
Dreary whiner, who gradually outwore his welcome, before blowing it entirely.
westprog is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 03:13 PM   #2162
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by rocketdodger View Post
O I C, so the observer has to be completely ignorant, except not ignorant of your concept of "actually physically aggregating things?"
No.

Real aggregation is "physical addition" because it occurs in systems that don't require a brain state to be part of them.

If 2 wolves join a group of 5 wolves, or if 4 sticks drift into a group of 2 sticks, then real aggregation or physical addition has taken place.

This type of activity is evolutionarily important to animals, so it's not surprising their brains developed ways of noticing it, discriminating different degrees, and responding.

One thing the brain can do is "symbolic addition" which involves changing brain states in the animal, but no change in the physical state of any other system -- which is what makes it "symbolic", "logical", or "imaginary" rather than physical or "real".

We can even "farm out" some of the work to information processors, which help us with the symbol manipulation, whether we're talking about an abacus or a TI scientific calculator.

One indisputible feature of symbolic information processors, though (we could legitimately call the whole world of matter and energy a physical information processor) is that to work they require brain states to be part of the system.

Symbolic info processors can't be created (or imagined about natural objects) without a brain state. And unless there is a brain in the system somewhere, all we're left with is an object that cannot be determined to be an information processor at all.

If you don't believe me, then just look back at the video PixyMisa posted of the marble adding machine. But watch it with the sound off and pretend you don't know what the markings mean (that would be adding info in a brain state to the system) or pretend that they're covered with various colors of paint.

Track everything the marbles do. You will not be able to determine from the behavior of the machine alone -- that is to say, in a system which contains only the machine -- what it is intended to do symbolically.

In other words, the machine would have no way of knowing it was being used as a calculator, even if it knew everything about its own body (note that the "meanings" of the symbols painted on it are not facts about the machine, but about the shape of a brain somewhere).

If it did guess that it was being used as an aggregator, it would still not agree with the actual use of the machine, because it would think that every time you put three marbles in separate slots at the top, you end up with 3 marbles together in a bucket at the bottom -- from the point of view of the machine (without the brain that understands the symbols and process) any 3 marbles dropped by any route to the bottom would be 1 + 1 + 1 = 3.

Of course it might also guess that it was intended to be used exactly the way our brains understand it to be used symbolically... but it could have no more confidence in that guess than the 1 + 1 + 1 = 3 theory, or the theory that it's used to tell stories, or the theory that it's a piece of conceptual art, or a spare part for a larger machine that someone's dropping marbles through simply out of boredom.

That's what I mean when I say that, by definition, symbolic information processors like our laptop computers require brain states as part of the system to operate. Without the brain states (which we can label as programmer, user, or observer) the minimum necessary requirements for symbolic information processing to exist are not met.

Originally Posted by rocketdodger View Post
Aside from the fact that this is obviously special pleading on your part, can you explain this: when you "aggregate" rocks in your backyard, those rocks are actually "replacing" the air where they lie, so it isn't "aggregating" anything unless someone is there to tell the difference between rocks and air.

How does that factor into your argument?

Seems to me the only genuine "aggregation" would be the aggregation of fundamental particles in empty space.
I hope the above has clarified why no special pleading is necessary.

But as to the point about the rocks and the air....

If you want to claim that there is no difference between rocks and air if no one is there to notice it, then you're into solipsism, which I don't intend to argue against, I'll just have to leave you to it.

But I will say that it doesn't make much sense in terms of what's already been described.

If there were no inherent difference, then evolving beings wouldn't have noticed a difference, and would not have evolved brains that attend to that difference, and even develop ways to track it.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 03:23 PM   #2163
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by dlorde View Post
Emulated; neurons being emulated by artificial neurons.
There is all the difference in the world, though, between emulation by replication or emulation by representation.

If I create a tornado in a storm box, that's replication. (Both systems can be measured with the same tools, eg. an anamometer, and neither system must include a brain state.)

If I create a tornado in a computer simulation, that's representation. (The physical computations of the second don't exhibit the properties which define the first, and a brain state must be part of the complete system for a full description.)

If you fail to keep the difference in mind when analyzing the second system, you are likely to grant the imaginary bits (the parts that require brain states, which is to say the "world of the simulation") the same physical reality of the non-imaginary bits (the physical computations of the simulator machine).

And if you use the term "emulation" for both processes, you are bound to make this mistake.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 03:45 PM   #2164
westprog
Philosopher
 
westprog's Avatar
 
Join Date: Dec 2006
Posts: 8,928
Originally Posted by Piggy View Post
No.

Real aggregation is "physical addition" because it occurs in systems that don't require a brain state to be part of them.

If 2 wolves join a group of 5 wolves, or if 4 sticks drift into a group of 2 sticks, then real aggregation or physical addition has taken place.
I agree up to a point - but the concept of "wolf" or "stick" is something we apply to the universe, not something inherent. We classify things according to their behaviour, but is there an objective addition going on? When do the wolves join the group? At what distance? How big does a stick have to be to be a stick?

IMO, the concept of "addition" is extremey difficult to tie down in a physical sense. We can apply addition as part of a physical model, but as a physical event, it's so universal as to not be particularly useful.
__________________
Dreary whiner, who gradually outwore his welcome, before blowing it entirely.
westprog is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 03:51 PM   #2165
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by dlorde View Post
Wait a moment (I thought I might be making a mistake putting all that stuff in a single post).

I started with a black box that replaces the visual cortex, and you appeared to agree that if it could interface with the incoming and outgoing neurons appropriately and reproduce the same outputs given the same inputs, that this could work - the patient could see and remain conscious.
Oh, sorry, looks like I deserve my own complaint.

You're right, I thought we had gone back to the point in the discussion about swapping the whole brain out. My mistake, sorry.

Originally Posted by dlorde View Post
I then suggested a number of scenarios based on that, replacing more subsystems in the same way, and/or extending the scope of the original black box to encompass more of the brain function. Finally I suggested replacing the whole brain with a black box (half seriously, half in jest).

My purpose was to see, given you accepted visual cortex replacement (didn't you?), whether you feel there is a point beyond which replacing those subsystems with black boxes would 'break' consciousness, or whether you feel it is possible to have a human-like black box consciousness that doesn't necessarily function in terms of artificial neurons (this because of previous suggestions that the physical structure would need to be emulated).

These questions obviously require some knowledge and understanding of the functional architecture of the brain, and some idea or ideas of which parts of that architecture might be involved in consciousness (e.g. the frontal cortex, but not the cerebellum - which is an obvious black-box candidate). I assume you have enough knowledge of and opinions about these things, given your steadfast and authoritative statements in the thread.

In other words, I'm curious to know what extent of the brain you think it might be theoretically possible to replace in the way described, and still maintain conscious function. What do you think the constraints are, where do you feel problems might lie, etc. (given that we could produce such black boxes and connect them) ? I think it's germane to the thread, but I'll understand if you don't want to tackle it.

Part of my motivation is that I think it may soon be possible to do this kind of replacement for real with very simple brains - to monitor the substantive inputs and outputs of a neural subsystem, train a learning system to reproduce the functionality, ablate the monitored tissue, and use the monitoring probes to enable the learning system to replace it. Clearly it's a long way from the pure speculation above, but it was the stimulus for it (ha-ha).
OK, now I get you.

And yeah, this really gets us to the big questions about consciousness, and how it relates to the larger brain structure.

The real tangle with a question like this is that we have really just 2 choices in how to imagine the black box replacing V.

On the one hand, we can try to imagine how we would literally wire up such a box.

On the other hand, we can simply assume that we have something set up that accepts all the neural bombardment that V takes in, and spews out all the neural bombardment V emits, from and to the right locations, and not worry about where the apparatus lives.

The difference between these two options is not so great when we consider a brain with the consciousness function inoperable as when we consider a brain with the function up and running.

If we assume no consciousness, we can start with our simpler solution and just map inputs and outputs.

In theory, this should work... unless there are as-yet unknown factors in the electrochemical workings of the non-conscious brain that make physical arrangement in spacetime a factor -- which is possible, given what we know about power lines, for instance -- but since we don't have any reason to believe that at the moment, let's just let this conclusion stand provisionally and assume we can replace V with a black box that's I/O equivalent.

(This thing's gonna have a ming-boggling array of inputs and outputs, though.)

But because our black box isn't doing any of the physical computation that V actually performed, any brain processes which depend on that work will either fail or go haywire in some way.

If consciousness depends at all on the coherence and synchronization of its "signature" global brain waves, which is to say if they're not just noise like the heat coming out of my computer or the noise coming out of my truck, then it's quite likely our black box just created a conscious "blind spot" which will remove visual activity from consciousness.

And although we don't know yet, my bet is that the brain waves are not noise, since we have few other good candidates for mechanisms to synchronize the activity of disparate brain regions, which we know is happening in conscious experience.

You could try various configurations of black boxes for different areas of the brain, and you'd just have to see by trial and error what happened to the reports of a subject and the observation of the signature waves.

I would bet that some configurations would create impaired consciousness and others would prohibit consciousness.

(Which means that all possible attempts to simulate a conscious brain may simply fail to produce consciousness because they are logically inconsistent, like the moat in Junction Pass which doesn't work because they switched from letters to pumpkins, which also can't work because the moat won't run.)

But even that model is simplistic, considering the role of bundling, such as the McGurk effect, in which the visual image of a speaker's lips determines which of any two ambiguous sounds we hear (say, B and F) even if we know in advance what the sound will actually be. Which means the visual and auditory streams have been merged upstream somehow in a way that consciousness doesn't have the tools to untangle.

This is why we are able to see a car rolling along the road, and not an overwhelming barrage of shapes, colors, sounds, and motions (which is what's really hitting our heads).

If a black box removes a physical component necessary for bundling across disparate regions, the effects could be quite unpredictable and very weird.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 04:03 PM   #2166
yy2bggggs
Master Poster
 
yy2bggggs's Avatar
 
Join Date: Oct 2007
Posts: 2,435
Originally Posted by Piggy View Post
No.

Real aggregation is "physical addition" because it occurs in systems that don't require a brain state to be part of them.

If 2 wolves join a group of 5 wolves, or if 4 sticks drift into a group of 2 sticks, then real aggregation or physical addition has taken place.
Your distinction is arbitrary, and I don't see the point of it. Why should I say that 2 wolves joining a pack of 5 wolves performs a "physical addition", and that this is a such of thing worthy of recognizing as a type of thing in itself, and different from all other sorts of addition? How come it's 2 wolves joining a group of 5 instead of 7 total wolves joining forces--in which case it's not an operation at all? Or, maybe the 5 that were a group were all stragglers at first, and then formed the group of 5... in which case, what physically happened was that 7 individual wolves joined together into a group, and we simply caught this mid process--making this a physical multiplication?
yy2bggggs is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 04:06 PM   #2167
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by rocketdodger View Post
I would like to try another approach to this that hopefully doesn't spiral out of control. Lets partake in a thought exercise, taking baby steps, and see where people start to diverge.

So everyone here is in some sort of a space, which they can perceive only to a finite distance ( meaning even if they are outdoors there is some limit to the distance they can see stuff happening at ), looking at a computer and thinking about what is on the screen.

Does anyone disagree?
So far, so good.

Originally Posted by rocketdodger View Post
Furthermore, in this situation, everything that exists is composed of fundamental particles arranged and behaving in various ways. There is no constraint that we humans are aware of all the types of particles, or that we understand them fully, only that everything in the situation must be composed of "something" and we can call the smallest units we can observe "fundamental particles," mainly because that is what physicists term them.

Does anyone disagree?
Still good.

Originally Posted by rocketdodger View Post
Furthermore, the behavior of the particles can be described with mathematics. It is possible that new behaviors are discovered, and current mathematics can't describe the newly discovered behaviors, but when that happens we just supplement and improve our mathematics, so that in the end we can always describe all the behavior of all the particles we can observe.

I understand that things like quantum uncertainty are present, but those types of things can be counted under behaviors that aren't necessarily fully observable at all times, and anyway I think they are probably immaterial to the discussion -- we are concerned with the deterministic behaviors of particles here.

Does anyone disagree?
I'm with you, brother.

Originally Posted by rocketdodger View Post
So to sum up the first baby step is to come to an agreement that we and the spaces we inhabit are composed of particles the behavior of which can be described mathematically. Whether the particles "follow" mathematics, or whether mathematics is just the "description" of the particles, or anything in between, isn't necessarily relevant.

If nobody disagrees we can move onto the next baby step in the thought exercise.
Cool.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 04:26 PM   #2168
westprog
Philosopher
 
westprog's Avatar
 
Join Date: Dec 2006
Posts: 8,928
Originally Posted by yy2bggggs View Post
Your distinction is arbitrary, and I don't see the point of it. Why should I say that 2 wolves joining a pack of 5 wolves performs a "physical addition", and that this is a such of thing worthy of recognizing as a type of thing in itself, and different from all other sorts of addition? How come it's 2 wolves joining a group of 5 instead of 7 total wolves joining forces--in which case it's not an operation at all? Or, maybe the 5 that were a group were all stragglers at first, and then formed the group of 5... in which case, what physically happened was that 7 individual wolves joined together into a group, and we simply caught this mid process--making this a physical multiplication?
We are also adding all the hairs on the wolves, or multiplying the number of wolves by four to get the number of feet. I can't really see this as an actual physical process.
__________________
Dreary whiner, who gradually outwore his welcome, before blowing it entirely.
westprog is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 04:32 PM   #2169
yy2bggggs
Master Poster
 
yy2bggggs's Avatar
 
Join Date: Oct 2007
Posts: 2,435
Originally Posted by westprog View Post
We are also adding all the hairs on the wolves, or multiplying the number of wolves by four to get the number of feet. I can't really see this as an actual physical process.
You two are on opposite ends of this. Whereas it may have no preferred description, it is a physical process, and the number of wolves that result is indeed really 7. And yes we can also discuss other aspects of the system, but that's not a discussion of the entities we label wolves.

Given a wolf is what we call it, and that is an entity that we learned to recognize (as opposed to invented), 7 is the only correct answer for how many there are.
yy2bggggs is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 04:46 PM   #2170
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by rocketdodger View Post
I think that confusion stems from a refusal to accept that for many functions emulation is the same as simulation.

Case in point, if you have 4 neurons in series, connected to other neurons at both ends, and you replace all 4 with artificial emulators, the overall network should function the same.

Now replace the 4 with a single emulator, that internally functions by simulating each of the 4 original neurons. If an emulator works by simulating anything, those simulations are also emulations.

I think even piggy would agree that this will also preserve the properties of the network.

Yet when we extrapolate, there is some arbitrary point where people start to think the internal simulations cease to also be emulations.
It's funny, from our point of view, there's a similar point in your approach where the imaginary suddenly becomes real.

So let me tackle that.

And yeah, I do agree that in some cases there's no difference between simulation and emulation. In fact, I can describe those cases, I think: it's when no real work needs to be done by the physical system during that portion of the chain (except what might be coincidentally done by the original part and the replacement, such as output of random heat) and when the implementation of the hardware doesn't hinder the functioning of other subsystems.

And the necessary fallout from this is that simulations cannot also be emulations -- or in my lingo, representations cannot also be replicas -- at any point where the system does any real work that the replacement can't also perform.

That's why a representation (including a computer simulation) of a kidney can never replace the entire kidney -- it can't do the phsyical work. You have to have an actual dialysis machine for that.

That's why Pinocchio can never be a boy as long as he's made out of wood -- it can't do the physical work that a human body does.

Or we can imagine Major Tom in a space suit, out repairing a space probe.

Let's say he can either use his jet pack to float back to his ship, or he can teleport like in Star Trek.

We'll ignore the problematic bits and just stipulate that he can be Wonkavisioned through space with the relative position and type of his particles preserved in the transfer of photons or whatever, and reassembled from that configuration into a corresponding collection of massive particles, a man in a spacesuit.

Now let's imagine that on this day Tom spots a space squid between him and the mother ship. A small one, but big enough to break something important, so he decides to kill it on his way back.

If David back on the ship teleports him, he can't kill the space squid.

But why not? The teleportation is real, the inputs match the outputs... but it's like I said, there is some real work to be done at that point in the chain which cannot be performed by the physical choice we've made for Tom's transportation (whatever it is) unless it kills anything in its beam, which it probably doesn't because then it would be too dangerous to use.

We could even program the transporter to rearrange the information about Tom's physical state along the way, run it through the same transformations the physical Tom would go through if he had killed the squid.

He would emerge with the memory of using his jet pack to return, and killing the squid along the way. Then he'd look out the portal and see the damn thing still there!

Ok, so getting back to the brain example, if we get down to the level of one neuron, it's likely (but not certain) that there's a medium available which we can use as a replica. And if there's only one action that's important (e.g. a neuron fired or it didn't) then it's hard to describe that as also not a digital simulation.

But if we try to zoom out to the level of the brain, specifically a conscious brain, then we'd have to assume that there is no physical work at all which is ever important in the functioning of the brain which will not also be produced coincidentally by a machine designed to run computer sims, if we are to believe that it could be replaced by a machine designed to run computer sims.

That's a tall order.

And it becomes an insurmountable one when we consider that consciousness is a phenomenon observable in spacetime, which makes it objectively real -- my body as a physical object is generating a conscious experience right now which is locatable in both time and space (i.e., it ain't happening tomorrow in Paris) -- and the laws of physics demand that all real phenomena require some sort of work.

Therefore we can conclude that a simulation/representation of a brain, rather than a replica of a brain, cannot generate a real instance of conscious awareness -- as happens, for instance, when a baby begins to exhibit that behavior -- because the physical work of the machine is too different from the physical work of a brain to make it happen.

So yes, the point of separation you're talking about is very real.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 04:51 PM   #2171
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by Modified View Post
That was my point. What you were saying was in no way a response to what I said, which was a very simple and clear statement about the nature of computation. It was phrased as if it was a response though. Such replies make communication difficult.
We are having communication difficulties, but it's both ways.

What I said was a response... I won't get into why because it doesn't matter....

Where were we? In all the confusion I lost count.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 05:25 PM   #2172
dlorde
Philosopher
 
dlorde's Avatar
 
Join Date: Apr 2007
Posts: 6,216
Originally Posted by westprog View Post
I mean a computer implementation that runs a computation supposedly equivalent to what is running in the brain - Turing-equivalent, technically. Such a computation wouldn't be running on artificial neurons, and it wouldn't be "plug-compatible" with the brain.
Why not? (to both).

Quote:
The disagreement is not about the artificial brain - I think that most people would agree with that as likely workable, in concept anyway. It with the idea that the functionality of the said artificial brain could be expressed, without lost of functionality, as a computation, and said computation could be implemented on any computing hardware.
But artificial neurons would work by computation; how else?
__________________
Simple probability tells us that we should expect coincidences, and simple psychology tells us that we'll remember the ones we notice...
dlorde is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 05:32 PM   #2173
dlorde
Philosopher
 
dlorde's Avatar
 
Join Date: Apr 2007
Posts: 6,216
Originally Posted by westprog View Post
... It's quite obvious that any arbitrary computer can't be just stuffed into the brain cavity. Therefore it cannot have "exactly the same functionality".
Why should it's functionality depend on where it can be stuffed?

In any case, we could conceive of an IO interface in the brain cavity that is wirelessly connected to the artificial brain residing elsewhere. How would that make the functionality different?
__________________
Simple probability tells us that we should expect coincidences, and simple psychology tells us that we'll remember the ones we notice...
dlorde is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 05:34 PM   #2174
dlorde
Philosopher
 
dlorde's Avatar
 
Join Date: Apr 2007
Posts: 6,216
Originally Posted by Piggy View Post
Consciousness is not a thing, it's an event.
More than one, I hope. It's a process, isn't it?
__________________
Simple probability tells us that we should expect coincidences, and simple psychology tells us that we'll remember the ones we notice...
dlorde is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 05:38 PM   #2175
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by rocketdodger View Post
If I wasn't learning things from all the posts people are making, showing you how utterly wrong you are on all this stuff, I would have stopped reading this thread long ago.

That is how frustrating your empty arguments have become piggy.

Take this latest one for instance. Given that any computer we program must be "built" first by assembling the hardware, and the "programming" is done by changing physical properties of the hardware, and given that we know the brain typically develops over time more by changing synapse strength rather than actually growing neuron connections, which is very much closer to how we "program" computers rather than to how we "build" computers, but the brain must be "built" before it can be "programmed" anyway, I don't know wtf you are talking about when you try to distinguish between "building" something and "programming" something. According to *any* formal definition you could come up with, "programming" is merely fine-grained "building," -- there is zero qualitative difference between the two.

For someone who is trying to bark up the tree of equivocating physical processes when no "observer" is present, you sure do pull a lot of arbitrary and unexplained distinctions out of you-know-where.
I'm glad you brought this up, because I wasn't aware there was any confusion on this point. However, I can see why my too-narrow use of the word "built" may have caused it.

Brains are built and computers are built. That's true.

Which means that you are 100% correct that programming is "fine-grained building". What the programmer is doing at the keyboard is changing the structure of the machine it's connected to so that some of its parts behave in different ways when electricity runs through them than they did before.

(Fortunately, the programmer doesn't have to know what these physical changes are, or even be aware of them, to peform the operation remotely.)

But it is fine-grained building of a particular type. For instance, building a skyscraper isn't "programming it" in the sense we mean when we say computers are programmed.

Programming the computer involves manipulating it so that it moves in ways which are intended to mimic, in some useful form, a set of ideas in the head of the programmer, perhaps associated with some real-world system or some imaginary system... or perhaps randomly if anyone desires to be reminded of a random number or color or sound at any time in the future.

Computers can be made in many configurations, out of many types of materials, each with its own physical properties, some faster and easier to manage than others.

So the question is this: Can that kind of "building" alone allow us to take the material we're working in and turn it into a conscious object? Or will that end up being like programming a skyscraper into existence... the wrong kind of building?

Well, to answer that, we have to ask "What sort of building is necessary to make consciousness happen?"

The short answer to that, of course, is: We don't know.

There simply are no satisfactory answers right now.

That situation by itself means first that we cannot confirm that a machine which is not conscious can be made conscious by programming alone, without building some other structures designed for the purpose, not of supporting symbolic logic, but of enabling the phenomenon of conscious awareness.

But digging further, the brain is obviously doing some sort of real work to make the phenomenon happen. That's beyond doubt, and nobody in biology questions it that I know of.

If no work were involved, the phenomenon could not occur. Because there are no observable phenomena that have no real causes.

Are the signature waves noise, or are they doing some of the necessary work?

Nobody knows right now, but they're currently our best lead for a component of the work that has to happen somehow, the synchronization of patterns in disparate parts of the brain.

In any case, since we know that consciousness is the product of some sort of physical work of the brain, we're not going to get a human body, or a robot body either, conscious unless we've got some structures that are designed to do whatever minimum work turns out to be necessary.

This is what eliminates the possibility of a pure-programming solution.

It simply means that if the machine wasn't already equipped to perform this behavior to make the phenomenon occur in spacetime, you can't do the kind of fine-building called programming (which is intended to make the computer mimic some other system symbolically, regardless of the actual physical actions of the computer which could be any number of possibilities) and expect to make it perform that behavior.

It's like saying you can take a machine without a CD player and make it play CDs by programming alone.

Can't be done, because programming doesn't get you the real work you need.

Programming and a laser and the other stuff, then yeah, you can perform the task and get a real phenomenon going, air molecules bouncing around.

Programming and the right hardware -- which will be more than just the hardware required to run the logic, because the laws of physics demand it -- will get you a real event of conscious awareness.

If you want to say, no, the brain or machine representing things to itself can cause conscious awareness, then you're in trouble.

There are several reasons why, but one fatal reason is that the brain represents things to itself via feedback in many known ways which are not involved in conscious awareness, although some are. So this requires us to follow up by asking "What makes the difference between feedback loops involved in conscious experience, and those that aren't?"

That's kinda where we are now.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 05:46 PM   #2176
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by yy2bggggs View Post
I'm confused. The way I read this, you're claiming that you can derive the fact that consciousness cannot be programmed, but must be built, from the fact that you cannot see how a simulation would be become conscious.

Is this your claim?
Not quite. A computer simulation cannot be conscious if consciousness requires any physical work on the part of the brain which is not also done by the computer.

Given what we know, we can safely conclude this is the case. (If it weren't, then it's difficult to see how consciousness could be postponed until after binding, and so forth.)

And if consciousness is a result, even in part, of the brute electrophysical activity of the brain, then building it would not constitute "programming" for the same reason that installing a traffic sensor under an intersection isn't considered programming.

I mean, you could call it that, but only if our universe is the computer.

Which is perfectly valid, but rather trivial.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 05:52 PM   #2177
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by westprog View Post
I agree up to a point - but the concept of "wolf" or "stick" is something we apply to the universe, not something inherent. We classify things according to their behaviour, but is there an objective addition going on? When do the wolves join the group? At what distance? How big does a stick have to be to be a stick?

IMO, the concept of "addition" is extremey difficult to tie down in a physical sense. We can apply addition as part of a physical model, but as a physical event, it's so universal as to not be particularly useful.
Actually, I'd rather not get off into that because, interesting as it is, and informative as it may be about certain things, it's going to get us off topic.

Suffice it to say that things in the universe do aggregate and de-aggregate, especially from the point of view of a critter with a field of vision who is taking in a terrestrial landscape, and that this real behavior is what our brains have learned to, in some ways, represent symbolically.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 06:02 PM   #2178
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by yy2bggggs View Post
Your distinction is arbitrary, and I don't see the point of it. Why should I say that 2 wolves joining a pack of 5 wolves performs a "physical addition", and that this is a such of thing worthy of recognizing as a type of thing in itself, and different from all other sorts of addition? How come it's 2 wolves joining a group of 5 instead of 7 total wolves joining forces--in which case it's not an operation at all? Or, maybe the 5 that were a group were all stragglers at first, and then formed the group of 5... in which case, what physically happened was that 7 individual wolves joined together into a group, and we simply caught this mid process--making this a physical multiplication?
Well, you can move off into philosophical musings about it, but that's not really necessary in order to do what needs to be done here.

In fact, I don't see how it will help.

The point is, things do aggregate and deaggregate in the universe, moving into and out of clusters, and our brains evolved to notice this.

Arguing about where groups begin and end is like the argument over the border between a language and dialect or between atmosphere and outer space... a pointless exercise that doesn't help us use these terms in the contexts in which they're significant.

Aggregations in systems that don't need to include brain states is physical addition -- whether that's raindrops filling up a footprint to make a small puddle, or predators closing ranks to make a more formidable fighting force -- are examples of physical addition.

So if a recipe says to add a pinch of salt to the batter, it's not asking you to do anything symbolic, but physical.

Now let's compare that to what PixyMisa's marble machine does.

Physically, it simply aggregates however many marbles you put on top into a single pile at the bottom.

So any configuration of 3 marbles at the top will be physically aggregated as a group of 3 marbles at the bottom.

That's physical addition of the separate marbles into a group (philosophical debates over the borders of groups notwithstanding).

But if you bring a brain into the system which understands the symbols painted on it and the rules for its operation, then (and only then) do you have symbolic addition, in which the observing brain may be reminded of the number 26 or 52 or 8 rather than 3 when the 3 marbles make their run from the top down into the bowl.

That is the difference, and it's real and it's clear, regardless of the philosophical woolgathering we could do about where groups begin and end.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 06:18 PM   #2179
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by westprog View Post
We are also adding all the hairs on the wolves, or multiplying the number of wolves by four to get the number of feet. I can't really see this as an actual physical process.
Hairs that are always together on a wolf aren't aggregating. They are aggregated.

Ditto the feet.

However, if you look at a wolf and it has 4 feet and you glance away and back and it has 3, your brain is going to notice, and cause you to stare at the wolf intently.

That's because unexpected deaggregation of an animal's feet is important.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 06:28 PM   #2180
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by dlorde View Post
But artificial neurons would work by computation; how else?
An artificial nueron could only work by physical computation.

If it doesn't, it has no means of communicating with the neurons it connects, which operate exclusively by physical computation.

I mean, you could include a symbolic computation in the mix, but why would you?

If you can replace

Neuron -> Neuron -> Neuron

with

Neuron -> Replica -> Neuron

Why bother with

Neuron -> Modulator -> Simulation -> Demodulator -> Neuron?
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 06:33 PM   #2181
westprog
Philosopher
 
westprog's Avatar
 
Join Date: Dec 2006
Posts: 8,928
Originally Posted by yy2bggggs View Post
You two are on opposite ends of this. Whereas it may have no preferred description, it is a physical process, and the number of wolves that result is indeed really 7. And yes we can also discuss other aspects of the system, but that's not a discussion of the entities we label wolves.

Given a wolf is what we call it, and that is an entity that we learned to recognize (as opposed to invented), 7 is the only correct answer for how many there are.
Wolves are a tricky example, as they might be conscious of how many of them there are, but we can count all kinds of things in that system, and I don't think that any of them are more physical than any other. Suppose we want to count the pretty wolves, or the dirty ones?
__________________
Dreary whiner, who gradually outwore his welcome, before blowing it entirely.
westprog is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 06:51 PM   #2182
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Location: Ralph's side of the island
Posts: 15,924
Originally Posted by dlorde View Post
More than one, I hope. It's a process, isn't it?
Any real thing can be described as an event, and any real event can be described as a process.

But the question of the unity of consciousness is also an interesting one.

The split brain studies are very illuminating here.

You're probably familiar, but if any lurkers are left who are not, there's a body of research on people who have had to have the connection cut between the two halves of their brain.

You can use physical apparatus to make it possible to communicate only with one or the other hemisphere of the brain, because of the ways our eyes work and our hands and mouths work, while the other one has no idea what y'all are discussing, or only finds out when it hears its own body say it.

Only one of these halves is controlling the language centers that can be used to talk to you (although both sides overhear the conversation).

Gazzaniga had one subject whose vocal side was religious, but whose mute side said (or wrote) that it no longer believed in God.

I probably wouldn't either, shut up in my own skull like that, only able to talk to anyone at the occasional lab test. What kind of God would allow that?

Anyway, there appear to be two coherent conscious entities in that skull now.

And there are the cases of people who lose consciousness of one half of their visual field.

The strange thing is, they don't particularly notice this, the way we don't (can't) notice our blind spot.

What they see fills up their entire conscious field of awareness, just like ours does. Looking at a mall from one end and then from the other, they would describe two completely different scenes. The fact that these views are not symmetrical doesn't bother these folks in the least.

If a fellow trips over a footrest, it wasn't because he couldn't see it, it's because he wasn't looking where he was going.

We know that some experience is bound and can't be untangled by consciousness.

But we also know that brain damage can short-circuit the entangling with very counterintuitive results, such as being able to perceive motion without any perception of shape or color or brightness.

But the evidence seems pretty clear at this point that conscious awareness has to involve real-time synchronization of activity across various regions of the brain, and there's a point at which coherence fails and the body stops being conscious.

That can happen suddenly (as with a lot of anesthesia -- I think the recent link up there mentions some anesthesia studies) or gradually, as when you fall asleep, but it always seems to follow a pattern of weak coherence followed by strengthening when consciousness is re-established.
__________________
.
How can you expect to be rescued if you donít put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:18 PM   #2183
yy2bggggs
Master Poster
 
yy2bggggs's Avatar
 
Join Date: Oct 2007
Posts: 2,435
Originally Posted by Piggy View Post
Well, you can move off into philosophical musings about it, but that's not really necessary in order to do what needs to be done here.
The point of my "philosophical musings" is that "addition" is simply a description, but you're trying to treat it as something special. By describing the same situation in multiple ways, I'm trying to show you that it's just a way to describe things.

But fine. Let's go straight to the heart of it.
Quote:
Arguing about where groups begin and end is like the argument over the border between a language and dialect or between atmosphere and outer space... a pointless exercise that doesn't help us use these terms in the contexts in which they're significant.
What exactly is significant about it?
Quote:
Aggregations in systems that don't need to include brain states is physical addition -- whether that's raindrops filling up a footprint to make a small puddle, or predators closing ranks to make a more formidable fighting force -- are examples of physical addition.
Okay, but this is a bit muddled. It's really important to clear this up because we're actually talking about building brain analogues.

I don't know what you mean by "need to include brain states".

Let's say we have an artificial object--a quarter. It rolls onto the floor next to another quarter. Is that physical addition? What if, instead, I put two quarters into a vending machine. Still physical addition? In this case, note that I intentionally put two quarters in--that certainly requires brain states.

Suppose that I put five quarters in, and press A1 to get a snack. Is that physical addition? Now, how about if instead I put one dollar bill in, and one quarter, and press A1 to get a snack. Still physical addition? If so, do we count the sum as the same as the first physical addition? Note that here, to get the required result, we do need $1.25.
Quote:
Now let's compare that to what PixyMisa's marble machine does.
...
But if you bring a brain into the system which understands the symbols painted on it and the rules for its operation, then (and only then) do you have symbolic addition, in which the observing brain may be reminded of the number 26 or 52 or 8 rather than 3 when the 3 marbles make their run from the top down into the bowl.
But what's so special about a brain here? It's trivial to add a physical machine that converts these placements to a physical quantity. Are you ruling out brain states just to rule out brain states?
yy2bggggs is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:30 PM   #2184
rocketdodger
Philosopher
 
rocketdodger's Avatar
 
Join Date: Jun 2005
Location: Hyperion
Posts: 6,884
Originally Posted by Piggy View Post
An artificial nueron could only work by physical computation.

If it doesn't, it has no means of communicating with the neurons it connects, which operate exclusively by physical computation.

I mean, you could include a symbolic computation in the mix, but why would you?

If you can replace

Neuron -> Neuron -> Neuron

with

Neuron -> Replica -> Neuron

Why bother with

Neuron -> Modulator -> Simulation -> Demodulator -> Neuron?
Probably because that is how it would be implemented.

Man this thread has just gone downhill...
rocketdodger is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:32 PM   #2185
rocketdodger
Philosopher
 
rocketdodger's Avatar
 
Join Date: Jun 2005
Location: Hyperion
Posts: 6,884
Originally Posted by Piggy View Post
I'm glad you brought this up, because I wasn't aware there was any confusion on this point. However, I can see why my too-narrow use of the word "built" may have caused it.

Brains are built and computers are built. That's true.

Which means that you are 100% correct that programming is "fine-grained building". What the programmer is doing at the keyboard is changing the structure of the machine it's connected to so that some of its parts behave in different ways when electricity runs through them than they did before.

(Fortunately, the programmer doesn't have to know what these physical changes are, or even be aware of them, to peform the operation remotely.)

But it is fine-grained building of a particular type. For instance, building a skyscraper isn't "programming it" in the sense we mean when we say computers are programmed.

Programming the computer involves manipulating it so that it moves in ways which are intended to mimic, in some useful form, a set of ideas in the head of the programmer, perhaps associated with some real-world system or some imaginary system... or perhaps randomly if anyone desires to be reminded of a random number or color or sound at any time in the future.

Computers can be made in many configurations, out of many types of materials, each with its own physical properties, some faster and easier to manage than others.

So the question is this: Can that kind of "building" alone allow us to take the material we're working in and turn it into a conscious object? Or will that end up being like programming a skyscraper into existence... the wrong kind of building?

Well, to answer that, we have to ask "What sort of building is necessary to make consciousness happen?"

The short answer to that, of course, is: We don't know.

There simply are no satisfactory answers right now.

That situation by itself means first that we cannot confirm that a machine which is not conscious can be made conscious by programming alone, without building some other structures designed for the purpose, not of supporting symbolic logic, but of enabling the phenomenon of conscious awareness.

But digging further, the brain is obviously doing some sort of real work to make the phenomenon happen. That's beyond doubt, and nobody in biology questions it that I know of.

If no work were involved, the phenomenon could not occur. Because there are no observable phenomena that have no real causes.

Are the signature waves noise, or are they doing some of the necessary work?

Nobody knows right now, but they're currently our best lead for a component of the work that has to happen somehow, the synchronization of patterns in disparate parts of the brain.

In any case, since we know that consciousness is the product of some sort of physical work of the brain, we're not going to get a human body, or a robot body either, conscious unless we've got some structures that are designed to do whatever minimum work turns out to be necessary.

This is what eliminates the possibility of a pure-programming solution.

It simply means that if the machine wasn't already equipped to perform this behavior to make the phenomenon occur in spacetime, you can't do the kind of fine-building called programming (which is intended to make the computer mimic some other system symbolically, regardless of the actual physical actions of the computer which could be any number of possibilities) and expect to make it perform that behavior.

It's like saying you can take a machine without a CD player and make it play CDs by programming alone.

Can't be done, because programming doesn't get you the real work you need.

Programming and a laser and the other stuff, then yeah, you can perform the task and get a real phenomenon going, air molecules bouncing around.

Programming and the right hardware -- which will be more than just the hardware required to run the logic, because the laws of physics demand it -- will get you a real event of conscious awareness.

If you want to say, no, the brain or machine representing things to itself can cause conscious awareness, then you're in trouble.

There are several reasons why, but one fatal reason is that the brain represents things to itself via feedback in many known ways which are not involved in conscious awareness, although some are. So this requires us to follow up by asking "What makes the difference between feedback loops involved in conscious experience, and those that aren't?"

That's kinda where we are now.
No offense but I am convinced now that you honestly think the more sentences you type the more correct your logic becomes.

It doesn't work that way piggy.
rocketdodger is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:33 PM   #2186
yy2bggggs
Master Poster
 
yy2bggggs's Avatar
 
Join Date: Oct 2007
Posts: 2,435
Originally Posted by Piggy View Post
In any case, since we know that consciousness is the product of some sort of physical work of the brain, we're not going to get a human body, or a robot body either, conscious unless we've got some structures that are designed to do whatever minimum work turns out to be necessary.

This is what eliminates the possibility of a pure-programming solution.
No, it does not rule out a pure-programming solution. What if the minimal work turns out to be the controlled manipulation of symbols which are invariant? That's essentially what we're doing with pure programming.

It simply does not follow. You cannot rule this out by speculating that there must be something essential that the brain does, because this could be that essential thing that the brain does. If you want to rule it out, you need to look at what pure-programming can actually do.
Quote:
Programming and the right hardware -- which will be more than just the hardware required to run the logic, because the laws of physics demand it -- will get you a real event of conscious awareness.
Which law of physics?
Quote:
There are several reasons why, but one fatal reason is that the brain represents things to itself via feedback in many known ways which are not involved in conscious awareness, although some are. So this requires us to follow up by asking "What makes the difference between feedback loops involved in conscious experience, and those that aren't?"
This is true, but it's not fatal. You're not taking into account that an environment can be programmed.
yy2bggggs is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:34 PM   #2187
rocketdodger
Philosopher
 
rocketdodger's Avatar
 
Join Date: Jun 2005
Location: Hyperion
Posts: 6,884
Originally Posted by Piggy View Post
It's funny, from our point of view, there's a similar point in your approach where the imaginary suddenly becomes real.

So let me tackle that.

And yeah, I do agree that in some cases there's no difference between simulation and emulation. In fact, I can describe those cases, I think: it's when no real work needs to be done by the physical system during that portion of the chain (except what might be coincidentally done by the original part and the replacement, such as output of random heat) and when the implementation of the hardware doesn't hinder the functioning of other subsystems.

And the necessary fallout from this is that simulations cannot also be emulations -- or in my lingo, representations cannot also be replicas -- at any point where the system does any real work that the replacement can't also perform.

That's why a representation (including a computer simulation) of a kidney can never replace the entire kidney -- it can't do the phsyical work. You have to have an actual dialysis machine for that.

That's why Pinocchio can never be a boy as long as he's made out of wood -- it can't do the physical work that a human body does.

Or we can imagine Major Tom in a space suit, out repairing a space probe.

Let's say he can either use his jet pack to float back to his ship, or he can teleport like in Star Trek.

We'll ignore the problematic bits and just stipulate that he can be Wonkavisioned through space with the relative position and type of his particles preserved in the transfer of photons or whatever, and reassembled from that configuration into a corresponding collection of massive particles, a man in a spacesuit.

Now let's imagine that on this day Tom spots a space squid between him and the mother ship. A small one, but big enough to break something important, so he decides to kill it on his way back.

If David back on the ship teleports him, he can't kill the space squid.

But why not? The teleportation is real, the inputs match the outputs... but it's like I said, there is some real work to be done at that point in the chain which cannot be performed by the physical choice we've made for Tom's transportation (whatever it is) unless it kills anything in its beam, which it probably doesn't because then it would be too dangerous to use.

We could even program the transporter to rearrange the information about Tom's physical state along the way, run it through the same transformations the physical Tom would go through if he had killed the squid.

He would emerge with the memory of using his jet pack to return, and killing the squid along the way. Then he'd look out the portal and see the damn thing still there!

Ok, so getting back to the brain example, if we get down to the level of one neuron, it's likely (but not certain) that there's a medium available which we can use as a replica. And if there's only one action that's important (e.g. a neuron fired or it didn't) then it's hard to describe that as also not a digital simulation.

But if we try to zoom out to the level of the brain, specifically a conscious brain, then we'd have to assume that there is no physical work at all which is ever important in the functioning of the brain which will not also be produced coincidentally by a machine designed to run computer sims, if we are to believe that it could be replaced by a machine designed to run computer sims.

That's a tall order.

And it becomes an insurmountable one when we consider that consciousness is a phenomenon observable in spacetime, which makes it objectively real -- my body as a physical object is generating a conscious experience right now which is locatable in both time and space (i.e., it ain't happening tomorrow in Paris) -- and the laws of physics demand that all real phenomena require some sort of work.

Therefore we can conclude that a simulation/representation of a brain, rather than a replica of a brain, cannot generate a real instance of conscious awareness -- as happens, for instance, when a baby begins to exhibit that behavior -- because the physical work of the machine is too different from the physical work of a brain to make it happen.

So yes, the point of separation you're talking about is very real.
Yep, just like I thought.

You really need to condense your arguments into just a few sentences piggy.

Last edited by rocketdodger; 1st March 2012 at 11:30 PM.
rocketdodger is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:36 PM   #2188
yy2bggggs
Master Poster
 
yy2bggggs's Avatar
 
Join Date: Oct 2007
Posts: 2,435
Originally Posted by westprog View Post
Suppose we want to count the pretty wolves, or the dirty ones?
Then we need a criteria for pretty wolves or dirty wolves, and since we don't have a well defined criteria for this in common, there would no longer be only one correct answer.

But that's not a problem, because it has no bearing on the number of wolves, given our criteria for wolves. It's a non-sequitur.
yy2bggggs is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 09:42 PM   #2189
yy2bggggs
Master Poster
 
yy2bggggs's Avatar
 
Join Date: Oct 2007
Posts: 2,435
Originally Posted by Piggy View Post
Not quite. A computer simulation cannot be conscious if consciousness requires any physical work on the part of the brain which is not also done by the computer.
Of course.
Quote:
Given what we know, we can safely conclude this is the case. (If it weren't, then it's difficult to see how consciousness could be postponed until after binding, and so forth.)
Maybe we can safely conclude this is the case. But that's the whole point--we need to safely conclude it. And I have seen no argument from you so far that allows that to happen.
yy2bggggs is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 10:36 PM   #2190
westprog
Philosopher
 
westprog's Avatar
 
Join Date: Dec 2006
Posts: 8,928
Originally Posted by yy2bggggs View Post
Then we need a criteria for pretty wolves or dirty wolves, and since we don't have a well defined criteria for this in common, there would no longer be only one correct answer.

But that's not a problem, because it has no bearing on the number of wolves, given our criteria for wolves. It's a non-sequitur.
I figured this out by going back to first principles. It has nothing to do with physical location, definition, subjectivity - it all comes down to sets.

It's possible for a set to include any object. The set can be defined by a rule (all white men over six feet tall) or simply by enumeration. The rule is just a shortcut.

The set can include any object, compound or otherwise. The set can include wolves individually, or as a group, or each atom in each wolf.

The "addition" is simply the cardinality of the set.

It is entirely objective, but there are a lot of sets.
__________________
Dreary whiner, who gradually outwore his welcome, before blowing it entirely.
westprog is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 1st March 2012, 11:56 PM   #2191
rocketdodger
Philosopher
 
rocketdodger's Avatar
 
Join Date: Jun 2005
Location: Hyperion
Posts: 6,884
Originally Posted by Piggy View Post
So far, so good.



Still good.



I'm with you, brother.



Cool.
Alright.

Now assume that we have a magical machine which is capable of the following:

1) It can apply an arbitrary spatial/temporal transformation to any number of particles.

2) It keeps a record of all such transformations that have been applied.

3) It alters the behavior of spacetime in a way that results in the interactions between any two particles remaining normal, as if neither of them have had any spatial/temporal transformations applied to them.

This may be hard to follow, so let me provide an example. You are sitting at your computer typing, and the machine decides to apply a 3 meter spatial translation to all the particles within a 0.3 meter diameter sphere, centered at the middle of your head, in a direction roughly "upwards" relative to you.

However, your perception of your space remains normal, and you remain living, because the machine magically insures that the particles in the sphere and the particles outside the sphere end up producing the same results when they interact, or rather "would have" interacted were it not for the translation. For example if a quark inside the sphere would have collided with a proton right outside the sphere, even though it is now 3 meters off target, the machine "fakes" it and applies effects identical to the collision on both the quark and the proton in question. Thus from the perspective of the quark and proton, they really did collide.

Note that the machine doesn't actually alter causation, it merely applies a sort of inverse transformation to any relevant behaviors of the particles so that the results are identical to what they would have been without any funny stuff going on.

Do you agree that such a machine would result in your continued existence in a manner that was totally transparent to you if it applied such a transformation to your head?

Last edited by rocketdodger; 1st March 2012 at 11:58 PM.
rocketdodger is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 12:04 AM   #2192
!Kaggen
Illuminator
 
!Kaggen's Avatar
 
Join Date: Jul 2009
Location: Cape Town
Posts: 3,734
Originally Posted by westprog View Post
I figured this out by going back to first principles. It has nothing to do with physical location, definition, subjectivity - it all comes down to sets.

It's possible for a set to include any object. The set can be defined by a rule (all white men over six feet tall) or simply by enumeration. The rule is just a shortcut.

The set can include any object, compound or otherwise. The set can include wolves individually, or as a group, or each atom in each wolf.

The "addition" is simply the cardinality of the set.

It is entirely objective, but there are a lot of sets.
Yes and life is the ability to keep adding .... ;-)
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa
"We live in a world of more and more information and less and less meaning" Jean Baudrillard
http://bokashiworld.wordpress.com/
!Kaggen is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 01:36 AM   #2193
Modified
Illuminator
 
Modified's Avatar
 
Join Date: Sep 2006
Location: SW Florida
Posts: 4,620
Originally Posted by westprog View Post
You're still begging the question, as to whether something that implements a computation going on in the artificial brain is doing the same thing as the artificial brain. We can't easily test this hypothesis, because we can't plug the computer into a human body, unlike the artificial brain. In the event that we can produce an artificial brain, we can see if the human "works properly".
So we can plug an artificial brain made of multiple processors into a human body, but we can't plug in a single computer in the same way?
Modified is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 01:58 AM   #2194
Belz...
Suspended
 
Belz...'s Avatar
 
Join Date: Oct 2005
Location: In the details...
Posts: 36,686
Originally Posted by Piggy View Post
Not quite. A computer simulation cannot be conscious if consciousness requires any physical work on the part of the brain which is not also done by the computer.
And it can if it does. What's the point ?
Belz... is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 02:21 AM   #2195
PixyMisa
Persnickety Insect
 
PixyMisa's Avatar
 
Join Date: Dec 2002
Location: Sunny Munuvia
Posts: 15,719
Originally Posted by Piggy View Post
Not quite. A computer simulation cannot be conscious if consciousness requires any physical work on the part of the brain which is not also done by the computer.
Yes, the magic bean theory of consciousness. Which you cannot support in any way whatsoever - or even coherently define.
__________________
Free blogs for skeptics... And everyone else. mee.nu
What, in the Holy Name of Gzortch, are you people doing?!?!!? - TGHO
PixyMisa is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 02:42 AM   #2196
PixyMisa
Persnickety Insect
 
PixyMisa's Avatar
 
Join Date: Dec 2002
Location: Sunny Munuvia
Posts: 15,719
For those who weren't present for the previous threads, I characterised Piggy's argument as consciousness requiring computation and a magic bean.

I've seen nothing since then to change my mind.
__________________
Free blogs for skeptics... And everyone else. mee.nu
What, in the Holy Name of Gzortch, are you people doing?!?!!? - TGHO
PixyMisa is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 04:28 AM   #2197
dlorde
Philosopher
 
dlorde's Avatar
 
Join Date: Apr 2007
Posts: 6,216
Originally Posted by Piggy View Post
Oh, sorry, looks like I deserve my own complaint.

You're right, I thought we had gone back to the point in the discussion about swapping the whole brain out. My mistake, sorry.
No problem

Quote:
On the one hand, we can try to imagine how we would literally wire up such a box.
Leave that for a rainy day...

Quote:
On the other hand, we can simply assume that we have something set up that accepts all the neural bombardment that V takes in, and spews out all the neural bombardment V emits, from and to the right locations, and not worry about where the apparatus lives.
Yup, this was my view.

Quote:
(This thing's gonna have a ming-boggling array of inputs and outputs, though.)
A couple of big reels of neural probe wire from Maplins should do it
I think it's only ever likely to be a hypothetical on even the smallest brains we think may support consciousness. Smaller brains, maybe.

Quote:
But because our black box isn't doing any of the physical computation that V actually performed, any brain processes which depend on that work will either fail or go haywire in some way.
It would be interesting to know which processes do depend on it.

Quote:
If consciousness depends at all on the coherence and synchronization of its "signature" global brain waves, which is to say if they're not just noise like the heat coming out of my computer or the noise coming out of my truck, then it's quite likely our black box just created a conscious "blind spot" which will remove visual activity from consciousness.

And although we don't know yet, my bet is that the brain waves are not noise, since we have few other good candidates for mechanisms to synchronize the activity of disparate brain regions, which we know is happening in conscious experience.
It seems likely - I've seen a number of articles that suggest synchronised waves across large areas of the cortex are a key feature of conscious mental states.

Quote:
If a black box removes a physical component necessary for bundling across disparate regions, the effects could be quite unpredictable and very weird.
Interesting point.

Thanks for that analysis - just the sort of food for thought I was after.
__________________
Simple probability tells us that we should expect coincidences, and simple psychology tells us that we'll remember the ones we notice...
dlorde is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 04:37 AM   #2198
!Kaggen
Illuminator
 
!Kaggen's Avatar
 
Join Date: Jul 2009
Location: Cape Town
Posts: 3,734
Originally Posted by PixyMisa View Post
For those who weren't present for the previous threads, I characterised Piggy's argument as consciousness requiring computation and a magic bean.

I've seen nothing since then to change my mind.
The alternative being to make up a definition of a brain activity that can be explained without an empirical understanding of that brain activity.
And no, understanding how neurons work is not understanding how the brain works, but how neurons work.
We don't understand dolphins because they are made up of atoms which we do understand. We understand dolphins by studying their behaviour in their environment. You can build models based on dolphin atoms behavior all you want. It's pointless. Unless of course you define dolphin behavior by what their atoms do and since we know what atoms do it's just a matter of finding the algorithm that uses dolphin atom behavior to predict dolphin behavior.

It is convenient for pretending to know it all but pointless when it comes to empirical evidence.

Piggy has consistently suggested we study the brain in order to inform our definition of the brain activity called consciousness.
You have never expressed the same interest. Instead you claim we know already how consciousness works and have a definition for it.
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa
"We live in a world of more and more information and less and less meaning" Jean Baudrillard
http://bokashiworld.wordpress.com/

Last edited by !Kaggen; 2nd March 2012 at 04:39 AM.
!Kaggen is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 05:44 AM   #2199
punshhh
Illuminator
 
punshhh's Avatar
 
Join Date: Jul 2010
Location: Rural England
Posts: 4,820
Originally Posted by westprog View Post
I think that some kind of underlying reality is the basic assumption of materialism. If there isn't some kind of underlying reality, then there is only our own subjective experience, and no reason to assume that any other kind is possible.
Quite and the underlying reality of materialism is not known.
punshhh is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 2nd March 2012, 05:55 AM   #2200
punshhh
Illuminator
 
punshhh's Avatar
 
Join Date: Jul 2010
Location: Rural England
Posts: 4,820
Originally Posted by rocketdodger View Post
First, you told me you aren't really a monist anyway.
I am primarily a monist.

Quote:
Second, obviously a full simulation of the universe would include a "simulation of timespace in order to capture any kind of dimensional experience."
Yes, I gather you are talking hypothetically as I would doubt that such simulations are doable with current technology, or current understanding of what exists or the role played by timespace in that existence.

Remember we cannot determine what exists, all we can determine is what appears to exist.

If we simulate what appears to exist we will have a simulation which only appears to be real, including any conscious entities there in.
punshhh is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Reply

JREF Forum » General Topics » Religion and Philosophy

Bookmarks

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -7. The time now is 05:47 AM.
Powered by vBulletin. Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
© 2001-2013, James Randi Educational Foundation. All Rights Reserved.

Disclaimer: Messages posted in the Forum are solely the opinion of their authors.