Wednesday, February 16, 2011

Can Google think about dogs?

Vallicella asks here whether a Martian scientist can determine the mental state corresponding to ‘thinking about dogs’ by monitoring the neural state of the thinking person?

And I ask whether a Martian scientist could determine the ‘software content’ corresponding to a search for ‘dogs’ in Google by monitoring the hardware states of the Google search engine. Probably not (see complicated looking picture of inside Google). Do we conclude there is more to Google than what can be known even by a completed computer science?


Brandon said...

No, but we might well conclude that there is a genuine distinction between hardware and software -- if software content cannot be monitored by monitoring hardware states alone, that requires that the one isn't simply the same as the other. Thus a complete computer science is not a science of hardware states alone. Indeed, this is pretty much what we actually do conclude as a matter of practice, especially adding in factors like multiple realizability, since that's why we have a hardware/software distinction in the first place.

In other words, while you're probably right that it's too hasty to move from this sort of consideration to the conclusion that physicalism about the mind is false, simply speaking, it wouldn't be unreasonable to move from it to the conclusion that if you were physicalist about the mind you would have to at least be a nonreductive physicalist -- some form of Nagel-style physicalism might still be made plausible, but not (say) Churchland-style physicalism.

Edward Ockham said...

Thank you Siris and nice to have you popping round for a cup of tea here. I agree. Mostly. Strictly speaking, though, the software is just another aspect of the hardware. If I search for the word ‘dog’ in MS word then the application will somewhere have to store the binary equivalent for ‘dog’ which is 110010011011111100111 (I hope). Similarly there is a binary equivalent of the whole text in the document, e.g. ‘waiting for dogot’, which contains the ‘dog’ binary string. Finally there is a binary representation of the application or program itself, which takes the ‘dog’ string and cycles through the text in a mechanical way until it finds the corresponding string in the searched text. This program is also a set of 0s and 1s. We don’t think about this since we are used to thinking of program and data as separate things. But they’re not. Thus, the ‘looking for dog’ search can be represented as a relation between the search string and the program, which is all a state of the hardware (specifically, the voltages of all of the individual transistors that make up the physical memory of the computer).

No sane person would want to do this, of course (although Turing reputedly despiled compilers, preferring to work with binary strings – this in the days when programs were less complex, though).

The equivalent of the ‘world’ is the searched text – you might say this is different, since humans interact with the world, not with internal representations. But even there, I’m not sure. My eye is scanning a computer screen as I write this reply. What is actually happening is that my eye is engaging with the complex retinal image – moving the fovea around the image, focusing using its lens and so on. Suppose I search for the word ‘dog’. Then I am rotating my eye in order to move the fovea across the retinal image, until I locate the part of the image that contains ‘dog’. You will throw your hands up in horror at this, having read Austin and Moore and countless others.

Brandon said...

Always glad for a cup of tea.

To say that "the software is just another aspect of the hardware" is arguably misleading and not strictly speaking at all; not every hardware state contributes to or even influences software state, and major changes in software state may be associated with very minor changes in hardware state and vice versa. The only features of hardware that are relevant to software are those whose actual operations are describable in terms of the underlying mathematics of operations on which programming is based, with respect to the inputs and the outputs of different subsystems. That is, what counts as relevant about the hardware is determined entirely by the nature of the software it is designed to run, not vice versa. Likewise, software can be just as much about what the hardware is not doing as what it is. This would be to put it paradoxically, but usually would make more sense to say that the hardware, or more accurately, the hardware behavior, is just another aspect of the software -- indeed, it is in principle possible for the same software to be run on a standard electronic computer, a photoelectric computer, a spintronic computer, or Babbage's Analytical Engine, even though the relevant features of hardware would often be physically different for each. Computers are in this sense primarily software: the hardware is just whatever system available that is capable of conforming to the mathematics of the software to an adequate level of precision. What the Google analogy shows is that hardware and software are really distinct objects of study (as distinct as the material composition of a ball is from the the dynamics a ball's motion) that would both need to be considered and interrelated by a complete computer science. And that is, indeed, pretty much what actual computer scientists do.

I'm less horrified by your description of the eye than you might think; although I would deny that your eye is engaging with the image at all; it's just undergoing muscular contractions while receiving light through an aperture.

Edward Ockham said...

>> major changes in software state may be associated with very minor changes in hardware state

I don't follow that. Whenever I load an application, I have changed the state of the hardware considerably.

Perhaps there is an analogy with matter and form. The hardware (considered without regard for the states it can take) is like matter. The software - an application that I load onto the hardware to change all the 0s into 0s and 1s - is like form.

>>it is in principle possible for the same software to be run on a standard electronic computer, a photoelectric computer, a spintronic computer, or Babbage's Analytical Engine,

Yes, and in this respect it is like form. The medieval analogy was different lumps of wax, and a seal. You heat the seal and make the same 'form' of impression on the individually different bits of wax.

>>although I would deny that your eye is engaging with the image at all;

That is precisely what I would assert, though I would need (a lot) more space to justify that.

Roughly, there is a proximate object of the eye's (and hence the mind's) operation, and that is the retinal image. Then there is a non-proximate object, and that is the object that is casting the image.

I was going to make a post or two about sense-data, in defence of what Hume says about them. Perhaps more later.

Brandon said...

Roughly, there is a proximate object of the eye's (and hence the mind's) operation, and that is the retinal image.

Interesting; I would deny that the retinal image is even the proximate object of the retina itself, much less the eye, much much less the mind -- it's projected on the back of the eye, and the fact that it is explains some aspects of vision given other facts about the visual system, but I don't see any action of the eye, or even of the rods and cones of the retina, as plausibly taking it as an object.

I'd definitely be interested in a defense of Hume's representationalism.

David Brightly said...

Regarding the software/hardware distinction, I offer the following in irenic spirit. We can view a computer as a network of discrete components such as resistors, capacitors, and transistors. Measurable electrical potentials and current flows occur at and between the nodes of the network. The computer is a dynamical system that we can model by a very large number of coupled differential equations. On this view a loaded program, in effect the distribution of voltages and charges representing 0s and 1s throughout the computer's memory cells and registers, constitutes the initial conditions of the dynamical system. So writing programs can be seen as preparing initial hardware states. On the other hand, viewing the computer as a network of voltages and currents is already to see it at some level of abstraction. For example, the conductors connecting the circuit elements are not explicitly present, though they are implicit in the couplings between our equations. To understand how a transistor acts as a switch requires a quantum mechanical account of electrons in doped semiconductors, a rather lower level of abstraction. What we have is a physical system viewable at many different levels of abstraction. One thing our Martian might notice is that the voltages and currents change state synchronously at regular intervals. One step he might therefore make in his reverse engineering project is to reduce a continuous problem to a discrete one. Another observation he might make is that voltages propagate around the circuit in groups of 8, 16, 32, or 64, say. So a further step would be to view the system as interconnected cells and registers of various widths. This would be a higher level of abstraction but one that we would still label a 'hardware' view. The upshot of these considerations is that it's not at all easy to make this hardware/software distinction. The notions of 'software content' and 'software state' are going to be hard to pin down. A low level of abstraction tends to be seen as a hardware view and a high level as a software view. At perhaps the highest level of all, where we say the system is looking for the string 'dog', we seem to have transcended the everyday notion of software, ie, a program in some artificial language. Does this reconcile the opposing views expressed in the previous comments? Yes, it's all 'hardware' really because it's a single physical system doing its physical thing. Yes, it's all 'software' really because every view we take will involve some degree of abstraction. At the higher levels the fact that we are in the first place talking about a lump of matter can easily be lost and indeed the same description can apply to a radically differently constituted lump of matter.

Edward Ockham said...

Well here's the sort of search function we are talking about

function NaiveSearch(string s[1..n], string sub[1..m])
2 for i from 1 to n-m+1
3 for j from 1 to m
4 if s[i+j-1] ≠ sub[j]
5 jump to next iteration of outer loop
6 return i
7 return not found

This could be rewritten in assembly language, for which a guide here.

Assembly is only one step up from machine language, which is 0s and 1s. And from there, a mere step to voltages. It's harder to make the voltages visible, of course (probably the best way is just to look at the machine code lol.

David Brightly said...

Oops, Did you not ask what hardware state corresponds to the program? Surely that's easy isn't it? Compile and link the program and load the resulting binary into memory. The machine is now 'wound up' and ready to go. It's in the right initial state such that, once set going, the required search ensues as a matter of physics. The program is directly an encoding of this initial state, seen at some level of abstraction. There is an issue here with regard to what I mean by 'once set going' but I suspect that's not the basis of your objection, if there is one. Are we actually disagreeing?

Excellent choice of processor architecture!

Edward Ockham said...

>>Oops, Did you not ask what hardware state corresponds to the program?

I certainly did. At least I meant to. Yes, I think we are agreeing.