He received his A. From until his retirement in he was State of New Jersey Professor of philosophy and cognitive science at Rutgers University in New Jersey , where he was emeritus. He is shy and voluble at the same time Disagreeing with Jerry on a philosophical issue, especially one dear to his heart, can be a chastening experience His quickness of mind, inventiveness, and sharp wit are not to be tangled with before your first cup of coffee in the morning. Adding Jerry Fodor to the faculty at Rutgers [University] instantly put it on the map, Fodor being by common consent the leading philosopher of mind in the world today.
|Published (Last):||20 August 2011|
|PDF File Size:||8.26 Mb|
|ePub File Size:||2.32 Mb|
|Price:||Free* [*Free Regsitration Required]|
He received his A. From until his retirement in he was State of New Jersey Professor of philosophy and cognitive science at Rutgers University in New Jersey , where he was emeritus.
He is shy and voluble at the same time Disagreeing with Jerry on a philosophical issue, especially one dear to his heart, can be a chastening experience His quickness of mind, inventiveness, and sharp wit are not to be tangled with before your first cup of coffee in the morning.
Adding Jerry Fodor to the faculty at Rutgers [University] instantly put it on the map, Fodor being by common consent the leading philosopher of mind in the world today. I had met him in England in the seventies and Fodor died on November 29, , at his home in Manhattan.
Despite the changes in many of his positions over the years, the idea that intentional attitudes are relational has remained unchanged from its original formulation up to the present time [update].
Fodor considers two alternative hypotheses. The first completely denies the relational character of mental states and the second considers mental states as two-place relations.
The latter position can be further subdivided into the Carnapian view that such relations are between individuals and sentences of natural languages    and the Fregean view that they are between individuals and the propositions expressed by such sentences.
Further, mental representations are not only the objects of beliefs and desires, but are also the domain over which mental processes operate. They can be considered the ideal link between the syntactic notion of mental content and the computational notion of functional architecture.
These notions are, according to Fodor, our best explanation of mental processes. Following in the path paved by linguist Noam Chomsky , Fodor developed a strong commitment to the idea of psychological nativism.
For Fodor, this position emerges naturally out of his criticism of behaviourism and associationism. These criticisms also led him to the formulation of his hypothesis of the modularity of the mind.
Historically, questions about mental architecture have been divided[ by whom? The first can be described as a "horizontal" view because it sees mental processes as interactions between faculties which are not domain specific.
For example, a judgment remains a judgment whether it is judgment about a perceptual experience or a judgment about the understanding of language.
The second can be described as a "vertical" view because it claims that our mental faculties are domain specific, genetically determined, associated with distinct neurological structures, and so on. Gall claimed that mental faculties could be associated with specific physical areas of the brain. The ability to elaborate information independently from the background beliefs of individuals that these two properties allow Fodor to give an atomistic and causal account of the notion of mental content.
The main idea, in other words, is that the properties of the contents of mental states can depend, rather than exclusively on the internal relations of the system of which they are a part, also on their causal relations with the external world. He insisted that the mind is not "massively modular" and that, contrary to what these researchers would have us believe, the mind is still a very long way from having been explained by the computational , or any other, model.
This view is characterized, according to Fodor, by two distinct assertions. One of these regards the internal structure of mental states and asserts that such states are non-relational.
The other concerns the semantic theory of mental content and asserts that there is an isomorphism between the causal roles of such contents and the inferential web of beliefs. Among modern philosophers of mind, the majority view seems to be that the first of these two assertions is false, but that the second is true. Fodor departs from this view in accepting the truth of the first thesis but rejecting strongly the truth of the second.
As Putnam, Boyd and others have emphasized, from the predictive successes of a theory to the truth of that theory there is surely a presumed inference; and this is even more likely when It is not obvious He maintains that if language is the expression of thoughts and language is systematic, then thoughts must also be systematic.
Fodor draws on the work of Noam Chomsky to both model his theory of the mind and to refute alternative architectures such as connectionism. The argument states that a cognizer is able to understand some sentence in virtue of understanding another. For example, no one who understands "John loves Mary" is unable to understand "Mary loves John", and no one who understands "P and Q" is unable to understand "P".
Systematicity itself is rarely challenged as a property of natural languages and logics, but some challenge that thought is systematic in the same way languages are. If thought also has such a combinatorial semantics, then there must be a language of thought. This argument touches on the relation between the representational theory of mind and models of its architecture. If the sentences of Mentalese require unique processes of elaboration then they require a computational mechanism of a certain type.
The syntactic notion of mental representations goes hand in hand with the idea that mental processes are calculations which act only on the form of the symbols which they elaborate. And this is the computational theory of the mind. Consequently, the defence of a model of architecture based on classic artificial intelligence passes inevitably through a defence of the reality of mental representations.
In his view, syntax plays the role of mediation between the causal role of the symbols and their contents. The semantic relations between symbols can be "imitated" by their syntactic relations. The inferential relations which connect the contents of two symbols can be imitated by the formal syntax rules which regulate the derivation of one symbol from another. This idea of content contrasts sharply with the inferential role semantics to which he subscribed earlier in his career.
As of [update] Fodor criticizes inferential role semantics IRS because its commitment to an extreme form of holism excludes the possibility of a true naturalization of the mental. But naturalization must include an explanation of content in atomistic and causal terms. He identifies the central problem with all the different notions of holism as the idea that the determining factor in semantic evaluation is the notion of an "epistemic bond".
Briefly, P is an epistemic bond of Q if the meaning of P is considered by someone to be relevant for the determination of the meaning of Q. Meaning holism strongly depends on this notion. The identity of the content of a mental state, under holism, can only be determined by the totality of its epistemic bonds. And this makes the realism of mental states an impossibility:[ citation needed ] If people differ in an absolutely general way in their estimations of epistemic relevance, and if we follow the holism of meaning and individuate intentional states by way of the totality of their epistemic bonds, the consequence will be that two people or, for that matter, two temporal sections of the same person will never be in the same intentional state.
Therefore, two people can never be subsumed under the same intentional generalizations. And, therefore, intentional generalization can never be successful. And, therefore again, there is no hope for an intentional psychology. For Fodor, in recent years, the problem of naturalization of the mental is tied to the possibility of giving "the sufficient conditions for which a piece of the world is relative to expresses, represents, is true of another piece" in non-intentional and non-semantic terms.
If this goal is to be achieved within a representational theory of the mind, then the challenge is to devise a causal theory which can establish the interpretation of the primitive non-logical symbols of the LOT. The intuitive version of this causal theory is what Fodor calls the "Crude Causal Theory". According to this theory, the occurrences of symbols express the properties which are the causes of their occurrence.
The term "horse", for example, says of a horse that it is a horse. In order to do this, it is necessary and sufficient that certain properties of an occurrence of the symbol "horse" be in a law-like relation with certain properties which determine that something is an occurrence of horse.
There are two unavoidable problems with the idea that "a symbol expresses a property if it is The first is that not all horses cause occurrences of horse. The second is that not only horses cause occurrences of horse. Sometimes the A horses are caused by A horses , but at other times—when, for example, because of the distance or conditions of low visibility, one has confused a cow for a horse—the A horses are caused by B cows.
This gives rise to what Fodor calls the "problem of disjunction". Fodor responds to this problem with what he defines as "a slightly less crude causal theory". According to this approach, it is necessary to break the symmetry at the base of the crude causal theory. Fodor must find some criterion for distinguishing the occurrences of A caused by As true from those caused by Bs false.
The point of departure, according to Fodor, is that while the false cases are ontologically dependent on the true cases, the reverse is not true. The first can subsist independently of the second, but the second can occur only because of the existence of the first: From the point of view of semantics, errors must be accidents: if in the extension of "horse" there are no cows, then it cannot be required for the meaning of "horse" that cows be called horses.
On the other hand, if "horse" did not mean that which it means, and if it were an error for horses, it would never be possible for a cow to be called "horse". Putting the two things together, it can be seen that the possibility of falsely saying "this is a horse" presupposes the existence of a semantic basis for saying it truly, but not vice versa.
If we put this in terms of the crude causal theory, the fact that cows cause one to say "horse" depends on the fact that horses cause one to say "horse"; but the fact that horses cause one to say "horse" does not depend on the fact that cows cause one to say "horse" Their proposal was, first of all, to reject the then-dominant theories in philosophy of mind: behaviorism and the type identity theory. The type-identity theory, on the other hand, failed to explain the fact that radically different physical systems can find themselves in the identical mental state.
Besides being deeply anthropocentric why should humans be the only thinking organisms in the universe? Hence, the impossibility of referring to common mental states in different physical systems manifests itself not only between different species but also between organisms of the same species.
An illustration of multiple realizability. M stands for mental and P stand for physical. The diagram shows that more than one P can instantiate one M, but not vice versa.
Causal relations between states are represented by the arrows M1 goes to M2, etc. One can solve these problems, according to Fodor, with functionalism , a hypothesis which was designed to overcome the failings of both dualism and reductionism.
What is important is the function of a mental state regardless of the physical substrate which implements it. The foundation for this view lies in the principle of the multiple realizability of the mental. Under this view, for example, I and a computer can both instantiate "realize" the same functional state though we are made of completely different material stuff see graphic at right.
On this basis functionalism can be classified as a form of token materialism. For example, the language of thought hypothesis has been accused of either falling prey to an infinite regress or of being superfluous. Specifically, Simon Blackburn suggested in an article in that since Fodor explains the learning of natural languages as a process of formation and confirmation of hypotheses in the LOT, this leaves him open to the question of why the LOT itself should not be considered as just such a language which requires yet another and more fundamental representational substrate in which to form and confirm hypotheses so that the LOT itself can be learned.
On the other hand, if such a representational substrate is not required for the LOT, then why should it be required for the learning of natural languages? In this case, the LOT would be superfluous. Dennett suggested that it would seem, on the basis of the evidence of our behavior toward computers but also with regard to some of our own unconscious behavior, that explicit representation is not necessary for the explanation of propositional attitudes.
During a game of chess with a computer program, we often attribute such attitudes to the computer, saying such things as "It thinks that the queen should be moved to the left. The same is obviously true, suggests Dennett, of many of our everyday automatic behaviors such as "desiring to breathe clear air" in a stuffy environment. Kent Bach , for example, takes Fodor to task for his criticisms of lexical semantics and polysemy. Fodor claims that there is no lexical structure to such verbs as "keep", "get", "make" and "put".
Causal Theories of Mental Content
Reading his book requires a fair amount of technical knowledge in philosophy of mind and language. This book is very important for anyone who is interested in the project of naturalizing intentionality. In this chapter, Fodor motivates an account of crude causal theory of mental content, which roughly states that all instantiations cause one to token a concept. So, an instantiation of red i.
FODOR PSYCHOSEMANTICS PDF
Disida In us, it might be that we interpret our experience as picking out the one rather than all the alternatives because our language allow us to carve things up in this fine-grained way and then implicitly theorize about what is indicated. When we show movies of rabbit parts coming together, pulling apart, we see that two such nodes are activated when apart, one node when together, and this tracks the psychophysics, etc. I skimmed the paper quickly and was quite pleased to discover a blob named after me! If we are talking about the function of some neuronal region, then the neuronal details will tend to matter more. My work mentioned, in addition to psychossemantics for 1 and 2, develops a detailed alternative. Was Psychosemantics a Failure? I think we do not see psycnosemantics.
Review of J. In Word and Object, Quine acknowledged the "practical indispensability" in daily life of the intentional idioms of belief and desire but disparaged such talk as an "essentially dramatic idiom" rather than something from which real science could be made in any straightforward way. Endnote 1 Many who agree on little else have agreed with Quine about this, and have gone on to suggest one or another indirect way for science to accommodate folk psychology: Sellars, Davidson, Putnam, Rorty, Stich, the Churchlands, Schiffer and myself, to name a few. This fainthearted consensus is all wrong, according to Fodor, whose new book is a vigorous--even frantic--defense of what he calls Intentional Realism: beliefs and desires are real, causally involved, determinately contentful states.