William Lycan's response as a functionalist seems to be one of the most interesting responses to Searle's paper. However, it also appears to be one of the most empty. Lycan's reaction as a functionalist appears to be very similar to the systems reply. In response to Searle's paper, both the systems reply and Lycan's functionalist response claim that while the individual person locked in the room does not understand the story, the system as a whole does understand the story. Lycan basically writes a logical response to Searle's paper with empty arguments. He also fails to back up his claim that a system other than a human brain is capable of understanding.
Both Searle and Lycan agree that individual objects within a system cannot be considered thinking. In other words, both Searle and Lycan believe that in the example of the Chinese room, the man does not understand the language by himself. It is very obvious to Lycan that an object as part of a system cannot understand or think on its own. He argues that it must be part of a greater system which as a whole system can understand the Chinese. It is this whole system that understands. Lycan criticizes Searle for looking to much at the individual parts of a system and not at the system as a whole. Lycan even pokes fun at Searle when he says, "Neither my stomach nor Searle's liver nor a thermostat nor a light switch has beliefs and desires." The man who responds in Chinese using the "data banks" of Chinese symbols is, according to Lycan, understanding as part of a system. Although as an individual, the man is unable to "understand" Chinese, he can, as a whole system understand it.
It is easy for Searle to respond to this claim. There is no evidence that he needs to refute. He even says that he is "embarrassed" to respond to the idea that a whole system aside from the human brain was capable of understanding. He asks the key question which will never be answered by Lycan, "Where is the understanding in this system?" Although Lycan tries to refute Searle's views, his arguments are not backed with proof. Lycan responded by explaining that Searle is looking only at the "fine details" of the system and not at the system as a whole. While it is a possibility that Searle is not looking at the system as a whole, it still does not explain in any way or show any proof whatsoever as to where the thinking in the system is.
string. We ended up going to the air port where my mom sent me too India.
The introduction to the article was interesting, “What has billions of individual pieces, trillions of connections, weights about 1.4 kilograms, and works on electrochemical energy? If you guessed a minicomputer you’re wrong. If you guessed the human brain, you’re correct!” I did not know the brain had quite this many connections. After reading our chapter I really started to grasp the complexity of the human brain and the amount of energy it expends. I felt that the article lacked facts like these further in. There was very little empirical numbers offered by the author Eric Chudler.
My second argument against Clark’s claims applies to premise two: “the brain, like a computer, uses symbols to make calculations and perform functions.” Before I state what I find is wrong with this claim, I should explain the example Clark uses to support this premise, which is from the work of Jerry Fodor:
Doublethink has successfully taken away the people’s human nature because part of humanity is being able to try to understand, converse about, and explain one’s feelings. Therefore, the complete control of the government has created doublethink to take away this vital cognitive process and has implemented a quick and useless explanation in order to falsely explain life’s complexities. In Winston’s society, the adaptation of language also contributes to the animalization of the people. When in the cafeteria, Winston watches a telescreen of the mindless robotic soldiers that Ingsoc has created. He explains that, ““The stuff that was coming out of him consisted of words, but it was not speech in a true sense: it was noise uttered in unconsciousness, like the quacking of a duck.” The absolute empowerment of Big Brother has transformed these soldiers from mindful humans to mindless robots and although they speak the same language, whenever they speak it has no effect on those around them because the meaningful language has been taken away from them. Syme later explains that there is in fact a word for this meaningless conversing, “duckspeak” he called it. The creation of this word has also created the idea
Searle's argument delineates what he believes to be the invalidity of the computational paradigm's and artificial intelligence's (AI) view of the human mind. He first distinguishes between strong and weak AI. Searle finds weak AI as a perfectly acceptable investigation in that it uses the computer as a strong tool for studying the mind. This in effect does not observe or formulate any contentions as to the operation of the mind, but is used as another psychological, investigative mechanism. In contrast, strong AI states that the computer can be created so that it actually is the mind. We must first describe what exactly this entails. In order to be the mind, the computer must be able to not only understand, but to have cognitive states. Also, the programs by which the computer operates are the focus of the computational paradigm, and these are the explanations of the mental states. Searle's argument is against the claims of Shank and other computationalists who have created SHRDLU and ELIZA, that their computer programs can (1) be ascribe...
Through newspapers articles we can take a glimpse at recorded history from the 18th century and see that rape had been a crime committed by many criminals, and dealt with harshly. Most victims of rape assaults were young women that were “robbed of that which constitutes the fairest part of the female sex- her chastity and peace of mind” (Newgate Calendar, Paragraph 3). John Lennard created a reputation for himself, as a man that was found guilty of raping a young woman by the name of Miss Ann Boss on the 15th of June, 1773. Not long after committing the crime, Lennard’s name appeared in numerous newspapers that were reporting on his accused crime. The newspapers followed him through his trial until after his execution on August 11, 1773. The newspapers used specific words and phrases that made Lennard appear to the public as either a dangerous criminal who has committed a particularly dangerous crime, or one that was grouped with other criminals who may have committed less dangerous or harmful crimes. These newspapers also had a way of appealing to the reader’s emotions in an attempt to teach the reader a valuable lesson from Lennard’s life of crime and execution. Through the newspapers specific word choices and appealing to our emotions and idea’s around life lesson’s, we can analyse how John Lennard is characterized by the public and depicted in the press.
However, the human brain is not that simple, which makes it even more sensitive and fragile to outside forces...
I will begin by providing a brief overview of the thought experiment and how Searle derives his argument. Imagine there is someone in a room, say Searle himself, and he has a rulebook that explains what to write when he sees certain Chinese symbols. On the other side of the room is a Chinese speaker who writes Searle a note. After Searle receives the message, he must respond—he uses the rulebook to write a perfectly coherent response back to the actual Chinese speaker. From an objective perspective, you would not say that Searle is actually able to write in Chinese fluently—he does not understand Chinese, he only knows how to compute symbols. Searle argues that this is exactly what happens if a computer where to respond to the note in Chinese. He claims that computers are only able to compute information without actually being able to understand the information they are computing. This fails the first premise of strong AI. It also fails the second premise of strong AI because even if a computer were capable of understanding the communication it is having in Chinese, it would not be able to explain how this understanding occurs.
Computers are machines that take syntactical information only and then function based on a program made from syntactical information. They cannot change the function of that program unless formally stated to through more information. That is inherently different from a human mind, in that a computer never takes semantic information into account when it comes to its programming. Searle’s formal argument thus amounts to that brains cause minds. Semantics cannot be derived from syntax alone. Computers are defined by a formal structure, in other words, a syntactical structure. Finally, minds have semantic content. The argument then concludes that the way the mind functions in the brain cannot be likened to running a program in a computer, and programs themselves are insufficient to give a system thought. (Searle, p.682) In conclusion, a computer cannot think and the view of strong AI is false. Further evidence for this argument is provided in Searle’s Chinese Room thought-experiment. The Chinese Room states that I, who does not know Chinese, am locked in a room that has several baskets filled with Chinese symbols. Also in that room is a rulebook that specifies the various manipulations of the symbols purely based on their syntax, not their semantics. For example, a rule might say move the squiggly
At the end of chapter two, Searle summarizes his criticism of functionalism in the following way. The mental processes of a mind are caused entirely by processes occurring inside the brain. There is no external cause that determines what a mental process will be. Also, there is a distinction between the identification of symbols and the understanding of what the symbols mean. Computer programs are defined by symbol identification rather than understanding. On the other hand, minds define mental processes by the understanding of what a symbol means. The conclusion leading from this is that computer programs by themselves are not minds and do not have minds. In addition, a mind cannot be the result of running a computer program. Therefore, minds and computer programs are not entities with the same mental state. They are quite different and although they both are capable of input and output interactions, only the mind is capable of truly thinking and understanding. This quality is what distinguishes the mental state of a mind from the systemic state of a digital computer.
I'll first talk about how Searle was lead to question the claim of computers being things that could actually think and were considered to have a strong sense intelligence based on the assumptions made by Alan Turing. He developed a test called the "Turing test" or, in other words, the "Imitation Game". The "Turing test" was a test that used a person (interrogator) who asked two subjects (a human and a computer) a series of questions that aided the integrator in determining which of the subjects was actually a human. (A.M. Turing, 1950, pg.) The assumptions based on the test included: If something has the ability to have thought then it is considered a thinker. The other assumption in question is that not only humans have the capability of having a mind, but other things including objects could also have a mind which makes them a thinking thing. These assumptions made Searle question on how the assumptions could be accurate, so in order to try to find a way to argue that the assumptions are not valid, so he created his experiment called the "Chinese Room Experiment". With this experiment, Searle was able to provide arguments that go against the claim proposed from the "Turing test" which I will discuss
Furthermore, it does not give a clear explanation of how mind works; instead, it only argues that the mind is a non-physical thing since the laws of physics cannot break it down into particles to conclude how it works. Mind and body both exist, but they both are physical; in fact, it has been proven that the brain is responsible for the human behaviors. For example, the story of Phineas Gage tells us about the mind-body relationship, in which Gage was known to be a very friendly and smart person before head injury, but when he suffered a head injury that affected his brain, he turned to a mean person, who was completely opposite of the person his friends had known before ( Lawhead 83). This shows that the brain is directly responsible for the mind and the behaviors of a
of our cognitive structure. And it is mentioned in Martinez et al., (2012) that people whom
Functionalism is a materialist stance in the philosophy of mind that argues that mental states are purely functional, and thus categorized by their input and output associations and causes, rather than by the physical makeup that constitutes its parts. In this manner, functionalism argues that as long as something operates as a conscious entity, then it is conscious. Block describes functionalism, discusses its inherent dilemmas, and then discusses a more scientifically-driven counter solution called psychofunctionalism and its failings as well. Although Block’s assertions are cogent and well-presented, the psychofunctionalist is able to provide counterarguments to support his viewpoint against Block’s criticisms. I shall argue that though both concepts are not without issue, functionalism appears to satisfy a more acceptable description that philosophers can admit over psychofunctionalism’s chauvinistic disposition that attempts to limit consciousness only to the human race.
...lligent, intentional activity taking place inside the room and the digital computer. The proponents of Searle’s argument, however, would counter that if there is an entity which does computation, such as human being or computer, it cannot understand the meanings of the symbols it uses. They maintain that digital computers do not understand the input given in or the output given out. But it cannot be claimed that the digital computers as whole cannot understand. Someone who only inputs data, being only a part of the system, cannot know about the system as whole. If there is a person inside the Chinese room manipulating the symbols, the person is already intentional and has a mental state, thus, due to the seamless integration of their systems of hardware and software that understand the inputs and outputs as whole systems, digital computers too have states of mind.