Originally posted by PalynkaYou represent yourself, only. I have never met any rude portuguese before. It actually surprised me, as portuguese people has high degree of goodwill inmy eyes. You represent yourself, and I'm sure that you are very kind if I meet you in person. Your behaviour has nothing to do with your nationality, only you yourself. I think you're (self-)aware of this. 😉
Perhaps you'd like to blame my rudeness on my nationality?
Originally posted by FabianFnasSo why did you say my culture taught me to be a misogynist?
You represent yourself, only. I have never met any rude portuguese before. It actually surprised me, as portuguese people has high degree of goodwill inmy eyes. You represent yourself, and I'm sure that you are very kind if I meet you in person. Your behaviour has nothing to do with your nationality, only you yourself. I think you're (self-)aware of this. 😉
Originally posted by FabianFnasIn secondary school our Religious Education text book made that ridiculous claim. I lost all respect for the writers of that text book.
Some says that animals are driven by insticts and Pavlovian behaviour, but I'm not so sure of that.
Anyone who has had a pet knows that each animal is unique and has thoughts, feelings and moods and is most definitely no more a 'instinct / Pavlovian condition response machine' than people are.
I think the flaw in the Chinese room example in the first post is that when he opened the black box and did not recognize the mechanism as being an entity that 'understands' Chinese he dismissed it. In my opinion he dismissed it too quickly.
Originally posted by ivanhoeI don't believe a human can do that. You can tell when someone understands a language.
..... or "Do computers have minds" ?
"Chinese room thought experiment.
Searle requests that his reader imagine that, many years from now, people have constructed a computer that behaves as if it understands Chinese. It takes Chinese characters as input and, using a computer program, produces other Chinese characters, which it presents as output. Supp ...[text shortened]... **************************************
What do you think ? Do computers have minds ?
Originally posted by AThousandYoungExactly! And what tips you off? The speed and accuracy of their responses, proper grammar and pronunciation, continuity of conversational context (i.e. they don't reply quickly with a perfectly accurate, grammatically correct and expertly pronounced non-sequitur like "rain is wet" when asked about the weather), concision, proper word choice, etc...
I don't believe a human can do that. You can tell when someone understands a language.
So if a Chinese room were to be constructed that could converse in Chinese in a manner that could fool a native speaker, what would be the difference between that and someone who speaks Chinese? What part of "understanding" isn't captured when the room demonstrates exactly those aspects of language that we consider essential to an understanding of it?
I think twhitehead said it most eloquently and succinctly, the Chinese room system understands Chinese, even if the individual components do not.
It seems to me you're all mixing up the concepts a little here. On one
hand you're talking about self-awareness. On the other hand you're
talking about understanding. And then you mix in a little linguistic skills.
These are all three distinctly different parts of an intellect.
Self-awareness can be found in many different species of animals. It can
be observed simply by the ability for such an animal to "relate" to the
surrounding environment. Were it not self-aware, it wouldn't be able to
learn from its mistakes and it wouldn't be able to make the distinction
between itself and others, which is crucial in order to survive in an ever
changing environment (possibly those changes are also the cause of
self-awareness).
Understanding is a much harder concept to explain. To "understand"
something is a more abstract concept, in my opinion. I understand that I
can't just run in front of a car simply because I can envision what the
outcome would most likely be and it's not desirable to me. I can
understand things as long as I can reasonably calculate their essence. If
I can't see how come something is the way it is, I don't understand it.
Self-awareness is not really required to understand anything. Only the
facts and methods required to derive the correct answers are needed in
order to understand something. Which leads us to linguistic skills. You
can understand a language in so far that you can derive the meaning of
a sentence, paragraph and so on. You may be able to calculate what
seems a reasonable response, considering your level of knowledge on
the subject, but if it's incomprehensible to your reader, does that mean
you have no self-awareness, no understanding or no linguistic skill? Of
course not.
To test a computerised system for understanding, we use logs. Logs
detail not only what conclusions a system can derive, but exactly how it
came to those conclusions. From that we can tell if a system has an
"understanding" of the conversation being held, which would also answer
the question of whether or not the system truly "understands" the
language being used. A human language is far more complex than a
computer language, but the basic principles are the same. Therefore, in
theory, it should be possible to write programs that fully "understands" a
language, and to test them appropriately for that understanding.
Self-awareness can be tested in much the same way. Unlike the case
with two-year olds and other animals, we have detailed logs (if we wish)
over the entire process from sensory input to resulting output.
Originally posted by JigtieSelf-awareness, or more precisely "consciousness', is a little more involved than simply relating to the environment and learning from mistakes. Any feedback system attached to a sensor can do that. Consciousness involves formulating the concept of "I", which is evident in many animals from watching their behaviour as you said, but is more complicated than simply adapting foraging habits or mating.
It seems to me you're all mixing up the concepts a little here. On one
hand you're talking about self-awareness. On the other hand you're
talking about understanding. And then you mix in a little linguistic skills.
These are all three distinctly different parts of an intellect.
Self-awareness can be found in many different species of animals. It can ...[text shortened]... f we wish)
over the entire process from sensory input to resulting output.
I don't think anyone ever said that a lack of linguistic skill means a lack of consciousness. In fact, the Chinese room experiment is predicated on the fact that the room appears to have impeccable linguistic skill, but the experiment has been interpreted as John Searle asserting that despite this skill the room can't have any understanding of the language it's producing. Consciousness is beside the point (although not to John Searle, as he believes anything that can't "think" can't "understand", where "thinking" involves consciousness). But I think we both agree that understanding involves symbol manipulation that produces the correct result (in a statistically significant way, of course).
Originally posted by PBE6What is a sense of self if not the recognition that my body is a different
Self-awareness, or more precisely "consciousness', is a little more involved than simply relating to the environment and learning from mistakes. Any feedback system attached to a sensor can do that. Consciousness involves formulating the concept of "I", which is evident in many animals from watching their behaviour as you said, but is more complicated than si that produces the correct result (in a statistically significant way, of course).
entity than everything around me? And in recognising that fact, I am
also acknowledging the concept of "I", if not consciously. I don't think
you can say that self-awareness is anything more than that. A higher
form of intellect (a conciousness*) could start asking question about its
own self in relation to the surrounding, but the self-awareness bit is
pretty much the same: the ability to recognise that you are one part of a
greater whole. (As an example, most of us are taught to ask these
questions that leads to our concious awareness of "I", as opposed to
being a subroutine running in the back of our mind and "causing"
individuality. But we were self-aware even before we realised what it
means to be self-aware. We must have been, or we wouldn't have had
any sense of individuality to begin with. And all intelligent life, even
two-year old babies and snails has a sense of individuality. They display
it by choosing their own directions of movement and by choosing
different activities in which to engulf themselves, as opposed to just
follow the tide.)
While I agree that understanding doesn't require a sense of self (or that
a sense of self require any understanding of what it actually means), I'm
not sure that impeccable language skill can come without understanding
the rules of the language and the subject being discussed, hence it does
require "thinking" in the sense that data will have to be analysed and a
response calculated. But to test for understanding, we merely need to
read the logs detailing how the system calculated the proper responses.
* Actually, I just realised that conciousness is the wrong word, as it
means the exact same thing as self-awareness. I don't know the English
word, but I'm talking about an awareness not just of your own self in
relation to the surrounding, but also an understanding of what that
awareness is (an understanding of the abstract concept). It may seem
like a subtle difference, but I think it's a very important one to make.
Originally posted by JigtieSelf-awareness is usually credited as being aware that one exists as a separate individual, while self-consciousness (note the importance of the world self) relates to the development of an identity or ego. The two are related, but not the same.
What is a sense of self if not the recognition that my body is a different
entity than everything around me? And in recognising that fact, I am
also acknowledging the concept of "I", if not consciously. I don't think
you can say that self-awareness is anything more than that. A higher
form of intellect (a conciousness*) could start asking question abo ...[text shortened]... y seem
like a subtle difference, but I think it's a very important one to make.