31 July 2008
30 July 2008
I am currently enrolled in a Philosophy of Language class. It’s fascinating stuff, really, and my professor is ridiculously enthused about it all, so I generally enjoy the course. But this past Friday, we had a debate that left me feeling furious! It put me in such a bad mood that I had trouble sleeping throughout the weekend and ditched class on Monday (it’s a MWF-type class, by the way), and it was with great reluctance that I dragged myself to class today because I was still so angry about the whole thing.
What was this debate, you ask? It was whether meaning is in what someone literally says or in their intent. My professor said we had to choose a side and that we weren’t allowed to be middling on the issue—we had to pick a side and argue for it. The class split perfectly in half, so it was 3 on 3 (and people told me horror stories about the huge class sizes at BYU! Ha!). I wasn’t convinced that all the way on one side or the other was very wise, so I considered the extremes: on one side, we have absolutely no wiggle room because every utterance is taken absolutely literally; on the other—wow, I shudder to think. It seemed to me that, if meaning is entirely in the intention, then structured language is totally unnecessary because people ought to just automatically know what I mean by this gesture or this facial expression and, if they misunderstand, that’s their problem. This latter prospect frightened me to death, so I argued in favor of absolute literalism and analyticity.
The class ended before the argument was really able to reach any sort of denouement, but I was happy to escape because the other side was driving me crazy with all these fluffy-bunny arguments that couldn’t be argued against because they were so insubstantial, and then we’d say something smart that they’d turn on its head because, I dunno, because nonsense always comes out on top.
So I was furious, as I said. Fury isn’t really something I’m naturally inclined toward, but I was positively livid about the whole ordeal because it seemed to me that linguistic anarchy had won against formal structure via stubborn belligerence, and I couldn’t take it.
But today I manned up to facing the class again, and I’m very glad I did because today I came to my own theory of meaning. It was mostly during the walk home from class that I refined it, but today’s discussions got my juices flowing in happy rivers again, and I’m sure that precipitated the brilliance that I am about to share with you.
The Schmetterlingian Theory of Meaning
I have decided that my notion of language is words organized into sentence structures that are governed by set rules. Gestures, facial expressions, tone of voice &c. are not parts of language (except, I suppose, in, say, ASL, which is strictly gestures, or, like, Chinese, in which tone is a part of pronunciation). But if I say, “That’s terrific,” the sentence “That’s terrific” has a literal meaning, and that meaning is not negotiable.
However, I, the speaker, may intend to convey that the antecedent of “that” is not, in fact, terrific—the exact negation of the meaning’s truth value—and perhaps I successfully convey this intention to my audience via a deliberate combination of vocal qualities, hand gestures, and facial expression. These are certainly elements of communication and augment the conveyance of my original intention, but they have no linguistic value and do not change the actual meaning of the sentence, though they may (and, I hope, will) help you to understand my intent in uttering them.
“Oh,” you might think, “so, while he said, ‘That’s terrific,’ he meant, ‘That’s not terrific.’”
NO!!! You aren’t paying attention! I did not mean anything: sentences mean; people intend! So, while I said, “That’s terrific,” I intended to convey my opinion that “that” is not, in fact, terrific.
I was tossing this idea around with my philosophy-major roommate, and I realized that I don’t really know what function words perform. Certainly, words can mean, but I think that meaning in words is more of an exception than a rule. The only evidence I can offer for this is that, if you translate a sentence from one language to another, you have to take the sentence as a whole: if you translate the individual words one at a time, you’re never going to get a very good translation. My roommate suggested that I can simply define words in terms of sentences, viz. “Words are the building blocks that we put together to create sentences.” This, to me, is, for now, at least, a satisfactory explanation, but I hope to come up with something better soon because that definition really makes me wonder how dictionaries can exist.
15 July 2008
Correct answer: too long ago.
So today's topic is art. I know few topics that is so innately volatile: I mean, just trying to nail down a solid definition of art can cause offense to some people. One of the problems we run into immediately is the difference between Art Perceived and Art Intended. I submit that Art Perceived is a null phrase. Art, to me, has a lot to do with creative process. If I discard a creation of mine as worthless, and some other guy finds it and thinks it's fantastic and beautiful and (dare I say it?) artistic, I don't think that makes it art. I'm not sure we have (in English, at least) a good name for that, but I think calling it "art" would be erroneous.
I got in a pretty exciting debate with a girl about this a few weeks back. She's a music major and a big fan of such things as John Cage's 4'33", which is hard for me to have a solid opinion on. I ran this girl in circles until, in a huff, she told me that anything a person wants to accept as art is art, which struck me more as a surrender than a definition--but whatever.
My problem with something like 4'33" is that I think an artist ought to manipulate in some way what it is the audience experiences. Perhaps John Cage intended his audience to simply appreciate the ambient sounds of their current environment; whether that be the buzzing of lights or the breeze through the bushes, certainly the audience could find something to listen to. This seems to me less of an artistic accomplishment and more of an exercise in meditative stillness--which I'm all for but cannot consider any sort of music. The argument is, of course, that silence plays a significant role in most any piece of music, so why not have it the dominant (or, in this case, sole) factor? I say because music is the organization (or, if we wanna be really liberal, the manipulation) of sounds to some end. A musician telling his audience to appreciate whatever ambient noises may be present is essentially the same as a photographer to tell his fans to look out their bedroom window--certainly not a bad idea but not really any semblance of art.
I am not opposed to the concept that anything can be art, though. Perhaps if I constructed or encouraged the instruction of a room in which absolute silence existed and then had people come in to experience the audible void, then we would be approaching art, I think, but more architectural than musical, though perhaps some amalgamation of the two.
Well, I feel I've rambled enough. Here is the definition of art I have created--the most satisfactory one I can think of--for you to support or refute; I thought it up myself so feel free to support it heartily or oppose it brutally--the latter being so much more the fun:
Art is the careful and honest expression of a sentiment.
14 July 2008
First, 2 Nephi 26:24 - "He doeth not anything save it be for the benefit of the world; for he loveth the world, even that he layeth down his own life that he may draw all men unto him. Wherefore, he commandeth none that they shall not partake of his salvation."
Last, Brother Joseph: "God will not command any thing, but what is peculiarly adapted in itself, to ameliorate the condition of every man under whatever circumstances it may find him, it matters not what kingdom or country he may be in."
So when God tells you to do something that you don't wanna do, just suck it up and remember it's just like your mother always told you: IT'S GOOD FOR YOU!!!
Big bowl of saurkraut every single morning, if that's what it takes....
07 July 2008
Guilt is not something I'm generally apt to feel; I just don't really get worked up over stuff, and I sometimes have trouble understanding why people are so edgy. Someone offends me, I get over it; if it's really bad, I may have to sleep it off, but I usually can do that. I offend somebody else, if I'm aware that I've done it, I usually just don't really care.
Well no more; no more, I say! I'm going to start taking responsibility for my actions. I learned last night, by way of extreme social faux pas, that human relationships are fairly delicate things, that even the strongest of ties (not that I've had many like that, so I guess this is merely supposition) can be broken by a single thoughtless act. And for the first time in my life, perhaps, I feel badly about something I've done to someone else. And now that the guilt has started, I find myself seeking to purge myself of all previous offenses, walking around, apologizing to every person I've ever known, regardless of whether I can think of a good reason for doing so. It feels good, though most folks think I'm crazy, and I think it's long over due. I suppose that if I were to become that guy who does nothing but apologize all the time, I'd become tedious and nobody would really like me any more, so I don't intend to take it to that extreme. But I am calling off my total disregard toward other people and their opinions, and I'm going to strive to live my life a little less recklessly.
Perhaps this may seem to some the sort of change that cannot be made all at once, but, given the way I feel, that's exactly what I intend it to be.
04 July 2008
by a rather angry butterfly
Saul Kripke, moron that he is—and, yes, that is ad hominid, but it also happens to be my thesis, so suck it!—sought (and, as I will show, failed) to disprove the following theory of naming (which I happen to agree with except for with one minute detail, which I will point out and will not, I don’t think, really interfere with what I’m trying to say):
(1) To every name or designating expression “X,” there corresponds a cluster of properties, namely the family of those properties φ such that A believes “φX.”
(2) One of the properties, or some conjointly, are believed by A to pick out some individual uniquely.
(3) If most, or a weighted most, of the φ’s are satisfied by one unique object y, then y is the referent of “X.”
(4) If the vote yields no unique object, “X” does not refer.
(5) The statement, “If X exists, then X has most of the φ’s” is known a priori by the speaker.
(6) The statement, “If X exists, then X has most of the φ’s” expresses a necessary truth (in the idiolect of the speaker).
(C) For any successful theory, the account must not be circular. The properties which are used in the vote must not themselves involve the notion of reference in such a way that it is ultimately impossible to eliminate.
The only part of this rigmarole (for lack of better title, “gobbledygook” being the only other thing that comes to mind) that I disagree with is the “is known a priori” in (5). I do not think it matters whether the speaker knows empirically or a priori that “if X exists, then X has most of the φ’s”—I don’t think it really matters whether he (the speaker) knows it at all, only that he either assumes or asserts that X has most (or all) of the φ’s. I do not, however, think that this dissention will inhibit my dashing Kipke’s sophistry to ickle bits, so let’s get to it!
Kipke begins on very solid ground—ground so solid, in fact, that I can’t refute it. But he’s a moron, and that will become plenty apparent plenty soon.
He begins by quoting and refuting Searle, who said, in essence, that, when we refer to Aristotle (for one example), we refer to “the logical sum, inclusive disjunction, or properties commonly attributed to him….” Kripke refutes this, saying, “It just is not, in any intuitive sense of necessity, a necessary truth that Aristotle had the properties commonly attributed to him.” Earlier in this same lecture, Kripke used Richard Nixon as an example, saying that “Richard Nixon” does not mean “The President in 1970” because Richard Nixon would have been Richard Nixon even if Humphrey had been elected. This is true and, as I said, I cannot refute it, but little thought is required to realize that Kripke is really missing the point here, and that is what I will show you now.
Let’s go back to Aristotle. I don’t know much about him, but I know that he was a pupil of Plato and a teacher of Alexander the Great (this I know solely because Aristotle is used as in example in most of the philosophical essays &c. that I’ve read of late). I also know that he was called (or is at least now called by we English speakers) Aristotle. I will assume (though I can’t claim to know it for sure) that he was given the name “Aristotle” before he became well-known (this I will assume because, even if it isn’t actually true, it won’t hurt this example at all because I could do this same thing with any given historical figure; Aristotle himself is not vital to what I am saying). Kripke is saying that saying “Aristotle” is not the same as uttering “the pupil of Plato who taught Alexander the Great” because he would have been Aristotle even if he had never been Plato’s pupil or Alexander’s teacher. This, to me, seems just a little silly on Kipke’s part because, though that is absolutely true, the reason we still talk about Aristotle so long after his death is because of the great things he did (which includes, I suppose, learning from Plato and teaching Alexander). So, while “Aristotle” does not etymologically denote “the pupil of Plato who taught Alexander the Great,” when I say “Aristotle,” I mean “the pupil of Plato who taught Alexander the Great.”
Kripke, you’re a moron. How do I hate thee? Let me count the ways:
Evidence #2 that Kripke is barking up the wrong tree: he attacks (2) claiming that it defies (C)’s demand that “the account must not be circular.” He begins by saying, “Usually the properties in question are supposed to be some famous deeds of the person in question.” Really, Kripke? Is that really so? What if I want to talk about Cyrano de Bergerac because of his nose? Or Joseph Merrick (the Elephant Man)on account of his deformity? Certainly these are individuals known for doing things, but their physical appearances play major roles in that fame as well. Or what if I want to talk about Uluru (Ayers Rock)? It has no deeds to its name, though it does have a name and properties, too. The same is true of Rushmore and Everest and the
“Consider Richard Feynman,” Kripke says, “to whom many of us are able to refer. He is a leading contemporary theoretical physicist. Everyone here (I’m sure!) can state the contents of one of Feynman’s theories so as to differentiate him from Gell-Mann. However, the man in the street, not possessing these abilities, may still use the name ‘Feynman.’ When asked he will say: well he’s a physicist or something. He may not think that this picks out anyone uniquely. I still think he uses the name ‘Feynman’ as a name for Feynman.”
Stupid Kripke; he forgot to finish his sentence. Clever of him, I suppose, for the denouement of it is what ruins his argument: “I still think he uses the name ‘Feynman’ as a name for Feynman because he learned it in reference to Feynman, as is evidenced in his at least knowing that Feynman is a physicist.” See, the reason Feynman was known at the time was because he was such a crazy physicist; surely people would have heard of him even without understanding at all what he has done. Perhaps the same is true today of Stephen Hawking: you could probably find someone who would say, “Oh, that name sounds familiar; who is he?” Such an one would not be asserting that “Stephen Hawking” refers to no one but that it rather refers to someone who is, to him or her, unknown. Similarly, someone who would say of Stephen Hawking, “I’ve never heard of him,” would not be asserting that there are no Stephen Hawkings existing anywhere in the universe, merely that he or she has not heard of him. This is something Kripke can’t seem to wrap his head around.
Moving on, Kripke goes on to use
Let’s say, for example, that we know that
Honest to goodness, that’s where the man stops! No joke; I’ll give you his very next sentence—indeed, the entirety of his next paragraph—when next I call him stupid. So this is the entirety of his Cicero-Catiline example. Pretty pathetic, no? Can you imagine the following conversation?
Man 1: Let me tell you about
Man 2: Who’s that?
Man 1: Why, he’s the man who denounced Catiline!
Man 2: Who’s Catiline?
Man 1: Oh, merely a man Cicero denounced.
Man 2: Wait, so who’s
Kripke is absolutely right that this conversation isn’t going to go anywhere; Man 1 does not appear to know who either Cicero or Catiline were. But this is an absurd example wherein Kripke seems to assume that we’re all as dumb as he is. If Man 1 were to tell Man 2 the story of how Cicero denounced Catiline, he would probably explain to Man 2 that Cicero was one of Rome’s greatest orators and that Catiline was a 1st-century Roman politician who attempted to overthrow the aristocratic Senate of the Roman Republic—no circularity there! And is not this an example of the reason that we use names, so our audience can know what we’re talking about and thus understand what we’re saying about it?
Stupid Kripke; he goes on to say:
If we say Einstein was the man who discovered the theory of relativity, that certainly picks out someone uniquely. One can be sure, as I said, that everyone here can make a compact and independent statement of this theory and so pick out Einstein uniquely; but many people actually don’t know enough about this stuff, so when asked what the theory of relativity is, they will say: “Einstein’s theory,” and thus be led into the most straightforward sort of vicious circle.
GAH! NO! Stupid, stupid, stupid, stupid! Now you’re running in circles, Kripke, and you’ve given us essentially the same example twice—the names have changed, but the intent remains the same. Must you assume that all English speakers—or speakers of any language, for that matter—are as dumb as you?1 Just because someone only knows of Einstein that he gave us the theory of relativity2 doesn’t mean that “Einstein” and “the theory of relativity” have no meaning—to that person or in general!
One last nail in Kripke’s logical coffin and then I’m going to finish reading his essay (he goes on for another six pages or so, so I suppose he may actually redeem himself, in which case I will gladly write a retraction of all this):
I often used to hear that Einstein’s most famous achievement was the invention of the atomic bomb. So when we refer to Einstein, we refer to the inventor of the atomic bomb. But this is not so.
No, Kripke, we don’t. When I say “
My defense of Das Gobbledygook above goes something like this:
(1) This is a definition; even Kripke had naught against this.
(2) When I use a name, I use it to refer to a thing and all of its properties known to me. Generally, those properties are too numerous to list, which is why names are so handy. Rather than saying, “That hunk of rock that rotates around the earth, reflecting light upon earth and affecting earth’s tides, and is best known for being noticeable at night but is also occasionally visible during the day and which has been walked upon by a few (8?) men but hasn’t been visited in a while—you know, that big, be-cratered, heavenly body that only ever shows one side to use, goes through phases because of earth’s shadow, is occasionally eclipsed by that same shadow, and sometimes eclipses the sun—the thing allegedly responsible for werewolves, flown across by witches, and worshipped by nocturnal pagans…” I’d much rather say, “The moon.”3 And when I say the moon, the foregoing abbreviated description is exactly what I mean.
And now, I return to reading Kripke. He has 7 pages to redeem himself; I wish him the best of luck and, having vented my disapproval of what he’s already said, I meet him with open arms and an unbiased mind.
1 There’s a pun in there somewhere, calling speakers “dumb,” but I don’t care enough to refine it just now.
2 Can I just mention here that I think it’s really kinda ridiculous when people refer to theories being discovered? I discovered the theory of relativity by reading about it in a book; Einstein created the theory to describe and explain certain phenomena. Just one more reason I don’t like Kripke: he’s one of those.
3 Especially since I’d have to substitute similarly burdensome descriptions in the places of “earth” and “sun” and “shadow” and “tide” and….
03 July 2008
The Bachelor and the Bobby-Soxer
Cary Grant across Myrna Loy and a teen-aged Shirley Temple--can't go wrong, right? Right. This movie was so much fun! Can't say it's my favorite Cary Grant, but it was way good anyway. Some very good lines. My favorite? Hm. That's tough, especially only having seen it once, but one real winner was, "I couldn't help but overhear--my ear was to the door." (Maybe it's funnier in context; I dunno. The movie makes Happy Birthday pretty funny, too.)
I love Pixar; I still believe (as I long have, though I may have never stated it here) that every one of their films is quality. Wall-E had some magic moments. I was especially drawn in by the early scenes because it's truly amazing how much is portrayed sans spoken language; I was very impressed. I don't wish to say (or even to imply) that I lost interest in it as it progressed (because that is absolutely not true), but I'm just not sure how to take it. I think another viewing would probably help a lot. I wouldn't call it a political film, but it deals with a lot of politically-charged topics (viz. environmentalism, obesity, technology dependency, media addiction, corporate power, society's herd mentality (especially as regards fashion and lifestyle), biased public education, and, were I even more nitpicky, probably several other things). I don't think that it was intended to preach any specific doctrine; I don't, in fact, believe that it was really intended to be a political movie. I suppose, if it's guilty of anything, it's guilty of reflecting the human condition disturbingly well, and I, for all my talk about how that's exactly what fiction ought to do, was a little put off by what I saw because I feel so above all of it (which probably makes me a part of the problem).
Anybody else seen Wall-E? Any thoughts anyone?
Both these flicks are worth your time, though; mighty fine movies, both.