Thursday, May 24, 2007
In most games, doing this makes the game very easy; you walk around with a Zeus that can just glide through the game, make monkeys out of all those evil-looking bosses, and sit back and casually watch all the pretty cut-scenes. I like to play this way. But Oblivion is different because when your character levels, the world levels with you. While some quests are easier than they would be, others are ridiculously tough.
Anyway, that's not what I'm writing about. My question is, what is the point at which pretend evil becomes real evil? I ask this because certain video games, including Oblivion, at times encourage the player's character to do profoundly evil deeds. Certainly, any game featuring violence shows a certain callousness towards the finer points of moral theology. But some games, for example the original Legacy of Kain, Star Wars: Knights of the Old Republic, Grand Theft Auto, and Oblivion, provide opportunities for your character not only to use disproportionate force to defend himself from evil creatures, but also to unambiguously lie, steal, abuse, and commit virtual murder.
Right now in Oblivion, my character has joined an assassination cult called the "Dark Brotherhood." The Brotherhood's "theology" is interesting in itself because it is monotheistic amid a fantasy world governed by polytheistic religions, only "God" is evil and delights in death; I suppose one might consider it gnostic in this sense. But it is curious because some of the dialogue seems to echo issues that then-Ratzinger brought up in his Introduction to Christianity about the advent of monotheistic faith.
For the most part, my character's contracts have been against people who aren't particularly likable, which I'm sure is by design--the game eases you into the idea of murder by making your victims out to be villains, just like the slasher movies of the 1980's made sure that none of the doomed teenage cattle would be missed. However, from reading online guides, I know that one quest my character will soon be given involves the murder of an entire family which is absolutely blameless--again by design. The game wants to take you on a trip to nihilism.
The Pope has recently condemned all manner of violent and culture-less forms of media and entertainment. "Any trend to produce programs and products – including animated films and video games – which in the name of entertainment exalt violence and portray anti-social behavior or the trivialization of human sexuality is a perversion, all the more repulsive when these programs are directed at children and adolescents.” (Link). His primary concern seems to be the formation of the consciences of children (and most of the games I mentioned above have at least "M" ratings, forcing many game stores to check IDs before they are sold).
I think perhaps a more helpful reference point would be any other form of play or pretend. Consider adults who reenact Civil War battles, "cosplay" at comic conventions, etc; or even professional acting. Mature people are able to participate in a fiction which sometimes calls on individuals to play-act at doing profoundly evil deeds. In the case of professionals, the seriousness of their skill demands immersion and thus a participation in evil thinking which must be less than comfortable. And I am referring as much to Shakespeare as I am to less culturally redeeming modern films.
Hm. I have more to write, but it's off to my little brother's graduation! What are your thoughts?
It seems as though the "choice" complicit in determining our beliefs is not so much a positive as a negative choice.
We do not actually choose what we believe; our beliefs are only the most consistent available terminations to the foreground of our thinking at a given time. We do not have direct, volitional access to change the content of our beliefs (they are, to use a computer-term, "read-only"). Of course, by "beliefs" I mean earnest beliefs, not statements (which can be lies) or self-deceit.
On self-deceit: Self-deceit seems to be the closest that one comes to changing one's belief by sheer act of the will. However it is still not direct. Recall that I defined beliefs as the most consistent available terminations to the *foreground* of our thinking. What people know and what they believe are not always consistent. Often, knowledges are present, but have either never been made explicit (perhaps they are very subtle, like the truths that Socrates would bring to birth in his interlocutors), or have been "forgotten" - which is not to say that they have disappeared, but they have receded from the foreground of thought. It is possible for us to wilfully forget things, whether they are subtle points or unpleasant facts. These can be pushed into the background of thought where they have no appreciable effect on conscious beliefs. This is done either by neglect of mental attention to those points, or exaggerated mental attention on a peripheral point that, isolated from its ugly cousin, yields a pleasant outcome. But note that self-deception appears always to be a filter. We do not add an untrue thought to our thinking; we let the knowledge of its falshood slide into obscurity.
But back to beliefs. Beliefs are the most consistent available terminations to the foreground of our thinking. They do not rest high above, aloof from throught, but rather make up a substantial portion *of* that thought (though thought is not simply a sequence of beliefs). Beliefs make up the soft boundaries of the thinkable; thus thought itself is not entirely accessible to the will. What part of it is?
To the extent that we have any control over our daily experiences and inputs, it seems that choice enters here. Like the food we eat, the influences that we imbibe can arrive on our doorstep according to circumstance, impulse or strict planning. Of course the extent to which individuals can do this is wildly variable, and socio-economic class is a strong factor--whether one has cable, Internet access, transportation, a public library, liesure, and the education to know what kinds of influences are where, is strictly beyond anyone's control; those who have these things didn't normally earn them or earn the tools necessary to earn them. Don't tell a destitute Guatemalan villager he can be anything he wants to be. Yet even among the wealthy the breadth of this choice is tightly constricted: there is a grave disparity between the conveniently available, little province of viewpoints that satisfies most in a given 1st world country, and an authentic breaking open of the intellect. Of course I don't mean to draw a line and put myself comfortably on the favorable side of it (though that may be a necessary consequence of merely writing about it). But there is a spectrum, and unfortunately it is precisely a part of the "lower end" of the spectrum that those who belong to it imagine themselves to be at the "upper end". The market couldn't function if it couldn't persuade the masses that they are Inspired Independent Intellectual Individuals (and persuade each that he or she is moreso than most).
Yet perhaps this is looking at it the wrong way. Certainly people have unequal access to the breadth of data generated by the human species in its recorded lifetime. But individually each life can be said to be confronted with every possible choice, irrespective of complex or simple circumstances. No civilization is without its quarrels, its traditions, its wisdoms and its foolishnesses. In one shape or another the entire range of meaningful associations is accessible to every person. Acadamia is elitist; but human meaning is absolutely egalitarian. So we must personalize the question: not what are one's influences, but who.
Our chosen associations affect our thought and thus our beliefs. But how, exactly? It is not necessarily true that one's friends are also one's teachers; as mentioned, Pascal may have played dice with the godless, but not with his belief in God. Shared beliefs are but one of the possible bonds of friendship. Yet still nobody is impermeable to undesired influences; we are "porous", and Scripture frequently warns of associating with non-believers.
I mentioned before that self-deception occurs through the neglect of true thoughts (and these not need be religious thoughts; we deceive ourselves about the simplest and most mundane things). The mind is a bit like quicksand; left alone, objects will sink beneath the surface and disappear. Of course they are still there, but you wouldn't know it simply by looking.
How long can a person sustain the presence of true thoughts in the foreground of the mind? Alone, it would depend on his or her discipline and fervor for truth; yet even religious hermits depend on periodic gatherings and reminders of the brilliant yet subtle nature of their call. How much more necessary, then, would good associations be for those who also maintained associations indifferent to truth?
Again, I make a clear distinction: the danger is not that truth-indifferent associations would put dangerous or untrue thoughts in one's mind (it is scarcely possible today to avoid exposure to untrue thoughts; intellectual provincialism I do not believe is the friend of Christian communities anymore, if it in fact ever was). It is rather that, analagous to the self-deceiver who exaggerates a peripheral thought to forget the deeper thought, truth-indifferent associations exaggerate the presence of peripheral and ephemeral ideas, of zeitgeists, and true thoughts can recede into the background of the mind if they are not brought forward again by good associations.
In short (too late!), it is not the associations that one has, but the associations that one lacks, that have the most profound effect on one's ability to have a breadth of views and truths always in the foreground of thought. The "choice" which is most relevant is not the choice for this or that association but the choice against.
OK, the next installation of this thing should finish it off. I know this essay is kind of weak; but I hope profound insights are forthcoming!
Tuesday, May 22, 2007
Indeed, Peter Berger's "The Heretical Imperative" argued precisely this, that whatever one's religion--from traditional Christianity to Mahayana Buddhism, it is a chosen religion and cannot be otherwise. He then proceeds to go on about plausibility structures as if this could be cheerful news for Christianity. Viva la heresy.
A point of analytic philosophy, a mere whisper among the enthusiasm for choice, is: can anyone be said to choose their beliefs at all? That was one of the first criticisms I read when studying William James' famous essay, "The Will to Believe". The very notion of a "chosen belief" seems self-contradictory. Presumably if someone believes something, it is believed for some reason--whether good or bad, or whether well-articulated or not. If I were conscious of having a belief because I have chosen to have it; i.e., that it is born out of sheer will, then I cannot actually be said to believe it, can I? I cannot choose to believe something which I know, or think, is false. And if I don't know, I might have a hypothesis--but the very word implies that it is less (hypo) than a real belief (thesis). One can have entertain contradicting hypotheses; one cannot hold opposite beliefs. Choice alone does not a belief make.
In fact, I recall reading that William James himself acknowledged that this was the case, and suggested that his essay would have been better titled, "The Right to Believe". But that does not change James' point that, even if we do not directly choose our beliefs in the manner I described above, still, desires and beliefs are mixed up somehow. A hint as to how is the bit of text from the Pensees that follows the so-called "Pascal's Wager":
Go, then, and take holy water, and have masses said; belief will come and stupefy your scruples – Cela vous fera croire et vous abêtira.Whereas some have accused Pascal of committing the error of "doxastic voluntarism", this single line actually vindicates him. Pascal never suggested that we choose our beliefs; we do not come to believe, rather, "belief will come." What we do choose is not our beliefs, but our surroundings, our influences--what we listen to, and what we do not listen to. In addition, it is notable, Pascal does not tell the reader what not to listen to, but suggests rather that one add to one's tributaries of influence one thing which might be lacking--exposure to worship. Pascal himself did not shy away from the company of his secularist peers; indeed, it was they, not he, who had constricted their scope of vision.
Blah, I've hit a block. I don't want to let this essaylet die; I do have a point. Need to take a break.
Saturday, May 12, 2007
(Some time later, I got a kick out of making gutteral zombie throat noises in the dark hallway outside of our rooms, right in the ear of an unsuspecting Gregory. Ecgh! Ecchhhcg! Not the only time I played cruel jokes on him.)
I was excited to see 28 Weeks Later, upon seeing that the reviews were favorable. I have some thoughts (CAUTION: SPOILERS):
- The sequel is more predictable and Hollywood-ish than the first film. I knew, as soon as the story focused on a small group of survivors, that everybody was doomed except for the children and, perhaps, the attractive woman. I also knew that the upstanding, handsome American soldier would die in an act of heroic self-sacrifice.
- I focussed on the black chopper pilate to see whether this movie would kill him off. Mainstream movies still don't know what to do with black characters, so they either kill them off, make them into black stereotypes, or turn them into mysterious "gatekeeper" figures that, as such, do not require a personality). In truth, this movie did a little bit of all three.
- SPOILER: Twist ending was neat; a little tacked on, but actually makes me salivate for the third movie. It also dragged this movie out of sheer cliche-ness. Makes it a tragedy in the true Shakespearean sense, if you catch my drift.
- The most ubelievable part of the film was that they would even think about resettling the country less than a year after the disaster.
- SPOILER: Plot hole. The Rage virus is a blood pathogen. The infected wife kissing her husband (with no notable open wounds in either) shouldn't have turned him into a zombie.
- There's some screwy logic going into both movies about the nature of the zombies. The lore seems confused about their motivation. They're not actually hungry, since they don't actually eat their victims--they just bite, and bite somebody else, and bite somebody else. Hence how the infecteds died from starvation after the first film. If they were true cannibals, it would have taken the original infestation longer to die out than it did. It's as if the infected's "purpose" is neither mindless rage nor carnivorous hunger but simply spreading the disease. Some have commented on the strangeness that the infecteds do not attack each other; this may have less to do with intelligence or society (Ala Romero's recent zombie movie) than perhaps a kind of built-in repulsion between infecteds. They run in groups because they have common targets, no other reason. The hole in this theory is that (SPOILER) the infected husband attacked his infected wife. But that was different, perhaps, because his infected wife was just a carrier and thus may not have been emitting whatever repulsion is effective between infecteds.
- On a more relevant note, I do believe in angels and their fallen counterparts, and I have explored a notion that devils are in a sense just like spiritual "infecteds". Perfect Love, and Perfect Freedom, are IMO synonyms for God and they are synonyms for salvation. In the films, to be infected is to have your entire being possessed by a single purpose, destruction. There are important differences between devils and infecteds, but the key similarity is that their wills are entirely collapsed into hatred; they cannot not destroy. The key difference is that devils have freely forfeited their wills; and also that they hunger as much for self-destruction as for the destruction of others. The infecteds fail to be a stronger analogy just because of plot points in place to make the films scarier. But the fact that infecteds are still technically alive make them a curious analogue to the devils we see possess people in other movies and that we read about in books. To be evil is to be reduced to the barest thread of existence, and to hate even that thread. Despair turns to rage, and rage to murder. Murder and suicide are kin.
Tuesday, May 08, 2007
- Explain what attracted you to the ministry of teaching.
- Explain why you wish to teach in a Catholic school.
- What do you think are the challenges for today's students?
For the last six years I have been a seminarian, and I imagined that I could follow my passion: learning and teaching about the Catholic faith. However my years in formation taught me that I did not want to sacrifice the depth of a dedicated teaching profession, for the breadth of ministries demanded of a parish priest. The Catholic Tradition is my first love. It is a bottomless well of wisdom; it is the key to a happiness which transcends all illusions, all weakness, and all failures. My love for teaching is not only the thrill of engaging a classroom. It is simply my confidence in the Holy Spirit's activity in the life of the Church, and I wish to share this Gift.
I hope to teach in a Catholic school to collaborate with others and help a generation become faith-filled, exceptional contributors to society. I know that Catholic schools are no shelters from the 'real world' but among the best preparations for it. Thus I would not teach religion as an obscure system of rules binding only on those who bear the name. Rather I would teach a truth which pierces the depths of each soul and manifests as a life different from the cynical, secular world; a life bound fast (re-ligata) to God.
Challenges facing high school students are legion, but I think that a single strand runs through them: the desire for autonomy. Perhaps teenagers do seek independence through non-conformity, but they also seek autonomy in a literal sense, that is, to "name oneself". This is the first time in people's lives when they begin to own their beliefs, and question beliefs that they are not ready to own. Because it is the first time, the sensation creates excitement and a demand for autonomy that maybe surpasses our own.
To desire that one's identity and beliefs be chosen rather than passively accepted, I believe, is a good thing. There is an implied desire for authenticity and integrity, i.e., that one's life ought not be guided by a jumble of unproven authorities, but rather by deep personal understanding. Even the rebel who declares, "you can't tell me what to believe!" implicitly demands that his or her beliefs have an inner unity and coherence. The guarding of one's ideas against unproven authorities is the hint of a desire for the unity of universal Truth, a deep desire fulfilled only in Jesus Christ. Understood this way, even the rebel can be a prophet.
What this means for me as a teacher is that my curricula have a hybrid foundation: on the one hand, Jesus Christ, God living and walking among us; and on the other, the questions of the students, and my conviction that all of these are bound up in truth and all Truth is bound up in Christ. My philosophy and theology background has enabled me to explain the lines of continuity drawing the most impassioned teenage values together with the wisdom of the ancient Fathers, the saints, philosophers, and the doctrine of faith. Concretely this means demonstrating the lines of connection between popular culture and ancient truth, such as when I discussed atheism in the context of James Frey's bestseller, "A Million Little Pieces," when I used the summer sci-fi movie "I, Robot" to talk about the Gospel of John and the hypostatic union.
There is nothing I enjoy more than eliciting the “aha” moment from students engaged in learning about our faith tradition, and I hope that, working in a Catholic school, I can be accorded many such opportunities.