Thursday, October 31, 2013

"The Scientific Worldview of the Bible"

Great, short article:

I'm putting it here so that I don't forget about it. I'd like to get back to it in the future but can't right now.

Catcalls, Photography, and Making Judgments

Hannah Price, a photographer, has captured otherwise uninteresting images with a very interesting motivation and narrative, namely, since her move to Philadelphia, she has captured images of men who catcall her and put them together into a photo project.
Though she does not believe her response causes these men to reconsider their actions, she feels that documenting the encounter allows her to take control of the situation, turning the attention to their behavior rather than her physical appearance. Claiming that the series is neither a judgment on men or a comment on race, the MFA Yale candidate uses her photography as a means of understanding something unfamiliar, hoping to find some sort of common humanity in the process.
Source: Jenna Garrett, "Female Photographer Approaches Men Who Catcall at Her and Takes Their Portrait," Feature Shoot, October 30, 2013, accessed October 31, 2013,

You can see the entire project here at Ms. Price's website:

What I find interesting about Price's comment that the series is "neither a judgment on men or a comment on race" is that all of the pictures of men who catcalled her are of an ethnic minority. Even if there was no intention on Price's part to comment on that obvious pattern, the very nature of a series of photographs with the narrative provided by Price is not only an explicit revelation of this aspect of our society but an implicit commentary, not only on race or men but on social norms and the female identity and status in relation to men.

The difficult tension of claiming that the series is not a "judgment" or a "comment" is that it emerged out of, as Price herself says, a motivation to "take control of the situation," probably a somewhat humiliating if not outright degrading one, and "turn the attention to their behavior rather than her physical appearance." It is a rather ingenious reversal of "intentional image emphasis," where the human intention towards an image is focused or emphasized. The motivation itself implies a judgment about the entire situation, namely, that it is wrong or at least not pleasant or ideal.

Finally, is it possible to have an "unspeaking" or "non-commenting" photo series, a series that by its very nature speaks meaningful images, put together with human intentionality that narrates those images in a coherent context? I am inclined to think it isn't possible. If a computer slideshow program randomly selected photographs from a database collected by images uploaded by humans everywhere, then I would be inclined to believe that there is no narrative or comment being made, and any perception of continuity or narrative would be accidental or projected on my part. But if it is an intentional human putting together such photographs from a database specifically made in response to a specific situation that arises in a specific social context, I would think otherwise.

But this isn't to put down Price's work. I think it's great, and it draws more attention to an experience that many women suffer from and a social practice that should stop but most likely never will until the end of time.

St. Bernard of Clairvaux and Pope Benedict XVI on Desiring Eternal Life

“Nightlong in my little bed I sought him whom my soul loves.” It is a great good to seek God; in my opinion the soul knows no greater blessing. It is the first of its gifts and the final stage in its progress. It is inferior to none, and it yields place to none. What could be superior to it, when nothing has a higher place? What could claim a higher place, when it is the consummation of all things? What virtue can be attributed to anyone who does not seek God? What boundary can be set for anyone who does seek him? The psalmist says: `Seek his face always.' Nor, I think, will a soul cease to seek him even when it has found him. It is not with steps of the feet that God is sought but with the heart's desire; and when the soul happily finds him its desire is not quenched but kindled. Does the consummation of joy bring about the consuming of desire? Rather it is oil poured upon the flames. So it is. Joy will be fulfilled, but there will be no end to desire, and therefore no end to the search. Think, if you can, of this eagerness to see God as not caused by his absence, for he is always present; and think of the desire for God as without fear of failure, for grace is abundantly present.
Source: St. Bernard of Clairvaux, Commentary on the Song of Songs, trans. by Killian Walsh, Paths of Love Website, accessed October 31, 2013,, Sermon 84.1.
“What do you ask of the Church?” Answer: “Faith”. “And what does faith give you?” “Eternal life”. According to this dialogue, the parents were seeking access to the faith for their child, communion with believers, because they saw in faith the key to “eternal life”. Today as in the past, this is what being baptized, becoming Christians, is all about: it is not just an act of socialization within the community, not simply a welcome into the Church. The parents expect more for the one to be baptized: they expect that faith, which includes the corporeal nature of the Church and her sacraments, will give life to their child—eternal life. Faith is the substance of hope. But then the question arises: do we really want this—to live eternally? Perhaps many people reject the faith today simply because they do not find the prospect of eternal life attractive. What they desire is not eternal life at all, but this present life, for which faith in eternal life seems something of an impediment. To continue living for ever —endlessly—appears more like a curse than a gift. Death, admittedly, one would wish to postpone for as long as possible. But to live always, without end—this, all things considered, can only be monotonous and ultimately unbearable. [...]
Obviously there is a contradiction in our attitude, which points to an inner contradiction in our very existence. On the one hand, we do not want to die; above all, those who love us do not want us to die. Yet on the other hand, neither do we want to continue living indefinitely, nor was the earth created with that in view. So what do we really want? Our paradoxical attitude gives rise to a deeper question: what in fact is “life”? And what does “eternity” really mean? There are moments when it suddenly seems clear to us: yes, this is what true “life” is—this is what it should be like. Besides, what we call “life” in our everyday language is not real “life” at all. Saint Augustine, in the extended letter on prayer which he addressed to Proba, a wealthy Roman widow and mother of three consuls, once wrote this: ultimately we want only one thing—”the blessed life”, the life which is simply life, simply “happiness”. In the final analysis, there is nothing else that we ask for in prayer. Our journey has no other goal—it is about this alone. But then Augustine also says: looking more closely, we have no idea what we ultimately desire, what we would really like. We do not know this reality at all; even in those moments when we think we can reach out and touch it, it eludes us. “We do not know what we should pray for as we ought,” he says, quoting Saint Paul (Rom 8:26). All we know is that it is not this. Yet in not knowing, we know that this reality must exist. [...]
This unknown “thing” is the true “hope” which drives us, and at the same time the fact that it is unknown is the cause of all forms of despair and also of all efforts, whether positive or destructive, directed towards worldly authenticity and human authenticity. The term “eternal life” is intended to give a name to this known “unknown”. Inevitably it is an inadequate term that creates confusion. “Eternal”, in fact, suggests to us the idea of something interminable, and this frightens us; “life” makes us think of the life that we know and love and do not want to lose, even though very often it brings more toil than satisfaction, so that while on the one hand we desire it, on the other hand we do not want it. To imagine ourselves outside the temporality that imprisons us and in some way to sense that eternity is not an unending succession of days in the calendar, but something more like the supreme moment of satisfaction, in which totality embraces us and we embrace totality—this we can only attempt. It would be like plunging into the ocean of infinite love, a moment in which time—the before and after—no longer exists. We can only attempt to grasp the idea that such a moment is life in the full sense, a plunging ever anew into the vastness of being, in which we are simply overwhelmed with joy. This is how Jesus expresses it in Saint John's Gospel: “I will see you again and your hearts will rejoice, and no one will take your joy from you” (16:22). We must think along these lines if we want to understand the object of Christian hope, to understand what it is that our faith, our being with Christ, leads us to expect.
Source: Pope Benedict XVI, Spe Salvi, Encyclical letter on Christian hope, Vatican Website, November 30, 2007, accessed October 31, 2013,, numbers 10-12. 

Wednesday, October 30, 2013

St. Thomas Aquinas on Christ's Fitting Example Through Suffering

The sin of man consists in cleaving to bodily things and neglecting spiritual goods. Therefore the Son of God in his human nature fittingly showed by what he did and suffered that men should consider temporal goods or evils as nothing, lest a disordered love for them impede them from being dedicated to spiritual things. Thus Christ chose poor parents, although perfect in virtue, lest anyone glory in mere nobility of flesh and in the wealth of his parents. He led a poor life to teach us to despise riches. He lived without titles or office so as to withdraw men from a disordered desire for these things. He underwent labour, thirst, hunger and bodily afflictions so that men would not be fixed on pleasure and delights and be drawn away from the good of virtue because of the hardships of this life. In the end he underwent death, so that no one would desert the truth because of fear of death. And lest anyone fear a shameful death for the sake of the truth, he chose the most horrible kind of death, that of the cross. Thus it was fitting that the Son of God made man should suffer and by his example provoke men to virtue, so as to verify what Peter said (1 Pet 2:21): "Christ suffered for you, and left an example for you to follow in his steps."

Then, because not only good conduct and avoiding sins is necessary for salvation, but also the knowledge of truth so as to avoid error, it was necessary for the restoration of the human race that the only-begotten Word of God who assumed a human nature should ground people in truth by a sure knowledge of it. Truth taught by men is not so firmly believed, because man can deceive. Only by God can knowledge of the truth be confirmed without any doubt.

So the Son of God made man had to propose the teaching of divine truth to men, showing them that it came from God and not from man. He did this by many miracles. Since he did things that only God can do, such as raising the dead, giving sight to the blind etc., people had to believe that he spoke with God's authority.

Those who were present could see his miracles, but later generations might say they were made up. Therefore Divine Wisdom provided a remedy against this in Christ's state of weakness. For if he were rich, powerful and established in high dignity, it could be thought that his teaching and his miracles were received on account of his favour and human power. So to make the work of divine power apparent, he chose everything that was rejected and low in the world, a poor mother and a poor life, illiterate disciples and messengers, and allowed himself to be rebuked and condemned even to death by the magnates of this world. This made it apparent that his miracles and teaching were not received because of human power, but should be attributed to divine power. Thus in what he did or suffered, human weakness and divine power were joined together at the same time. Thus at his nativity he was wrapped in cloth and put in a manger, but praised by the angels and adored by the Magi led by a star. He was tempted by the devil, but ministered to by angels. He lived without money as a beggar, but raised the dead and gave sight to the blind. He died fixed to the cross and numbered among thieves, but at his death the sun darkened, the earth trembled, stones split, graves opened and the bodies of the dead were raised.

Therefore if anyone considers the great fruit of such beginnings, namely, the conversion of peoples over [nearly the whole] world to Christ, and wants further signs in order to believe, he must be considered harder than a stone, since at Christ's death even stones were shattered. Thus the Apostle says (1 Cor 1:18): "The message of the cross is folly for those who are on the way to ruin, but for those of us who are on the road to salvation it is the power of God."

Source: St. Thomas Aquinas, De Rationibus Fidei, trans. by Joseph Kenny, DHS Priory Website, accessed October 30, 2013,, chapter 7.

Tuesday, October 29, 2013

Human Rights Campaign, the Discourse of Equality, and the Joker

Although I don't pay much attention to the same-sex marriage movement as well as its related equal rights movement for the LGBT community, it's almost impossible to avoid these things in our society. There are two facets of the complementary movements that I've noticed that I would like to comment on: 1) its discourse that capitalizes on the ambiguity of key words; and 2) its values that supervene upon this ambiguity.

As with all discourse, there are key words and phrases that set the same-sex equality movement discourse apart from others. One can see these phrases collected or implied on the Human Rights Campaign website, for example, or in their nearly-ubiquitous equal sign logo. This list could go on, but here are some that come off the top of my mind: same, sex, movement, equality (and equal), right(s), human, marriage, love, partners, gays (and variants), choice, freedom, self-determination, tolerance, acceptance, advocate, attraction. These are the positive words, but there are also negative words, such as homophobia, bigot (and variants), fag, queer, unnatural, Bible, intolerant, un-accepting, religion, dogmatic, judgmental, and other vulgar or graphic terms.

Interestingly, when one looks at the broadest of these key terms, the discourse under examination becomes almost indistinguishable from other discourses of a similar kind, such as the pro-choice movement, libertarian thought, modern political liberalism/progressivism, secular advocacy, the advocacy of scientism, etc. These are worldviews similar to what Husserl called the Lebenswelt. They are discourses that collectively organize the perception and appropriation of reality in a way that "makes sense" or that attempts to minimize cognitive dissonance, such as the reality of physical and genetic difference between male and female species, the reality of death, or, above all, the reality of God.

But if these broad key words are shared, then there is a foundational post-Enlightenment political-philosophical discourse that roots those words and makes sense of them; we could call it a "cenoscopic discourse" that gives birth to particular discourses. And these key words are necessarily ambiguous. For example, what is meant by same/equality? The answer is self-evident to a person who unquestioningly subscribes to the discourse. Its answer is as natural as a fish in water, but notice that in order to answer the question, such a subscriber will consult other key words within the same discourse. The web of key words is thus self-reinforced, hearkening back to Descartes' important contribution to the modern mind: self-produced certitude and all the solipsism that follows from that, culminating in Bertrand Russell's own admission of solipsism: "We can witness [only] what goes on in our heads, and [not] anything else at all" (1959: 26). It is literally impossible to think outside the box of a discourse without awareness and external light leading to Lonerganian insight.

The questioning may continue: what does love mean? In what way can one love be "equal" to another? Who/what adjudicates the validity or equality of any love? Why isn't the desire that leads to rape or pedophilia equal to the love that leads to same-sex or heterosexual relationships? Of course, such a question will cause a knee-jerk negative reaction, culminating in accusations of intolerance, judgmentalism, dogmatism, and bigotry. The reason is that even though the question does not declare any explicit judgment, it implies certain judgments that are contrary to the same-sex discourse. These judgments, one could say, are alien, are analogically homophobic to the homo-accepting Lebenswelt, and hence such judgments are seen as a threat. In fact, everyone understands at least implicitly that to frame a question in a certain way requires firstly a certain way of looking at things, because the frame implies the framer and vice versa; this is the possibility for rhetorical and loaded questioning. Discourse is a frame. The question reveals the frame just as the frame of a picture gives finality to the picture itself. A picture without frames, without borders, becomes indistinguishable from the Real.

These questions, however, reveal the ambiguity of the same-sex discourse and its key words. It reveals that such words are not simply used analogically but ambiguously or even worse, equivocally (perhaps because its users are nominalists). And the discourse and all related discourses rely absolutely upon such ambiguity. It is imperative that no one knows what love is, what "nature" is, what it exactly is that one "chooses" or "self-determines" but simply that one has "choice." It is like the anarchic Joker from Nolan's The Dark Knight, who says, "Do I look like a guy with a plan? [...] I'm a dog chasing cars. I wouldn't know what to do with one if I caught it! You know, I just do things." Oddly enough, the Joker's own discourse falls right into the modern political-philosophical discourse that I've been describing thus far. It is essential to the project (or "plan" or "scheme") that human rights remain floating values that one can pick and choose like apples from a tree, the tree being...what? And who planted it?

If love is an empty placeholder, then why not call it "equal" as well? If sex/gender are indistinguishable, then what is marriage? Yes, the destruction of distinctions leads inevitably to a discourse of "tolerance," which is impossible on its own terms. The discourse of tolerance, as is any discourse by its very terms, is intolerant of other discourses. A discourse, to be distinguishable from other discourses, must be intolerant in its foundation.

Yet again, discourses can be either ideoscopic or cenoscopic, particular or general, but that is another topic.

Thursday, October 24, 2013

Americans, Biblical Literacy, and the Magisterium

I enjoy reading the Big Think articles because they post generally popular, quasi-intellectual topics in a way that reflects how most people, especially seculars, basically look at these same issues. It allows me to stay up-to-date on some of these issues and the perspectives that have been developing. It also is a catalyst for self-reflection and meditation. Finally, it is a source of frustration, probably not only for me, but even for seculars who may see that the issues may be radically oversimplified for the sake of brevity.

Anyway, a recent article was on Americans and the Bible. Apparently, 80% of Americans believe the Bible is either the inerrant, literal Word of God or at least inspired by God, but less than 50% of Protestants or Catholics (Catholics actually scored higher) could pass a very simple Biblical literacy quiz. You can read the full article with links to the quiz here:

One professor that the article interviews, Joel Baden at Yale Divinity School of all places, notes the contradictory, jumbled elements of the Bible and that to invoke the Bible for cultural authority is therefore not so straight an issue as Americans would like it to be, say, for creationism, monogamy, or heterosexual marriage. The article and Baden therefore, somehow, conclude:
And so if we are to continue to invest as much authority in the Bible as we do, Baden says, we - as serious readers of the text - cannot pretend that the Bible is a single, clear statement of belief. Rather, "it is a jumble of beliefs," Baden says, "a combination of voices...embedded in the text right from the word 'Go.'" 
So of what use is the Bible? This book is both the ultimate source of authority and completely indecisive. But that does not mean we should throw it away, Baden says. "This text that our culture holds most sacred is a living reminder that human interaction is founded on dialogue and not monologue - the inclusion of differences, not their exclusion.
Source: Big Think Editors, "The Bible Paradox," Big Think, October 20, 2013, accessed October 24, 2013,

I can see how that makes sense if one assumes either: 1) there really is no supernatural origin for the Bible and therefore what we do—"dialogue"—is the most important element in our collective, cultural relation to the Bible; and/or 2) a Protestant approach to adjudicating Biblical interpretations.

But there's a third possibility, the Catholic one. Perhaps the Bible isn't meant to be read as some call it "literalistically," that is, that we take every story or verse to be literally true precisely as we happen to interpret that story/verse in the moment. Such a reading leaves no room for analogies (or at least only a little), for differences of genre, or for spiritual interpretations of the text.

The Catholic solution is the need for a Magisterium to help guide us in the proper reading of the text. Perhaps the Biblical text wasn't meant to be read merely by individuals in isolation from one another. Perhaps the text arose within a structured, even hierarchical, community of religious believers. Perhaps the text arose simultaneously with an oral tradition as well as rituals and an authoritative context that made sense of all three elements. Perhaps sola Scriptura and "dialogue" are not as important as submission to the authority that God personally established in order to protect and hand on His self-revelation. Just saying.

Surveillance, Images, and the Devil

Photographer Sheri Lynn Behr put together a photo project that explores an increasingly-discussed topic, namely, the rapidly disintegrating line between the public sphere and one's private life, especially through the use of image-capturing technology.
NoSafeDistance addresses the idea of photography without permission—a concept practically unavoidable in what Behr calls today’s “post-privacy world and image obsessed society.” Separated only by glass store windows, Behr turns the camera on strangers just long enough to capture the surprise on the other side.
Source: Ben Marcin, "Too Close For Comfort: Portraits of Strangers Taken Without Their Permission," Feature Shoot, October 23, 2013, accessed October 24, 2013,

Behr remarked, "I wanted to see what other people would do, in this age of smart phones and surveillance.” How would you react? Earlier this evening, I photographed two seminarians in front of me during dinner conversation. At first they tried blocking their faces and stopping me. They jokingly asked what I was doing on my phone, whether I was uploading awkward photographs onto my Facebook or Twitter.

Behr continues:
I’m always interested in things that we are aware of, but don’t necessarily notice, like surveillance cameras. After the Boston Marathon tragedy, people were talking about how great it was that we caught the bombers because of security camera footage. I can’t deny that that was a good thing, but no one talks about the people who were misidentified on surveillance video and how their lives were disrupted. I think all this surveillance falls into really a gray area—there’s so much of it, and we need it to be out in the open and under discussion.
Source: Ibid. 

Earlier this year, 60 Minutes came out with a segment on facial recognition technology. It notes that for the Boston Marathon bombing, although there were images of the bomber, the bomber was captured not by matching photo images but fingerprints. It explores the issue more in-depth:

The same technology shows up in the movie Minority Report, which the 60 Minutes segment briefly explores, such as the influence on advertisements and consumerism.

We also live in a society that praises constant self-assertion, whether through Facebook and other social media websites, through YouTube vlogs, through blogs, etc. Even this post is an example of such self-assertion. It's what Behr describes as "image-obsessed." We like to control our output, showing people exactly what we want them to see. We borrow and share familiar angles of photograph capturing, such as the famous selfie. We share status updates that provide a slice of our life that we alone wish to share and have people notice, painting a picture of whom we seem to be. I become what I describe, what I present, the image I project.

But we all understand that this self-assertion is an extension of the basic realization that children have at some point when they look into the mirror and realize that they are looking, not at another child who is imitating them, but themselves and that their being involves an image, an objective side. We want to look in the mirror and see, instead, what we upload on Facebook; after all, people on the Internet don't have a mirror of us.

But when it comes to other people capturing images of us, most of us detest it. Some enjoy it, but it is interesting to meditate on why we might dislike having our images leave our control. Privacy, reputation, and manipulation among other things. But even more fundamentally, we lose control of ourselves, which we equate with our self-output image. If I have no control over my image, I lose myself.

Again, this unconscious mentality is clearly contrary to the spiritual self-surrender necessary for the reformation of Christ's image in us, whom we must remember are made in one image—the image and likeness of God.

It has been particularly unfortunate to see women especially take advantage of the selfie in order to post semi- or fully pornography photographs of themselves. This phenomena suggests another element—the ease in which the demonic enters into all of our attempts to take matters of our destiny, our identity into our own hands. The selfie becomes a new form of magic, by which we attempt to control the forces around us and thus save ourselves rather than allow ourselves to be saved. Even in the masochistic end of a relationship, such masochism is usually consciously chosen, and if not consciously so, unconsciously provoked and encouraged. These are all assertions of self that very easily give room for the demonic to enter in.

It seems necessary to me, then, to reflect on what I put out before doing so, to reflect on my motives and my goal and my relation to Christ at that moment. Am I trying to allow Christ's image emerge through me, or am I asserting myself and in so doing allowing sin and the devil to be my PR managers?

Sunday, October 20, 2013

"One Day in History" by Andrea Gjestvang

A collection of photographs of the survivors and victims of the shooting at the Utøya summer camp on July 22, 2011. Click here to see the gallery.

Andrea noted the following:
We tend to forget survivors, as their condition is often not known until the news cycle is over. Or we assume that time [equals] healing. My goal with this work is to remind people, make them think and feel and want to understand. Even if time goes by and new stories hit the front pages of the newspapers, the mental and visible scars of the survivors of such terrible incidents remain. I want to show that grief (but also healing) comes in different shapes. These youths share the same experiences and to a large extent similar reactions, but the process is individual. Thinking of the terror attack of July 22nd, I want people to remember these youths and not the face of the terrorist.
Source: Sahara Borja, "Photographing The Unspeakable - Andrea Gjestvang’s Portraits of Norway’s July 22nd Survivors," Feature Shoot, August 29, 2013, accessed October 20, 2013,

Friday, October 18, 2013

Distinction: The Deception of the Senses and Descartes' Error

Distinctions are important and dangerous. They can endlessly multiply arguments, but they can also help discover and preserve truth. An example of a culturally-ubiquitous lack of distinction-leading-to-error is in order.

I came across a video today (one of many I've seen from the past; I hear people carelessly making the same claim frequently):

Source: BuzzFeedVideo, "Proof Your Senses Are Lying To You," YouTube, October 12, 2013, accessed October 18, 2013,

Descartes made many errors. There's one I want to focus on here, and it persists through the tradition of modern philosophy and into popular culture today. It's the belief that our "senses deceive us" or even can deceive us. What's going on here, really, is an ambiguous use of the word "senses," and Descartes even makes the distinction between what our senses in themselves report to our mind and the judgments that our minds make of those sensory reports:
When the wax is in front of us, we say that we see it, not that we judge it to be there from its colour or shape; and this might make me think that knowledge of the wax comes from what the eye sees rather than from the perception of the mind alone. But this is clearly wrong, as the following example shows. If I look out of the window and see men crossing the square, as I have just done, I say that I see the men themselves, just as I say that I see the wax; yet do I see any more than hats and coats that could conceal robots? I judge that they are men.
Source: René Descartes, Meditations on First Philosophy, Marxists, accessed October 18, 2013,"Second Meditation: The nature of the human mind, and how it is better known than the body,"

Nevertheless, Descartes continues to insist that it is our senses that deceive us, and they are trustworthy only because of the goodness of God:
Thus I see plainly that the certainty and truth of all knowledge depends strictly on my awareness of the true God.
Source: Ibid., "Fifth Meditation."

Nevertheless, to be fair, Descartes seems to mean that we cannot derive clear and certain knowledge of things from the senses, not necessarily that sensory data itself is deceiving, but even here, Descartes is unclear, lacking in making distinctions, or at least not applying them consistently; for example:
God has given me no way of recognizing any such ‘higher form’ source for these ideas; on the contrary, he has strongly inclined me to believe that bodies produce them. So if the ideas were transmitted from a source other than corporeal things, God would be a deceiver; and he is not. So bodies exist. They may not all correspond exactly with my sensory intake of them, for much of what comes in through the senses is obscure and confused.
Source: Ibid., "Sixth Meditation." 

At various points—since I don't have the works in front of me, I can't verify exactly where—I've noticed that so many other modern philosophers have made similar claims as Descartes, namely, that the senses deceive and are untrustworthy. It's a claim that popular culture now takes for granted. Yet hardly ever is the distinction made between our sensory percepts and the judgments we form of those percepts. Error is found in a concrete judgment, a proposition that a thing is either so or not. Deception requires firstly the possibility of an affirmation or negation, not a brute datum, such as sensory stimuli. I am deceived about my judgment of a matter of fact, not the reality that the fact apparently reports.

Our senses tell us exactly what they tell us. Yes, that's a tautology and one of many factors that gives rise to the possibility for judgments and hence also truth and deception. Even people with "malfunctioning" sensory organs receive precisely what those organs communicate, but what the person judges of that sensory data is where deception can occur.

Thursday, October 17, 2013

Zed Nelson, the Beauty Industry, and People as Objects

The following poignant commentary on the beauty industry and standards of Western beauty becoming globalized is from the photographer Zed Nelson:
Beauty is a $160 billion-a-year global industry. The worldwide pursuit of body improvement has become a new religion. 
We live in a society that celebrates and iconises youth, where the old, the aesthetically average and the fat seem to have been erased from the pages of our glossy magazines, advertising posters and television screens. 
The promise of bodily improvement is fuelled by advertising campaigns and a commercially-driven Western media, reflecting an increasingly narrow palette of beauty. The modern Caucasian beauty ideal has been packaged and exported globally, and just as surgical operations to 'Westernise' oriental eyes have become increasingly popular, so the beauty standard has become increasingly prescriptive. In Africa the use of skin-lightening and hair-straightening products is widespread. In South America women have operations that bring them eerily close to the Barbie doll ideal, and blonde-haired models grace the covers of most magazines. Anorexia is on the increase in Japan, and in China, beauty pageants, once banned as 'spiritual pollution', are now held across the country. 
'Westernising' the human body has become a new form of globalisation, with 'Beauty' becoming a homogenous brand. The more rigorously our vision is trained to appreciate the artificial, the more industries benefit. The current standard of beauty feeds the fashion, cosmetics, diet, medical and entertainment industries, with the homogenisation of appearance becoming part of an increasingly globalised consumer culture. 
But who creates this culture? However much we may confidently point the finger at certain industries, we can't deny our own tacit, albeit culturally conditioned, involvement. Like it or not, we are judged, and judge, by appearance. Perhaps we are obsessed with the way our own bodies look because we know how instinctively judgemental [sic] we are of the bodies that we look at. 
A recent scientific study reported that we make decisions about the attractiveness of people we meet in the space of 150 milliseconds. This superficial appraisal has profound implications. Those we consider most beautiful not only find sexual partners more readily but studies also show they get better jobs and more lenient treatment in court. 
We have created a world in which there are enormous social, psychological and economic rewards and penalties attached to the way we look. Can any of us honestly say, 'I don't want to be attractive'? Don't we all want to be loved? But have we been brainwashed into believing that in order to be loved we need smaller noses, bigger breasts, tighter skin, longer legs, flatter stomachs and to appear ever youthful? Where does it end?
The body has, in a sense, become just another consumer purchase. Everyone can, in the spirit of our age, go shopping for bodily transformation. Banks now offer loans for plastic surgery. American families with annual incomes under $25,000 account for 30 per cent of all cosmetic surgery patients. Americans spend more each year on beauty than they do on education. 
As our role models become ever younger and more idealised, we are so afraid of aging that the quest for youthful preservation generates an almost pathological obsession with our bodies. As we align our sense of self-worth with self-image, the psychological and emotional consequences are tortuous. The one thing we do know for certain is that our body will always, in the end, betray us.
Source: Zed Nelson, "Love Me: Introduction," Zed Nelson, 2009, accessed October 17, 2013,

The commentary speaks for itself. The photograph project is quite striking, dark, and saddening, and I highly recommend a viewing. Just a warning, some of the pictures are quite graphic, depicting surgery as well as remains from surgery:

One quotation that Nelson attaches to one of the photographs is quite perceptive:
“Every society has notions of what one should believe, how one should behave, and how one should look like in order to avoid unpopularity.

“These social conventions are formulated in legal codes and religious doctrines, but also in a vast body of social judgements which we take for granted, which dictates what we wear, who we respect, how we lead our lives, and how we should look. We refrain from questioning the status quo, because we associate what is popular with what is right.
(Cited as: Alain de Botton, writer.)

A further difficulty is that cultural values work as a web, each element strengthening the other. To overcome one difficulty requires that a person go up against every other element, at least implicitly. Doesn't this resistance culminate in the death of the martyr, whether red or white?

Women and men caught up in the trap of societal standards of beauty, and each entangled in sexual sin, these are two sides of the same coin, or rather, two facets of a multi-faceted problem. The underworld that these practices create, the political corruption encouraged, the entertainment industry fostered, everything converges to tell you that this is the only reality, and it seems that the attempt to look elsewhere is met with the equivalent of the ending of Orwell's Nineteen-Eighty Four.

But there is a true solution. This page sums it up best:

Complementary Photos for Quine's "Desert Landscapes"

Wyman’s overpopulated universe is in many ways unlovely. It offends the aesthetic sense of us who have a taste for desert landscapes, but this is not the worst of it. Wyman’s slum of possibles is a breeding ground for disorderly elements. Take, for instance, the possible fat man in that doorway; and, again, the possible bald man in that doorway. Are they the same possible man, or two possible men? How do we decide? How many possible men are there in that doorway? Are there more possible thin ones than fat ones? How many of them are alike? Or would their being alike make them one? Are no two possible things alike? Is this the same as saying that it is impossible for two things to be alike? Or, finally, is the concept of identity simply inapplicable to unactualized possibles? But what sense can be found in talking of entities which cannot meaningfully be said to be identical with themselves and distinct from one another? These elements are well-nigh incorrigible. By a Fregean therapy of individual concepts, some effort might be made at rehabilitation; but I feel we’d do better simply to clear Wyman’s slum and be done with it.
Source: Willard Van Orman Quine, "On What There Is," Review of Metaphysics 2 (1948/1949): 23-24 [total: 21-38], accessed October 17, 2013,

I came across this photo project that very well complements Quine's taste for "desert landscapes." It's called, "Somewhere in the Middle of Nowhere"; you can find it here:

A nice little article on it is here:

Wednesday, October 16, 2013

The Psychological Mechanics of Personal Development

The following is an excerpt of a letter that my doctor sent to me. The original context is doing one of the Tai Chi forms, but I found that his words can apply to almost any other experience and process of development. I don't think he has ever read Bernard Lonergan, but his observations remind me of Lonergan's work.
As for instructor mistakes, at your level of exploration, they are of minor importance since the first agenda is to "find your body". That is, to develop an awareness of your physical presence in more than normal proportion and sensitivity. Although one could logically assume that incorrect instruction might inhibit proper development, the challenge of doing "the Form" is in and of itself part of the developmental process. Even if not what an advanced practitioner would do, your present study is still proportionately appropriate for your level of development. [...] 
After many years of study it is clear that every small advance in physical awareness is followed by a transitional period of adaptation. That is, the "homunculus" must first be coerced into accepting physical change and remodel itself to adapt to your demand for advancement in your practice. Be aware that the "homunculus" will resist any change that you attempt to develop. Consequently, only your desire and will to advance can drive your body to a greater awareness and ultimately result in significant advancement in development. 
Over the years I have noted in myself and my students that every quantum leap in development is accompanied by a confusing feeling that one has not advanced or changed at all. This frustratingly common confusion is the result of actual change! What appears to be the lack of advancement is actually the experiential reality of being in a metaphorically "new body" which must then again advance to a higher level of development in order to proceed to the next quantum level. When one realizes a new physical awareness you are essentially "starting from scratch" again. That is to say that the new physical and kinetic being that you have become perceives of itself as a novice body and transports you experientially back to your former sense of consciousness. Fortunately this illusion is of little consequence as the more rapid your development, the more often you have this "deja-vu" experience. In fact I would argue that this is a paradoxically good sign that you are progressing in the right direction. Those students who have the experience of having achieved a semblance of mastery usually have become quite static which is what creates the illusion of progress.

Tuesday, October 15, 2013

Video Games, Escapism, and Leisure

But the problem is that gamer [sic] can easily fall into the belief that he is partaking in something greater, for he is ‘accomplishing’ things and producing something (albeit something more-or-less wholly limited by the structures built into the game). Thus a gamer achieves ‘success’ physiologically while not leaving any external success. If one is unreflective he is in grave danger of being overtaken by this illusory success, of lauding himself for something that is not. 
The end of gaming is then this pseudo-success, this creation of a character, a ranking, a profile of achievements. And this telos is unending, but also, essentially, unreal. The gamer is not fundamentally involved in becoming more human, nor in helping others become more human (it may be that one does become more human but that is almost certainly incidental to the game itself). Insofar as the gamer is using games for relaxation and legitimate leisure they are not problematic. But there is always a danger that the illusion of production will exert an unnatural influence on the gamer, resulting in a hobby that is fundamentally dishonest.
Source: gentlemantheologian, "The End of Gaming," A Gentleman Theologian Blog, April 5, 2013, accessed October 15, 2013,

The author brings up a point that I've noticed in my self-reflection as well as the reflections of other gamers or ex-gamers, namely, the escapist function of the video game. This function is not limited to the video game; in fact, any reality, whether mind-independent or dependent, can take on a quality of unreality (ens rationis in the language of John of St. Thomas) through our own construction of a system of signifiers—e.g. "nature" becomes the "Mother Goddess" that I must defend by sitting in a tree in order to prevent this construction company from going forward with this building project; the "Mother Goddess" then becomes the cyclical victim of the cosmos, which I may partake in through ritualistic actions in the form of magic(k), and so forth.

And people are not oblivious to this escapist function nor are they silent on it. In fact, some deliberately seek escapism to avoid the brutality of reality. For example:
I used [this video game in order to] compensate the pressure and bullying that I had to deal with back [in high school], so no. But I'm happy other people were able to grow up like this.
Source: FutureJirachi, comment, [approx.] September 24, 2013, on "Pokemon Silver/Gold/Crystal - Ecruteak City/Cianwood City," YouTube, February 22, 2008, accessed October 15, 2013,

But, interestingly, this comment is in response to this comment, which is also an expression of escapism:
Such beautiful nostalgia. Remember being a kid and this being just the best thing in the world to you? Nothing else mattered. There were no troubles, there were no worries, there were no responsibilities. Just being a kid, having fun with your Gameboy and Pokemon :)
Source: ilikemetroid, comment, [approx.] August 15, 2013, on "Pokemon Silver/Gold/Crystal - Ecruteak City/Cianwood City," YouTube, February 22, 2008, accessed October 15, 2013,

The similar comments are endless; one needs to go only to any similar videos on a cherished video game to find them.

But what is interesting is the contrast between the two comments, a contrast of circumstance. The first comment reveals an escapism from bullying and pain that was immediate to the person's experience. The second comment is reflexive and based in hindsight, i.e. given my current "troubles, worries, responsibilities," this game was the "best thing in the world." As a child, the commenter didn't seem to realize what he was escaping from even while in the act of escape. The first commenter, therefore, calls out the illusion of the second commenter's carefree childhood by reference to her own pain-plagued childhood by saying, "so no [things did matter, and I wanted to get away from them]."

But for the second commenter, who had a carefree childhood, the carefree quality wasn't an illusion but a reality, and the video game complemented it. It is only by way of comparison that the victim of bullying could call the other commenter's reflection as being based on an illusion.

But both comments fall under the same category: reflection on escapism. For some it is deliberate; for others, not so. Some realize what is going on as it is; others realize it in hindsight. The question we could turn to then is: does such escapism make us human, and in what ways might it do so?

The virtue of playfulness (eutrapelia) and the necessity of leisure carry a long tradition in Catholic thought. It in fact could be summarized in a witty saying that goes, "We don't talk about politics and religion in polite company. That's why God invented football."

Anyway, I simply intend to bring out some of these points for consideration here. What is the line between playfulness in its truest sense and escapism? Is escapism itself a good thing, and under what conditions? What are we escaping from, individually and collectively?

Sunday, October 13, 2013

Hipster Culture, Consumerism, "Speaking" Photography

Interestingly, the subject’s consumption further supports the case that hipster photography is often appropriated to advertise a product thus further provoking more consumption of perhaps a different product. Although hipsters depicts a carefree world, in general they carefully pick or curate the consumable objects they wish to be surrounded by. [...]

The emergence of hipster photography via Instagram, Tumblr and numerous other platforms points to a crucial characteristic that must be considered in this context. One of the key elements in hipster photography is the fact that it is shared with others. These are not private photographs that are personal keepsakes hidden on the bottom of a shoebox. They are loud images that say ‘look at me’. It is not enough to just photograph something, but that photograph must be shared in order to justify its existence and indeed the existence of the person taking the photograph. In short, hipster photography is an existential exercise, a performance, even a ritual that marks out a tiny territory – an identity – in a world filled with images.


Source: admin [Marco Bohr], "What Is Hipster Photography?," Visual Culture Blog, October 11, 2013, accessed October 13, 2013,

Wednesday, October 9, 2013

Charles Taylor on the Three Consequences of Representational Epistemology

[p. 428] Husserl asks in the first meditation whether the “Trostlosigkeit” [“despair”; “hopelessness”] of our present philosophical predicament doesn’t spring from our having abandoned Descartes’s original “spirit of radical philosophical self-responsibility.” And he continues:

Should the supposedly exaggerated demand for a finally possible [and] disengaged philosophy of presuppositionlessness [or impartiality] not, on the contrary, belong rather to a philosophy that in the deepest sense shapes itself in real autonomy out of finally self-produced evidences and, so, is thereby absolutely self-responsible? (Cartesianische Meditationen (The Hague, 1950), 47)

This ideal of “self-responsibility” is foundational to modern culture. It emerges not only in our picture of the growth of modern science as the fruit of the heroism of the great scientist […] Copernicus, Galileo (he wobbled a bit before the Holy Office, but who can blame him?), Darwin, Freud. It is also closely linked to the modern ideal of freedom as self-autonomy, as the passage from Husserl implies. To be free in our modern sense is to be self-responsible, to rely on one’s own judgment, to find one’s purpose in oneself.

And so the epistemological tradition is also intricated in a certain notion of freedom, and the dignity attaching to us in virtue of this [autonomy]. The theory of knowledge partly draws its strength from this connection. But also reciprocally, the ideal of freedom has drawn strength from its sensed connection with the construal of knowledge seemingly favored by modern science. From this point of view it has been fateful that this notion of freedom has been interpreted as involving certain key theses about the nature of the human agent; we might call them anthropological beliefs. [... T]he three connected notions that I would like to mention here are in fact historically closely connected with the epistemological construal.

The first is the picture of the subject as ideally disengaged, that is, as free and rational to the extent that he has fully distinguished himself from his natural and social worlds, so that his identity is no longer to be defined in terms of what lies outside him in these worlds. The second, which flows from this, is a punctual view of the self, ideally ready qua free and rational to treat these worlds—and even some of the features of his own character—instrumentally, as subject to change and reordering in order the better to secure the welfare of himself and other like subjects. The third is the social consequence of the first two: an atomistic construal of society as constituted by, or ultimately to be explained in terms of, individual purposes.

The first notion emerges originally in classical dualism, where the subject withdraws even from his own body, which he is able to look on as an object [.... p. 429] The second originates in the ideals of the government and reform of the self that have such an important place in the seventeenth century [... and] it continues today in the tremendous force that instrumental reason and engineering models have in our social policy, medicine, psychiatry, politics, and so on. The third first takes shape in seventeenth-century social contract theories [but continues] in many of the assumptions of contemporary liberalism and mainstream social science. [...]

To challenge these is sooner or later to run up against the force of this tradition, which stands with them in a complex relation of mutual support. [...]


Source: Charles Taylor, "Overcoming Epistemology," in After Philosophy: End or Transformation, ed. Kenneth Baynes, James Bohman, and Thomas McCarthy (Cambridge, MA: MIT Press, 1987), in Twentieth-Century Philosophy, ed. Forrest E. Baird and Walter Kaufmann (Upper Saddle River, NJ: Prentice Hall, 2003), 428-429.

Charles Taylor on Representational Knowledge

[p. 424] In some circles it seems to be rapidly becoming a new orthodoxy that the whole enterprise from Descartes, through Locke and Kant, and pursued by various nineteenth- and twentieth-century succession movements, was a mistake. Within this new agreement, however, what is becoming less and less clear is what exactly it means to overcome the epistemological standpoint [….]

Rorty’s book seems to offer a clear and plausible answer. The heart of the old epistemology was the belief in a foundational enterprise (Princeton, 1979:132). What the positive sciences need to complete [then], on this view, was a rigorous discipline that could check the credentials of all truth claims. An alleged science could only be valid if its findings met this test; otherwise it rested on sand. Epistemology would ultimately make clear just what made knowledge claims valid, and what ultimate degree of validity they could lay claim to. (And, of course, one could come up with a rather pessimistic, skeptical answer to the latter question. […])

In practice, of course, epistemologists took their cue from what they identified as the successful sciences of their day, all the way from Descartes’s infatuation with mathematics to contemporary vogue for reduction to physics. But the actual foundational science was not supposed itself to be dependent on any of the empirical sciences, and this obviously on pain of a circularity that would sacrifice its foundational character. Arguments about the source of valid knowledge claims were not supposed to be empirical.

If we follow this description, then it is clear what overcoming epistemology has to mean. It will mean abandoning foundationalism. On this view, Quine would figure [p. 425] among the prominent leaders of this new philosophical turn, since he proposes to “naturalize” epistemology, that is, deprive it of it’s a priori status and consider it as one science among others, one of many mutually interacting departments of our picture of the world. […]

But there is a wider conception of the epistemological tradition [….] If I had to sum up this understanding in a single formula, it would be that knowledge is to be seen as correct representation of an independent reality. In its original form it saw knowledge as the inner depiction of an outer reality.[1]

The reason why some thinkers prefer to focus on this interpretation, rather than merely on the foundationalist ambitions that are ultimately (as Quine has shown) detachable from it, is that it is bound up with very influential and often not fully articulated notions about science and about the nature of human agency. […]

The link between this representational conception and the new, mechanistic science of the seventeenth century. This is, in fact, twofold. On one side, the mechanization of the world picture undermined the previously dominant understanding of knowledge and thus paved the way for the modern view. The most important traditional view was that of Aristotle, according to which when we come to know something, the mind <nous> becomes one with the object of thought (Cf., e.g., De Anima III, 430a20, also 431a1 and 431b20-23). Of course, this is not to say that they become materially the same thing; rather, the idea is that they are informed by the same <eidos> (Cf., e.g., De Anima III, 430a9 and 431b32). Here was a conception quite different form the representational model, even though some of the things Aristotle said could be construed as supporting this later. The basic bent of Aristotle’s model could much better be described as participational: being informed by the same <eidos>, the mind participated in the being of the known object, rather than simply depicting it.

But this theory totally depends on the philosophy of forms. Once one no longer explains the way things are in terms of the species that inform them, this conception of knowledge is untenable and rapidly becomes close to unintelligible. We have great difficulty in understanding it today. The representational view can easily then appear as the only available alternative.

[… p. 426] If we see [perception] as another process in a mechanistic universe, we cannot but construe it as involving as a crucial component the passive reception of impressions from the external world. Knowledge then hangs on a certain relation holding between what is “out there” and certain inner states that this external reality causes in us. This construal, valid for Locke, applies just as much to the latest AI-inspired models of thinking. It is one of the mainsprings of the epistemological tradition.

The epistemological construal is, then, an understanding of knowledge that fits well with modern mechanistic science. This is one of its great strengths, and certainly this connection contributes to the present vogue of computer-based models of the mind. […] It is in fact heavily overdetermined. […]

[According to Descartes] if the object of my musings happens to coincide with real events in the world, this doesn’t give me knowledge of them. This congruence has to come about through a reliable method, generating well-founded confidence. Science requires certainty, and this can only be based on that undeniable clarity which Descartes called évidence. “Every science is a certain and evident knowledge,” runs the opening sentence of the second of the Rules for the Direction of the Mind.

Now certainty is something that the mind has to generate for itself. It requires a reflexive turn. […] The correct issue of science, that is, of certainty, can be posed—the issue of the correspondence of idea to reality, which Descartes raises and then disposes of through the supposition of the malin genie [evil genius] and the proof of his negation, the veracious God.

The confidence that underlies this whole operation is that certainty is something the thinker can generate for himself, by ordering his thoughts correctly—according to clear and distinct connections. […] The very fact of reflexive clarity is bound to improve our epistemic position, as long as knowledge is understood representationally. […]

[p. 427] Descartes is thus the originator of the modern notion that certainty is the child of reflexive clarity [….]

There is still a strong draw toward distinguishing and mapping the formal operations of our thinking. In certain circles it would seem that an almost boundless confidence is placed in the defining of formal relations as a way of achieving clarity and certainty about our thinking, be it in the (mis)application of rational choice theory to ethical problems or in the great popularity of computer models of the mind [….]

The plausibility of the computer as a model of thinking comes partly from the fact that it is a machine, hence living “proof” that materialism can accommodate explanations in terms of intelligent performance; but partly too from the widespread faith that our intelligent performances are ultimately to be understood in terms of formal operations. The computer, it can be said, is a “syntactic engine.” […] The most perspicuous critics of the runaway enthusiasm with the computer model, such as Hubert Dreyfus, tirelessly point out how implausible it is to understand certain of our intelligent performances in terms of a formal calculus, including our most common everyday ones, such as making our way around our rooms, streets, and gardens, picking up and manipulating the objects we use, and so on. But the great difficulties that computer simulations have encountered in this area don’t seem to have dimmed the enthusiasm of real believers in this model. It is as though they had been vouchsafed some certain revelation a priori that it must all be done by formal calculi. Now this “revelation,” I submit, [p. 428] comes from the depths of our modern culture and the epistemological model that is anchored in it, whose strength is based not just on its affinity to mechanistic science but also on its congruence to the powerful ideal of reflexive, self-given certainty.


1. Cf. Descrates’s statement in his letter to Gibieuf of 19 January 1642, where he declares himself “assured that I can have any knowledge of what is outside me, [that] by the mediation [of] key ideas that I have had in me.” […] The notion that the modern epistemological tradition is basically dominated by this understanding of representation was pioneered by Heidegger [….]


Source: Charles Taylor, "Overcoming Epistemology," in After Philosophy: End or Transformation, ed. Kenneth Baynes, James Bohman, and Thomas McCarthy (Cambridge, MA: MIT Press, 1987), in Twentieth-Century Philosophy, ed. Forrest E. Baird and Walter Kaufmann (Upper Saddle River, NJ: Prentice Hall, 2003), 424-428.

Tuesday, October 8, 2013

John Deely on the Epistemology/Ontology Distinction

The famous problema pontis, the problem of building a bridge from what is in our individual mind to what is outside of our mind (which of course includes the mind of other humans), on the terms of modern philosophy, proves insoluble. The problema indeed is that there is no pons possible! In Bertrand Russell's famous summary,[1] "we can witness or observe what goes on in our heads," indeed, but "we cannot witness or observe anything else at all." 
That is where what the moderns came to call "epistemology" leads.[2] Of course the conclusion—solipsism, the isolation of every self within itself—is unacceptable to "common sense" and incompatible with common life in the every day. [...]
Behind the evident terminological diversity you will find underlying not a dime's worth of difference among any of the modern treatments of epistemology. Absent a placing of the understanding of the singularity of relation front and center in the treatment of knowledge, there is no way out of the modern dilemma, as our redefinition of medieval philosophy will show; for only the singularity of relation as circumstantially transcending the difference between ens reale and ens rationis, and thus also transcending in the contrast of a relation's fundament to the relation's terminus the distinction between inner and outer, do we find the basis for the prior possibility of the action of signs enabling all knowledge, brute or "rational", thus including the human awareness of things at once in their difference in principle and partial coincidence in fact with objects.
 1. [ftnt. 7] Bertrand Russell, My Philosophical Development (New York: Simon and Schuster, 1959), 26; second italic added.
2. [ftnt. 8] The division of philosophical study into "epistemology" as the study of knowledge and "ontology" as the study of being seems harmless enough, a merely descriptive distinction of areas of study. But the modern "epistemology", coined perhaps in Ferrier 1854, turns out to be a study which shows, explicitly in Kant but implicitly already in Descartes and Locke with the reduction of objectivity to mental self-representations, that in fact human knowledge cannot go beyond itself to grasp being in its own subjective and intersubjective structures—"as though", Maritain remarks of this modern implication (Jacques Maritain, Distinguish to Unite, or The Degrees of Knowledge, trans. Gerald B. Phelan (New York: Scribner, 1959), 66), "a philosophy of being could not also be a philosophy of mind".   
In fact, [...] with Aquinas and others, animal awareness in sensation does not begin with mental imagery, but originates rather in relations (the relations [proper] to common sensibles, in fact) that do not neatly fit the contrast between mind-independent and mind-dependent being, but descriptively and analytically antecede that division in providing the common root of objectivity as it will subsequently branch within perception into entangled mind-dependent and mind-independent aspects.   
For this reason, I prefer Maritain's description of the study of knowledge as "noetic" rather than "epistemology", in order to escape from the start the implicit consequent in the modern coinage of "epistemology" as a technical term within philosophy.
Source: John Deely, Medieval Philosophy Redefined (London: University of Scranton Press, 2010), xxvi-xxvii.

Brief History of the Fall of Scholasticism and Rise of Science

The 17th century crash and burn of Scholasticism—the tradition of commentary on Aristotle (in philosophy) and Lombard (along with the Bible in theology) begun in the late 1100s—resulted from accumulated abuses on the part of authorities civil and religious, abuses in which the scholastic “establishment” within the universities was all-too-often complicit. What discredited the Scholastics in the end was the actual demonstration by men we now call “scientists” of basic truths about the universe that scholastics [sic] denied—while encouraging church and state officials to take actions of repression and thought-control. Not until 1757 did the Roman Church lift its prohibition from 1616 of books dealing with Copernicus’ view that the earth was not the center of the physical universe, and not until 1835 did an edition of the Index of Forbidden Books appear which no longer listed as prohibited the works of Copernicus, Galileo, and Kepler! 
 However understandable, the turning away from scholasticism in philosophy turned out to be a matter of throwing out the baby with the bathwater; for thinkers of the time were so taken with the experimental and mathematical techniques that had shown the earth to move and the stars to be other suns that they came to believe that the whole edifice of human knowledge, without remainder, could be rebuilt on the basis of science in this modern, empirical and mathematical sense. The ascendancy of this belief defined the historical epoch that has come to be called the Enlightenment, the belief that philosophers might ask questions, but only scientists could actually give answers. If you think that this Enlightenment attitude is a thing of the past, you are mistaken. Yet increasingly has it come to be recognized that if the whole of the knowledge we acquire before becoming scientists has no independent validity, then science itself would have no validity. 
 The first major thinker seriously to recognize this situation, or at least most completely to do so, was Charles Sanders Peirce. Borrowing a terminology coined by Jeremy Bentham, Peirce pointed to the difference between critical knowledge based on common experience or “cenoscopy”, presupposed to the validity of the specialized foci of modern experimental and mathematical science, in contrast with the knowledge that only experimentation and mathematization of results can produce, or “ideoscopy”, which is science in the modern sense. Until now, philosophers generally, in desperation have tried appealing to “common sense” as the basis upon which philosophy has a legitimacy of its own prior to and independent of science. But so discredited has the notion of “common sense” become in intellectual culture that appeal to it has little chance of persuading a wide audience. What is needed, rather, is the recognition that, while both science and “common sense” depend upon “the total everyday experience many generations of multitudinous populations”, yet “such experience is worthless for distinctively scientific purposes”.[1] 
 The “distinctively scientific purposes” includes, however, both exploration of human experience that requires experimentation to advance knowledge and the more general “scientific purpose” to evaluate and expose in critically controlled terms that overall framework of knowledge within and on the basis of which scientific research comes to be conducted in the first place. Articulation of the presupposed overall framework of knowledge and of independent results attainable within it too requires “science” (as critically controlled objectification), but not ideoscopic science: here is the domain proper to philosophy, cenoscopic science. It has a legitimacy of its own, and this is what the early moderns lost sight of in their enthusiasm for the then-firmly-established-possibilities of ideoscopy. Moreover, the most basic of the cenoscopic lines of investigation proves to be precisely inquiry into the action of signs, “semiosis”, because it turns out that cenoscopy and ideoscopy alike depend on this action throughout for whatever knowledge they succeed to establish. […]
 Philosophy, then as cenoscopic science, not only precedes ideoscopic science and provides its framework. Philosophy also, rightly understood, shows the inevitability of ideoscopic development in order for human thought to reach maturation—just what the authorities, Church and Civil, in the closing Latin centuries, failed to understand. Exactly as Hannam says in the subtitle of his book: “the medieval world laid the foundations of modern science”; but the Latins achieved this feat, as it were, indirectly, mainly as a consequence or by-product of their exploring the dimensions and depths of cenoscopic knowledge out of which ideoscopic inquiries inevitably arise. [...]
How are we to understand the medieval notion of "science" in relation to the modern notion of science? Well, to begin with, we are finally in a position to say right off that the Enlightenment notion that modern science simply displaces and replaces medieval science cannot possibly be the case. For science in the modern sense in fact presupposes the development of "science" in the ancient and medieval sense, in just the way that adult knowledge depends upon and develops out of the knowledge and experience of teenagers and, for that matter, of infants! If there were no validity to the knowledge that human beings begin to acquire in their earliest years, prior to and independent of any use of instruments extending the senses and mathematics systematizing the results of experiments and observations, then scientific knowledge itself as later acquired could have no validity, for no chain can be stronger than its weakest link.
 1. Charles Sanders Peirce, “Pragmaticism, Prag.,” CP 5.522.
Source: John Deely, Medieval Philosophy Redefined (London: University of Scranton Press, 2010), vii-viii, xxviii.