Ovi -
we cover every issue
newsletterNewsletter
subscribeSubscribe
contactContact
searchSearch
Oxterweb  
Ovi Bookshop - Free Ebook
Ovi Greece
Ovi Language
Murray Hunter: Essential Oils: Art, Agriculture, Science, Industry and Entrepreneurship
Stop violence against women
Murray Hunter: Opportunity, Strategy and Entrepreneurship
Stop human trafficking
 
BBC News :   - 
iBite :   - 
GermanGreekEnglishSpanishFinnishFrenchItalianPortugueseSwedish
Does the Internet make us more Intelligent but less Human? Does the Internet make us more Intelligent but less Human?
by Dr. Emanuel Paparella
2008-07-28 08:11:21
Print - Comment - Send to a Friend - More from this Author
DeliciousRedditFacebookDigg! StumbleUpon
For more than a decade now we educated people of our global village have been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to writers. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes.  This is definitely progress, we are told by those who believe, with Hegel’s dialectics, that what comes at the end of an historical process is always better than what preceded it.  If that is indeed so, then the latest statistics about EU literacy skills among young people cannot but give us pause and make us scratch our head in perplexity.

Here are the statistics fresh from the European Commission:

In 2006, almost a quarter of 15-year olds (24.1 percent) qualified as "low performers in reading" - an increase of 21.3 percent when compared to data from 2000. Boys (30.4 percent) scored almost twice as badly as girls (17.6 percent).  Romania and Bulgaria lie at the bottom of the chart, with over 50 percent of 15-year olds in both countries performing poorly in reading and understanding a written text. The two countries are followed by Greece, with 27.7 percent performing poorly, Italy (26.4 percent) and Spain (25.7 percent).  At the top of the chart, in Finland (4.8 percent), Ireland (12.1) and Estonia (13.6), teenagers have the best reading ability in the EU. This may be good news of sort for Finland, but bad news for just about every other nation in the EU. Indeed having one eye among the land of the blind is not exactly something to be too sanguine about.

As part of attempts to turn the EU into a knowledge-based economy, governments agreed that by 2010 at least 85 percent of 22-year olds should have completed upper secondary education.  In 2007, the EU average stood at slightly above 78 percent. Malta and Portugal performed the worst, while the Czech Republic, Poland and Slovenia came out best. Malta and Portugal are also the black sheep when it comes to early school leavers, with the numbers three times the 2010 threshold of ten percent. Poland and Slovakia, on the other hand, had the lowest proportion of dropouts in 2007.

Those are the facts which do nothing but reveal the symptoms of the malaise, and it is a malaise found on both sides of the Atlantic in the Western world. What may be the causes? Let’s speculate a bit. For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through our eyes and ears and into our mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded.  But that boon comes at a price. As already pointed out in another article on the media theorist Marshall McLuhan, he pointed out in the 1960s that media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away our capacity for concentration and contemplation. Our mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once we were scuba divers in the sea of words. Now we zip along the surface like a guy on a Jet Ski.

The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”

So, the question that still needs to be fully answered it this: how does Internet use affect cognition. A recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited. They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it. It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.

Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.

Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

Let us go back to the 19th century. Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page. But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “Perhaps you will through this instrument even take to a new idiom,” the friend wrote in a letter, noting that, in his own work, his “‘thoughts’ in music and language often depend on the quality of pen and paper.” “You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts.” Under the sway of the machine, writes the German media scholar Friedrich A. Kittler, Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.”

As we use what the sociologist Daniel Bell has called our “intellectual technologies”—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.”

As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock. The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.

The Internet promises to have particularly far-reaching effects on cognition. In a paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that’s what we’re seeing today. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.  When the Net absorbs a medium, that medium is re-created in the Net’s image. It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration.

The Net’s influence doesn’t end at the edges of a computer screen, either. As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that’s been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure.

About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant’s machinists. With the approval of Midvale’s owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an “algorithm,” we might say today—for how each worker should work. Midvale’s employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, Charlie Chaplin later dedicated a surrealistic movie to the phenomenon, but it could not be denied that the factory’s productivity soared.

More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor’s tight industrial choreography—his “system,” as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the “one best method” of work and thereby to effect “the gradual substitution of science for rule of thumb throughout the mechanic arts.” This was the advent of the world of “efficient ordering” envisioned by the likes of Bacon and Descartes. Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. “In the past the man has been first,” he declared; “in the future the system must be first.”

Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best method”—the perfect algorithm—to carry out every mental movement of what we’ve come to describe as “knowledge work.”

Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind. The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.” In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.

Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,” Page said in a speech a few years back. “For us, working on search is a way to work on artificial intelligence.” In a 2004 interview with Newsweek, Brin said, “Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.” Last year, Page told a convention of scientists that Google is “really trying to build artificial intelligence and to do it on a large scale.”

The largely unexamined but distressing assumption is that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive. The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.

It must be admitted though that there is another side of the coin here:  just as there’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine. Going all the way to the ancient Greeks, in Plato’s Phaedrus , Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not exactly wisdom).

The arrival of Gutenberg’s printing press, in the 15th century, set off another round of worries. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men “less studious” and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes, “Most of the arguments made against the printing press were correct, even prescient.” But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would deliver.

So, yes, we ought to be skeptical of the skeptics. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct after all, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom, but I wouldn’t bet on it. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking. If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake: “I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and ‘cathedral-like’ structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.”

As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”  In ethical terms, C.S. Lewis called them “men without chests.” We ought to be haunted by the final scene in 2001. What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the lack of emotion that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001: A Space Odyssey, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence. In another prophetic film, Terminator, the mother at the end of the story is happy that her son has found a true friend, a computer that looks like a human being.

Do the above musings on artificial intelligence explain the troubling illiteracy of our young and not so young people nowadays? Perhaps not, but that idea of “pancake people” remains food for thought nonetheless.

    
Print - Comment - Send to a Friend - More from this Author

Comments(17)
Get it off your chest
Name:
Comment:
 (comments policy)

Sand2008-07-28 09:32:44
Although pancakes may seem food for thought, waffling seems to have no nourishing quality whatsoever.


Sand2008-07-28 09:50:40
If your statistics are any indication of the quality of your general perception of the current situation it seems your internet capabilities need a bit of polishing. According to https://www.cia.gov/library/publications/the-world-factbook/print/fi.html
Finnish literacy is at 100%.


Emanuel Paparella2008-07-28 14:50:22
No surprises in the above comments which are true to form for Mr. Sand. Had he however bothered to actually read the article, before jumping on his horse for a customary slanderous attack with dubious intellectual tactics, he would have noticed that those statistics (culled verbatim, as clearly stated, from an official report of the EU Commission, see EU-Observer) were not about literacy per se (the ability to read at any level) but the ability to read at the appropriate level, wherein Finland commendably comes up on top with the least deficiency of all the EU countries, but still some deficiency. On the other hand, should the statistics be wrong the appropriate and fair task for Mr. Sand is to take the matter up with the EU Commission and perhaps obtain a position in it as Grand Guardian of Finnish Literacy. Should the statistic be inaccurate, that would speak volumes in itself on the mathematical and statistical ability of the EU Commission and the entire continent and would logically mean that the cognitive problem of the Internet is even worse than surmised and needs to be revisited.


Emanuel Paparella2008-07-28 15:22:44
P.S. By the way, to the editors: great complementary picture on the cover story. C. S. Lewis would approve enthusiastically. In his Abolition of Man he speak of the "men without chest" all rationality and brain and little if any heart and wisdom.


Sand2008-07-28 15:31:22
If Mr. Paparella’s reading capabilities were up to snuff he would be aware that I did not supply any statistics of my own, I merely referenced to a site which supplied statistics. I wonder why he incessantly creates pseudopositions with the pre-title of “Grand” which makes me suspect he feels himself to be some sort of “Grand” something or other.

But the element of his submission, which deserves much clarification, is his use of the word “human” in the title. There is the definite implication there that to be more intelligent is somehow to be less human. Many of his submissions in the past have carried this odd injunction and many of his stated beliefs also indicate a rather strong worship of stupidity that I find, perhaps complementary to his own condition but not particularly attractive or worthwhile.

“Human” is a quite indefinite term to strangely put forth as being absolutely delightful. It is pandemic in humanity to mistreat spouses and offspring, to vandalize social structures, to act stupidly and viciously towards the environment and to the welfare of helpless people. It is also human to act oppositely to the benefit of society. And this has been true throughout history as well as today. They are both very human.

There is also implied that machines are somehow not human. I have yet to see electric hand tools, bicycles, sewing machines, computers, microscopes or even automobiles spring spontaneously from the soil like mushrooms or blueberries. Humans are intricately involved in all these useful machines and their integration into society. No doubt they may cause problems in certain circumstances but they are very human creations and make us no less human than wearing false teeth or eyeglasses.

I can understand and sympathize with problems with the proliferation of machines but they are social problems. If there is something wrong with this it does not help to characterize the problems with machines as not human. “Human” is not a useful term here.


Emanuel Paparella2008-07-28 16:27:08
http://www.metanexus.net/magazine/PastIssues/tabid/126/Default.aspx

Open above link: Vol 9 issue 3 (June 2008) of Global Spiral (of the Metanexus Institute) for an in depth look at the issue of transhumanism that is a bit ticker than a pancake.

Also read the C.S. Lewis lectures on "The Abolition of Man." It may take some time but it will certainly unberden the reader who may authentically be interested in the issue rather than expressing an animus via argumenti ad hominem, of much ingorance and misconceptions in the matter.


Sand2008-07-28 16:47:37
It is, of course, a standard Paparella ploy when confronted with an unanswerable question to dispatch the questioner into a maze of vague literature in the hope the question will somehow go away.


Emanuel Paparella2008-07-28 18:34:33
Of course any perceptive reader knows full well that the standard ploy of Sand is that of rebutting any issue with which he disagrees with an argumentum ad hominem; if he does not like the message he ridicules the messanger; that way the issue does not have to be addressed; and in the process he makes a fool of himself.

On the issue of the connection of intelligence and ethical behavior, it actually begins with Socrates's dictum that "knowledge is virtue" by which the rationalist makes (see above comment by Sand) the egregious and fallacious judgment that knowledge is the equivalent of the good. History has more than disporoven that fallacy. If the Wannsee Conference has taught us anything it is that intelligent men with Ph.D.s after their name are capable of rationalizing the greatest crimes and monstrosities ever committed and that therefore knowledge while useful for right and prudent action does not necessarily guarantee ethical behavior. St. Paul may have had it more on target: I know the good but I see another law in my members which leads me to evil. Unless that propensity to evil is comprehended and acknowledged no amount of knowledge will ever defeat it.


John Lennon2008-07-28 19:17:49
All you need is love...


Sand2008-07-28 19:33:29
Strange and typical how your accusations of ad hominem are ad hominem. A clumsy way of supplying diversion away from the fact that you totally avoided the very basic issue I raised, namely: the term human in no way implies virtue.

It is interesting that there is a long tradition of nations at war to deny human status to their opponents. During WWI Germans were described as inhuman Huns and falsely accused of chopping off children’s hands. During WWII there were real horrors on side of the Axis countries but the Japanese were buck toothed “Japs” and traditionally depicted as monkey-like hairy midgets. In Vietnam the local people were “gooks” and treated as sub-human. When slavery of black people was extant in the USA they were given, for some purposes, fractional human status as “niggers” and by many not considered human at all. Thus this technique is extended by Paparella to include people who respect science and the search for knowledge. In this latest retort he openly associates ignorance with virtue using the authority of one of his pet philosophers to give it acceptance. Again he reiterates that a few people with PhDs were the cause of the Holocaust horror as if the rest of Germany were totally innocent and had no preparation by centuries of religious exhortation to vent their violence on the Jews and, as if they by some magic command, could be made to do anything. The Orwellian dictums of knowledge is bad, ignorance is good, stupidity is supreme seems firmly planted in Paparella’s very strange nervous system.

I have looked at his “men without chests” reference and discovered that his favored author C.S.Lewis has created an aesthetic addendum to Plato’s highly mistaken concept that absolutes exist outside of human mental constructions which floated in some cloud cuckoo land of perfect abstractions. Lewis put forth that something considered beautiful had that as an intrinsic part of it’s structure, a kind of Platonic aesthetic totally in contradiction to the obvious that styles of beauty that vary greatly over different cultures and different times. I can understand that a zany mind like Paparella’s would get hooked on this total baloney.


Emanuel Paparella2008-07-28 20:36:48
It looks as if the voices have been visiting again and put words in my mouth. Don't believe them, they are liars. Thanks for the demonstration of a flattened pankake sort of artificial intelligence. Noway it can now be denied that it exists. As far as C.S. Lewis' comments on Plato's Natural Law, stay tuned and hold your horse for a while longer before charging with drawn sword and the shouting voices at the rear. No doubt the readers appreciate the Punch and Judy show. What did Fellini say? Bring in the clowns.


Sand2008-07-28 21:06:03
You have, as usual, not only totally avoided giving any cogent recognition to the pertinent questions I presented but continue to congratulate yourself for your abysmal stupidity. Such a blatant exhibition of disdain for intelligence and knowledge is positively freakish.


Emanuel Paparella2008-07-28 21:42:41
The Country of the Blind

Hard light bathed them-a whole nation of eyeless men,
Dark bipeds not aware how they were maimed. A long
Process, clearly, a slow curse,
Drained through centuries, left them thus.

At some transitional stage, then, a luckless few,
No doubt, must have had eyes after the up-to-date,
Normal type had achieved snug
Darkness, safe from the guns of heavn;

Whose blind mouths would abuse words that belonged to their
Great-grandsires, unabashed, talking of light in some
Eunuch'd, etiolated,
Fungoid sense, as a symbol of

Abstract thoughts. If a man, one that had eyes, a poor
Misfit, spoke of the grey dawn or the stars or green-
Sloped sea waves, or admired how
Warm tints change in a lady's cheek,

None complained he had used words from an alien tongue,
None question'd. It was worse. All would agree 'Of course,'
Came their answer. "We've all felt
Just like that." They were wrong. And he


Knew too much to be clear, could not explain. The words --
Sold, raped flung to the dogs -- now could avail no more;
Hence silence. But the mouldwarps,
With glib confidence, easily

Showed how tricks of the phrase, sheer metaphors could set
Fools concocting a myth, taking the worlds for things.
Do you think this a far-fetched
Picture? Go then about among

Men now famous; attempt speech on the truths that once,
Opaque, carved in divine forms, irremovable,
Dear but dear as a mountain-
Mass, stood plain to the inward eye.

C S Lewis


Emanuel Paparella2008-07-28 22:10:34
Although I have this sense that I sit up there alone something less than two meters above my shoes where I peer out at the world through those two holes on either side of my nose, I must admit that I do have silent conversations with somebody else or perhaps with several somebody elses.

Now the show is ending and the dolls need mending.
The Punch and Judy show is never-ending.
Inside each on of us is a Punch and Judy,
In you, sir, you, ma'am - and in me, yours truly.
That's ... the Punch and Judy show goes on forever.


Emanuel Paparella2008-07-29 00:17:48
P.S. Within the Punch and Judy show "the pertinent questions" presented are always question to which one already knows the answer and intented not to search for the truth but to intimidate one's opponent, to win the debate and have the last word.


Sand2008-08-02 14:39:03
And who is attempting to have the last word?


Ramsay2008-10-27 19:51:33
This is blatant plagiarism of Nicholas Carr's Is Google Making Us Stupid (http://www.theatlantic.com/doc/200807/google).
Write your own work or give credit where credit is due.


© Copyright CHAMELEON PROJECT Tmi 2005-2008  -  Sitemap  -  Add to favourites  -  Link to Ovi
Privacy Policy  -  Contact  -  RSS Feeds  -  Search  -  Submissions  -  Subscribe  -  About Ovi