Support: Public Netbase (Vienna)!

evacuate & flush

w e b l o g s
synthetic zero
corey c.
living art
guardian u.
zaa zaa furi
zen calm ink
ghost rocket
dev null
sensible erection
idea of the day
wood s lot
50 cups

c o n n e c t
teller (he's still alive!)
acid trip

w e b r i n g
- pitas! +
< BlogCanada >
< i'm twisted >

Tuesday, June 29, 2004

I still don't really understand why people want things that pretend to be other things. Here are some. Fire flickers in an appealing way, wood crumbles to ashes and spits. A rotating red bulb under semi-transparent plastic molded to look like logs is just dumb. Now maybe if it was proud to be a rotating red bulb under etc, that would be fine.

There's a nice word in Hayles' How We Became Posthuman: Here I want to introduce another term from archaeological anthropology. A skeuomorph is a design feature that is no longer functional in itself but that refers back to a feature that was functional at an earlier time. The dashboard of my Toyota Camry, for example, is covered by vinyl molded to simulate stitching. The simulated stitching alludes back to a fabric that was in fact stitched, although the vinyl "stitching" is formed by an injection mold. Skeuomorphs visibly testify to the social or psychological necessity for innovation to be tempered by replication. Like anachronisms, their perjorative first cousins, skeuomorphs are not unusual. On the contrary, they are so deeply characteristic of the evolution of concepts and artifacts that is takes a great deal of conscious effort to avoid them.
Some of the more interesting ideas I've heard about the formation of the universe/multiverse/whatever involve (a) evolution and (b) determination. In the evolution idea, every universe begins as a black hole in an earlier universe, with slight variance of the fundamental constants (like alpha, for example). Universes that are more likely to produce black holes propagate, creating more and more black holes, just like evolution is designed to carry on a lifeform's genes. It's a little out there, but not nearly as out there as this:

The idea is that the universe didn't exist at all until life had evolved to see it. Since quantum states don't decohere without an observer, the earliest states of the unvierse don't exist until we (or some other intelligence) looks back far enough to see it. So, by this theory, the whole universe is a big feedback loop. I don't really buy it (there are, in theory, other ways for particles to decohere) but it's interesting nonetheless.

[:: comment! :]

Monday, June 28, 2004
information is energy

One of the many key insights in Claude Shannon's information theory was that information is a form of energy. There is a certain inherent amount of energy present in information. And in any operation where the total amount of energy afterwards is smaller, the energy value of the amount of information decline is converted to heat.

The amount involved is not very great and doesn't matter for the kind of circuitry we use in microprocessors. They've got serious cooling problems, but that's not where it comes from. But it does matter for the guys who have been experimenting with trying to produce computing devices with Josephson junctions, which rely on certain properties of superconductors. If there are too many gates and too much processing is going on, the device heats up and stops being a superconductor.

A logical "NOT" does not destroy information. For each bit in, one bit goes out.

A logical "AND", however, takes two input bits and creates one output bit. Each time, one bit's worth of energy is converted to heat. (Which is why someone suggested creating a modified form of gate which has one output bit equivalent to AND, and another output bit which could be used with the first to reproduce the two input bits. As such, no information is destroyed. The second bit would then be connected to a wire exiting the circuit, and would be destroyed elsewhere, releasing its heat elsewhere.)

The energy of information also matters in other ways. Since the information is "ordered", the energy is also "ordered". Since information is energy, devices which manipulate information are subject to the laws of thermodynamics. And thermodynamics says that some energy involved in energy transactions becomes disordered and useless, in other words as "heat". Some of Shannon's key insights resulted from examining how thermodynamics affected transmission and processing of the ordered energy inherent in information.

That's why redundancy can help. Forward error correction algorithms permit a certain amount of corruption in the transmitted bit stream while still delivering the information correctly.

A digital representation of a picture inherently represents a certain amount of energy. If the image is complex, there's more energy; if it is simple there's less. The energy per pixel, or energy per bit, isn't necessarily the same for different images in the same format; it's a function of the image.

Compression algorithms take a bitstream and convert to a different bitstream where nearly every bit comes close to containing maximum information. If the information density of the source was low, it will compress a lot. If it was already high, it won't compress very well.

But if the output is smaller than the input, and if it is lossless compression, then the output file will contain the same energy as the input, with more energy per bit.

An arithematic "average" operation is another one which destroys energy. Two or more numbers go in, one number comes out. All the energy associated with the other numbers is converted to heat. That's what pixellize algorithms do to pictures.

If you take an image and "pixellize" it, what you're doing is to divide it into sections, then average the numbers describing the R,G,B levels of the pixels in each section, yielding a single RGB number used for all those pixels in the output image. If the file is not compressed, the output will be the same number of bits as the input. But because information was destroyed, the resulting file contains less information total, and less per bit.

To go from that back to the unpixellized version requires an operation which produces more energy than it consumes. If it was pixellized in 4*4 squares, then N bits worth of information go in, and N*16 bits of information comes out.

Or rather, N*16 low-energy bits go in, and N*16 higher energy bits come out. The resulting file has more total energy, and more per bit, and won't compress as much. That's perpetual motion. More energy is created than is consumed. And it's against the rules of the game, and you can't get out of the game. Sad, ain't it?
gzip the universe
Intelligence is distributed over the environment because we throw information away. On the long scale (light from above), and the short scale (you know the time, but you haven't looked at your watch yet). Artificial objects, created interfaces that don't obey distance, or object-hood, or texture: they're either confusing, or, if used right, remarkably useful illusions (television).

I'm also interested in the metabolic cycle. It's an autopoietic system (self-creating, an evolved system that's reached this point), with allopoietic components (parts that can't exist on their own, but that are used for creating something other than the autopoietic system). An analogy: Human society (auto); cars (allo). Not an analogy: The metabolic cycle (auto); life-instances (allo).

You need the whole cycle for the cycle to exist. There is no beginning or end. It's like the puffer in Conway's Life. It exists, it continues, it cycles, it emits gliders (people) that go out to infinity, a side-effect of the cycle. (But not optional. Transformations are always conserved. If the plane of Life wasn't infinite, would the gliders pollute/inscribe, and would life in Life eventually emerge?)

The cycle is instantiated in life, and passes from one instance to another in the form of chemicals. Are these chemicals the cycle? No. They're just slices of the organisation, in the same way my laptop screen is a slice of the human-computer-internet-society-history-equation_solutions assemblage. They're both easy-to-refer to slices, with no real importance other than their semiotcratic affordance (in much the same way the selection mechanism 'attention' has high semiotcratic affordance but no actual reified existence in the brain).


Entropy is an inevitable bulk process of bulk properties, of multiplicities. There's a space limit there, entropy doesn't really function for tiny numbers. But is there a time limit too? Over even vaster numbers, vaster times, is there a new law that over-rules entropy? Perhaps it's inevitable that a reproducing thing always emerges out of random motion where history is inscribed and there is the possibility of time-binding. Out of oceans, sediment will always build up, discontinuities will always form, tides will always inscribe and time-bind, symmetries will break, and life will always form. If not cellular life, then maybe a population of nuclear volcanos that reproduce by exploding and China Syndroming through the crust. Whatever. Entropy++.

I would say, let's redefine life in terms of information. Hayles opens chapter 2 of How We Became Posthuman with a line from Bateson's Steps to an Ecology of Mind: We might regard patterning or predictability as the very essence and raison d'etre of communication [...] communication is the creation of redundancy or patterning. [p412]

Which is just a wonderful way of talking about information. Or rather, communication which is successful transfer of information. Communication is anything that duplicates itself at different physical coordinates, possibly transformed. So a telephone is communication because the air-pressure ripple patterning is duplicated (more or less) from speaker's room to the listener's room. If you were to gzip the universe instantaneously before and after the call, it'd compress better afterwards.

What is life? Is it the medium-mechanism by which information is doubly-transformed, to an intermediate, temporary (and therefore matter-like) state, and back again? Information is a process, not a state. It's a becoming, but a becoming that folds back. Like a miniature metabolic cycle. And information puffs out matter-like states as it goes (people, eggs). Which is which, the rocks or the muddy holes?

But then, I feel we look at matter and information and we see the dichotomy because it's semiotcratic to do so. Just as we look at particles and see fermions (things that can't be in the same place at the same time) and bosons (things that can be so). Perhaps it's just an artefact of our measuring equipment. It's all string vibrations, further down. And rooms and corridors. Buildings and streets (tell that to those in Catalhoyuk!). And objects and textures, of course, animate/inanimate, background/attended. Mesh/tree, mesh-becoming/tree-becoming, branching/canalising, push/pull. But we've talked about that, or we will. We've created an arboreal world, we've also been created. We can't assign causality, only proximity. Does it makes sense to talk about any thing if everything is every thing?

[:: comment! :]

Saturday, June 26, 2004
consciousness and memory

"Obviously, genetics drives plenty of behaviors, from ants marching to babies suckling. To make the leap from that to say that it must therefore drive all behaviours is disingenuous and silly. The great evolutionary advantage of consciousness and memory is that it allows us to conceive of and execute behaviors that are not encoded into our nervous systems. That's the whole frickin' point of consciousness. Those ancestors that were able to formulate responses in realtime to new conditions were far better suited than those whose behaviors were genetically encoded."

equations and programs

"The difference between equations and programs is typically in retained state. In your typical equation, you're expressing a relationship and there is typically no state (read: memory) in the relationship unless the relationship is recursive, whereby the state is implicit in the expression... State in software has been standard operating procedure for time immemorial and is a big thorn in the side of theoreticians who would prefer a purely functional language in which there is no state at all. State (and side-effects) are an expressional flexibility, but also the cause for most bugs in software."

[:: comment! :]

Friday, June 25, 2004
memory and evolutionary potential

"As molecular biologist and biosemiotician, Jesper Hoffmeyer, has succinctly put it: '[N]othing can evolve if it is not remembered (because then we talk about substitution).' Moreover, the reliable boundary constraints afforded by genes also affords novelty through mutations that are then nonlinearly amplified if viable.

"The accumulation of information (as well as the novelty of response) afforded by genes has to do with a property called displacement in linguistics that involves, as Pattee has put it, 'the apparent total lack of intrinsic connection between the time and place where we acquire new information and the time and place where it is selected or when we decide to use it in our actions and efforts to control.'"

learning in synaptic networks

"I'm not saying intelligent networks of people and cyberspace will necessarily emerge: I'm postulating that it's possible. The web is analogous to a brain at certain levels. Neural learning algorithms work well in an environment where the network starts with random connections and is built up through directed learning (strengthening and weakening connections in a goal-based way).

"In any case, a poor science fiction novel neither strengthens nor weakens the theory. I'm not sure what a testable hypothesis might be. That users of certain web communities might pull together facts 'out of the air' to solve given problems which they could not solve in isolation? In that case it's the connectedness which is providing the emergent intelligence."

[:: comment! :]

Wednesday, June 23, 2004
mad world video (via nytimes)

watch travis dance (via absenter)

[:: comment! :]

Tuesday, June 22, 2004
the amazing daniel (via oblivio)

neutered dog haikus (via snarkout)

[:: comment! :]

Sunday, June 20, 2004
what am i?

Is there actually an identifiable self with continuity of existence which is typing these words? I really don't know. How much would that self have to change before we decide that the continuity has been disrupted? I think I don't want to find out.

Most of those kinds of questions either become moot or are easily answered within the context of standard religions. Those questions are uniquely troubling only for those of us who believe that life and intelligence are emergent properties of certain blobs of mass which are built in certain ways and which operate in certain kinds of environments. We might be forced to accept that identity is just as mythical as the soul. We might be deluding ourselves into thinking that identity is real because we want it to be true.

other people's memories

Here's another way to look at the issue. Our humanity is something separate from our personality and memories. Otherwise, people with advanced Alzheimers would cease to be human beings, which is not a conclusion we accept. This shows, prima facie, that we don't consider personhood to be intrinsically dependent on having a memory.

Even patients with advanced Alzheimers still have a sense of their own existence, of themselves as persons. that is why they get so agitated and upset at the confusion that results from Alzheimers. This shows, again, that we are more than just collections of memories, but that there is a fundamental sense of being an agent in the world that precedes them and survives their destruction.

[:: comment! :]

Saturday, June 19, 2004
going somewhere (via plastic scitech)

Why dying people speak of taking journeys is anyone's guess. Drugs don't seem to play much of a role, hospice workers say, because the phenomenon occurs both in those who are taking painkillers and those who aren't. If anything, they say, the more drugs one takes, the less likely any conversations.

Others speculate that the dying may be experiencing visions similar to those in a dream. "The mind has its own set of analgesics," said sociologist Robert Fulton, a University of Minnesota professor emeritus and a pioneer in the study of death and bereavement in the 1960s. "The mind is well capable of drugging itself. In a dream, there might be the euphoria of meeting a dead friend and having a conversation…. The brain is kind of cleaning itself up, like a computer downloading."

The dreams are reinforced by images of immortality, and of heaven and hell or reincarnation, embraced throughout history as well as in modern life. Kelley speculates that the dying recognize "that they're going from one world to another one" or the feeling that they're "going somewhere."

uncanny valley (via collision detection)

The Uncanny Valley can make games less engrossing. That's particularly true with narrative games, which rely on believable characters with whom you're supposed to identify. The whole point is to suspend disbelief and immerse yourself. But that's hard to do when the characters create goosebumps. You fight searing battles, solve brain-crushing puzzles, vanquish enemies, and what are you rewarded with? A chance to watch your avatar mince about the screen in some ghoulish parody of humanity.

The screwiest part of this phenomenon is that game designers pride themselves on the quality of their sepulchral human characters. It's part of the malaise that currently affects game design, in which too many designers assume that crisper 3-D graphics will make a game better. That may be true when it comes to scenery, explosions, or fog. But with human faces and bodies, we're harder to fool. Neuroscientists argue that our brains have evolved specific mechanisms for face recognition, because being able to recognize something "wrong" in someone else's face has long been crucial to survival. If that's true, then game designers may never be able to capture that last 1 percent of realism. The more they plug away at it—the more high-resolution their human characters become—the deeper they'll trudge into the Uncanny Valley.

Instead, maybe they should try climbing out, by going in the opposite direction and embracing low-rez simplicity. Roboticists have begun doing this. Like Mori, they've learned that a spare, stripped-down robot can seem more lifelike than an explicitly humanoid one. I own a Roomba, one of those Frisbee-shaped vacuum robots, and it doesn't look even vaguely human. Yet as it zips around my living room, it seems amazingly alive, and I can't help but feel warmly toward it. This is because of another quirk of our psychology: If something behaves in only a slightly human way, we'll fill in the blanks—we'll read humanness into it. (That's partly why our pets seem so intelligent and humanlike.)

[:: comment! :]

Thursday, June 17, 2004
paean to killing somebody! (via MT)

Assuming you have it inside a house where you can work on it a bit, the first thing you want to do is drain it of fluids. This will make it easier to cut up, and slow decomposition a little bit. The best way to do this quick and dirty is to perforate the body with a pointed knife, and then perform CPR on it. Cut the fronts of the thighs deep, diagonally, to slit the femoral arteries. Then pump the chest. The valves in the heart will still work when dead, and the springback of the ribcage can put apply a fair amount of suction to the artria. Do this in a tub. Plug the drain, and mingle lots of bleach with the bodily fluids before unplugging the drain to empty the tub. This should help control the stench of death, which would otherwise reek from your gutter gratings. Do everything you can to control odors. Plug in an ionizer, burn candles, leave bowls of baking soda everywhere. Ventilate the room in the middle of the night, but otherwise keep it closed. Keep the body under a plastic sheet while it's in the tub.
paean to community college :D (via HR)
You go to community college because you are an ambitious kid whose parents don't have professional jobs. Because you are a girl in a family whose culture for thousands of years has valued education only for boys. Because you come from a family that never really thought about college for anyone, never saved for it or steered you toward it. You go to community college because you had a significant trauma during your adolescence: Perhaps you had an alcoholic parent, lost a sibling, lived in a household of chronic anger, suffered from depression or anorexia, did too many drugs. So you failed some of your high school courses, and the "good" colleges won't take you. You go to community college because you were born in another country and came to America too late to pick up English very easily. Because you landed a good job or gave birth to a beautiful baby right out of high school, and didn't look back for 10 or 15 years, when, suddenly, you thought about college. You go to community college because you have a learning disability, undiagnosed or untreated, that pushed you to the sidelines in school. Because you started at a four-year school and discovered that you weren't ready to leave home. And you go to community college because you believe that America is a society where intelligence is rewarded, and since you're such a fine, intelligent person, it's unnecessary for you to actually do any homework in high school, and suddenly you have a C average and your SATs are pretty good but, frankly, so are a lot of other people's, and the best offer you got from four-year colleges was their wait list.

[:: comment! :]

Wednesday, June 16, 2004
joe blow (via list)

Vinton Studios announces the release of its latest short film Joe Blow directed and conceived by Emmy Award-winning Vinton director Mark Gustafson. The film will premier exclusively at Landmark Theatres starting April 16 in New York and LA with wider distribution to follow. The short film will run prior to the steamy drama Young Adam starring Ewan McGregor and Tilda Swinton.

Joe Blow is Gustafson's third film. Others include Mr. Resistor and Bride of Resistor. Joe Blow is the story of one man's quest for companionship, says Gustafson. It's a cautionary tale of passion, loss and pneumatics. Joe, who lives by himself in a trailer, finds that love can be a breathtaking experience.
joe blogs (via blogdex)
A few years ago, Mathew Gross, 32, was a free-lance writer living in tiny Moab, Utah. Rob Malda, 28, was an underperforming undergraduate at a small Christian college in Michigan. Denis Dutton, 60, was a professor of philosophy in faraway Christchurch, New Zealand. Today they are some of the most influential media personalities in the world. You can be one too.

Gross, Malda and Dutton aren't rich or famous or even conspicuously good-looking. What they have in common is that they all edit blogs: amateur websites that provide news, information and, above all, opinions to rapidly growing and devoted audiences drawn by nothing more than a shared interest or two and the sheer magnetism of the editor's personality. Over the past five years, blogs have gone from an obscure and, frankly, somewhat nerdy fad to a genuine alternative to mainstream news outlets, a shadow media empire that is rivaling networks and newspapers in power and influence. Which raises the question: Who are these folks anyway? And what exactly are they doing to the established pantheon of American media?

[:: comment! :]

Monday, June 14, 2004

"So what exactly is going on here? In Achbar's view, 'There's a real disenchantment with corporate culture.' Many people see corporations as having governmentlike power with almost no accountability and don't see the standard media outlets dealing with that issue. 'So they've got to go to a movie theater to see their values reflected,' he says."

"Among those interviewed in 'The Corporation' is Michael Moore himself, and toward the end of the film he argues that his own career is proof of a sort of greed loophole in the culture industry that allows dissenters to exploit the system to their own ends: the rich man will sell you the rope you will use to hang him if he thinks he can profit from it. (Provocative analysis, eh, Weinsteins?) Bakan says this is what's happening, and that it's happening because there is an authentic hunger to understand corporate power in a way that most media accounts don't."


"It's not just the rich that need their luxuries justified, though. Disneyland matters in the same way: The great genius of postwar America has been the creation of broad, vulgar luxuries, the hard-working pursuit of which by the broad, vulgar masses keeps us all from destitution."

"Yet, some people insist on saying they're helping others even as they exploit and demean those very same people. Those people are just greedy, selfish bastards and heartless cads who like to think of themselves as 'good people.'"

[:: comment! :]

Sunday, June 13, 2004
holographic data storage (via SE!)

"NTT unveiled its 1-gigabyte prototype earlier this year, and the first commercial versions are slated to hit the market in 2005. The see-through cards are manufactured using thin-film holography: Digital data is encoded into a two-dimensional image, a computer converts the image into a hologram, and lasers etch the hologram onto the card. The result is about 100 minute layers with varying refractive properties. Magnified several hundred times, the patterns etched into the layers look a bit like the dots and dashes of Morse code."

rock climbing robot (via SE :)

"While other climbing robots are designed to scale the sides of flat structures using suction cups or magnets for grip, tackling uneven geological surfaces is a far more difficult task. With a central body and four triple-jointed limbs, Lemur's gait resembles that of a human rock climber as it manoeuvres up an indoor climbing wall at Stanford (see graphic above, and videos here)."

[:: comment! :]

Friday, June 11, 2004


[:: comment! :]

Thursday, June 10, 2004
yingzi (via sideout)

proofrea (via gulfstream)

[:: comment! :]

Tuesday, June 8, 2004

As it happens, his accidental study provides a window onto a subject that has long stymied academics: white-collar crime. (Yes, shorting the bagel man is white-collar crime, writ however small.) Despite all the attention paid to companies like Enron, academics know very little about the practicalities of white-collar crime. The reason? There aren't enough data.

A key fact of white-collar crime is that we hear about only the very slim fraction of people who are caught. Most embezzlers lead quiet and theoretically happy lives; employees who steal company property are rarely detected. With street crime, meanwhile, that is not the case. A mugging or a burglary or a murder is usually counted whether or not the criminal is caught. A street crime has a victim, who typically reports the crime to the police, which generates data, which in turn generate thousands of academic papers by criminologists, sociologists and economists. But white-collar crime presents no obvious victim. Whom, exactly, did the masters of Enron steal from? And how can you measure something if you don't know to whom it happened, or with what frequency, or in what magnitude?


This sort of hypocrisy is woven deeply into the fabric of American business life. But how deeply I didn't appreciate until I sat in on some classes in ethics at the Haas School of Business at the University of California, Berkeley.

The place is, you might think, the natural home for woolly-headed business idealism. It is, after all, Berkeley. And the new dean, a former Republican congressman named Tom Campbell, has made the newly fashionable subject of business ethics something of a personal obsession. When we met he was trying -- and failing -- to gain entrance to a white-collar prison, so he could bring his students face to face with real-life business crooks. The point, he said, was "to show the students that the businesspeople who wind up in jail aren't really any different from them. They look like them, they talk like them; they just made bad decisions." In the meantime, to help keep his students out of jail, he had expanded the school's ethics curriculum.

[:: comment! :]

Monday, June 7, 2004
critical mass (via scitech!)

Ball gives a sprawling account of physics over the past several centuries - from thermodynamics to complexity theory - showing how fundamental insights about the behavior of matter can be adapted to understand the dynamics of society. He applies this claim intriguingly to a variety of social, economical, and political situations, showing, for instance, that voting follows patterns akin to magnetization, and marriage rates resemble the behavior of gasses and liquids.

mass intelligence (via metafilter :)

As counterintuitive as it sounds, however, the mathematics work so long as Surowiecki's three key criteria - independence, diversity, and decentralization - are satisfied. He explains his three criteria in depth, and then shows how they play out in the three types of problems that groups can solve: cognition problems (such as who will win the Patriots game), coordination problems (such as how crowds move through a city), and cooperation problems (such as how to price a gallon of orange juice fairly).

[:: comment! :]

Wednesday, June 2, 2004
stanislaw witkiewicz

Painter, playwright, novelist, aesthetician, and philosopher, Witkiewicz -- or Witkacy as he called himself -- belongs to the writers and thinkers known in Poland as catastrophists, who sprang up in the period framed by two world wars, the first of which brought the Polish state back into existence after nearly 150 years of dismemberment, and the second of which threatened the nation with total annihilation. Poised between cataclysms, Witkacy forecast an apocalyptic close to Western civilization and wrote with sardonic humor about the approaching end of the world.

The major theme in all of Witkacy's works is the growing mechanization of life, understood not as dehumanizing technology, but rather as social and psychic regimentation. In dozens of plays and two large novels, Witkacy portrays the threatened extinction of a decadent individualism. The degenerate remnants of a once creative mankind will be replaced by a new race of invading levelers who will establish the reign of mass conformity, modeled on the beehive and anthill -- by what Orwell calls "insect-men."
colin wilson
The most engaging part of his book is the factual stuff, about his early struggle to be a writer and his relationship with Joy and their children. Where it drags is when he gets on to his ideas. His philosophy is basically existentialism with non-rational excrescences and characterised by bizarre nomenclature - Faculty X, Upside Downness, Peak Experiences, Right Men, The Dominant Five Per Cent, King Rats. It seems to constitute an attempt to classify human feelings and behaviour as written by a Martian who has never met an Earthling. This is, of course, Wilson's weakness and also, in a way, his charm - he has no understanding of other people whatever. When I ask if he would say he is low in emotional intelligence, he readily agrees: 'That is fair, yes.'

As a child he was so introverted, so uninterested in other people, he might have been diagnosed today with Asperger's syndrome. 'I wouldn't be surprised. I wasn't cut off from other people, but, as I keep saying in The Outsider, other people were the trouble. They kept intruding into my world whether I wanted them to or not, because what they did was to drag me away from the world of ideas and abstractions I wanted to be in. When I was a teenager I was a total romantic escapist. My world was books. I felt as Axel did in the Villiers de L'Isle-Adam play - 'As for living, our servants can do that for us.' But that all changed when I was 16 and discovered Rabelais. Suddenly I had that wonderful feeling - my God, life is good after all!'

[:: comment! :]

Tuesday, June 1, 2004
what is mathematics? (via sciscoop)

So what does it mean when mainstream explanations of our physical reality are based on stuff that even scientists cannot comprehend? When nonscientists read about the strings and branes of the latest physics theories, or the Riemann surfaces and Galois fields of higher mathematics, how close are we to a real understanding? Despite the writer's best metaphors and analogies, what is lost in translation?

"It is a bit like trying to explain football to people who not only have no understanding of the word 'ball,' but are also rather hazy about the concept of the game, let alone the prestige attached to winning the Super Bowl," wrote Dr. Ian Stewart, professor of mathematics at the University of Warwick in England, in an email message.

Asked if there exist mathematical concepts that defy explanation to a popular audience, Dr. Stewart, author of "Flatterland: Like Flatland, Only More So" replied: "Oh, yes - possibly most of them. I have never even dared to try to explain noncommutative geometry or the cohomology of sheaves, even though both are at least as important as, say, chaos theory or fractals."

Dr. Keith Devlin, a mathematician at Stanford University and author of "The Millennium Problems," which tries to describe the most challenging problems in mathematics today, admits defeat in his last and most impenetrable chapter, where he is forced to interpret something called the Hodge conjecture. He suggests to readers, "If you find the going too hard, then the wise strategy might be to give up."
what are numbers? (via interconnected)
Our brain seems to be equipped from birth with a number sense. Elementary arithmetic appears to be a basic, biologically determined ability inherent in our species (and not just our own — since we share it with many animals). Furthermore it has a specific cerebral substrate, a set of neuronal networks that are similarly localized in all of us and that hold knowledge of numbers and their relations. In brief, perceiving numbers in our surroundings is as basic to us as echolocation is to bats or birdsong is to song birds.

It is clear that this theory has important, immediate consequences for the nature of mathematics. Obviously, the amazing level of mathematical development that we have now reached is a uniquely human achievement, specific to our language-gifted species, and largely dependent on cultural accumulation. But the claim is that basic concepts that are at the foundation of mathematics, such as numbers, sets, space, distance, and so on arise from the very architecture of our brain.

In this sense, numbers are like colors. You know that there are no colors in the physical world. Light comes in various wavelengths, but wavelength is not what we call color (a banana still looks yellow under different lighting conditions, where the wavelengths it reflects are completely changed). Color is an attribute created by the V4 area of our brain. This area computes the relative amount of light at various wavelengths across our retina, and uses it to compute the reflectance of objects (how they reflect the incoming light) in various spectral bands. This is what we call color, but it is purely a subjective quality constructed by the brain. It is, nonetheless, very useful for recognizing objects in the external world, because their color tends to remain constant across different lighting conditions, and that's presumably why the color perception ability of the brain has evolved in the way it has.

My claim is that number is very much like color. Because we live in a world full of discrete and movable objects, it is very useful for us to be able to extract number. This can help us to track predators or to select the best foraging grounds, to mention only very obvious examples. This is why evolution has endowed our brains and those of many animal species with simple numerical mechanisms. In animals, these mechanisms are very limited, as we shall see below: they are approximate, their representation becomes coarser for increasingly large numbers, and they involve only the simplest arithmetic operations (addition and subtraction). We, humans, have also had the remarkable good fortune to develop abilities for language and for symbolic notation. This has enabled us to develop exact mental representations for large numbers, as well as algorithms for precise calculations. I believe that mathematics, or at least arithmetic and number theory, is a pyramid of increasingly more abstract mental constructions based solely on (1) our ability for symbolic notation, and (2) our nonverbal ability to represent and understand numerical quantities.

[:: comment! :]

a r c h i v e

w r i t t e n
get out
on violence
horror story
disney land
from earth
human matrix
crazy guy
future reference
bosco nought
US engagement

f r i e n d s

a b o u t m e