Blog Home


the eighth day

« February 2009 | Main | April 2009 »

March 27, 2009

What's in a Name?

The number of people in Britain with surnames like Cockshott, Balls, Death and Shufflebottom has declined by up to 75 per cent in the last century.

A study found the number of people with the name Cock shrank to 785 last year from 3,211 in 1881, those called Balls fell to 1,299 from 2,904 and the number of Deaths were reduced to 605 from 1,133.

People named Smellie decreased by 70 per cent, Dafts by 51 per cent, Gotobeds by 42 per cent, Shufflebottoms by 40 per cent, and Cockshotts by 34 per cent, said Richard Webber, visiting professor of geography at King's College in London.

"If you find the [absolute] number goes down, it's either because they changed their names or they emigrated," Professor Webber, author of the study, said.

He said that in many cases, people probably changed their surnames as they came to be regarded as in bad taste.

"It's because the meaning of words can change. Take the name Daft - that as a term for a stupid is a relatively recent innovation."

According to the Oxford English Dictionary, Daft meant "mild" or "meek" in Old English, whereas it means "foolish" today.

"That's why there are names which people think aren't really very pleasant names and you wonder why they persisted as long as they did."

Professor Webber, whose work can be seen on the website, got his data for 2008 from credit card firm Experian and mapping service Geowise. He then compared it with the census of 1881.

Webber also discovered that the most popular names in Britain have not changed over the past 127 years.

Last year, Smith, Jones, Williams, Brown, Taylor and Davies held the top five spots, in exactly the same order as they did a century ago.

Professor Webber also found that between 1996 and 2008, the names Zhang, Wang, and Yang and experienced the fastest growth. Zhang rose by 4,719 per cent, while Wang grew by 2,225 per cent.

- Reuters

Posted by gary at 12:09 AM | Comments (0)

March 26, 2009

What do you get if you divide science by God?

A prize-winning quantum physicist says a spiritual reality is veiled from us, and science offers a glimpse behind that veil. So how do scientists investigating the fundamental nature of the universe assess any role of God, asks Mark Vernon.

The Templeton Prize, awarded for contributions to "affirming life's spiritual dimension", has been won by French physicist Bernard d'Espagnat, who has worked on quantum physics with some of the most famous names in modern science.

Quantum physics is a hugely successful theory: the predictions it makes about the behaviour of subatomic particles are extraordinarily accurate. And yet, it raises profound puzzles about reality that remain as yet to be understood.

Originated in work conducted by Max Planck and Albert Einstein at start of 20th Century.
They discovered that light comes in discrete packets, or quanta, which we call photons.
The Heisenberg Uncertainty principle says certain features of subatomic particles like momentum and position cannot be known precisely at the same time.
Gaps remain, like attempts to find the 'God Particle' that scientists hope to spot in the Large Hadron Collider. It is required to give other particles mass.
The bizarre nature of quantum physics has attracted some speculations that are wacky but the theory suggests to some serious scientists that reality, at its most basic, is perfectly compatible with what might be called a spiritual view of things.
Some suggest that observers play a key part in determining the nature of things. Legendary physicist John Wheeler said the cosmos "has not really happened, it is not a phenomenon, until it has been observed to happen."
D'Espagnat worked with Wheeler, though he himself reckons quantum theory suggests something different. For him, quantum physics shows us that reality is ultimately "veiled" from us.
The equations and predictions of the science, super-accurate though they are, offer us only a glimpse behind that veil. Moreover, that hidden reality is, in some sense, divine. Along with some philosophers, he has called it "Being".
In an effort to seek the answers to the "meaning of physics", I spoke to five leading scientists.

Nobel-prize winning physicist Steven Weinberg is well-known as an atheist. For him, physics reflects the "chilling impersonality" of the universe.
He would be thinking here of, say, the vast tracts of empty space, billions of light years across, that mock human meaning.He says: "The more the universe seems comprehensible, the more it seems pointless."
So for Weinberg, the notion that there might be an overlap between science and spirituality is entirely mistaken.

The Astronomer Royal and President of the Royal Society, Martin Rees, shows a distinct reserve when speculating about what physics might mean, whether that be pointlessness or meaningfulness.
He has "no strong opinions" on the interpretation of quantum theory: only time will tell whether the theory becomes better understood.
"The implications of cosmology for these realms of thought may be profound, but diffidence prevents me from venturing into them," he has written.
In short, it is good to be humble in the face of the mysteries that physics throws up.

Oxford physicist Roger Penrose differs again. He believes that mathematics suggests there is a world beyond the immediate, material one.
Can science explain all of life's meaning?
Ask yourself this question: would one plus one equal two even if I didn't think it? The answer is yes.
Would it equal two even if no-one thought it? Again, presumably, yes.
Would it equal two even if the universe didn't exist? That is more tricky to contemplate, but again, there are good grounds for a positive response.
Penrose, therefore, argues that there is what can be called a Platonic world beyond the material world that "contains" mathematics and other abstractions.

John Polkinghorne worked on quantum physics in the first part of his career, but then took up a different line of work: he was ordained an Anglican priest. For him, science and religion are entirely compatible.
The ordered universe science reveals is only what you'd expect if it was made by an orderly God. However, the two disciplines are different. He calls them "intellectual cousins".
"Physics is showing the world to be both more supple and subtle, but you need to be careful," he says.
If you want to understand the meaning of things you have to go beyond science, and the religious direction is, he argues, the best.

Brian Swimme is a cosmologist, and with the theologian Thomas Berry, wrote a book called The Universe Story: From the Primordial Flaring Forth to the Ecozoic Era.
It is avidly read by individuals in New Age and ecological circles, and tells the scientific story of the universe, from the Big Bang to the emergence of human consciousness, but does so as a new sacred myth.
Swimme believes that "the universe is attempting to be felt", which makes him a pantheist, someone who believes the cosmos in its entirety can be called God.

Mark Vernon is author of After Atheism: Science, Religion and the Meaning of Life. source:BBC

Posted by gary at 11:33 AM | Comments (0)

March 18, 2009

8 Brilliant Scientific Screw Ups

Hard work and dedication have their time and place, but the values of failure and ineptitude have gone unappreciated for far too long. They say that patience is a virtue, but the following eight inventions prove that laziness, slovenliness, clumsiness and pure stupidity can be virtues, too.

1. Anesthesia (1844)

Mistake Leading to Discovery: Recreational drug use
Lesson Learned: Too much of a good thing can sometimes be, well, a good thing

Nitrous oxide was discovered in 1772, but for decades the gas was considered no more than a party toy. People knew that inhaling a little of it would make you laugh (hence the name “laughing gas”), and that inhaling a little more of it would knock you unconscious. But for some reason, it hadn’t occurred to anyone that such a property might be useful in, say, surgical operations.

Finally, in 1844, a dentist in Hartford, Conn., named Horace Wells came upon the idea after witnessing a nitrous mishap at a party. High on the gas, a friend of Wells fell and suffered a deep gash in his leg, but he didn’t feel a thing. In fact, he didn’t know he’d been seriously injured until someone pointed out the blood pooling at his feet.

To test his theory, Wells arranged an experiment with himself as the guinea pig. He knocked himself out by inhaling a large does of nitrous oxide, and then had a dentist extract a rotten tooth from his mouth. When Wells came to, his tooth had been pulled painlessly.

To share his discovery with the scientific world, he arranged to perform a similar demonstration with a willing patient in the amphitheatre of the Massachusetts General Hospital. But things didn’t exactly go as planned. Not yet knowing enough about the time it took for the gas to kick in, Wells pulled out the man’s tooth a little prematurely, and the patient screamed in pain. Wells was disgraced and soon left the profession. Later, after being jailed while high on chloroform, he committed suicide. It wasn’t until 1864 that the American Dental Association formally recognized him for his discovery.

2. Iodine (1811)

Mistake Leading to Discovery: Industrial accident
Lesson Learned: Seaweed is worth its weight in salt

In the early 19th century, Bernard Courtois was the toast of Paris. He had a factory that produced saltpeter (potassium nitrate), which was a key ingredient in ammunition, and thus a hot commodity in Napoleon’s France. On top of that, Courtois had figured out how to fatten his profits and get his saltpeter potassium for next to nothing. He simply took it straight from the seaweed that washed up daily on the shores. All he had to do was collect it, burn it, and extract the potassium from the ashes.

One day, while his workers were cleaning the tanks used for extracting potassium, they accidentally used a stronger acid than usual. Before they could say “sacre bleu!,” mysterious clouds billowed from the tank. When the smoke cleared, Courtois noticed dark crystals on all the surfaces that had come into contact with the fumes. When he had them analyzed, they turned out to be a previously unknown element, which he named iodine, after the Greek word for “violet.” Iodine, plentiful in saltwater, is concentrated in seaweed. It was soon discovered that goiters, enlargements of the thyroid gland, were caused by a lack of iodine in the diet. So, in addition to its other uses, iodine is now routinely added to table salt.

3. Penicillin (1928)

Mistake Leading to Discovery: Living like a pig
Lesson Learned: It helps to gripe to your friends about your job

Scottish scientist Alexander Fleming had a, shall we say, relaxed attitude toward a clean working environment. His desk was often littered with small glass dishes—a fact that is fairly alarming considering that they were filled with bacteria cultures scraped from boils, abscesses and infections. Fleming allowed the cultures to sit around for weeks, hoping something interesting would turn up, or perhaps that someone else would clear them away.

Finally one day, Fleming decided to clean the bacteria-filled dishes and dumped them into a tub of disinfectant. His discovery was about to be washed away when a friend happened to drop by the lab to chat with the scientist. During their discussion, Fleming griped good-naturedly about all the work he had to do and dramatized the point by grabbing the top dish in the tub, which was (fortunately) still above the surface of the water and cleaning agent. As he did, Fleming suddenly noticed a dab of fungus on one side of the dish, which had killed the bacteria nearby. The fungus turned out to be a rare strain of penicillium that had drifted onto the dish from an open window.

Fleming began testing the fungus and found that it killed deadly bacteria, yet was harmless to human tissue. However, Fleming was unable to produce it in any significant quantity and didn’t believe it would be effective in treating disease. Consequently, he downplayed its potential in a paper he presented to the scientific community. Penicillin might have ended there as little more than a medical footnote, but luckily, a decade later, another team of scientists followed up on Fleming’s lead. Using more sophisticated techniques, they were able to successfully produce one of the most life-saving drugs in modern medicine.

4. The Telephone (1876)

Mistake Leading to Discovery: Poor foreign language skills
Lesson Learned: A little German is better than none

In the 1870s, engineers were working to find a way to send multiple messages over one telegraph wire at the same time. Intrigued by the challenge, Alexander Graham Bell began experimenting with possible solutions. After reading a book by Hermann Von Helmholtz, Bell got the idea to send sounds simultaneously over a wire instead. But as it turns out, Bell’s German was a little rusty, and the author had mentioned nothing about the transmission of sound via wire. Too late for Bell though; the inspiration was there, and he had already set out to do it.

The task proved much more difficult than Bell had imagined. He and his mechanic, Thomas Watson, struggled to build a device that could transmit sound. They finally succeeded, however, and came up with the telephone.

5. Photography (1835)

Mistake Leading to Discovery: Not doing the dishes
Lesson Learned: Put off today what you can do tomorrow

Between 1829 and 1835, Louis Jacques Mandé Daguerre was close to becoming the first person to develop a practical process for producing photographs. But he wasn’t home yet.

Daguerre had figured out how to expose an image onto highly polished plates covered with silver iodide, a substance known to be sensitive to light. However, the images he was producing on these polished plates were barely visible, and he didn’t know how to make them darker.

After producing yet another disappointing image one day, Daguerre tossed the silverized plate in his chemical cabinet, intending to clean it off later. But when he went back a few days later, the image had darkened to the point where it was perfectly visible. Daguerre realized that one of the chemicals in the cabinet had somehow reacted with the silver iodide, but he had no way of know which one it was … and there were a whole lot of chemicals in that cabinet.

For weeks, Daguerre took one chemical out of the cabinet every day and put it in a newly exposed plate. But every day, he found a less-than-satisfactory image. Finally, as he was testing the very last chemical, he got the idea to put the plate in the now-empty cabinet, as he had done the first time. Sure enough, the image on the plate darkened. Daguerre carefully examined the shelves of the cabinet and found what he was looking for. Weeks earlier, a thermometer in the cabinet had broken, and Daguerre (being the slob that he was) didn’t clean up the mess very well, leaving a few drops of mercury on the shelf. Turns out, it was the mercury vapor interacting with the silver iodide that produced the darker image. Daguerre incorporated mercury vapor into his process, and the Daguerreotype photograph was born.

6. Mauve Dye (1856)

Mistake Leading to Discovery: Delusions of grandeur
Lesson Learned: Real men wear mauve

In 1856, an 18-year-old British chemistry student named William Perkin attempted to develop a synthetic version of quinine, the drug commonly used to treat malaria. It was a noble cause, but the problem was, he had no idea what he was doing.

Perkin started by mixing aniline (a colorless, oily liquid derived from coal-tar, a waste product of the steel industry) with propylene gas and potassium dichromate. It’s a wonder he didn’t blow himself to bits, but the result was just a disappointing black mass stuck to the bottom of his flask. As Perkin started to wash out the container, he noticed that the black substance turned the water purple, and after playing with it some more, he discovered that the purple liquid could be used to dye cloth.

With financial backing from his wealthy father, Perkin began a dye-making business, and his synthetic mauve colorant soon became popular. Up until the time of Perkin’s discovery, natural purple dye had to be extracted from Mediterranean mollusks, making it extremely expensive. Perkin’s cheap coloring not only jumpstarted the synthetic dye industry (and gave birth to the colors used in J.Crew catalogs), it also sparked the growth of the entire field of organic chemistry.

7. Nylon (1934)

Mistake Leading to Discovery: Workplace procrastination
Lesson Learned: When the cat’s away, the mice should play

In 1934, researchers at DuPont were charged with developing synthetic silk. But after months of hard work, they still hadn’t found what they were looking for, and the head of the project, Wallace Hume Carothers, was considering calling it quits. The closest they had come was creating a liquid polymer that seemed chemically similar to silk, but in its liquid form wasn’t very useful. Deterred, the researchers began testing other, seemingly more promising substances called polyesters.

One day, a young (and apparently bored) scientist in the group noticed that if he gathered a small glob of polyester on a glass stirring rod, he could use it to pull thin strands of the material from the beaker. And for some reason (prolonged exposure to polyester fumes, perhaps?) he found this hilarious. So on a day when boss-man Carothers was out of the lab, the young researcher and his co-workers started horsing around and decided to have a competition to see who could draw the longest threads from the beaker. As they raced down the hallway with the stirring rods, it dawned on them: By stretching the substance into strands, they were actually re-orienting the molecules and making the liquid material solid.

Ultimately, they determined that the polyesters they were playing with couldn’t be used in textiles, like DuPont wanted, so they turned to their previously unsuccessful silk-like polymer. Unlike the polyester, it could be drawn into solid strands that were strong enough to be woven. This was the first completely synthetic fiber, and they named the material Nylon.

8. Vulcanized Rubber (1844)

Mistake Leading to Discovery: Obsession combined with butterfingers
Lesson Learned: A little clumsiness can go a long way

In the early 19th century, natural rubber was relatively useless. It melted in hot weather and became brittle in the cold. Plenty of people had tried to “cure” rubber so it would be impervious to temperature changes, but no one had succeeded … that is, until Charles Goodyear stepped in (or so he claims). According to his own version of the tale, the struggling businessman became obsessed with solving the riddle of rubber, and began mixing rubber with sulfur over a stove. One day, he accidentally spilled some of the mixture onto the hot surface, and when it charred like a piece of leather instead of melting, he knew he was onto something.

The truth, according to well-documented sources, is somewhat different. Apparently, Goodyear learned the secret of combining rubber and sulfur from another early experimenter. And it was one of his partners who accidentally dropped a piece of fabric impregnated with the rubber and sulfur mixture onto a hot stove. But it was Goodyear who recognized the significance of what happened, and he spent months trying to find the perfect combination of rubber, sulfur and high heat. (Goodyear also took credit for coining the term “vulcanization” for the process, but the word was actually first used by an English competitor.) Goodyear received a patent for the process in 1844, but spent the rest of his life defending his right to the discovery. Consequently, he never grew rich and, in fact, wound up in debtors prison more than once. Ironically, rubber became a hugely profitable industry years later, with the Goodyear Tire & Rubber Co. at the forefront.

By Eric Elfman

Posted by gary at 08:20 PM | Comments (0)

March 17, 2009

When the Camera Angle Matters

These photos are creatively composed - the camera angle makes all the difference!

Posted by gary at 10:00 PM | Comments (0)

March 13, 2009

Work Value

In 1965, U.S. CEOs at major companies made 24 times a worker's pay -- by 2004, CEOs earned 431 times the pay of an average worker. From 1995 to 2005, average CEO pay increased five times faster than that of average workers. While CEO pay continues to increase at rates far exceeding inflation, wages for the vast majority of American workers have failed to keep up with rising prices. In fact, real wages for the 90% of Americans who earn under $92,000 a year have actually fallen since 2001.

Posted by gary at 07:34 AM | Comments (0)

March 10, 2009

A Changing World

Food for thought:

Posted by gary at 11:27 AM | Comments (0)

March 09, 2009

Tree Power

How Autumn works...

Posted by gary at 08:29 AM | Comments (2)

September 2011
Sun Mon Tue Wed Thu Fri Sat
        1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30  

Recent Entries

Syndicate this site (XML) Powered by
Movable Type 3.121