Sunday, April 29, 2007
To put those remarks in some context, I was a child of the late 1950's and 1960's. Optimism about science and technology was everywhere (or so it seemed to me), and you couldn't help (or so it seemed to me) to be fascinated with space exploration, nuclear power, 'electronic brains', and so on. It's hard to convey that zeitgeist today. Some sense of it comes through in Homer Hickam's wonderfully evocative Rocket Boys; another avenue is simply to peruse the advertisements in the magazines of the era. It's hard to beat the effect of being led out into your backyard by your father to stare out a satellite tracing its path across the night sky (I am guessing it was an Echo satellite as we could see it without binoculars or telescope.)
In retrospect, I suppose our household would have been classified 'lower' middle-class as well as 'non-intellectual': my father was an auto mechanic, my mother a homemaker and nurse; both worked damn hard to make ends meet, so for this and other reasons no music lessons, foreign languages, philosophical discussions, or great books. But there were books everywhere in the house, more than enough to stimulate my lifelong love of reading and learning. I recall the excitement each week when my mother brought home a new volume of The Golden Book Encyclopedia, which was being sold as some sort of promotion through the A&P grocery stores. And I can still picture the cover of what I think was Volume 4, with its weird green background (now confirmed through the magic of Google, see photo to the right) and an image of a chemist holding a test tube up to his eye. Right then, in first grade, I knew that's what I wanted to do, and the subsequent childhood variations (pilot, astronaut, engineer, architect) about that path were never very large.
I was careful to put 'non-intellectual' in quotations above. To be sure, my Golden Book Encyclopedia was a far cry from Julian Schwinger systematically working his way through the Encyclopedia Britannica and then the New York Public Library. But I was still very proud when I used my Golden Book knowledge when I was 5 or 6 to explain to my father's friends what the differential was on a car and why you needed it. This tight coupling between "book learning" and understanding the real world around us is precisely the intellectual engagement which so fascinates me and which drove me to experimental physics. I know that I would be a much better experimentalist if I had 1% of my father's (trade school derived) mastery of mechanics, electricity, plumbing, heating and cooling systems, etc., and as a result I have a real antipathy towards those who use a very narrow definition of 'intellectual' to establish some sort of academic hierarchy (see also Chanda's post on the silly genre prejudices about 'fundamental' physics versus more applied studies).
The hard work of my parents meant that we were able to live in solidly middle class suburbs (of Milwaukee, Wisconsin), and therefore take advantage of a fairly good public school system. Given that I was oriented towards science from a very early age, the most likely perturbation would have been away from this state (after all, what fraction of high school graduates become practicing scientists?), but I encountered very few bad teachers along the way. Somehow in first grade I was directed towards a book on how rockets work, which tried to explain action and reaction by what happens when you a squeeze a bar of soap and it squirts out of your hands. Hardly the best explanation, but even in questioning it I was learning something. In second grade I found a wonderful book of science experiments you can do (I recall it was from UNESCO; I bet this is it). I now know it confused air drag with air pressure, so it wasn't perfect, but following its instructions to study evaporation or make a working barometer was fascinating. I think it was this same book that also led to my first safety violation in experimental physics: In describing how sound propagates in dense materials, it noted that American Indians were able to detect oncoming trains many miles (kilometers) away by placing their ears directly on the railroad track. Well, on my walk to school I crossed a railroad track, so I had to do the experiment, and this was duly reported to the teacher by a classmate, who wrote a very concerned note to my parents (my protestations of scientific persecution not withstanding). Her mistake was giving the note to me to bring home; my mistake was throwing it in the ditch... Somehow the subsequent fallout from these cascading bad decisions didn't deter me from continuing my pursuit of science.
Other than this isolated incident, the pattern of encouragement from these 'ordinary' schools continued. A 4th teacher who introduced me to different number systems. A 5th grade teacher who gave me a 'programmed learning' book on rudimentary set theory. A 6th grade teacher who taught me about logarithms and how to use a slide rule. A shop teacher in 9th grade who taught us about phase in AC circuits (really! And I still remember his description of the vectors in the R and L directions as being like horses of different strength, the resultant is the direction the wagon is going to go...).
I know Chanda and others mentioned the role of science fiction (in particular Robert Heinlein) in expanding their horizons; this was also true for me. Heinlein's oft-repeated Calculus Made Easy). I'm sure that it was also through Heinlein that I first became aware of some place called Caltech, which quickly became the place I wanted to go to college. This became even more true after I attended a NSF-sponsored summer institute in nuclear physics at MacMurray College, where I was introduced to the Feynman Lectures on Physics. I was so grateful when my parents let me purchase my own copy from the University of Wisconsin bookstore, especially given the extravagant price at the time ($9.95 for volumes I and II, $8.50 for volume III; these same very tattered hardbound volumes are within an arm's length as I type).
In reading Yidun Wan's post I was struck by his statement "So again, I have to say: 'I simply followed my destiny.'" I often feel that I have done the same, but in the sense that a light ray being bent by a lens is following its 'destiny'. That is, a very definite principal of least action on my part has combined with some extraordinary luck to position me where I am today. I applied to only three undergraduate schools (Caltech, Reed and Wisconsin), only one(!) graduate school, did not have to actively seek my first postdoc, and was asked to apply for the Columbia position I currently hold. At each of the stops along this path I was the beneficiary of some extraordinary mentoring (Caltech- Tom Tombrello, Berkeley- Ken Crowe; Pennsylvania- Sherman Frankel, Columbia- Shoji Nagamiya).
At the same time, it would be unwise for me to minimize the enormous amount of hard work required for a life in science. If you are simply a careerist, then there are far easier ways to make much more money. You have to have sort of personality that 'forces' you to invests the time (sitzfleisch) because you there is something you really want to know about Nature. There is ample evidence that this discovery process must be the most addictive substance around- in a typical career you get only a few 'hits' of the drug, but they are easily enough to keep you going. In my case, I would estimate one or two such hits on my thesis, one as a postdoc and perhaps two as a junior faculty member. The incredible thing about my time on the PHENIX experiment at RHIC is that the hits just keep coming. It's been an honor and a privilege to be associated with such an enterprise, and I can't wait to see what's in our next data set.
The superficial words I've written here perhaps convey a trajectory of extraordinary luck, beginning with a stable childhood located in the sweet spot of the space age. Indeed that's how I think of my career (so far). And of course the very pinnacle of the space age was July 20, 1969, when Neil Armstrong stepped out of the Lunar Module and spoke his immortal words. I vividly recall watching this on a small black-and-white television outdoors on a boat dock in Ontario, and taking immense pride in the praise of our Canadian hosts "You Yanks really did it right". Yet that joy was tempered with enormous grief, as my brother Mike, two years my junior, avid rocketeer and constant companion in tinkering, model building and general boyhood carousing, was not there to share in the moment, having been killed in a traffic accident just a month previous. I am not one for (public) introspection, and initially did not intend to include these remarks. But after some thought I decided that if I was to take advantage of Bee's hospitality and honestly describe those influences that led me to physics, I should avoid selection effects and include all those things that impact on the human condition. While it's clear to me that I had been interested in science and mathematics from my earliest days, it was not until Mike's death that I realized it was time to take this thing called school seriously in order to reach the opportunities that were out there. Only now do I realize (thinking about things as a parent) how difficult is must have been just two years later for my parents to send me off to a school they had never heard of 2000 miles from home. I take some comfort in knowing that the presence of my younger brother Steve at home was a great source of comfort to them as their prodigal son wandered through academia to his current position.
I already mentioned taking advantage of Bee's hospitality, and don't wish to further abuse the reader's interest, so I will avoid rambling on about many other thoughts that preoccupy me (that status of heavy ion physics, why I became an experimentalist, working in large collaborations, academic prejudice against alternative career paths, etc.) But no account of why I am who I am today would be complete without acknowledging the tremendous support of my loving wife Mary, and the joy we take from our sons Thomas and Kevin. I am hardly a role model for parent of the year, but I find inspiration from the existence proofs out there of those who combine a life in science with exceptional parenting. It is possible, and it provides a wonderful and essential balance to the intensity of scientific investigation.
Bill Zajc is a physics professor at Columbia University. He received his BS in physics from the California Institute of Technology in 1975, and his PhD in physics from the University of California at Berkeley in 1982. From 1997 to 2006 he was the spokesperson of the PHENIX experiment at RHIC. Since stepping down as spokesperson last December, he has been using his newly-found free time to answer the mail messages he has been neglecting for the previous decade.
See also the previous contributions to the inspiration-series by
and my related guest post at Asymptotia 'Sabine Hossenfelder: My Inspiration' (also available as pdf-file).
TAGS: PHYSICS, PHYSICISTS
Today is the birthday of Henri Poincaré. Physicist know the Poincaré group of translations and Lorentz transformations, and the Poincaré conjecture about the topology of 3-spheres became widely known last year as the one millennium problem that has been its proofed by a reclusive Russian named Perelman.
But the man behind these concepts is probably not as well know as he deserves to be, considering that he contributed enormously to diverse areas of mathematics and physics. He is better known in France, where he was born on April 29, 1854, in Nancy in Lorraine, but beware: If you see a place, or street in France called Poincaré, it is most probably named after his cousin Raymond Poincaré, who was premier minister and president of the French Republic in 1913-1920.
In physics, Henri Poincaré is most famous for his contributions to the three-body problem, and, of course, to the theory of the electron and the special theory of relativity.
Poincaré discussing with Marie Curie at the 1911 Solvay Congress, while Einstein stands behind. (Source: Solvay Congress 1911)
It is not so easy today to form an unbiased opinion of what Poincaré achieved with respect to relativity, and to give a fair tribute to his and Einstein's respective work and results. That's in part because his original papers about the special theory of relativity are not easily available - I had been searching for a long time before finding some scanned copies to have a glimpse in his 1905 paper, "Sur la dynamique de l'electron", Comptes Rendues 140, 1504-8, and his 1906 paper "Sur la dynamique de l'electron", Rendiconti del Circolo matematico di Palermo 21, 129-176.
It seems that although Poincaré stated a version of the principle of relativity, understood the problems involving simultaneity, formulated the group property of the Lorentz transformations, and postulated the invariance of the laws of physics with respect to different inertial frames, he stayed convinced that all this was a consequence of the detailed dynamics of matter in the rest frame of the ether. A good description of the current understanding of this issue by a historian of science is The Mystery of the Einstein Poincaré Connection, by Olivier Darrigol in Isis 95 (2004) 614–626. Alas, his birthday is to short, even using apparent time, to read all these papers, or the book by Galison, Einstein's Clocks, Poincaré's Maps, which discusses all these topics and contains a biographical sketch of Henri Poincaré...
TAGS: Henri Poincaré, physics, relativity
Saturday, April 28, 2007
For example, I had never heard before of John Backus, until I read his obituary in this week's Nature (subscription required). He was the creator of FORTRAN, the first higher programming language. He passed away on March 17, 2007, at the age of 82. There have been several obituaries published before, i.e. in New York Times, the
The Sydney Morning Herald, or the Guardian.
Backus tried, quite unsuccessfully, to become a chemical engineer and a medical doctor, before he found his vocation in mathematics. Around the time he was finishing his a Master's degree from Columbia University, he visited a showroom exhibition of IBM, where the company presented its latest computers. He talked to some of the representatives there, and at the end of the day, he had a job at IBM.
At that time, in the early 1950s, IBM had started to build and sell computers in series. While hardware thus was more easily available - at least for big universities and companies, or governmental institutions - the problem of programming these computers became critical. Programs had to be written in machine code, in the series of zeros and ones the computer could understand, and this was a tedious and error-prone task, which required much training and could done only by experts.
The IBM 704 computer room at Lawrence Livermore, October 1956. The FORTRAN compiler was developed by John Backus to write programs for this computer. (Source: Lawrence Livermore National Laboratory, Historical Computer Photos
So, Backus had the idea to develop a program that would be able to translate some formal description of an algorithm into the right series of zeros and ones, the machine code the computer would operate with. This was how FORTRAN; the FORmula TRANslator, came into being. The FORTRAN compiler was released for general use with the IBM 704 computer just 50 years ago, in April 1957.
The title page of the IBM 704 Fortran Manual with an autograph by John Backus himself and the names of the people who worked on the first Fortran compiler. (Source: www.fortran.com)
By some curious coincidence, I never actually used FORTRAN for scientific computing. I had landed on an island of C/C++, while everyone else around me was constantly relying on FORTRAN for running simulations or coding small programs to find a quick numerical solution to some mathematical problem. I only realised that it had some funny formal features, for example with respect to the line length, which made me some headaches when I once tried to write a small program in FORTRAN. But once you realise that, actually, this compiler is a reliable tool back from the stone age of electronic computing, you probably can be tolerant with such idiosyncracies.
More about Backus can be found in the IBM Archives.
Here is an iteresting text about The history of the FORTRAN programming language.
Tags: John Backus, FORTRAN
Friday, April 27, 2007
I have a friend (who shell remain unnamed, but you occasionally find him in the comment section...) with the habit to always come to late, at least 15 minutes. He is the truest academic that I know, and the 15 minutes are a part of his personality.
Here at PI, coming to late to a seminar or a meeting is significantly tougher, because your BlackBerry beeps relentlessly and reminds you of your slackness (that's the true reason why we get one). But still, it is possible to be late. The easiest way to achieve it is to make a detour via the kitchen to grab a coffee. (You must know that our executive director Howard Burton made 'the importance of coffee and good food' really clear to the architects, and so it became a priority for the building, as you can read here. )
If that coffee detour doesn't take enough time, it is advisable to take the elevator, which - as everybody will confirm - is the slowest elevator in the whole world. But this too is of course not a bug but a feature, and belongs to the design of the building: the intention is to support communication among researchers in 'an abundance of natural light' and so on and so forth.
Indeed, even with the BlackBerry we are pretty successful with the delay tactics I'd say. In fact, hardly any seminar or colloquium starts in time. But actually, I can't recall any institute or department for theoretical physics where this was the case...
... except back in Germany, where we'd just always start late in time. It is called the 'academic quarter' (das Akademische Viertel) and it's announced with a c.t. after the seminar time - an abbreviation for 'cum tempore', the Latin expression for 'with time'. That is, 3 c.t. means actually 3:15. The exceptions are meetings announced with s.t., meaning 'sine tempore' , Latin for 'without time'. I found this nice photo on Wikipedia which shows a plate at Lund University (Sweden) announcing the entry into the realm of academia
|For me, one of the most obvious differences between Europe and North America is how much faster time seems to run over here. Most of the time, I am not in time, but without time, even if not outside time.|
And now I've successfully killed time. Weekend is close, and on Friday we have our wine and cheese... (the importance of good food, you know).
A nice weekend to all of you, hopefull one 'with time' :-)
Aside: My lesson of the day is that Google has something called SafeSearch that you should turn off before you expect to find any hits for a term containing 'cum'. But if you do so, you'll find a truly astonishing amount of Latin texts.
Wednesday, April 25, 2007
Therefore, the lunch remark of the day is: did you know that today is Wolfgang Pauli's birthday?
Wolfgang Pauli was born on April 25th, 1900 in Vienna. After receiving his early education in Vienna, he studied at the University of Munich under Arnold Sommerfeld. He obtained his doctor's degree in 1921 and spent a year at the University of Göttingen as assistant to Max Born and a further year with Niels Bohr at Copenhagen. (What I always liked most about quantum mechanics is that I know how to pronounce all the names of the people.)
Pauli is most famous for the exclusion principle which states that identical fermions (like electrons) can not occupy the same state. Thus, fermionic stuff can not clump together arbitrarily, and has an inherent stiffness. Among other things, the Pauli exclusion principle explains why electrons form nice shells around the atom core instead of all sitting in the lowest level, thus explaining the variety of chemical elements.
Wolfgang Pauli received the Nobel Price in 1945 "for the discovery of the Exclusion Principle, also called the Pauli Principle".
There are a lot of entertaining stories around Pauli, which is why he makes a good lunch topic. Among other things, he was known for spoiling experiments by simply being present in the room, an effect that was dubbed the 'Pauli-effect'. Allegedly, Otto Stern even banned Pauli from his laboratory to avoid the Pauli-effect, despite their friendship.
Wolfang Pauli was also known for ruthlessly criticising the work of his colleagues, from which the famous quotation stems
"This isn't right. This isn't even wrong."
("Das ist nicht nur nicht richtig, es ist nicht einmal falsch!")
Now let me publish this post... just in time for lunch. Have a toast to Pauli!
Monday, April 23, 2007
The nabla symbol is used in maths (and physics of course) to denote a differential operator. It was introduced by Hamilton around 1837. Its name apparently goes back to a joke by Maxwell. According to Wikipedia, W. Thomson wrote in 1884:
"I took the liberty of asking Professor Bell whether he had a name for this symbol and he has mentioned to me nabla, a humorous suggestion of Maxwell's. It is the name of an Egyptian harp, which was of that shape"
I am kind of glad he didn't suggest to use the Greek name 'psaltery' as I admittedly have no idea how to pronounce it. You might be interested to hear though that it makes an appearance in the bible, Psalm 33:2
"Rejoice in the LORD, O ye righteous: for praise is comely for the upright.
Praise the LORD with harp: sing unto him with the psaltery and an instrument of ten strings. "
The word 'operator' is a very sophisticated expression for a thing that assigns things to things. The telephone operator for example, assigns incoming calls to the desired connection. Its correct mathematical notation is
An operator can be almost everything. Your kid who never tidies up is an operator that assigns toys to places in your living room. If you buy tickets for the opera, the online booking system is an operator that assigns seats to the audience.
A differential operators specifically acts on functions by differentiating them. The nabla for example, when applied to a scalar field, gives the gradient of that field. If you think about the scalar field as an altitude in a mountain range, then the gradient points towards the direction where the increase is the steepest.
Operators are the core concept of quantum mechanics. Quantities that in a classical theories are functions, like the position or energy of an object, become operators. To make something useful out of them, they now have to act on a function - that being the purpose of an operator. In quantum mechanics, it is the well-known wave-function that they act on.
But the usefulness of the operator concept is that one can deal with them on their own without applying them all the time. It's a bit like replacing 'classical' money with a credit card. If you want to see something 'real' you have to 'apply' it to an ATM to get cash. Most often the result is quantized, say, you can only get multiples of $10 or so. You also typically have an offset, a smallest possible amount that you can get. But for most cases, you are fine dealing with the card itself. You have to be a bit careful though if you use it together with other cards, say the club card (payback card, member card, VIP card, whatever) from your local groceries. For your total, it matters in which order you present them at the register. We say that the operators don't commute: the result depends on the order of use.
The nabla is essentially the operator that, when acting on the wave-function, gives the momentum. That is, up to a constant - in this case a relevant constant. But this may be subject of another post.
TAGS: PHYSICS, NABLA, MATHEMATICS, OPERATOR
Saturday, April 21, 2007
Anyway, I only knew that because the recent bee-problem was pointed out to me already last summer, I think by Lubos (I couldn't find the comment though). What is new this April is that the bee-problem got a catchy name: the colony collapse disorder (already on Wikipedia), and is blamed on mobile phones by The Independent: Are mobile phones wiping out our bees?. They argue with the results of a study by German researchers at Landau University:
"Now a limited study at Landau University has found that bees refuse to return to their hives when mobile phones are placed nearby. Dr Jochen Kuhn, who carried it out, said this could provide a "hint" to a possible cause."
The word 'limited' already sounds very suspicious, and indeed the link is more than weak. For one, it is extremely implausible that this effect should have set in rather suddenly last summer. Mobile phone nets have gradually been extended over a far longer period of time. Also, it is very unlikely that such a mobile phone disorientation 'spreads out' as it seems to have done from the USA to Europe, this doesn't make sense at all:
"The alarm was first sounded last autumn, but has now hit half of all American states. The West Coast is thought to have lost 60 per cent of its commercial bee population, with 70 per cent missing on the East Coast.
CCD has since spread to Germany, Switzerland, Spain, Portugal, Italy and Greece. And last week John Chapple, one of London's biggest bee-keepers, announced that 23 of his 40 hives have been abruptly abandoned."
But more importantly, mobile phones are rarely placed in bee hives. If one tries to find out what the researchers from Landau did, it turns out they indeed placed the 'base station of a mobile phone' directly in the bee hive. Now, if you don't speak German, let me be precise here: the German word commonly used for the English 'mobile phone' (or cell phone) is not 'Mobiltelefon', but 'Handy' (also written 'Händi' in best Denglish). The device the researchers used is not a cell phone, but a cordless home phone. What was placed directly in the bee hive was the base station of that phone.
An very conscise description of their experiment is available online. Unfortunately, it is (except for the abstract) in German. But even if you don't speak German, look at the pictures and illustrations
Verhaltensänderung der Honigbiene Apis mellifera unter elektromagnetischer Exposition
There is follow-up article in Spiegel who tried to clarify the misinterpretation of the researcher's results (again in German unfortunately): Werden Bienen tot telefoniert? (Are Bees phoned to death?), which also quotes Prof. Jürgen Tautz, bee-researcher at the university Würzburg: "I am sure: a healthy, not stressed, bee colony will not be affected by cell phone networks."
Ah - I just realized this wasn't even the reason for me writing. No, I meant to comment on the quotation that has been attributed to Albert Einstein in this context:
“If the bee disappeared off the surface of the globe then man would only have four years of life left. No more bees, no more pollination, no more plants, no more animals, no more man.”
(„Wenn die Biene von der Erde verschwindet, dann hat der Mensch nur noch vier Jahre zu leben; keine Bienen mehr, keine Bestäubung mehr, keine Pflanzen mehr, keine Tiere mehr, keine Menschen mehr.“)
If one does a Google search on 'Einstein Bees' it gives 981,000 results today (two days ago it was only 893,000).
I've read a lot of Einstein stuff, and I can't recall I ever came across something remotely like this. Nowhere could I find a source for this alleged quotation. It seems it goes back to this article by Walter Haefeker (see last paragraph), but there it ends without a reference. It is not listed in any book with Einstein quotations. There are various other people who have pointed out that this quotation is most likely made-up, at least completely unconfirmed see e.g. here, here, or here.
Update April 21st: See also Lubos' post.
Update April 29th: Gelf-Magazine has an article about the alleged Einstein quotation titled 'Albert Einstein, Ecologist?' which confirms my doubts about its authenticity
"Roni Grosz, curator of the Albert Einstein Archives of the Hebrew University in Jerusalem, tells Gelf, "There is no proof of Einstein ever having said or written it." While Grosz notes that it is extremely difficult to disprove a quote, he "could not remember even one reference to bees in Einstein's writings."
TAGS: BEES, EINSTEIN, HONEY BEES, CCD
Friday, April 20, 2007
Frühling läßt sein blaues Band
Wieder flattern durch die Lüfte
Süße, wohlbekannte Düfte
Streifen ahnungsvoll das Land
Veilchen träumen schon,
Wollen balde kommen
Horch, von fern ein leiser Harfenton!
Frühling, ja du bist's!
Dich hab ich vernommen!
~ Eduard Mörike
Thursday, April 19, 2007
Blackberry the gadget is, of course, the first wireless handheld organizer that was able to receive and send email. It's an invention of the Waterloo, Ontario, based company Research in Motion, and earned them a lot of money. So much, indeed, that RIM president and co-chair Mike Lazarides could afford donating $50 million to the University of Waterloo to help establish the Institute for Quantum Computing, and $100 million to establish the Perimeter Institute for Theoretical Physics.
That's the reason why scientist working at the Perimeter get equipped with one of these gadgets once they take over their positions there. And from my somewhat limited experience, these BlackBerries can be quite invasive, as concerns your way of living, since it may happen that you are constantly disturbed by beeps signaling newly incoming emails you have to read immediately ;-).
What brings me back to blackberry, the fruit. Take, for example, this thicket of blackberries:
It is, actually, at the edge of the garden of my parents house. The shrubs provide great fruits in August, and they make a great, impenetrable fence against your neighbours. However, if you do not take care, your lawn is invaded without mercy by this species, and soon, you cannot access anymore the apple trees. It gives you a really cool impression of the hedge around the castle of Dornröschen. So, it's a good idea to peck out the blackberry shrubs entering the garden from time to time, and to make sure to leave no roots behind... That's actually how I spend last weekend, when I visited my mother and she had already prepared a quite a long list of things to do in the garden...
The lesson from this is clear: Blackberries are delicious, and BlackBerries are great. You just should make sure they don't take over your garden :-)
Figure Credits: Wikipedia
TAGS: BlackBerries, blackberries
Wednesday, April 18, 2007
To promote spring feelings, 3Dchem has chosen SEX to be the molecule of the month March 2007: you can play around with different positions using their nice java applet.
According to the NICNAS fact-sheet, everything that contains more than 20% SEX is hazardous:
"Under normal conditions (20°C), there is enough moisture in the air to cause SEX to form carbon disulphide, a highly flammable, toxic gas that is readily absorbed through the skin. [...] It is poisonous but there is a lack of information about its health effects.
SEX is readily absorbed by the skin and is a skin and eye irritant [...].
Signs of high exposure are dizziness, tremors, difficulty breathing, blurred vision [...]. It causes severe skin, eye and respiratory irritation."
So, be careful...
TAGS: CHEMISTRY, SEX
Monday, April 16, 2007
After a week or so, I gave up. It then occurred to me maybe one or the other reader of our blog understands more about the technical details, so here is what I found out with a lot of open questions.
The issue is how wireless power for home use could work. That is, we are typically talking about distances of some meters or so, over which we want to transmit energy to power technological devices (say, your cellphone), preferably without roasting every human in the room.
To set the base, energy can of course be transmitted without wires. There are generally two ways how this can work:
One is to use electromagnetic radiation. E.g. your microwave does that. Actually, every bulb does that. The problem with this energy transport is if you want to use it over useful distances you either have a sender that broadcasts the energy into all directions, and a receiver that picks up only a part of it. This means a lot of energy is just lost. Or you focus the sender's radiation on the receiver, which means for moving targets you have to track them. (The microwave is a box that avoids radiation loss by reflection, and you don't want to sit in it).
The other possibility is using inductive coupling. In this case, energy is transferred from a sender to the receiver by using the very near field, and the radiation loss is negligible. This technique works pretty good and very efficiently and is in fact used to recharge many devices. The problem is that it only works on really short distances (say, a centimeter or so).
Neither of both ways sounds very useful for wireless power at home. Now, while I was scanning through these articles, I found out that there are currently two ideas on the market that rely on different schemes.
A) The one has been proposed by a group of physicists from MIT, Marin Soljacic (assistant professor of physics), Aristeidis Karalis, and John Joannopoulos (professor of physics). They have a paper on the arxiv about it: physics/0611063. I could also find these slides from a talk one of them (M. Soljacic) gave. There is a news article on BBC, and the story has been echoed with slight alterations, e.g. here, here, or here.
B) The other one is the technology used by a company called Powercast. They have a website that you find here. It it almost void of any information. They have a form that you can fill out and they send you some pdf-files. These again hardly contain any information (except some general explanation about the allowed limits on power density). If you don't want to fill out the form , the pdf's are here: 1,2,3,4. They apparently have presented their device on the consumer electronics show 2007, see e.g. here, here, here, or here (they are all more or less identical).
Let me first comment on A: The paper says they propose an 'efficient wireless non-radiative mid-range energy transfer'. Efficient means there is little energy loss. Mid-range means it potentially operates on the distances that we are interested in. Non-radiative means it doesn't use radiation. The idea that they build up upon is a resonance effect. You know that from your car. Your engine causes vibrations, and if you accelerate their frequency changes. If the frequency of the vibration coincides with the resonance frequency of some parts in your car (say the CDs in the glove box) they will also start to vibrate. This does extract energy from the engine, just that for the car this is a totally negligible (though annoying) effect.
Now the technique of A proposes to use a receiver and sender system that are resonant objects. In addition to that they state that they do not use the radiation field for this. Note that a field can very well be time-dependent (oscillating) without actually having an energy flow to far distances. For the sender and receiver they consider two examples: disks and loops. The paper does hardly contain any calculation, it seems there isn't very much one can do analytically with these boundary conditions. But electrodynamics can be treated numerically without too much complications, so that's what they have done. Figure 2 from physics/0611063 shows how that field would look like for the two disks (sorry, I removed the figure due to copyright reasons, see update note below).
Interesting is also Figure 6 which shows how the field gets distorted by a wall (to the right), and by a by sample (the square) that simulates a human (again, sorry, I removed the displayed figure, please check the paper) . In both cases, there is not too much distortion which is really promising. As they write 'the system performance deteriorates [...] only by acceptably small amounts'.
I was wondering why the 'human' isn't placed between the disks, wouldn't you too? But I have to say I find it actually reasonable. As long as you don't hit the resonance frequency you probably don't distort the field too much. A human body is very unlikely to contain resonating parts. Though I wouldn't want to have a pacemaker in such a room. More generally, one should ask how other technological (and metal) objects affect the field. Also, what does the field look like if sender and receiver are not parallel to each other?
But okay, I thought, great. So far I can make sense out of this. But this is the static configuration, in which case there is not really a sender or receiver. It's just two coupled systems. But to describe the realistic situation, one disk (loop) has to be the source that 'powers' the other one. The above figure is perfectly symmetric, so it can't describe this situation.
In particular the question is where is the energy flow localized in that case? I mean, energy is a locally conserved quantity. It has to get from the sender to the receiver somehow. My naive guess would have been, it takes place in that cylindrical part of space whose end-caps are the sender and receiver - this intuition relying on the simple fact that photons like to travel in straight lines. I was looking for something like the Poynting vector of that field, but couldn't find anything. The only 'explanation' I found was from Howstuffworks, which says:
"Electricity, traveling along an electromagnetic wave, can tunnel from one coil to the other as long as they both have the same resonant frequency. The effect is similar to the way one vibrating trumpet can cause another to vibrate."
This, excuse me, is simply bullshit . We're talking about a classical system, energy doesn't just 'tunnel' somewhere. The vibrating trumpet transfers its energy via air molecules. Try it without air, you'll see. (You also find the tunnel-explanation in the BBC News article). In this article at physorg, you will find the statement 'Most of the energy not picked up by a receiver would be reabsorbed by the emitter' which doesn't make sense to me either. Photons don't just turn around and fly back if they were not absorbed.
The reason why this puzzles me is that their paper considers as an example a 'useful extracted power' (p. 15) of 10 W (p.16). Now I would expect this power to be transferred between the two loops, that is, it for a diameter of 30 cm, it is distributed over a surface of roughly 1000 cm2. Distributing the power over a surface that large does of course significantly lower the power density relative to that of a cable. But still one finds a power density of 10 mW/cm2 (The FCC limit e.g. in the frequency range 30-300 MHz is 0.2 mW/cm2). One can of course make the loops larger, just that - if you ask me - already a diameter of 30cm doesn't appear so very handy to me.
In addition to this, I have tried to recall how these resonance effects work. In the symmetric configuration (none of both is a source), there is a phase shift between the oscillations of both coupled systems, and energy is transferred periodically from the one to the other. On the average, this does not lead to an energy flow. In case energy is 'used' on one side, an average flow will take place. However, the energy of the total field is typically significantly larger than the fraction that is transferred. I am not sure I understand all the details, but it seems to me that indeed the extracted energy is only a small fraction of the total field. Then, the power density of the total field is even larger than the above estimate, even though a large part of it does not lead to an effective energy flow, but just goes back and forth .
I looked at the slides from the talk, and it seems to me that the configurations examined there indeed have a source and a receiver. But since I didn't hear the talk I am not sure, and again I couldn't find anything about the energy transfer. So I wrote an email to the guy who posted the paper on the arxiv, Aristeidis Karalis. He kindly explained:"Think of two penduli connected with a spring. If you move one, energy will be moved to the other and then back and so on. The energy stays in the system and does not leak out. It just jumps from one to the other back and forth." I am not sure I can make more sense out of 'jump' than out of 'tunnel'.
I repeated my question on where the energy flow takes place, but it seems I exhausted his patience at some point (well, I know, I can be really annoying). Interesting is also what he wrote regarding my question why the human sample wasn't placed between the plates:
"The system of dielectric disks is more affected from extraneous objects than the system of loops. I initially made calculations for the 'human' between the two disks, and the numbers were still viable but worse. Therefore, I chose the positioning presented in the paper, because for application where humans are present most probably the loops would be used, while for applications where disks would be used (e.g. optical regime) the materials have much smaller indices and losses."
But as I said above, I actually believe that a human wouldn't make a big distortion because it's unlikely to hit the resonance frequency.
I then looked who's who on this photo, and I thought maybe it would be more helpful to ask Dr. Marin Soljacic. So I wrote him an email, but he didn't reply - at least not yet. And this is where this story ends.
Then let me summarize what I think about A. If there really is very little energy loss, then it seems to me this energy flow has to take place around the axis between sender and receiver and is roughly distributed over a surface of their diameter. If the efficiency of that is indeed almost independent of the sender's and receiver's relative positions and orientations, this means it is somewhat like an automatically working tracking mechanism. The problem is then that the energy density shouldn't be too high between sender and receiver. (Or you'd want to make sure you don't get in the way.) I am not too good with numbers (famous for loosing factors of 106 or so). So I don't know - given the limits on the power density are fulfilled - how long would it take to charge the average device ?
Now to B: In essence the idea is using a broadcaster that operates in 900-MHz range with acceptably small power density. They call that the 'omnidirectional power beacon' and it 'will recharge devices within about a 1-meter range' [source]. This energy can be received by a device they call 'power-harvester'. Since there are constraints on the allowed power density, the field can not be too large which means one can only use it to power really small devices. As they say:
"We have a technology that's here today, with FCC approval, that sends RF signals through the air to power very low- power devices directly or to recharge battery-powered devices," said Powercast vice president Keith Kressin. "Our wireless systems can recharge batteries in any consumer device smaller than a cell phone, from up to a meter away." [source]
It seems they actually have working products, and had a demonstration that impressed many people last month. I have to say though, I don't think I would want to work in an office where LED lights start gleaming through energy they extract from the radiation field around me. No matter if you tell me the limits are compatible with what the government demands.
"You can forget their orientation, forget the use of coils; just watch the LED get brighter the closer you place your device to the Powercaster."
This technology seems to have been developed mainly by Dr. Marlin Mickle and collaborators from Pittsburgh. I checked some of his publications, to find out how efficient this power transfer would be. I found a lot technological details about antennas, but not what I was looking for (I could not access all of the papers). If you do better than I, please let me know.
Besides me feeling uneasy with sitting in that power transmitting field, my problem with B is that I am afraid there might be a considerable loss into radiation. In particular, unlike this article from the Alternative Consumer says, this can hardly be very 'green' . The Alternative Consumer essentially repeats Powercast's information sheet that says nice things like 'Powercast Reduces High-tech Waste' because 'Continuous recharging of batteries via the Powercast Wireless Power Platform has the potential to reduce the huge waste stream of batteries to a mere trickle'. Indeed, instead of rechargeable batteries you then use the power-beacon and -harvester, and instead of transmitting power with negligible loss via a cable you radiate it generously into your apartment where most of it goes byebye to outer space.
In addition, I wonder what happens if the guy in the apartment below me installs a 'power beacon' it his ceiling. And my neighbor to the right. And to the left...
To summarize: I wouldn't buy neither A nor B.
Update, April 17th: Yesterday, I sent an email to one of the authors of physics/0611063, Aristeidis Karalis, asking whether it is okay that I display the figures from paper. He replied that the paper is in the publication process and asked me to remove the figures. I was kind of afraid that would happen. So, I am sorry for the inconvenience, but you'll have to look at the pdf-file.
Footnote 1: You can fill in the address fields with x, it works. They sent only the requested files and no spam.
Footnote 2: The use of the expression 'tunneling' is most likely due to a misunderstanding. The electromagnetic field configuration of the proposed system makes use of the Whispering Gallery modes which have an exponentially decaying tail. From the solution of the wave-equation this is similar to the tunnel-effect in quantum mechanics. Just that in electrodynamics the amplitude is that of the electromagnetic field and not - as in quantum mechanics - a probability amplitude. The typical 'tunnel effect' in which a particle 'jumps' through a classically forbidden region has nothing to do with the above described resonance.
Footnote 3: As Stefan pointed out in this comment, the ratio between the transferred power and that of the total field is of order 1000, which means the total power density would be in the kiloWatt range - far above the allowed FCC limits.
Footnote 4: Yes, I checked the junk folder. I found it indeed possible that PI's highly efficient filter discards MIT-senders as spam.
Footnote 5: As my husband just taught me, with a power of 1.5 W it takes 2 hours to charge a common battery of type AA. That is, the transmitted power they considered is very realistic to applications, and going below it makes the scenario considerably less appealing. To meet the limits on the power density, you either have to wait 100 hours, or increase to diameter of the loops to a meter or so.
TAGS: WIRELESS POWER, ENERGY, PHYSICS
Sunday, April 15, 2007
Leonhard Euler (1707 - 1783) in a 1753 portrait by Emanuel Handmann
(Öffentliche Kunstsammlung Basel, via MacTutor)
Euler has left his imprint in all branches of mathematics, and created some new ones, such as graph theory. He also contributed to physics, astronomy, and engineering - Wikipedia has a list of about 50 topics that are named after him. Ed Sandifer, in the February 2007 issue of his monthly column "How Euler did it", presents Euler's Greatest Hits in mathematics. Wow - so many stuff you have probably heard about: the polyhedral formula, the Königsberg bridge problem, the product formula, the Euler-Lagrange equations... Lubos has a nice post on the relevance of many of Euler's discoveries for string theory.
And, of course, there is the famous formula
about which whole books have been written. When I was thinking about something a tiny bit original to post about Euler, I thought I might try to trace the origin of this formula in Euler's writings. Mathworld gives as source of the full Euler identity
page 104 of Introductio in Analysin Infinitorum, Vol. 1. Lausanne, 1748. Now, this can be searched for in the Euler Archive, and we find it as entry E101: the Introduction to the Analysis of the Infinite, volume 1, where "Euler lays the foundations of modern mathematical analysis". The original text is available online from gallica.fr, and here is what we read on page 104:
There it is: From this can be seen how imaginary exponential quantities are reduced to sines and cosines of real arguments. It is
It's written in a now old-fashioned notation - in fact, the notation "i" was introduced only in 1777 by, guess whom, Euler - and in an even more old-fashioned language - but what it says is timeless!
There are several web sites commemorating Euler's birthday, for example at theMathematical Association of America, the Euler Society, and the Leonhard Euler Tercentenary - Basel 2007.
The Euler biographies at Wikipedia and MacTutor have much information and many interesting links. If you want to read a book about Euler's life and time, Leonhard Euler by Emil A. Fellmann has not too much maths and no technical details at all - but it is a very readable biography, has a good choice of illustrations, and conveys a lively picture of a 18th century life in science and mathematics. And there is even a comic about Euler.
TAGS: Leonhard Euler, Imaginary Numbers
Saturday, April 14, 2007
Still, I was really running behind with preparing the talk. The problem was when I realized I actually had time to talk about whatever I wanted, I could not decide on what to talk about. Then I recalled what I found the most exciting when I was a student were not all the things that were known, but the open problems that were left for us to explore. So I thought, instead of talking about all the things we know I would - somewhat unusual for a scientific talk - focus on what we don't know, and where the frontiers of our knowledge currently are. Thus, the title of the talk
Frontiers of our Knowledge
Abstract: Theoretical and experimental physics work hand in hand to broaden our understanding about the universe that we live in and man's place in the world. In the 21st century, nature has given us quite some puzzles to solve, in the microscopic (particle physics) as well as in the macroscopic (cosmology) range. These open questions at the threshold of the unknown have lead theoretical physicists to formulate possible solutions whose experimental tests are awaited soon. I will talk about these current limits to our knowledge, and about the insights that new experiments like the Large Hadron Collider can provide us with. A central point will be the possibility of large extra dimensions and black hole production at the LHC.
If you want to classify my current state of mind, I'd say I'm an high energy physicist trying to become a cosmologist. I think that in the soon future more interests will shift from particle physics towards cosmology, which right now I find a tremendously exciting area. So in my talk I wanted to talk about both, cosmology and particle physics, especially also the areas where they overlap e.g. dark matter searches, and what that has to do with the 'big' questions like: Where do we come from? What are we made of? (Why am I spending my Saturday at work?)
Well, at least that was the idea. But I have never before given a talk about cosmology (I don't even know what an 'erg' is. Luckily, nobody asked.) I was really kind of nervous (in addition you should know that this lecture starts at 10am, and I didn't have any coffee because due to some problem with the key cards I couldn't get into my office.)
PI's public outreach program is organized by Damian Pope, who told me the format is rather casual, and the physics knowledge of the audience often pretty mixed. So I thought the best would be not to use too many equations, but to really explain every detail (be honest, usually you don't do that since everybody has seen this figure a million times). Bruno made a very nice introduction, and I looked at the large seminar room getting fuller with people. You're not going to believe it, but soon all seats were taken. In fact, after 15 minutes, Damian asked me to interrupt my talk so we could change into the big lecture hall. As I said to Bruno: I think I'm in the wrong movie.
As you can guess, my timing for the talk was a complete disaster. I had to skip the biggest part of the second half, and I had to promise I would put the slides online, so here they are:
Frontiers of our Knowledge (Powerpoint Presentation, ~20 MB)
I am afraid the size of the file is rather large because it has a lot of photos. I have thrown out the movies that I showed, you can download them here:
I believe in recycling, so part of what I told today is based on posts I have written here, in particular Dark Matter, The World's Largest Microscope, Anomalous Alignments in the Cosmic Microwave Background (piecewise), Micro Black Holes and Extra Dimensions.
People had a lot of questions. The best question was without doubt: Do you know what caused the big bang? It was a good question, because it didn't ask what caused it, but if I know it. It happens only rarely that I can clearly answer a question with yes or, in this case, with no.
If I find the time, I will write a summary of the talk sometime next week, if you will find it, then you will find it here.
Overall seen, giving the lecture was a great experience. In fact, I volunteered to do it again...
Friday, April 13, 2007
Einstein on the beach in Santa Barbara, 1933 (?) (Santa Barbara Historical Society, and Caltech Archives)
It seems that Isaacsons book is strong about Einstein's time in the US, but I am not sure if I want to take the time to read through another 700+ pages Einstein biography. I found Jürgen Neffes recent book a very good reading, with a well-balanced mix of the man and the science, and taking into account now available sources about Einstein's personal life. An English translation is about to appear as "Einstein: A Biography". And for a comprehensive and authoritative exposition of Einstein's scientific work, I still know of no match to Abraham Pais' "Subtle is the Lord".
Einstein with astronomer Charles St. John at the Mt. Wilson Observatory in 1931, examining the apparatus for the (unsuccessful) measurement of the gravitational redshift in the solar spectrum. (The Observatories of the Carnegie Institution of Washington, in: Centennial History of the Carnegie Institution of Washington, p. 142)
Anyway, the TIME magazine excerpt of Walter Isaacsons book is accompanied by a sort of "Einstein FAQ": 20 Things You Need to Know About Einstein, answering questions from "Was Einstein a slow learner as a child?" (he was slow in learning how to speak) and "Did Einstein flunk math?" ("I never failed in mathematics") over "Why did it take so long for Einstein to get a Nobel Prize?" (a longer story) to "Was Einstein disillusioned at the end?" (no, he wasn't).
It is definitely worth a click, if only for the wonderful collection of photos, most of which are not those that one usually sees!
TAGS: Albert Einstein, Biography
Thursday, April 12, 2007
From the comment section:
Roberto: The unidirectional time in the equations of physics is constructed from the original, circular time.
(now that explains everything)
Guest: does time really exist? Not in the least. Time is a construct. Created by mankind to assert ourselves over the other dimensions. We have freed ourself from the laws of physics and created a new prison, time.
(Gee, I didn't even notice we freed ourselves from the laws of physics. That's what can happen if you live in a small city in west Ontario - someone could have told me!)
Guest: if you ever get the opportunity, use DMT (dimethyltryptamine). you will see far beyond the illusion of time and realize many things that cannot be expained.
Guest: Time is an invention of convenience. Kind of like x^^0 = 1 .
(Aside: this is not an invention of convenience but a consequence of smoothness)
Guest: It's a tautological question. Time is what's measured by a clock, just as intelligence is what's measured by an IQ test.
(Comes back to the point that the question, as formulated, is essentially empty.)
Eric: Human sentience requires this illusion to give us the illusion of free will.
Robert: there is no such thing as time, instead it should be thought of as the cycle of mater and energy from light to total entropy to highly energized quantum materials which all together form the macro world
(thats what happens if you 'think horizontally')
Pierre: If we are going to think about time, shouldn't we first ask ourselves "are thoughts real?".
(I link therefore I am).
Jeff M: It doesn't matter - the question is moot.
(Thanks to Jeff M for enriching my vocabulary, I agree. Will try to use 'moot' from now on instead of 'nonsense'.)
pachorazy: if we change our perception of time can we change how fast we age?
(and how do you measure 'fast' if not with 'time'?)
Stuart: I suppose that it does not matter. Time is how we perceive it. What it may theoretically be has no effect on how it IS for us. So... it is what it is.
letibenson: Time is an allusion and here is why-- our congress can tell us when we need to set our clocks forward or back by an hour. If time was real it would be out of there hands
tculp: If we break down the question, we have an initial question - what is illusion?
Time flies like an arrow, fruit flies like a banana. (From the book "Instant Physics")
Alex: Time is a man-made idea, it is a construct of human society used to make life more convenient and easy.
Gary: It might be time to go back to the Greek philosophical definition of the universe and work our way forward again.
(have fun, say hello if you arrive in the 21st century)
Robert: the fascination that many people have these days with wanting to believe that everything happens all at once, there really is no time.
JD Bailey: Time is not an illusion. Time is a multidimensional relationship.
Jim: Life is illusion and time comes from life.
Ric: One of the secrets of TIME, which I have realized, is that TIME itself is the constant we call “infinity”.
Erika: And what about killing time?
Jana: Time is fake. Have a nice day.:)
TAGS: TIME, ILLUSION
Tuesday, April 10, 2007
Glendale, CA— March 13, 2007 – DreamWorks Animation SKG, Inc. (NYSE:DWA) announced today its intention to produce all of its films in stereoscopic 3D technology starting in 2009. To best take advantage of the technology, the company will now be creating films utilizing stereoscopic 3D from the beginning of its creative process.
“I believe that this is the greatest opportunity for movies and for the theatrical exhibition business that has come along in 30 years,” said Jeffrey Katzenberg, Chief Executive Officer of DreamWorks Animation.
The first time I saw a movie in 3D was in DisneyWorld Florida, some time in the last century. I vividly recall the movie had an underwater scene with a shock-moment where a large shark came in from the right, and the guy left to me hit me in the face trying to hold off the shark.
The principle with the stereographic 3D is essentially the same as with the red-green glasses. To get a three dimensional image, our eyes need two pictures from slightly different angles, one for each eye. If printed or projected on a screen, these two images can be overlaid, but they have to be separated afterwards.
In the red-green (or sometimes red-blue) version one of the image is displayed in red and the other in green. The filters on the glasses allow only one image to enter each eye. Unfortunately, you cannot really have a color movie when color is used to provide the separation.
A better method is to project the two views onto the screen with a different polarization of the light, and to use glasses that filter out one polarization for each eye.
I have to admit I never really liked these 3D movies, they hardly had any plot and the scenes were never really convincing. It will be interesting to see what DreamWorks Animations makes out of this technology.
And don't miss the trailer for the Bee Movie (In Theaters November 2, 2007).