The Internet has brought severe challenges for journalism, and especially for science journalism. The vast majority of private websites are financed through advertisements, which is impossible to miss. Exceptions to this are publicly funded governmental or educational institutions, and rare cases that are financed through donations like Wikipedia. Advertisements are more profitable the more visitors a website has, which thus puts a major incentive on popularity. Though this incentive has always been present, it is today much more pronounced than with a clientele of subscribers, and the breathlessness of infotainment with an emphasis on novelty contributes its part. The trend of print newspapers has thus been to cut back on the length of reports, to make them increasingly simplistic, and to provide additional web content in an effort to adapt to the changing demands of the customers.
This however has not sufficed to keep newspapers financially healthy. Reporting on results of a recent survey among newspaper executives, researchers on the Project for Excellence in Journalism summarize that the newspaper of today “has fewer pages than three years ago, the paper stock is thinner, and the stories are shorter. There is less foreign and national news, less space devoted to science, the arts, features and a range of specialized subjects”. Well over half (59%) of the 259 newspapers participating in the survey have reduced full-time newsroom staff over the past three years, mainly because of financial pressures. Roughly the same number (61%) also reported a decrease in their space available for stories. 46% of survey respondents said that the resources devoted to cover international affairs dropped within the last three years, 41% report a drop for national politics, and 24% for science reporting.
While this development is of general concern, it is particularly so for scientific reporting, where attention to detail, background knowledge, and accuracy are essential. Quality of information is relevant for citizens to make decisions, and it should thus be in our prime interest. The problem underlying this erosion of newspapers substance (both in budget and content) is with the link between personal interests and the resulting overall trend, a classical case of public choice. We have gotten used to information being provided for free, and to all the advantages and amenities connected to it. We consider it a public service. If this information was provided for the actual coast it causes, likely many people would not pay this price, thus eroding the basis of our democracies. Free information is desirable to keep our societies functioning well. The problem is, its provision is done by people who need to eat and sleep. Consequentially, they should be financed as providers of public service, either by governmental subsidies, or as tax-free non-profit organizations.
This is a discussion which is overdue, I was thus glad to see Swensen and Schmidt recently picked up the question of alternative financing models in their recent NYT article News You Can Endow.
Related: Do we need science journalists?, When capitalism fails and Fact or fiction.
Pages
▼
Friday, February 27, 2009
Thursday, February 26, 2009
What does the American Dream mean to you?
I am presently reading Jeremy Rifkin's book “The European Dream,” which is quite an interesting comparison between the United States of America and the “United States of Europe,” as he calls it. It is as much about the American Dream as the dreamers and their European ancestors. The book is however full of sweeping generalizations. Though I can find some truth in most of what he writes, I find it hard to swallow statements like
Which doesn't make much sense if I consider Microsoft ran into problems with European privacy rights where Americans didn't care. This morning a complete stranger told me her friend Gerald just had a triple bypass some days before his 70th birthday. Not that I asked. Maybe it's just me, but these things happen to me constantly on this side of the Atlantic. I yet have to find a German who'd tell a random seat neighbor on a plane about her daughter's affair with the pharmacist. So much about the Americans' desire for privacy.
Another example:
That must be why millions of Americans have gathered in mega-cities like LA, Chicago and New York.
That being said, I am not sure how much to trust Rifkin's elaborations on the American Dream either. Thus, I though I'll pass on the question: what does the American Dream mean to you?
“Americans covet exclusive space. Each person strives to be self-contained and autonomous. That's why we put a premium on privacy. Europeans seek inclusive space - being part of extended communities, including family, kin, ethnic and class affiliation. Privacy is less important than engagement.”
Which doesn't make much sense if I consider Microsoft ran into problems with European privacy rights where Americans didn't care. This morning a complete stranger told me her friend Gerald just had a triple bypass some days before his 70th birthday. Not that I asked. Maybe it's just me, but these things happen to me constantly on this side of the Atlantic. I yet have to find a German who'd tell a random seat neighbor on a plane about her daughter's affair with the pharmacist. So much about the Americans' desire for privacy.
Another example:
“We Americans [...] if we can afford it, we'd much prefer to place our home at the very top of a hill, and at a distance from our nearest neighbors, afford us a daily reminder of our autonomy.”
That must be why millions of Americans have gathered in mega-cities like LA, Chicago and New York.
That being said, I am not sure how much to trust Rifkin's elaborations on the American Dream either. Thus, I though I'll pass on the question: what does the American Dream mean to you?
Wednesday, February 25, 2009
Did you know... (VII)
... the origin of the word “travel”?
It goes back to the Old French word travail “suffering or painful effort, trouble” (12c.), from travailler “to toil, labor,” originally “to trouble, torture,” from Vulgar Latin tripaliare “to torture,” from tripalium (in Literary Latin trepalium) “instrument of torture.”
Source: The Online Etymology Dictionary
Yep.
Next time you fly from the East- to the Westcoast, suffering from cheap airline coffee and dull movies, try to imagine you'd have taken the trip 200 years ago...
See also: Did you know...
... the opposite of eloquent? ... why the toast is called toast? ... what the Baconian method is? ... why Google is called Google?
It goes back to the Old French word travail “suffering or painful effort, trouble” (12c.), from travailler “to toil, labor,” originally “to trouble, torture,” from Vulgar Latin tripaliare “to torture,” from tripalium (in Literary Latin trepalium) “instrument of torture.”
Source: The Online Etymology Dictionary
Yep.
Next time you fly from the East- to the Westcoast, suffering from cheap airline coffee and dull movies, try to imagine you'd have taken the trip 200 years ago...
See also: Did you know...
Tuesday, February 24, 2009
Three Years of Backreaction
A while ago, Bee wrote a post about Risky Black Holes.
This seems to be a timeless topic – it could have been written today. It was three years ago, and it started this blog.
Happy Birthday, Backreaction!
This seems to be a timeless topic – it could have been written today. It was three years ago, and it started this blog.
Happy Birthday, Backreaction!
Monday, February 23, 2009
Book Review: Naomi Klein "The Shock Doctrine"
The Shock Doctrine - The Rise of Disaster Capitalism
By Naomi Klein
In late 2007, I read an article in Harper's Magazine, titled “Disaster Capitalism” which, well written, vividly argued, left an impression. I Googled for the author, Naomi Klein, and was lead to her website announcing the new book, on which the Harper's article had offered a glimpse. I watched a truly appalling promotional video, and reminded myself that the author probably wasn't responsible for the advertisements before ordering the book.
Summary
In “The Shock Doctrine” Naomi Klein puts forward the thesis that worldwide and over decades shocks have been used to push through unpopular free market decisions, mostly privatization and deregulation, generally against the will of the people but always to the advantage of large corporations, the wealthy upper class, and corrupt governments. Shocks might be natural disasters, wars, terrorist attacks, or economic turmoils. The book is a collection of well researched and documented examples, from Bolivia over Chile, Poland, Iraq, China, the UK, Russia, again Iraq, Israel, South Africa, Sri Lanka, the Maldives to New Orleans.
The book begins with drawing parallels to shock therapy in the 50s and 60s, the attempt to reset the human mind by whatever means into an infantile state, a “clean slate” on which there could be imprinted a new beginning. Klein reports how these insights were later used for purposes of torture all over the world.
Throughout the book, Klein traces the actions of Milton Friedman, and his “Chicago Boys” who provided the ideological and allegedly scientific backup for operations in the course of which hundreds of thousands of people were murdered, died, or committed suicide. (Jeffrey Sachs makes an appearance in several instances.) If her claim of a shocked nation being a desirable state to perform unpopular free market reforms feels far fetched to you as it did to me, this is actually quite well documented in many instances, and one doesn't have to look very far to find this option was considered quite appropriate as means for what some considered progress.
Over the decades, Naomi argues, the ideology spread, packed into the wrapper that a free market maximizes social welfare, filled with a creamy myth of tricke-down. Extreme measures were more easily put into place in tyrannies, but found their way into democratic systems as well, through cloak and dagger operations, through exerting economic pressure, or just by corruption, all in the midst of states of confusion and shock:
Particularly shocking for me has been to learn about the role of the International Monetary Fund (IMF) in these doings. In many cases, support has been tied to requests for privatization and deregulation, clearly interferences with nations' autonomy and often with their political landscape.
Naomi Klein traces traces the historical route to our present days, when a large and increasing sector of our global economy has specialized in disaster help and security, up to the point that devastating news actually score as good news on the stock market. Her account of which private companies made billions after billions with the war in Iraq and Hurricane Katrina, all with taxpayers' money (resp. by increasing the government's debt) is sobering, especially in the face of how incompetent these tasks were performed (or paid for but not performed at all).
She finishes the book with examples of where shock has worn off, especially in South America were social democracy has put a halt to selling off countries and destroying communities, and established a new autonomy of these nations independent on the IMF. The main message I think she wants to get across is to be prepared for when a shock hits, for we can most easily be exploited when taken by surprise.
Comments
The book is an interesting read and an impressive collection of facts, quotations, and data. I can't help however to find Klein's account very single sided. As meticulously as she has collected evidence in favour of her thesis, I did not get the impression she has as carefully looked for evidence against her thesis. I feel like her narrative is compelling in its simplicity and has certainly some truth to it, but lacks many qualifiers.
Most importantly, her arguments are directly and by name targeted at recommendations out of the pocketbook of neoclassical economics. But what she actually is criticising is not a free market or the drawbacks of particular regulations, but corporatism and corruption (not to mention torture). In fact, in the introduction she writes:
She then however fails to explain why privatization and deregulation must necessarily be tied to these circumstances she is bemoaning, and thus why her constant stabbing at Friedman and the Chicago School. Instead of providing an argument on why this connection would be, she just offers example after example. I don't find this very insightful, as I would have wanted to know what could have been done better and why.
The book was recently turned into a documentary by directors Michael Winterbottom and Matt Whitecross, who make an effort pointing out it is not a conspiracy theory. Indeed, it is not. There is no sense in which Klein raises the impression these events were deliberately planned. Instead, the picture she paints is one in which ideological convictions have gone awry and people in power neglected to pay attention to reality - or just did not live up to their expectations. As she writes repeatedly, there is no way in which the human mind or a country can be cleaned of its history and provide the perfect platform to build on it from scratch a flawless utopia. This strive for perfection is an illusion, and one in whose pursuit comes suffering. To add my own perspective on that, it is less a conspiracy and more a system failure. Most importantly, the system fails to correct its own problems since those who hold the power have no incentives to do so.
Besides this, the book is nicely written and reads very well. It is however very repetitive, and fuzzy in articulating the main claims and conclusions. As far as I am concerned, I would have been fine with the first 100 pages and then a collection of facts and data. I really don't need to be told the story of the evil Friedmanians fifty times. And for my taste, she gives way too much room to the history of shock therapy treatment and its applications for torture. I don't even doubt that these concepts were picked up by economists. Such cross-fertilization between different fields is interesting (the concept of a 'landscape' being picked up by physicists is a similar example), but I didn't want to buy a book on torture or electroshock treatments and would have appreciated less details on that matter.
All together, if this was an amazon review, I'd give three points.
By Naomi Klein
Metropolitan Books (September 18, 2007)
In late 2007, I read an article in Harper's Magazine, titled “Disaster Capitalism” which, well written, vividly argued, left an impression. I Googled for the author, Naomi Klein, and was lead to her website announcing the new book, on which the Harper's article had offered a glimpse. I watched a truly appalling promotional video, and reminded myself that the author probably wasn't responsible for the advertisements before ordering the book.
Summary
In “The Shock Doctrine” Naomi Klein puts forward the thesis that worldwide and over decades shocks have been used to push through unpopular free market decisions, mostly privatization and deregulation, generally against the will of the people but always to the advantage of large corporations, the wealthy upper class, and corrupt governments. Shocks might be natural disasters, wars, terrorist attacks, or economic turmoils. The book is a collection of well researched and documented examples, from Bolivia over Chile, Poland, Iraq, China, the UK, Russia, again Iraq, Israel, South Africa, Sri Lanka, the Maldives to New Orleans.
The book begins with drawing parallels to shock therapy in the 50s and 60s, the attempt to reset the human mind by whatever means into an infantile state, a “clean slate” on which there could be imprinted a new beginning. Klein reports how these insights were later used for purposes of torture all over the world.
Throughout the book, Klein traces the actions of Milton Friedman, and his “Chicago Boys” who provided the ideological and allegedly scientific backup for operations in the course of which hundreds of thousands of people were murdered, died, or committed suicide. (Jeffrey Sachs makes an appearance in several instances.) If her claim of a shocked nation being a desirable state to perform unpopular free market reforms feels far fetched to you as it did to me, this is actually quite well documented in many instances, and one doesn't have to look very far to find this option was considered quite appropriate as means for what some considered progress.
Over the decades, Naomi argues, the ideology spread, packed into the wrapper that a free market maximizes social welfare, filled with a creamy myth of tricke-down. Extreme measures were more easily put into place in tyrannies, but found their way into democratic systems as well, through cloak and dagger operations, through exerting economic pressure, or just by corruption, all in the midst of states of confusion and shock:
“And that is how the crusade that Friedman began managed to survive the dreaded transition to democracy - not by its proponents persuading electorates of the wisdom of their world view, but by moving deftly from crisis to crisis, expertly exploiting the desperation of economic emergencies to push through policies that would tie the hands of fragile new democracies. Once the tactic was perfected, opportunities just seemed to multiply.”
Particularly shocking for me has been to learn about the role of the International Monetary Fund (IMF) in these doings. In many cases, support has been tied to requests for privatization and deregulation, clearly interferences with nations' autonomy and often with their political landscape.
Naomi Klein traces traces the historical route to our present days, when a large and increasing sector of our global economy has specialized in disaster help and security, up to the point that devastating news actually score as good news on the stock market. Her account of which private companies made billions after billions with the war in Iraq and Hurricane Katrina, all with taxpayers' money (resp. by increasing the government's debt) is sobering, especially in the face of how incompetent these tasks were performed (or paid for but not performed at all).
She finishes the book with examples of where shock has worn off, especially in South America were social democracy has put a halt to selling off countries and destroying communities, and established a new autonomy of these nations independent on the IMF. The main message I think she wants to get across is to be prepared for when a shock hits, for we can most easily be exploited when taken by surprise.
Comments
The book is an interesting read and an impressive collection of facts, quotations, and data. I can't help however to find Klein's account very single sided. As meticulously as she has collected evidence in favour of her thesis, I did not get the impression she has as carefully looked for evidence against her thesis. I feel like her narrative is compelling in its simplicity and has certainly some truth to it, but lacks many qualifiers.
Most importantly, her arguments are directly and by name targeted at recommendations out of the pocketbook of neoclassical economics. But what she actually is criticising is not a free market or the drawbacks of particular regulations, but corporatism and corruption (not to mention torture). In fact, in the introduction she writes:
“A more accurate term for a system that erases the boundaries between Big Government and Big Business is not liberal, conservative or capitalist but corporatist. Its main characteristics are huge transfers of public wealth to private hands, often accompanied by exploding debt, an ever widening chasm between the dazzling rich and the disposable poor, and an aggressive nationalism that justifies bottomless spending on security. For those inside the bubble of extreme wealth created by such an arrangement, there can be no more profitable way to organize a society.”
She then however fails to explain why privatization and deregulation must necessarily be tied to these circumstances she is bemoaning, and thus why her constant stabbing at Friedman and the Chicago School. Instead of providing an argument on why this connection would be, she just offers example after example. I don't find this very insightful, as I would have wanted to know what could have been done better and why.
The book was recently turned into a documentary by directors Michael Winterbottom and Matt Whitecross, who make an effort pointing out it is not a conspiracy theory. Indeed, it is not. There is no sense in which Klein raises the impression these events were deliberately planned. Instead, the picture she paints is one in which ideological convictions have gone awry and people in power neglected to pay attention to reality - or just did not live up to their expectations. As she writes repeatedly, there is no way in which the human mind or a country can be cleaned of its history and provide the perfect platform to build on it from scratch a flawless utopia. This strive for perfection is an illusion, and one in whose pursuit comes suffering. To add my own perspective on that, it is less a conspiracy and more a system failure. Most importantly, the system fails to correct its own problems since those who hold the power have no incentives to do so.
Besides this, the book is nicely written and reads very well. It is however very repetitive, and fuzzy in articulating the main claims and conclusions. As far as I am concerned, I would have been fine with the first 100 pages and then a collection of facts and data. I really don't need to be told the story of the evil Friedmanians fifty times. And for my taste, she gives way too much room to the history of shock therapy treatment and its applications for torture. I don't even doubt that these concepts were picked up by economists. Such cross-fertilization between different fields is interesting (the concept of a 'landscape' being picked up by physicists is a similar example), but I didn't want to buy a book on torture or electroshock treatments and would have appreciated less details on that matter.
All together, if this was an amazon review, I'd give three points.
Saturday, February 21, 2009
International Mother Language Day - February 21
Today is the "International Mother Language Day", which has been proclaimed by the General Conference of UNESCO ... to promote linguistic and cultural diversity and multilingualism. You may be aware of the issue from our recent post What is the world coming to?.
German radio added drama this morning by reporting that some German dialects such as Kölsch, Bairisch (Bavarian), and Letzeburgisch are in danger of extinction - a bit of an exaggeration, as you can check out on the "UNESCO Interactive Atlas of the World's Languages in Danger" which lists eg Bavarian as merely "unsafe" (and let me tell you, if you don't know Bavarian, it is definitely unsafe to use it).
Anyway, our blog works quite well because we and you, dear readers and commenters, can communicate without major accidents in one common language, English (or so we hope). But this is not our native tongue, and maybe it isn't yours either.
Looking at the visitors statistics (see below) of Backreaction, it seems that more than 75% of all readers come from English-speaking countries - but this actually just reflects the "language settings" of the visiting web browsers. It doesn't say necessarily much about the mother language of the real people using these browsers (Bee eg has her settings on US-English).
So, to celebrate International Mother Language Day and to learn more about the background and diversity of our readers, we invite you to leave in the comments a few greetings in your actual mother tongue - and maybe you can add what language it actually is :-)
(Don't understand German? Try the translation by Google!)
German radio added drama this morning by reporting that some German dialects such as Kölsch, Bairisch (Bavarian), and Letzeburgisch are in danger of extinction - a bit of an exaggeration, as you can check out on the "UNESCO Interactive Atlas of the World's Languages in Danger" which lists eg Bavarian as merely "unsafe" (and let me tell you, if you don't know Bavarian, it is definitely unsafe to use it).
Anyway, our blog works quite well because we and you, dear readers and commenters, can communicate without major accidents in one common language, English (or so we hope). But this is not our native tongue, and maybe it isn't yours either.
Looking at the visitors statistics (see below) of Backreaction, it seems that more than 75% of all readers come from English-speaking countries - but this actually just reflects the "language settings" of the visiting web browsers. It doesn't say necessarily much about the mother language of the real people using these browsers (Bee eg has her settings on US-English).
So, to celebrate International Mother Language Day and to learn more about the background and diversity of our readers, we invite you to leave in the comments a few greetings in your actual mother tongue - and maybe you can add what language it actually is :-)
Willkommen bei "Backreaction" am Internationalen Tag der Muttersprache! Das ist heute die Gelegenheit, hier ein paar freundliche Worte in Eurer Muttersprache zu hinterlassen. Wir freuen uns auf Eure Kommentare!
(Don't understand German? Try the translation by Google!)
Thursday, February 19, 2009
Evidence for the Black Hole Event Horizon
Yesterday we had a very nice colloquium by Ramesh Narayan from Harvard
You can find the recording at PIRSA: 09020024.
Black hole formation is a prediction of General Relativity (GR). We know that stars that have masses more than a few times the solar mass can not, once their nuclear power is burned out, stabilize at a finite radius and the gravitational pressure of their own mass will cause them to completely collapse. In this process, the density of the object increases, and the gravitational force on the surface gets stronger. If the gravitational force on the surface gets so strong not even light can escape, we call this surface an event horizon. It is the characteristic feature of black holes. Classically, nothing can ever leave the region behind the event horizon.
Since the early 90s, evidence has mounted for astrophysical black holes. These come in two rough categories: solar size black holes, with masses of a few times the mass of our sun that form directly from collapse of stars, and the so-called supermassive black holes, with masses about a million to a billion times the solar mass, that form through accretion in densely populated areas, mostly in the center of galaxies.
The cheap way to label an object a black hole is to measure its mass (eg from the motions of nearby stars) and its radius (eg by determining the source area of its emission). For a black hole, we know the relation between both, R = 2 GM, where G is the gravitational constant, R is the radius, and M is the mass. The radius of a black hole of about solar mass would be roughly 3 km, and that of a supermassive black hole is then consequently some million to billion kms. If one has data that allows to estimate mass and radius, if there is too much mass in an observed region of spacetime, one can conclude it has to be a black hole. (Keep in mind this is astrophysics, so observables typically have large errorbars and it takes some effort to pin down conclusions.)
This is however somewhat unsatisfactory. What one would really like to know is whether the object does have an event horizon, which is the defining feature of a black hole. The question is then, what observables can help us to determine whether we are dealing with a compact object that has a surface, or with an object that has an event horizon?
First let me emphasize that compact objects of the masses we are concerned with here that have a radius close by but not quite the radius of a black hole are not possible in GR. These objects can't be stabilized. But if one modifies GR, one can get away with this. People have looked into such modifications but these are not very convincing options. The reason is simple: To avoid collapse, one needs a mechanism to stabilize matter at a density that allows the matter to just not form a black hole. That is, one needs a deviation from the standard theory at densities of about M/R3, and inserting the black hole radius this goes as ~ 1/M2. This means, the more massive the black hole is, the smaller is the density at which you need deviations from the standard theory.
And this density can be arbitrarily small. It can be as small as densities we deal with every day. Take a supermassive black hole with 109 times the mass of the sun, which has a radius of about 109km. This gives a density of about 1039kg per 1027km3, or 1 kg per dm3, which is about the density of water. Not exactly a very extreme condition, and one that we have quite some experience with. From Einstein's field equations we further know the density scales like the background curvature. This means if you want to generally avoid the formation of black holes, you need modifications of GR in the arbitrarily small curvature regime. In this regime, the theory is extremely well tested, and we have not seen any deviations whatsoever.
But still, one would like to have observational evidence for the presence of the horizon (after all, it could be a naked singularity, no?). The key to this is to compare the emissions of an object that does have a surface with that of an object that does not have a surface. Astrophysical black holes accrete matter, and that matter heats up, which leads to emissions. When the accreted matter hits the surface this also leads to emissions, that can - in the case of astrophysical black holes - be violent nuclear explosions. An object with an event horizon on the other hand will not have contributions to the emitted radiation from the surface. Both will thus differ in their luminosity, which is observable.
In his talk, Narayan summarized the observations of the luminosity of both solar mass black holes in our galaxy, and for Sgr A*, the supermassive black hole in the center of our galaxy. In both cases, the observed emission is much smaller than would be expected if the object had a surface, and thus clear evidence for the presence of an event horizon.
Related: Coincidentally, Moshe just today wrote a nice post on Frozen Stars.
- Evidence for the Black Hole Event Horizon
Abstract: Astronomers have discovered many candidate black holes in the universe and have studied their properties in ever-increasing detail. Over the last decade, a few groups have developed observational tests for the presence of event horizons in candidate black holes. The talk will discuss one of these tests, which indicates that the supermassive black hole at the center of our Galaxy must have a horizon.
You can find the recording at PIRSA: 09020024.
Black hole formation is a prediction of General Relativity (GR). We know that stars that have masses more than a few times the solar mass can not, once their nuclear power is burned out, stabilize at a finite radius and the gravitational pressure of their own mass will cause them to completely collapse. In this process, the density of the object increases, and the gravitational force on the surface gets stronger. If the gravitational force on the surface gets so strong not even light can escape, we call this surface an event horizon. It is the characteristic feature of black holes. Classically, nothing can ever leave the region behind the event horizon.
Since the early 90s, evidence has mounted for astrophysical black holes. These come in two rough categories: solar size black holes, with masses of a few times the mass of our sun that form directly from collapse of stars, and the so-called supermassive black holes, with masses about a million to a billion times the solar mass, that form through accretion in densely populated areas, mostly in the center of galaxies.
The cheap way to label an object a black hole is to measure its mass (eg from the motions of nearby stars) and its radius (eg by determining the source area of its emission). For a black hole, we know the relation between both, R = 2 GM, where G is the gravitational constant, R is the radius, and M is the mass. The radius of a black hole of about solar mass would be roughly 3 km, and that of a supermassive black hole is then consequently some million to billion kms. If one has data that allows to estimate mass and radius, if there is too much mass in an observed region of spacetime, one can conclude it has to be a black hole. (Keep in mind this is astrophysics, so observables typically have large errorbars and it takes some effort to pin down conclusions.)
This is however somewhat unsatisfactory. What one would really like to know is whether the object does have an event horizon, which is the defining feature of a black hole. The question is then, what observables can help us to determine whether we are dealing with a compact object that has a surface, or with an object that has an event horizon?
First let me emphasize that compact objects of the masses we are concerned with here that have a radius close by but not quite the radius of a black hole are not possible in GR. These objects can't be stabilized. But if one modifies GR, one can get away with this. People have looked into such modifications but these are not very convincing options. The reason is simple: To avoid collapse, one needs a mechanism to stabilize matter at a density that allows the matter to just not form a black hole. That is, one needs a deviation from the standard theory at densities of about M/R3, and inserting the black hole radius this goes as ~ 1/M2. This means, the more massive the black hole is, the smaller is the density at which you need deviations from the standard theory.
And this density can be arbitrarily small. It can be as small as densities we deal with every day. Take a supermassive black hole with 109 times the mass of the sun, which has a radius of about 109km. This gives a density of about 1039kg per 1027km3, or 1 kg per dm3, which is about the density of water. Not exactly a very extreme condition, and one that we have quite some experience with. From Einstein's field equations we further know the density scales like the background curvature. This means if you want to generally avoid the formation of black holes, you need modifications of GR in the arbitrarily small curvature regime. In this regime, the theory is extremely well tested, and we have not seen any deviations whatsoever.
But still, one would like to have observational evidence for the presence of the horizon (after all, it could be a naked singularity, no?). The key to this is to compare the emissions of an object that does have a surface with that of an object that does not have a surface. Astrophysical black holes accrete matter, and that matter heats up, which leads to emissions. When the accreted matter hits the surface this also leads to emissions, that can - in the case of astrophysical black holes - be violent nuclear explosions. An object with an event horizon on the other hand will not have contributions to the emitted radiation from the surface. Both will thus differ in their luminosity, which is observable.
In his talk, Narayan summarized the observations of the luminosity of both solar mass black holes in our galaxy, and for Sgr A*, the supermassive black hole in the center of our galaxy. In both cases, the observed emission is much smaller than would be expected if the object had a surface, and thus clear evidence for the presence of an event horizon.
Related: Coincidentally, Moshe just today wrote a nice post on Frozen Stars.
Wednesday, February 18, 2009
This and That
Some random things I came across recently and thought you'd enjoy:
- If you still haven't understood what universal health insurance is good for, read this: For Uninsured Young Adults, Do-It-Yourself Health Care. Statistically, this is the most healthy agegroup...
- Robert has an interesting post about and agaist refereeing fees. See also our posts Peer Review II, III, IV and V.
- Paul Kedrosky comments on Cheap Credit and Higher Education Tuition. Where is your country going?
- Great Photo
- Quotation of the week:
“If all economists were laid end to end, they would not reach a conclusion.”~ George Bernard Shaw
Sunday, February 15, 2009
Do we need Science Journalists?
Science bloggers and their sometimes troublesome relation to science journalists is a topic that I have come across many times since I started writing this blog. And in many instances I have heard statements of the sort that blogging will render journalism obsolete. Bora's recent post The Shock Value of Science Blogs is a nice example. He writes
He than bashes around a bit on George Johnson and John Horgan, and ends with saying
For one, as far as I am concerned most scientists are not particularly good writers (I include myself in that) and since I appreciate a piece of good writing I sincerely hope professional journalism will prevail. Having acquired the necessary skills and appropriate education certainly helps to this matters. I don't know what Bora's standards are, but I find the vast majority of science blogs not particularly well written (YOU obviously belong to the minority of brilliant writers).
Second, reporting by scientists about their own research is always bound to be biased, and an important task of journalists is to provide an objective outside view. This might not always work out to the scientists' favour. John Horgan eg is certainly known for his cynical view on some branches of science, and it is of little surprise some scientists are put off by this. But such criticism fulfils an important function in disconnecting topics from people who are their direct originators, much like editorials in newspapers are not generally written by politicians.
I am not saying that science journalism presently is fulfilling this task very well (see eg earlier post Fact or Fiction?, and When Capitalism Fails for why this is the case), but it has its place and I think we need it. Science blogs can certainly contribute to communicating science, by providing the details that journalists don't cover - details about the research or also the life as researcher. But leaving science journalism completely over to bloggers is not a good idea.
“The job of translating Scientese into English (or whatever is the local language) has traditionally been done by professional science journalists. Unfortunately, most science journalists (hats off to the rare and excellent exceptions) are absolutely awful about it. They have learned the journalistic tools, but have no background in science. They think they are educated, but they only really know how to use the language to appear they are educated. Fortunately for everyone, the Web is allowing scientists to speak directly to the public, bypassing, marginalizing and pushing into extinction the entire class of science "journalists" because, after all, most scientists are excellent communicators. And those who are, more and more are starting to use blogs as a platform for such communication.
[I]n science journalism, there exist out there people with real expertise - the scientists themselves - who now have the tools and means to bypass you and make you obsolete because you cannot add any value any more.”
He than bashes around a bit on George Johnson and John Horgan, and ends with saying
“Perhaps if we remove those middle-men and have scientists and the public start talking to each other directly, then we will have the two groups start talking to each other openly, honestly and in an informal language that is non-threatening (and understood as such) by all. The two sides can engage and learn from each other. The people who write ignorant, over-hyping articles, the kinds we bloggers love to debunk (by being able to compare to the actual papers because we have the background) are just making the entire business of science communication muddled and wrong. Please step aside.”Well, there is two things I have to say about that.
For one, as far as I am concerned most scientists are not particularly good writers (I include myself in that) and since I appreciate a piece of good writing I sincerely hope professional journalism will prevail. Having acquired the necessary skills and appropriate education certainly helps to this matters. I don't know what Bora's standards are, but I find the vast majority of science blogs not particularly well written (YOU obviously belong to the minority of brilliant writers).
Second, reporting by scientists about their own research is always bound to be biased, and an important task of journalists is to provide an objective outside view. This might not always work out to the scientists' favour. John Horgan eg is certainly known for his cynical view on some branches of science, and it is of little surprise some scientists are put off by this. But such criticism fulfils an important function in disconnecting topics from people who are their direct originators, much like editorials in newspapers are not generally written by politicians.
I am not saying that science journalism presently is fulfilling this task very well (see eg earlier post Fact or Fiction?, and When Capitalism Fails for why this is the case), but it has its place and I think we need it. Science blogs can certainly contribute to communicating science, by providing the details that journalists don't cover - details about the research or also the life as researcher. But leaving science journalism completely over to bloggers is not a good idea.
Friday, February 13, 2009
Assumptions and Limitations
I recently came across this astonishing quotation, also referred to as “the F-twist”:
I imagine I'd send this as reply to a referee report which criticizes my work on the grounds that I might have found a great dark matter candidate, but not only have I assumed unbroken supersymmetry, my model also has four neutrino generations and, oh, only 2 spatial dimensions. Assumptions that indeed qualify as unrealistic and wildly inaccurate. For not to say, bluntly wrong.
But maybe I am being unfair.
Let me guess what Friedman might have wanted to say. The more parsimonious a model, the easier it is to extract relevant features and get an understanding of its behaviour. That does not mean however, it makes for a better model the fewer and more unrealistic assumptions you have. Certainly, the standard model of particle physics would be nicer if all fermions were massless and chiral symmetry was unbroken. Unfortunately, it doesn't describe Nature then. That's why we distinguish between models of the real world, and 'toy models' meant as testing ground to increase our understanding of the general features (see earlier post on Models and Theories).
But maybe also this is unfair.
He might have meant to say that a simplifying assumption does not have to be shown appropriate for a certain range of validity, the range in which predictions derived from the assumption then can be made. Instead, one can just see whether the model works and such justify the assumption a posteriori. Unfortunately, that too is nonsense. If you don't specify the range of validity of your assumptions (typically by showing that the effect of deviations from the assumptions is negligible for the result) your model is not falsifiable and thus not scientific. If you test it and the outcome does not match your predictions, you can just go and say, well, the assumptions were not fulfilled.
Thus, I am afraid unless you want to redefine what you mean with a scientific theory, this is not a good starting point. One wonders why he felt the need to put “explain” in quotation marks.
See also: Shut up and calculate
“Truly important and significant hypotheses will be found to have “assumptions” that are wildly inaccurate descriptive representations of reality, and, in general, the more significant the theory, the more unrealistic the assumptions (in this sense). The reason is simple. A hypothesis is important if it “explains” much by little, that is, if it abstracts the common and critical elements from the mass of complex and detailed circumstances surrounding the phenomena to be explained and permits valid predictions on the basis of them alone. To be important, therefore, a hypothesis must be descriptively false in its assumptions; it takes account of, and accounts for, none of the many other attendant circumstances, since its very success shows them to be irrelevant for the phenomena to be explained.”
I imagine I'd send this as reply to a referee report which criticizes my work on the grounds that I might have found a great dark matter candidate, but not only have I assumed unbroken supersymmetry, my model also has four neutrino generations and, oh, only 2 spatial dimensions. Assumptions that indeed qualify as unrealistic and wildly inaccurate. For not to say, bluntly wrong.
But maybe I am being unfair.
Let me guess what Friedman might have wanted to say. The more parsimonious a model, the easier it is to extract relevant features and get an understanding of its behaviour. That does not mean however, it makes for a better model the fewer and more unrealistic assumptions you have. Certainly, the standard model of particle physics would be nicer if all fermions were massless and chiral symmetry was unbroken. Unfortunately, it doesn't describe Nature then. That's why we distinguish between models of the real world, and 'toy models' meant as testing ground to increase our understanding of the general features (see earlier post on Models and Theories).
But maybe also this is unfair.
He might have meant to say that a simplifying assumption does not have to be shown appropriate for a certain range of validity, the range in which predictions derived from the assumption then can be made. Instead, one can just see whether the model works and such justify the assumption a posteriori. Unfortunately, that too is nonsense. If you don't specify the range of validity of your assumptions (typically by showing that the effect of deviations from the assumptions is negligible for the result) your model is not falsifiable and thus not scientific. If you test it and the outcome does not match your predictions, you can just go and say, well, the assumptions were not fulfilled.
Thus, I am afraid unless you want to redefine what you mean with a scientific theory, this is not a good starting point. One wonders why he felt the need to put “explain” in quotation marks.
See also: Shut up and calculate
Thursday, February 12, 2009
My Computer has Insomnia
I don't sleep well at night. One gets used to it. Over the years I have developed a skill in writing imaginary equations on the ceiling. But last night I noticed I'm not the only one in my apartment who is up at night.
My computer presently sleeps on the couch (also known as 'the ouch' to our long-term readers). That's because I figured if I work in the living room during the winter I have one room less to heat. And electric heating at an outside temperature below -20 °C makes quite an impact on the bill, believe me. I recently acquired a new computer, a Lenovo Thinkpad, after my Dell had developed more and more bugs and one was eventually fatal (don't buy Dell).
I wake up around three in the morning and lie around for a while. “Beep!” I hear from the living room. Funny, I think, is this my computer? After a while I hear “Dschingeling!” Indeed, I think, it must be having some kind of dream. And after some more minutes there comes a “Plong!” which I recognize as an error message. My poor computer, has a nightmare.
So I get up and find my computer is wide awake. It seems it decided, after seven hours or so, to notify me that a program didn't start properly and if I want to send an error report to Microsoft, or maybe restart the program? I hit Ctrl Alt Delete, kill all zombie processes, and put it back to sleep. Not without first switching off the wireless feature. At least so I know it won't talk to other computers if it has a bad dream.
My computer presently sleeps on the couch (also known as 'the ouch' to our long-term readers). That's because I figured if I work in the living room during the winter I have one room less to heat. And electric heating at an outside temperature below -20 °C makes quite an impact on the bill, believe me. I recently acquired a new computer, a Lenovo Thinkpad, after my Dell had developed more and more bugs and one was eventually fatal (don't buy Dell).
I wake up around three in the morning and lie around for a while. “Beep!” I hear from the living room. Funny, I think, is this my computer? After a while I hear “Dschingeling!” Indeed, I think, it must be having some kind of dream. And after some more minutes there comes a “Plong!” which I recognize as an error message. My poor computer, has a nightmare.
So I get up and find my computer is wide awake. It seems it decided, after seven hours or so, to notify me that a program didn't start properly and if I want to send an error report to Microsoft, or maybe restart the program? I hit Ctrl Alt Delete, kill all zombie processes, and put it back to sleep. Not without first switching off the wireless feature. At least so I know it won't talk to other computers if it has a bad dream.
Tuesday, February 10, 2009
Science and Democracy IV
Dennis Overbye recently had a nice opinion piece in the NYT, titled “Elevating Science, Elevating Democracy.” Since we have discussed the topic of Science and Democracy repeatedly on this blog (Part I, Part II, Part III) I thought it is worthwhile to comment on this piece (see also Daniel's comment).
Before Overbye spirals off into an elaboration on China's problems, he lays out values that are essential both for science and democracy:
I agree with him on that, but this is about were similarities end. In its function, democracy serves an entirely different purpose than science, and it uses a different mechanism to reach this aim.
Democracy, as other forms of government, is a way to take people's opinions and come to a common conclusion about what to do, which eventually reflects in the organization of people's lives. In a monarchy, this process is pretty simple: neglect everybody's opinion except that of the king. In a grassroots democracy you might sum everything up and take the majority opinion. In a representative democracy the process is quite involved. It gets even more complicated due to the constraint that legislation should be self-consistent.
The aim of science on the other hand is not to come to a common conclusion about people's opinions by whatever mechanism. The aim is to come to a common conclusion about Nature. The decisions in the end are not made by scientists, but by the evidence we have gathered, whether we like that or not. In this process, opinions hopefully come to largely agree on some insights that then enter the established body of knowledge. Ideally, the evidence becomes so clear that virtually nobody in his right might holds differing opinions.
But if you want to know what the scientific opinion is on a matter that has not yet been settled, you are not going to get a reply in unison. (Possibly not even if you ask one single person.) Indeed, if that was the case it would pretty much mean that science is completely disfunct. Instead, you might be offered a selection of different approaches and their pros and cons, the present status of research and the lacking pieces of the puzzle. But there is no formal process by which a decision about open question is made.
There are certainly also in politics questions that are highly discussed during some period, and later become pretty much settled. Think about slavery, women's right to vote, or homosexual's right to marry. (Well, there are so-called 'civilized' countries that are a bit behind on some of these issues.) But these are questions of opinion, opinions that evidently change over time, and as much as you'd want to argue such neither opinion is “wrong” in the scientific sense as that it can be falsified by experiment.
What I had been writing about in my earlier posts (eg here or more recently here) is a different aspect of democracy in science, which does not address the question of how a scientific fact becomes established, but about the process of knowledge discovery itself. As I have argued many times, the present organization of scientific research leads to an inefficient use of human, financial and time resources. Besides inertia, the dominant reason for this state of affairs to prevail is that scientists have virtually no influence of how the system they operate in is organized. That, sad as it is, currupts the status of the above quoted values Overbye ranks so highly.
Overbye has further many nice words for scientists - he goes so far to praises science as “the most successful human activity of all time.” I would have thought the most successful human activity is sex. But maybe I am confusing matters.
Before Overbye spirals off into an elaboration on China's problems, he lays out values that are essential both for science and democracy:
“Those values, among others, are honesty, doubt, respect for evidence, openness, accountability and tolerance and indeed hunger for opposing points of view.”
I agree with him on that, but this is about were similarities end. In its function, democracy serves an entirely different purpose than science, and it uses a different mechanism to reach this aim.
Democracy, as other forms of government, is a way to take people's opinions and come to a common conclusion about what to do, which eventually reflects in the organization of people's lives. In a monarchy, this process is pretty simple: neglect everybody's opinion except that of the king. In a grassroots democracy you might sum everything up and take the majority opinion. In a representative democracy the process is quite involved. It gets even more complicated due to the constraint that legislation should be self-consistent.
The aim of science on the other hand is not to come to a common conclusion about people's opinions by whatever mechanism. The aim is to come to a common conclusion about Nature. The decisions in the end are not made by scientists, but by the evidence we have gathered, whether we like that or not. In this process, opinions hopefully come to largely agree on some insights that then enter the established body of knowledge. Ideally, the evidence becomes so clear that virtually nobody in his right might holds differing opinions.
But if you want to know what the scientific opinion is on a matter that has not yet been settled, you are not going to get a reply in unison. (Possibly not even if you ask one single person.) Indeed, if that was the case it would pretty much mean that science is completely disfunct. Instead, you might be offered a selection of different approaches and their pros and cons, the present status of research and the lacking pieces of the puzzle. But there is no formal process by which a decision about open question is made.
There are certainly also in politics questions that are highly discussed during some period, and later become pretty much settled. Think about slavery, women's right to vote, or homosexual's right to marry. (Well, there are so-called 'civilized' countries that are a bit behind on some of these issues.) But these are questions of opinion, opinions that evidently change over time, and as much as you'd want to argue such neither opinion is “wrong” in the scientific sense as that it can be falsified by experiment.
What I had been writing about in my earlier posts (eg here or more recently here) is a different aspect of democracy in science, which does not address the question of how a scientific fact becomes established, but about the process of knowledge discovery itself. As I have argued many times, the present organization of scientific research leads to an inefficient use of human, financial and time resources. Besides inertia, the dominant reason for this state of affairs to prevail is that scientists have virtually no influence of how the system they operate in is organized. That, sad as it is, currupts the status of the above quoted values Overbye ranks so highly.
Overbye has further many nice words for scientists - he goes so far to praises science as “the most successful human activity of all time.” I would have thought the most successful human activity is sex. But maybe I am confusing matters.
Monday, February 09, 2009
Singularities in your Kitchen
When Sabine was preparing her talk about black holes and information loss, we thought about other examples of singularities in physical theories besides the centres of black holes in General Relativity. Somehow, the topic seems to pursue me since then - the current issue of the Scientific American welcomes me with Naked Singularities on the title page, and there is even a newly created Singularity University.
A singularity in the mathematical formulation of a physical theory means that a variable which represents a physical quantity becomes infinite within a finite time. This is, actually, not that rare a phenomenon in non-linear theories. For example, in General Relativity, Einstein's field equations when applied to the gravitational collapse of a very massive star develop infinities in density and curvature at the centre of the system. Another famous example of a non-linear theory is fluid dynamics as described by the Navier-Stokes equations - and this is also a habitat of nice singularities.
For example, when a thin jet of water decays into drops, the breakup is driven by surface tension which tries to reduce the surface area. Such a reduction can be realised by diminishing the radius of the jet. Shrinking, triggered by tiny fluctuations of the surface, becomes more and more localised, end eventually, the jet breaks in finite time. The local radius goes to zero, local flow velocity and surface curvature diverge, and the surface is not smooth anymore. Something very similar happens when a drop forms and pinches off from a tap, as can be seen nicely in the photograph taken from the paper by Shi, Brenner, and Nagel. Breakup occurs just above the spherical droplet, where the radius of the thread of the fluid shrinks to zero and the surface becomes kinky.
Of course, a singularity in the Navier-Stokes equations at the pinch-off of a droplet doesn't mean anything mysterious. But it is a hint that in this situation and at small enough length scales, the equations do not make sense anymore, or at least disregard essential physics. In this case, we know of course that the molecular structure of matter becomes important, replacing the continuum description of matter implied by the Navier-Stokes equations. On the scale of molecules, the concept of a sharp and smooth surface is ambiguous, but already at length scales between 10 and 100 nanometer, van der Waals forces between molecules come into play which are not considered in the continuum formulation.
It's a bit of a stretch to say that some similar effect might remove the singularity at the centre of a black hole, but on a very general level a similar breakdown of the theory that predicts a singularity might occur. In this case it would be General Relativity to be replaced by a theory of quantum gravity that accurately describes the region of strong curvature and high density.
Here are a few paper about singularities in fluid dynamics I found interesting:
If you know of other examples of singularities in fluid dynamics, or in other physical systems, I'll be glad to collect them in the comments!
Tags: Physics, Singularity, Drop formation
Photograph of a drop of a mixture of glycerol in water. The diameter of the drop is about 20 mm. The photo of the right shows the neck in detail. From "A Cascade of Structure in a Drop Falling from a Faucet" by X. D. Shi, Michael P. Brenner, and Sidney R. Nagel, Science 265 (1994) 219-222, via jstor.)
But I was fascinated most by what I've learned since then about singularities in fluid dynamics - singularities that actually occur in the kitchen, every time a drop of water falls off the tap.A singularity in the mathematical formulation of a physical theory means that a variable which represents a physical quantity becomes infinite within a finite time. This is, actually, not that rare a phenomenon in non-linear theories. For example, in General Relativity, Einstein's field equations when applied to the gravitational collapse of a very massive star develop infinities in density and curvature at the centre of the system. Another famous example of a non-linear theory is fluid dynamics as described by the Navier-Stokes equations - and this is also a habitat of nice singularities.
For example, when a thin jet of water decays into drops, the breakup is driven by surface tension which tries to reduce the surface area. Such a reduction can be realised by diminishing the radius of the jet. Shrinking, triggered by tiny fluctuations of the surface, becomes more and more localised, end eventually, the jet breaks in finite time. The local radius goes to zero, local flow velocity and surface curvature diverge, and the surface is not smooth anymore. Something very similar happens when a drop forms and pinches off from a tap, as can be seen nicely in the photograph taken from the paper by Shi, Brenner, and Nagel. Breakup occurs just above the spherical droplet, where the radius of the thread of the fluid shrinks to zero and the surface becomes kinky.
Of course, a singularity in the Navier-Stokes equations at the pinch-off of a droplet doesn't mean anything mysterious. But it is a hint that in this situation and at small enough length scales, the equations do not make sense anymore, or at least disregard essential physics. In this case, we know of course that the molecular structure of matter becomes important, replacing the continuum description of matter implied by the Navier-Stokes equations. On the scale of molecules, the concept of a sharp and smooth surface is ambiguous, but already at length scales between 10 and 100 nanometer, van der Waals forces between molecules come into play which are not considered in the continuum formulation.
It's a bit of a stretch to say that some similar effect might remove the singularity at the centre of a black hole, but on a very general level a similar breakdown of the theory that predicts a singularity might occur. In this case it would be General Relativity to be replaced by a theory of quantum gravity that accurately describes the region of strong curvature and high density.
Here are a few paper about singularities in fluid dynamics I found interesting:
- Hydrodynamic Singularities by Jens Eggers, arXiv:physics/0110087v1
- A Brief History of Drop Formation by Jens Eggers, arXiv:physics/0403056v1
- Theory of Drop Formation by Jens Eggers, Phys. Fluids 7 (1995) 941 and arXiv:physics/0111003v1
- Sink Flow Deforms the Interface Between a Viscous Liquid and Air into a Tip Singularity by S. Courrech du Pont and J. Eggers, Phys. Rev. Lett. 96 (2006) 034501 and arxiv:physics/0512095v1 - That's a different kind of "singularity" than in the drop breakup, even extending along a line, and with nice data showing the "approach" towards the singularity.
If you know of other examples of singularities in fluid dynamics, or in other physical systems, I'll be glad to collect them in the comments!
Tags: Physics, Singularity, Drop formation
Sunday, February 08, 2009
Bizarre Snow Formation
This weekend it's been much warmer than the previous weeks, and most of the snow that had piled up in parking lots and backyards is melting off. Here are some photos from a quite bizarre snowpile I came across this afternoon. The whole thing is 4-5 meters high and makes a constant dripping and crunching sound (click to enlarge).
Thursday, February 05, 2009
What is the world coming to?
I have wondered for a while what the increasing connectivity and mobility will mean for historically local aspects of our lives, such as language and traditions.
In 1996, The Economist wrote “English might now be impregnably established as the world standard language: an intrinsic part of the global communications revolution” [1]. In the same year, an article titled “World Wide Web: Three English Words” in the New York Times let us know that “if you want to take full advantage of the Internet, there is only one way to do it: learn English.” And indeed, in the late 90s, 80% of online information was in English, a large fraction given that only an estimated 8% of the people in the world are native English speakers. However, by 2002, the fraction of English websites had dropped to less than 50%, by 2005 to about 1/3 [2], and in 2008 English had dropped to 29.4%, followed by 18.9% Chinese and 8.5% Spanish. Today the web thus reflects the diversity of languages in the world much more accurately than a decade ago.
The idea that English would overtake the world through its dominance on the Internet therefore was wrong. Nevertheless, small countries whose languages are not very widely used have a too small market share to synchronize movies or translate books into the local language (eg the Netherlands), which means that growing up in such a country people will learn foreign languages very early or miss a big part of what the world is talking about. It isn't particularly representative, but I know several couples with differing native languages who mostly communicate in English, meaning their children do (or will) grow up trilingual. And how I envy them for the ease with which they will be able to travel around the globe.
All in all, despite the fact that people do prefer their mother tongue, different languages pose a difficulty for communication, an obstacle that requires effort to overcome. Wouldn't it just be so much easier if we'd all speak the same language? If I crank forwards the clock for some thousands years - provided that mankind still exists and the level of connectivity remains or increases - I thus think we will settle down into one common language, which will very likely be none of the languages we speak today but some aggregate of present languages. Certainly, there will be interest groups for the preservation of language diversity. If they have spent all their donations setting up their website in 50 different languages, they will realize fighting against a trend towards simplicity is futile.
When it comes to traditions however, the situation is different. May that be rituals, festivities or recipes - though these do compete with each other for our time and taste, I see no strong reason for this diversity to dwindle. You might prefer to stick to people who share your traditions, but this is much more your private business than a language you need to get through your daily live. Again jumping ahead some thousand years, I thus don't think these differences will be erased, for much the same reason that interior design differs from one house to the next. I am undecided about cultural differences, as these fall somewhere between communication and tradition.
What do you think?
I have to admit though, there are advantages to not being a native speaker. If I want to get rid of solicitors, I conveniently forget I speak English.
[1] “The Coming Global Tongue,” The Economist, 21 December 1996.
[2] Numbers from “Who Controls the Internet?” by Goldstein and Wu
In 1996, The Economist wrote “English might now be impregnably established as the world standard language: an intrinsic part of the global communications revolution” [1]. In the same year, an article titled “World Wide Web: Three English Words” in the New York Times let us know that “if you want to take full advantage of the Internet, there is only one way to do it: learn English.” And indeed, in the late 90s, 80% of online information was in English, a large fraction given that only an estimated 8% of the people in the world are native English speakers. However, by 2002, the fraction of English websites had dropped to less than 50%, by 2005 to about 1/3 [2], and in 2008 English had dropped to 29.4%, followed by 18.9% Chinese and 8.5% Spanish. Today the web thus reflects the diversity of languages in the world much more accurately than a decade ago.
The idea that English would overtake the world through its dominance on the Internet therefore was wrong. Nevertheless, small countries whose languages are not very widely used have a too small market share to synchronize movies or translate books into the local language (eg the Netherlands), which means that growing up in such a country people will learn foreign languages very early or miss a big part of what the world is talking about. It isn't particularly representative, but I know several couples with differing native languages who mostly communicate in English, meaning their children do (or will) grow up trilingual. And how I envy them for the ease with which they will be able to travel around the globe.
All in all, despite the fact that people do prefer their mother tongue, different languages pose a difficulty for communication, an obstacle that requires effort to overcome. Wouldn't it just be so much easier if we'd all speak the same language? If I crank forwards the clock for some thousands years - provided that mankind still exists and the level of connectivity remains or increases - I thus think we will settle down into one common language, which will very likely be none of the languages we speak today but some aggregate of present languages. Certainly, there will be interest groups for the preservation of language diversity. If they have spent all their donations setting up their website in 50 different languages, they will realize fighting against a trend towards simplicity is futile.
When it comes to traditions however, the situation is different. May that be rituals, festivities or recipes - though these do compete with each other for our time and taste, I see no strong reason for this diversity to dwindle. You might prefer to stick to people who share your traditions, but this is much more your private business than a language you need to get through your daily live. Again jumping ahead some thousand years, I thus don't think these differences will be erased, for much the same reason that interior design differs from one house to the next. I am undecided about cultural differences, as these fall somewhere between communication and tradition.
What do you think?
I have to admit though, there are advantages to not being a native speaker. If I want to get rid of solicitors, I conveniently forget I speak English.
[1] “The Coming Global Tongue,” The Economist, 21 December 1996.
[2] Numbers from “Who Controls the Internet?” by Goldstein and Wu
Tuesday, February 03, 2009
Corot-Exo-7b: A Venus in another World
German science blogs today are abuzz with reports about the discovery of an Earth-like planet around a Sun-like star in the constellation of Monoceros, at a distance of about 450 light years.
The newly discovered planet Corot-Exo-7b transiting in front of its star (left, illustration by Klaudia Einhorn), and Venus in front of the disk of the Sun on June 8, 2004 (right, photo by Martin Sloboda). As the sizes of both the stars and the planets are similar, a transit of Corot-Exo-7b would look very similar to the Venus transit.
The planet has a radius which is 1.75 times larger than that of the Earth, and has six to thirteen times the mass of Earth. The star is a main sequence star with roughly the same composition as the Sun, with slightly less mass and a slightly lower temperature. However, distance of the planet to the star is only 1.7 percent of the distance of the Earth from the Sun - hence the revolution period, or "year", of the planet is only 20 hours, and its surface temperature is estimated to be between 1,000 and 1,500 degrees Celsius.
The planet was discovered by the European satellite mission Corot - hence its name, Corot-Exo-7b, meaning the first planet in the 7th planetary system discovered by Corot. Corot uses the transit method to search for new planets: When a planet passes in front of the disk of a star, the light of the star is slightly dimmed.
Here is the light curve of Corot-Exo-7, the star around which the planet is in orbit, showing a drop in brightness of the order of 10-4:
Mass, radius, and orbital parameters of the planet could be extracted from this measurement, and further observations and data analysis using the radial velocity method, the method which had led to the first detection of an exoplanet back in 1995.
So, it's indeed the first Earth-like planet at a Sun-like star - unfortunately, at a temperature close to the melting point of iron.
Tags: Astronomy, Exoplanet, Corot
The newly discovered planet Corot-Exo-7b transiting in front of its star (left, illustration by Klaudia Einhorn), and Venus in front of the disk of the Sun on June 8, 2004 (right, photo by Martin Sloboda). As the sizes of both the stars and the planets are similar, a transit of Corot-Exo-7b would look very similar to the Venus transit.
The planet has a radius which is 1.75 times larger than that of the Earth, and has six to thirteen times the mass of Earth. The star is a main sequence star with roughly the same composition as the Sun, with slightly less mass and a slightly lower temperature. However, distance of the planet to the star is only 1.7 percent of the distance of the Earth from the Sun - hence the revolution period, or "year", of the planet is only 20 hours, and its surface temperature is estimated to be between 1,000 and 1,500 degrees Celsius.
The planet was discovered by the European satellite mission Corot - hence its name, Corot-Exo-7b, meaning the first planet in the 7th planetary system discovered by Corot. Corot uses the transit method to search for new planets: When a planet passes in front of the disk of a star, the light of the star is slightly dimmed.
Here is the light curve of Corot-Exo-7, the star around which the planet is in orbit, showing a drop in brightness of the order of 10-4:
Mass, radius, and orbital parameters of the planet could be extracted from this measurement, and further observations and data analysis using the radial velocity method, the method which had led to the first detection of an exoplanet back in 1995.
So, it's indeed the first Earth-like planet at a Sun-like star - unfortunately, at a temperature close to the melting point of iron.
- Press release by the Observatoire de Paris: "Super-Earth found! The smallest transiting extrasolar planet ever discovered".
- Corot-Exo-7b fact sheet of the Extrasolar Planets Encyclopaedia
Tags: Astronomy, Exoplanet, Corot