Saturday, May 29, 2010

Interna

This weekend, I'll be on my way back to Sweden. My time here at Perimeter Institute turned out to be busier than expected, but it has also been very productive. It is somewhat sad that every time I come back more people I knew have left. Those postdocs who I spent my years with here have either left already, sit on packed bags about to start a new job, or are due to apply for a new job this fall.

The weather here in Waterloo has been brilliant the last two weeks, and the construction at PI has proceeded rapidly. On the risk of boring you to death, here's more photos of the building extension. Meanwhile, one can imagine how the result will look. The photo below is from the back of the building. To the right, you see the old part of the building, the glass boxes are the researchers' offices.



This is a close-up of the new part of the building, with the goldish shimmering glass front:



And this is again the view from the parking lot, compare to three weeks ago. If I recall correctly, that's where the new main entrance will be.



So, now I have to pack my bag. You'll hear from me once I'm back in Stockholm. Meanwhile, a great weekend to all of you!

Thursday, May 27, 2010

Learning to deal with information

In the 21st century, information is cheap. Or is it? I have written several times on this blog that it is a naive illusion to think of the internet as a democratic provider of information. Moreover, the simple provision of information is not equivalent to people being well informed.

The availability of information on the internet is not democratic but, if anything, anarcho-capitalist. If you have the money to pay people who know something about search engine optimization, and others to spam links to your site wherever they won't be immediately deleted, you can pimp up your website's ranking dramatically. Even Google's PageRank algorithm itself is clearly not democratic: it gives more weight to a link from a site that has itself more links. That's what makes it so powerful and so useful. Sure, we all profit from this clever ordering of information. I'm certainly not complaining about it. It's just not democratic and shouldn't be sold as democratic since not everybody's voice has the same weight. Google itself does smartly not call the algorithm itself "democratic" but writes that "PageRank relies on the uniquely democratic nature of the web." Ohm, which democratic nature are we talking about again? But maybe more important, Google's PageRank also doesn't tell you anything about the quality of information you obtain. That the voices of the wealthy have more impact is hardly surprising, and merely a reflection of what has been going on in the media and news press for a long time.

Now one could of course argue that it's up to me to just go through all the hits that my search brought up and find the best piece of information. But as a matter of fact, most people don't do that. I usually don't do it either. And that's not even irrational, because scanning through all the hits that one gets on a query is very time-intensive and the result rarely justifies the effort. Thus, most people will skim maybe the first 20 hits, if at all, and conclude that they've gotten a fair cross-section of what there is to know about the topic. That's the part of the information that is "cheap." Everything else, for example checking sources, becomes increasingly costly in terms of time and effort. And since most websites don't list their sources, there's few shortcuts to that. What is left is that whoever dominates the "cheap" information does, for all practical purposes, dominate the information market. The only cure for that is information literacy.

The other day, I read an interesting article by Mark Moran. Moran is CEO of a Web publisher that offers free content and tools that teach students how to use the Web effectively. He writes:
"[A]s the founder of a company whose mission is to teach the effective use of the Internet, I have pored through dozens of studies, and recently oversaw one myself, that all came to the same conclusion: Students do not know how to find or evaluate the information they need on the Internet.

In a recent study of fifth grade students in the Netherlands, most never questioned the credibility of a Web site, even though they had just completed a course on information literacy. When my company asked 300 school students how they searched, nearly half answered: "I type a question." When we asked how students knew if a site was credible, the most common answers were "if it sounds good" or "if it has the information I need." Equally dismal was their widespread failure to check a source’s date, author or citations."
I find this seriously scary! As I have expressed in my earlier post Cast Away, the passing on of knowledge to the next generation is one of the most essential ingredients to continuing progress. How are people supposed to make informed decisions if they can't tell what the relevant information is to begin with? Where does that leave our political systems? But then I read the following:
"Every day, we are inundated with vast amounts of information. A 24-hour news cycle and thousands of global television and radio networks, coupled with an immense array of online resources, have challenged our long-held perceptions of information management. Rather than merely possessing data, we must also learn the skills necessary to acquire, collate, and evaluate information for any situation. This new type of literacy also requires competency with communication technologies, including computers and mobile devices that can help in our day-to-day decisionmaking. [...]

Though we may know how to find the information we need, we must also know how to evaluate it. Over the past decade, we have seen a crisis of authenticity emerge. We now live in a world where anyone can publish an opinion or perspective, whether true or not, and have that opinion amplified within the information marketplace."

Wise words, eh? Guess where that's from? Guess, don't Google! It's a press release from the White House. No, really. It's an announcement for the "National Information Literacy Awareness Month" that was last year in October, which somehow passed me by. While recognizing a problem isn't the same as solving it, it is certainly a good first step. Let's hope that other nations will follow that example, there's clearly hope. Yes, we can do it! Indeed, there is more hopeful news today: The Pew Research Center's Project for Excellence in Journalism some days ago published new data comparing the news coverage on blogs to that in the traditional press. Here's an interesting number: only 2% of news in the traditional press are about science and technology. But on the blogs, it's 18%.

Sunday, May 23, 2010

On the Edge of Chaos

I saw this advert about a month ago, and it got me thinking. It doesn't matter much if you don't understand German, the visuals speak for themselves:


It's an advert for craftsmanship (Handwerk). The song lyrics are roughly saying: imagine how life would be without them. (The long list in the end is a list of professions.) To me it shows so nicely how incredibly complex our life has become, and how much that we take for granted is only a very recent achievement in the history of mankind.

Friday, May 21, 2010

Terra Incognita

As you know, I am presently at a workshop at Perimeter Institute about the Laws of Nature: Their Nature and Knowability. Yesterday, we had a talk by Marcelo Gleiser titled “What can we know of the world?”. It occurred to me somewhat belatedly that I recently read an article by Gleiser in New Scientist, “The imperfect universe: Goodbye, theory of everything.” In that article, he writes that after “Fiften years [as] a physicist hard at work hunting for a theory of nature that would unify the very big and the very small” he has come to the conclusion that “the very notion of a final theory is faulty.” In a nutshell, that was also what his talk was about.

The only thing that's interesting about this insight is that it took him 15 years to arrive there. And maybe, why it got printed in New Scientist. Of course the notion of a final, fundamental, theory of all and everything is faulty. For the simple reason that even if we had a theory that explained everything we know, we could never be sure it would eternally remain the theory of everything we know. As Popper already realized about a century ago, one cannot verify a theory, it can only be falsified. Thus, theories we have are forever out for test, always on the risk that some new data does not fit in. That's exactly what makes a theory scientific. It's also one of the points I made in my FQXi essay. You see, I'm an even Newer Scientist.

That we can never know whether a theory is truly fundamental and able to explain all observable phenomena of course does not mean there is no fundamental theory. It just means we can never know - so your believe that such a theory exists belongs in the realm of religion, not science.

In any case, in his talk (video and audio here), Gleiser touched on another topic that reminded me of something else. He had a sketch of our expanding knowledge, with a filled circle representing “The Known” in the middle, that is expanding into what is now the unknown (“perennial ignorance”) outside:



I used a similar, though slightly different analogy for the progress of science in my PI public lecture some years ago (which incidentally has the same title as the FQXi essay, I'm very into recycling). In this case though, I used a map of Middle Earth.

The message that I wanted to convey is that the process of knowledge discovery is very similar to exploring unknown territory. There are parts that you have already seen and that you know very well, though details may be missing. And let me be clear that with “The Known” (in contrast to Gleiser) I don't mean laws themselves but the data from which the laws were extracted. Otherwise you lose information that is possibly important about the range of applicability (information you at first possibly didn't think was relevant).

You try to explain the known by a theory, and if everything fits you point somewhere into the unknown (make a prediction). Ideally, experimentalists go there and find what you told them they would find. You don't want to point out too far because people today are quite impatient, and if your prediction is not measurable within their lifetime it won't help you get tenure. The other way progress happens is that there is data available for which a theoretical explanation is missing. Or a theory might be sketchy and not work very well. That's the situation of the experimentalist saying: we've seen something on the horizon, please explain that. The body of knowledge that we have is usually not neatly simply connected, but typically has some pieces that don't really match with anything else.

Which brings me back to Gleiser's article then. The essential question is not whether you do or don't believe in a fundamental theory of everything. The essential question is what is a good and promising way to expand what is known. You can believe in flying spaghetti monsters, reincarnation, or a theory of everything: if it helps you with your research, by all means, go ahead, just don't put your believes in the abstract of your paper.

Experimental input is of course essential to progress all along. On the theoretical side, the obvious reason why people are looking for a unification of the known forces is that unification has worked previously and has been tremendously successful. The same holds for symmetry principles. Sure, that doesn't mean these procedures will continue to be successful, but it's the obvious thing to try. It's the same reason why a band's second hit sounds like the first, and why, after my move to Sweden I first had to learn that asking to speak to a supervisor and complaining about lacking customer service is not a very successful tactic in this country. Similarly, we might have to reconsider our tactics and learn new ways of thinking if we remain unsuccessful making headway on today's big questions in physics. For example when it comes to resolving the apparent tension between General Relativity and quantum mechanics, or to explain the arrow of time, rspt the initial conditions of the universe: It's terry incognita and there may be dragons.

That's why I find meetings like the current one at PI very useful to become more aware of our standard mode of thinking, for awareness and acknowledgement of limitations of a procedure is the first step to improvement.

Monday, May 17, 2010

Abramowitz/Stegun goes online

Did you ever need to learn about the properties of some obscure mathematical function which turns up when you try to solve, say, the Schrödinger equation with a linear potential?

In the times before Wikipedia and Eric Weisstein's World of Mathematics/MathWorld, the usual way to proceed was to go to the library and look up in the "Abramowitz/Stegun", a compilation of formulas, relations, graphs and data tables for all kinds of functions you can think of.



Airy functions Ai(x), Bi(x) and M(x). dlmf.nist.gov/9.3#F1.


Over the last years, Milton Abramowitz' and Irene A. Stegun's time-honored "Handbook of Mathematical Functions" has been carried over to the internet age as the Digital Library of Mathematical Functions. Published by the US National Institute for Standards and Technology (NIST),


... the NIST Digital Library of Mathematical Functions (DLMF), is the culmination of a project that was conceived in 1996 at the National Institute of Standards and Technology (NIST). The project had two equally important goals: to develop an authoritative replacement for the highly successful Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, published in 1964 by the National Bureau of Standards (M. Abramowitz and I. A. Stegun, editors); and to disseminate essentially the same information from a public Web site operated by NIST. (From the DLMF Preface)


Parts of the DLMF have been available since some time, but the complete site went online just last week, on May 11.

In comparison to the old printed book, there are more functions and formulas, which all can be copied as Latex or MathML code. And while the function graphs at MathWorld are interactive, the DLMF features more detailed descriptions of applications in mathematics and physics, and links to freely available software libraries.

Should I ever need to code Jacobian Elliptic Functions, I'll know where to look them up.




Via bit-player, where you can also read more about the history of the Abramowitz/Stegun.

Saturday, May 15, 2010

The Future of the Conference

As you know, I am here at Perimeter Institute for the upcoming workshop on the Laws of Nature: Their Nature and Knowability. Every time I lift my bag on a scale at a check-in counter I am wondering if there will come a time when, instead of stepping on a plane, we will meet in cyberspace.

The Past

The classical conference in academia is still omnipresent. You go, you sit and listen to a dozen talks a day, you smalltalk over coffee and cheap cookies and try to get to know some people at the "social event," typically a reception with buffet or a conference dinner. Also typical is that the average participants pays a horrendously high fee that covers the VIP guests airfare and four star hotel. But that's okay because most of the participants have a travel grant for exactly this purpose.

That might sound a bit dull, and it frankly sometimes is, but there are a lot of good reasons both to organize and go to a conference.

On the participant's side: Most notably, conferences are useful to obtain or keep an overview on the research going on in one's field. An overview both on the what and on the who. It is possible to do that by other means, but a conference is an especially efficient way to do it, in particular if you're a newcomer. In contrast to reading a review, you can go and talk to the people who work on similar stuff like you and face-to-face communication is still the best way to exchange information. You might learn about some unfinished work and find a new collaborator. You might get to hear a talk by some of the more well-known people in the field that might be unlikely to pass by your own institution. And last but not least, you have the opportunity to communicate your own research, get feedback and advice.

On the organizer's side: Organizing a conference takes a lot of work and time. One doesn't make money with a scientific conference - in fact the first steps will be trying to find sponsors. Reasons for organizing a conference are most notably to advance the research in one's own field. It's to bring together and support the community, to spread ideas, to foster the formation of collaborations. Conferences are also frequently used to advertise the institution where they take place, which most often is one of the main sponsors. As an organizer one often has, to some extend, the possibility to select speakers that one is interested in hearing or meeting. And then there is the communication of the research field's relevance beyond the own community. Many conferences will have a public lecture and will have some media coverage, at least in the news magazine of the university where it takes place.

There are a few variations to the conference scheme. A workshop for example will typically have fewer participants and fewer talks, and these talks will be more specialized and leave more room for discussion. The large conferences will often separate talks into plenary talks and parallel session. Sometimes they will dump people into a poster session.

The Present

During the last years, with the spread of social networking tools and the continuing improvements in information technologies, one could start to see some modifications of the standard scheme. To begin with, it is nowadays possible to let a speaker deliver a talk by video link and it is similarly possible to let people participate by streaming video. I have been to a few conferences were talks were given by video and they typically are not very well attended. I'm not entirely sure why. It's like people think "Oh, he wont really be here." Or maybe it's that, not so surprisingly, these talks typically imply a lot of technical fiddling and are fault-prone. That however will improve the more often it happens.

Another quite obvious change that is now so common one easily takes it for granted, is that most conferences have the slides of talks online and in many cases even a recording. This is so omnipresent that indeed most conferences no longer publish proceedings. I find this extremely useful because I don't have to take notes and write down a reference on the speaker's slide. I can just go and look it up later.

Then there are the changes in format. I wrote previously that I was at the SciBar Camp in Toronto and later at the SciFoo Camp in California. In this case the schedule is not set before the start of the meeting, but assembled by self-organization and participant's interests upon arrival which is greatly aided if the participants had in advance a possibility to exchange their interests online. This spontaneous self-assembly has advantages and disadvantages. The advantage is that it's a more flexible format that expresses the participants' rather than the organizers' interests, is more lively and interactive. Disadvantage is that most of the sessions will be pretty much unprepared. Since I actually prefer listening to thought-through arguments instead of improvised babble, I am not too much in favor of this mode of organization. I think that for scientific purposes, a combination of both, the old-fashioned scheduled talks and some more flexible sessions would be more suitable.

And then of course there is the use of social networking tools, like Twitter, Friendfeed and blogging or just setting up a networking site specifically dedicated to the event. Whether or not that works well depends crucially on how many people make use of it. But if it works well, this can serve a lot of purposes. One is clearly the communication to the public. But besides this it also improves the exchange between the participants. In particular if you are at a large conference you might not actually know who is interested in similar topics like you or who you might want to talk to in the coffee break. If a conference wants to make good use of Web2.0 tools first thing they should do is to aggregate the participants' feeds. It is somewhat ironic, but you might not actually know that the person sitting next to you is writing a blog you read frequently. And installations like a Tweetwall for example (a screen displaying tweets by participants, see picture) add a completely new layer to the discussions that can take place at a meeting and can greatly improve information exchange and facilitate networking.

It is interesting to see how more and more conference organizers are making use of these possibilities. It depends of course a lot on the technical support that they have. For example I read the other day on Resonaances that the International Conference on High Energy Physics 2010 has launched an official blog and recruited some bloggers to cover the event. How cool is that? One should add that of course these things have been done since years in some communities that are especially dedicated to advancing these changes in networking, blogging, outreach and information technology. And Perimeter Institute's outreach efforts have been playing around with all these possibilities since years, for example with the recent Quantum2Cosmos Festival. Point I'm trying to make is that the use of these tools is now slowly spreading and becoming more common.

The Future?

So what's next? NatureNetworks organizes conferences that are life-streamed to Second Life. Will this become the conference of the future? I think it is very well possible. I don't think it is very likely that conferences will become entirely decoupled from physical reality in the sense that we exclusively meet online. But it will become increasingly more common to attend a "real" conference "virtually" if one cannot be there in person for one or the other reason, may that be lack of funding or illness.

I also think that conferences will obtain several more virtual layers in the soon future. For example, I imagine that you go to a talk and while you are there can "log in" to the respective website, and so could people who are not physically there but following online. You could then for example skip back and forth in the slides as you wish or ask your colleague in the second row what he thinks about what the speaker just said. I think that this happens to some extend today by people sending emails, but it could become much better aggregated. I am not sure however that such a complex environment as SecondLife is necessary for these purposes. Though it has of course the advantage that the technology is already in place.

As to the increased flexibility in format. There is a quite obvious hurdle to having an academic conference that has not a program online a month ahead and neatly scheduled speakers: Many people can only justify their participation and receive a reimbursement for their expenses when they are giving a talk. This is a typical example where requiring "accountability" can be misguided (see also) and hinders improvements. However, I think that this problem will resolve by itself once funding agencies notice that there are other means to document one's participation in an event than being listed on the program. Basically, all they really want to know is that you didn't just spend the week on the beach at their expenses. But taking part in online discussions or blogging can serve a similar purpose.


Bottomline

Real change is happening and I think we'll see more of it!

Thursday, May 13, 2010

Paradigm Shifts

One day, when I'm old and my hair is grey, and I'm sitting in a rocking chair stroking a cat on my lap when the neighbor's son comes with a book on the history of science, I want to say "Yes, I was there."

There are as many different motivations to become a physicist as there are physicists. But one of them is certainly the wish to be part of something greater, an event of historical importance. It's the wish to be there and have a say when our view of the world fundamentally changes; when a new picture comes into focus that will be passed on to future generations.

The change of our fundamental understanding of Nature, the emergence of a new way of thinking about the world is what is known as a "paradigm shift." It's a notion that occasionally creeps up in a discussion. It most often does so either as a means of defense, when a new proposal is widely rejected or when the speaker tries to make himself more interesting.

I was wondering the other day what paradigms there are that might be shifting today. In the 22nd century's textbooks, what of our today's understanding will appear in the historical appendix instead?

There's three such potential paradigm shifts that I've come across.

The first one is about the limits of reductionism. With the incredible success reductionism has had in physics in the first half of the century, there came the believe that one day we'll be able to explain everything by taking it apart into smaller and smaller pieces. This paradigm has by now pretty much shifted over in favor of acknowledging that emerging features might not be possible to explain, either in practice or in principle, by reduction to more elementary constituents (see also). This change on our perception came along with the rise of chaos theory and complexity, features that are both very common to natural systems and hard if not impossible to address by reductionist approaches. It is funny in fact how silently this shift seems to have taken place. You sometimes find today people in talk vigorously arguing that reductionism has limitations, just to find there's nobody actually disagreeing with them. Except for the old professor in the front row.

The second potential paradigm shift that has crossed my way is the multiverse. The multiverse I have in mind is the one forced upon you from the string theory landscape, a vast number of possible universes with different laws of Nature, versus the previously prevailing idea that our universe is unique and is so for a reason that we have to find. Various other sorts of multiverses seem to creep up from other considerations. The multiverse is presently a very hotly discussed topic, with strong defenders both for and against it. I have previously expressed my opinion that the multiverse isn't so much a new paradigm but a new way of thinking about an old paradigm. Instead of finding a way to derive the parameters of the standard model as a 'most optimal' configuration of some sort, one now searches for a measure in the multiverse by means of which our universe is (ideally) most likely. It's a watered-down version of the same game. In any case, I recall that Keith Dienes (the guy with the String Vacuum Project) spoke about the probabilistic attempt as a new way of thinking about "why" we have exactly these laws. And yes, I was thinking, maybe he's right and in some decades from now that will be how we think about our reality. That we're embedded in a vast number of different universes which different laws of Nature and our grandchildren will laugh about that we once thought we were unique.

The third potential paradigm shift is that spacetime might not be a fundamental entity. I think that everybody who works on quantum gravity (whatever sort of) is familiar with the idea. But I noticed on occasion, most recently when I was talking to Sophie about Verlinde's emergent gravity scenario, that the idea that space-time is only seemingly a smooth, continuous manifold with a metric on it and on small scales might be nothing like this is not very widely spread outside the community. While there are many approaches towards finding a more fundamental description of spacetime, they each suffer from their own problems. So I think it is pretty unclear presently whether this will turn out to be a true (and useful) description of Nature in some way. But it's certainly a thought hanging in the air. On the completely opposite side is the idea that space-time instead is the only fundamental entity and that matter indeed emerges from it (an idea that dates back at least to Kaluza and Klein). Or that neither is fundamental, but arises from something unified that's neither matter nor space-time.

These are all ideas that physicists have been chewing on for quite some while now. I am curious how people will be think about them in the future, if they will laugh about or foolishness or admire our imagination.


Tuesday, May 11, 2010

Under Construction

As you have probably seen from my Twitter-feed, I have made it to Waterloo, despite ash cloud and all. Perimeter Institute is, as usual, buzzing with activity. Since I have to contribute my part to the buzz, here's just a short update on the building construction. The photos from February are here. It is oddly pleasing to see reality evolve by plan, just as we've been shown in the models. It's so different from my every day life...




Sunday, May 09, 2010

Knowledge for the sake of knowledge

In May 2007, Canadian Prime Minister Stephen Harper announced the government's new Science and Technology agenda. The location they chose for this happened to be Perimeter Institute. I blogged about the event here. To introduce the Prime Minister, Mike Lazaridis, founder of the Institute said a few welcoming words (Video and audio here). An excerpt:
“We believe that bold focus and continuing investments in theoretical physics and its applications result in further breakthroughs. These in turn will be the basis for tomorrow's goods and services [...]

If we learned anything from this great experience that is Perimeter and IQC, it is this: Canada can lead the world in key scientific fields from which future economic prosperity and job creation will flow - as long as the private sector and governments make bold, focused, and long-term investments in carefully chosen fields.”

I didn't write about Lazaridis' words then because I distinctively recall feeling let down. So all the fundamental research we do, in the end it's all to produce something that you can go and buy at the mall? Needless to say, I do believe there is value in knowledge just for the sake of knowledge. Curiosity and the wish to know - where we come from, what we are made of, what is out there - is one of the key drivers of our development, both as a species and as a person. Material prosperity is a, certainly welcome and desired, result that better knowledge of the laws of Nature can bring. But knowledge itself also feeds our desires, even if it remains immaterial. Whenever somebody justifies fundamental research as an investment in future technologies (international competitiveness! economic prosperity! job creation!) they are missing half of the story. Yes, that's one of the reasons. But the other reason is that we just want to know.

Now I've never talked to Lazaridis and I actually don't know what his opinion is on the matter. It is very well possible that his speech to the Prime Minister was just a collection of “right things to say on such occasion.” (My only encounter with Big Mike was a very short one. I had been to PI's gym on a weekend and after half an hour or so on the treadmill had the sudden urge to look up something in a book. Sweaty and without glasses I headed over to the library, where a group of important looking grey suits were just being shown around. Thinking that I might leave a somewhat unfortunate impression, I silently vanished.) But leaving aside Lazaridis and the Canadians for a moment, the idea that tax-paid fundamental research in the end should produce some sort of *stuff* is unfortunately wide spread.

Last year Paul Drayson, Minister of Science in the UK, said
“Scientists should be accountable where work is funded by the taxpayer and therefore I think it is right that scientists should be asked to think about the impact that they have had.”

(as quoted in THE "Science, we have a problem."). What bothers me about this quotation is the implicit assumption that scientists do not think about the impact they have. As if scientists would not care whether their work is relevant for the society they are part of. I previously wrote that in my experience it is a crude misconception. Most people I know who work in fundamental research do actually suffer from feeling useless for the exact reason that the impact of their work is difficult, if not impossible, to quantify. It is often even difficult to communicate. But think about it: there is the prospect that their work will fundamentally change the way we understand our own place in this world, possibly some centuries into the future. How do you want them to account for that?

To come back to the Canadians one more time, on page 20 of Canada’s Science and Technology Strategy, Mobilizing Science and Technology to Canada’s Advantage, you can read:
“Science and technology is not an end unto itself. It is a means by which we can pursue sustainable development.”
(via Jeff Sharom). I am all for sustainable development. I am also totally in favor of innovation and I love my BlackBerry. I have a lot of fantasy and can imagine that our rapidly improved understanding of, for example, the first moments of the universe might one day lead in mysterious ways to an application. But sometimes science is an end unto itself.

Saturday, May 08, 2010

Interna

If the charming Icelandic volcano lets me, I'll be flying to Toronto on the weekend. Next week, I'll be attending the PI workshop on the "Laws of Nature:Their Nature and Knowability," which promises to be interesting. I have some more trips upcoming this summer. Towards the end of June, there's a meeting of the "Working group on quantum black holes" in Bonn (which is part of the COST action "Black holes in a violent universe"), the SUSY 2010 happens to be also in Bonn in August. The Planck 2010 is May 31 to June 4 at CERN, where I will not be going because it conflicts with my Toronto trip. And of course there's the ESQG 2010, July 12-16, here in Stockholm. To top off the summer, my 15 year high school reunion is also planned for August. Not the busiest summer ever, but still seems I'll collect some frequent flyer miles.

You'll hear from me when I'm on the other side of the big water. Meanwhile, have a nice weekend or, as the Swedes say, ha det så bra.

Thursday, May 06, 2010

Why'd you have to go and make things so complicated?

    "Why'd you have to go and make things so complicated?
    I see the way you're actin' like you're somebody else
    Gets me frustrated
    Life's like this you
    You fall and you crawl and you break
    And you take what you get, and you turn it into
    Honestly, you promised me
    I'm never gonna find you fake it
    No no no"

Complexity

It is interesting, if you follow the news press, how frequently one finds references to "complex" problems, issues and questions: "Illegal immigration is a complex [...] issue with no easy solution," "Toyota faces complex legal woes as lawsuits mount," "Senate passes complex, controversial energy reform bill," "[T]he subject of radicalization [...] is a complex problem," and so on and so forth. One is left to wonder, what is not complex?

The difference between complex and complicated is that a complex system has new, emergent features that you would not have seen coming from studying its constituents alone. (For the meaning of "emergent", see my earlier post on Emergence and Reductionism.) The complex problem, it can't be decomposed. It can't be reduced. It's global, interrelated, it's on many timescales, and it doesn't respect professional boundaries either. Worse, you don't know were it begins and ends. It's full of "unknown unknowns." It's not only their problem, it's our problem too.

If you need any evidence for the popular appeal to complexity, even the Pope had something to say about it last year:
"The current global economic crisis must also be viewed as a test: are we ready to look at it, in all its complexity, as a challenge for the future and not just as an emergency that needs short-lived responses?"
In a recent article in the New York Times, David Segal wrote
"[C]omplexity has a way of defeating good intentions. As we clean up the messes, there's no point in hoping for a new age of simplicity. The best we can do is hope the solutions are just complicated enough to work."

Calling a problem "complex" seems to mean nowadays to acknowledge one doesn't really know how to handle it. A complicated problem, sure, we'd figure out what to do. After all, evolution has kindly endowed us with big brains. But a complex problem? Our political and social systems can't deal with that. *Shrug shoulders* Now what? Let's clean up the messes and hope that a complicated solution will do.

It is true that the problems we are facing are becoming ever more complex. This is a consequence of our world getting increasingly more and increasingly better connected. This creates new opportunities and fosters progress, but along the way it causes interdependencies and, when left unattended, lowers resilience.

It is however not true that we don't know what to do with a complex problem. We just don't do it. In contrast to our political systems, humans are good at solving complex problems. It's the complicated ones that you better leave to a computer. Look at the quote from Avril Lavigne that is title for this post. She's talking about relationships. Navigating in a human society is a multi-layered task on many time-scales with unexpected emergent features. It's full of unknown unknowns. That's not a complicated problem - it's a complex one. We have the skills to deal with that.

The reason why we can't use our abilities to deal with economic or political problems is simply lack of input and lack of method. These are solvable problems. And they are neither complex nor complicated.

Optimization

I have written previously on what three requirements have to be fulfilled for a system to be able to develop into an optimal state, to find a good solution to a problem. One is free variation. Democracy and a free market economy are good conditions for that. The second one is to detect whether a small variation is an improvement or not. The third is the ability to react to the result of the variation. That's basically a poor man's way to find a maximum: go a step in each direction and take the direction that goes up*.

This procedure however dramatically fails whenever there is either data missing to find out whether a change is an improvement or not, or if there's no way to react to it. Take the recent economic crisis. There have been people all over the place who found something odd is going on; that this money creation out of nothing didn't make sense. They've had the data, but they've had no way to act on it. There was no feedback mechanism for their odd feeling. Way too late one would hear them saying they've sensed all the time something was wrong. From a transcript of a radio broadcast "This American Life" (audio, pdf transcript):
    mortgage broker: ...it was unbelievable... my boss was in the business for 25 years. He hated those loans. He hated them and used to rant and say, “It makes me sick to my stomach the kind of loans that we do.”

    Wall St. banker: ...No income no asset loans. That's a liar's loan. We are telling you to lie to us. We're hoping you don't lie. Tell us what you make, tell us what you have in the bank, but we won't verify? We’re setting you up to lie. Something about that feels very wrong. It felt wrong way back when and I wish we had never done it. Unfortunately, what happened ... we did it because everyone else was doing it.
Italics added. (We previously discussed this in my post The Future of Rationality.)

It's not that nobody noticed what was going on. There was a variation taking place, but part of the change it was creating wasn't monitored. And there was no way to feed notice about the change back into the system. Computer programs made a risk assessment. They might not have made sense, but you wouldn't question them because everybody played the same game. In a recent NewScientist article, economist Ernst Fair is quoted saying
"Almost everyone in business, finance or government studies some economics along the way and this is what they think is the norm. It's a biased way of perceiving the world."

"Biased" is another way to say there's input missing.

We notice similar failures with other examples. Our economic systems are slow if not incapable of dealing with ecological problems because the problems don't automatically feed back into the system (at least not on useful timescales). There is a variation, but the optimization process can't work properly.

Backreaction

The reason why this close monitoring of the system (our global political, social, ecological systems) has become necessary and why no return to simplicity is possible is that even small groups of humans can cause a significant change to their environment. That may be a natural environment, social, or an organizational environment, which could be summarized as "background". In the earlier days, we were trying to achieve an optimization in a fixed background. Now, we can no longer neglect that we are changing the background by our own actions. In physics, this is commonly known as "backreaction."

If you take for example the deflection of light at the sun, then to compute the deviation you treat the photon as propagating in the fixed background field of the sun. That is an excellent approximation. Yet to be precise, the photon does actually change the background field too. If you'd take heavier objects passing by the sun, you'd eventually come to notice that they do contribute to the gravitational field too. The approximation of a fixed background is often made. For example, for the Hawking radiation of black holes, one commonly neglects the backreaction of the emitted radiation. This, again, is an excellent approximation, but one that breaks down at some point. (In this case when the energy of the emitted particles comes close to the mass of the black hole itself.)

If you are in a regime however where you can no longer neglect backreaction, as we are now with humans living on planet Earth, then you have to find a common solution for both the system and the background. Or you could say, they form a common system. This necessity to find a solution for both the background and the objects in it is one of the great insights of Einstein's theory of General Relativity, where the background is space-time, formerly thought to be an unchanging, fixed entity. You cannot have a time evolution for any system and just look at what the background will do or the other way round. You have to find a solution for both together. It is somewhat of a stretch to the notion of a "background" but I think that we are facing exactly this problem today when we are trying to find a sustainable solution for mankind living on this planet. We can either return to an era where backreaction was negligible and the background was eternally static and unchanging at our disposal. Or we learn how to find a stable solution to the full problem: us and our environment.

This issue is far more complex than you might think. That's because we are now in a situation were the change we cause to our environment does influence our own evolution and adaption to the environment. Human Culture has demonstrably been an evolutionary force since thousands of years already. And we are now only short of actively shaping our own evolution, not to mention that of other species. Whether that's a good idea or not depends on whether we are able to learn fast enough, ie whether assesment and reaction to a change is fast enough so the system doesn't just run down the hill before we can say bullshit.

Bottomline

And that's why I keep saying we need to finish the scientific revolution. Trial and error may have worked well to organize our living together for thousands of years, but this method has its limits. In an increasingly interconnected world, errors are too costly. We need to use a smarter method, a scientific method.

To be able to find a stable, sustainable, and good integration of the ongoing human development into the environment we need first of all to know what's going on. It is not too far fetched to think that Google will play a role in that with creating "real-time natural crisis tracking system," "real-world issue reporting system" or "collecting and organize the world's urban data" (see: Project 10 to the 100). The next step is to find a good way to extract meaning from all this data to be able to react in a timely manner to changes. People often seem to think that with that I mean the systems' dynamics has to be predicted. And let us be clear again that the system we are talking about is the global political, economical and ecological system. Having a model that makes good prediction would be nice, but it is questionable whether this is possible or even desirable. But that is in fact not necessary.

You don't need to predict the dynamics of the system. You just need to know what parameter space it will smoothly operate in so optimization works. You want to stay away from threshold effects, abrupt changes with potentially disastrous consequences. Think again about how we deal with human relationships. You don't predict what your friends, relatives or your partner will be doing. This would be pretty much impossible. But after you have got to know them you'll have an idea what to expect from them, and you'll be able to maintain a sustainable relationship on a balance of taking and giving. The same holds for the systems that govern our lives. You don't need to predict their evolution. You just need to know the limits. Life's like this...


* This does not find you a global maximum, but that's a more complicated problem that we'll discuss some other time.

Tuesday, May 04, 2010

Physics Bits and Bites

Here are three interesting and intriguing physics items I came across recently:
  • Last year, the American Association for the Advancement of Science (AAAS) had organized a symposium called "Quest for the Perfect Liquid: Connecting Heavy Ions, String Theory, and Cold Atoms". Perfect, low-viscosity liquids can be observed when there is a very strong interaction between the constituents of the fluid, as is the case for the quarks and gluons created in heavy ion collisions at RHIC, or clouds of ultracold lithium atoms in optical traps. The strongly coupled quark gluon plasma can be described using the AdS/CFT correspondence, which brings sting theory into play (see also this earlier post). At the AAAS symposium, physicist-blogger Clifford Johnson (from Asymptotia) and Peter Steinberg (from Entropy Bound) discussed this connection, and a write-up of their presentation has now come out as a feature article in the May 2010 issue of Physics Today, "What black holes teach about strongly coupled particles" (free access).

  • You may be aware of the ongoing quest for the densest possible packing of tetrahedra? The NYT wrote about this in January, and the articles mentions that a paper on the subject "prompted Paul M. Chaikin, a professor of physics at New York University, to buy tetrahedral dice by the hundreds and have a high school student stuff them into fish bowls and other containers." This project now resulted in a Physical Review Letter, with an experimentally determined volume fraction of 0.76±0.02 (The current theoretical "record" is at 0.856). Analysis of the experiment was done using Magnetic Resonance Imaging to look "into" the container crammed with tetrahedra, which shows that the packing is highly disordered. More background can be found in an article at "Physics", which also contains a free link to the PRL paper.

  • Also via Physics, I've learned about what is the fastest (and possibly smallest) analogue computer to perform Fourier transforms: a single iodine molecule. A iodine molecule consists of two iodine atoms, which can vibrate, realizing a tiny harmonic oscillator. During one period, a harmonic oscillator follows a circular trajectory in phase space, which means that the Wigner function describing the quantum state of the oscillator "switches" space and momentum coordinates every quarter period. Going from real space to momentum space corresponds to a Fourier transform, so when the wave function of the iodine molecule is prepared in real space, after quarter of a period, the wave function encodes the Fourier transform of the initial configuration. Using laser, it is possible to prepare the molecule in definite state, and to probe the state again later. This allows discrete Fourier transforms for four and eight elements, and all this within 145 femtoseconds, "which is shorter than the typical clock period of the current fastest Si-based computers by 3 orders of magnitudes." (Ultrafast Fourier Transform with a Femtosecond-Laser-Driven Molecule", PRL).

Saturday, May 01, 2010

Publication Cut-off

The German Research Foundation (DFG) has taken an important and overdue step. To limit their applicant's attempts to blind the reviewer with publications, from July 1st 2010 on a maximum of 5 publications can be listed in the CV. In addition to this, only papers that are already published can be listed. Previously, it was possible to also list papers that are submitted, but not yet published. The change in this policy is apparently a reaction to an instance last year in which applicants (in the area of biodiversity) invented publications. (More details on the new regulation here.) It remains unclear to me whether a paper on the arxiv counts as published or unpublished.

With this decision, the DFG is clearly signaling that it's quality that matters, and not quantity. Or at least that's what should matter for their referees. Another reason for the change is that other countries have similar restrictions. The NSF for example also has a limit of 5 publications relevant for the project, and the NIH 15.

Matthias Kleiner, President of the DFG said
“With this we want to show: For us it is the content that matters for the judgement and the support of science.”

And he bemoans that today
“The first question is often not anymore what somebody's research is but where and how much he has published.”

(As quoted in Physik Journal, April 2010, my translation).

The DFG is the funding source for scientific research in Germany. Not the only one, but without doubt the most important one. This decision will therefore have a large impact. The impact however is limited in that the other major reason publication numbers are ever increasing is that hiring committees pay attention to these numbers - or at least are believed to pay attention, which is sufficient already to create the effect. The President of the German Higher Education Association (DHV*), Bernhard Kempen, comments
“To assess a candidate's qualification in a hiring process it should also be solely the content of provided publications, not their number, that is decisive for an appointment.”
(as quoted here, my translation.)

Since I have written many times that it hinders scientific progress when selection criteria set incentives for researchers to strive for secondary goals (many publications) instead of primary goals (good research), it should be clear that I welcome this decision by the DFG.



* DHV stands for Deutscher Hochschulverband. The literal translation of the German word "Hochschule" is "high school" but the meaning is different. "Hochschule" in Germany is basically all sorts of higher education, past finishing what's "high school" in America. The American "high school" is in German instead called "Oberstufe," lit. "upper step." See also Wikipedia.