Archive

Archive for December, 2008

Kepler’s Equation: 2009 Edition

December 24th, 2008 No comments

As anyone who’s used the systemic console knows, the numerical integration of planetary orbits is aggravatingly slow. For modern-day dynamicists, endless pages of algebra are often a thing of the past. Now it’s “hurry up and wait” while the computers grind through the integrations.

If you’re charting the courses of planets that have negligible planet-planet gravitational interactions, then life runs at interactive pace. Instead of integrating 6N coupled ordinary differential equations, you need only solve Kepler’s equation, M = Ee sin E, which parameterizes the position of the planet on its ellipse as a function of time.

In an era of environmental and economic collapse, solving M = E e sin E for E doesn’t seem like a big problem. Simple iteration, for example, works quite well. Remarkably, however, as pointed out by Peter Colwell in his 1993 book Solving Kepler’s Equation Over Three Centuries, there have been scientific papers written about Kepler’s Equation and its solution in every decade since 1650. From the synopsis of Colwell’s book:

The sole subject of our work is Kepler’s Equation (KE) M = Ee sin E . In its narrowest form, the Kepler problem is to solve KE for E , given M in the interval and e in the interval [0,1]. In virtually every decade from 1650 to the present there have appeared papers devoted to the Kepler problem and its solution. We can see from a list of them that the problem has enticed a wide variety of scientists to comment on or involve themselves in its solution.

It is surely not unique in science for a specific problem to be given so much attention over so long a period–particularly if it resists solution, if its partial solutions are inadequate or unsatisfactory, or if it is recharged with new interpretations and new applications. Still, it is curious that the Kepler problem should have continued to be this interesting to so many for so long. Admittedly it is a problem central to celestial mechanics, but it is a technical little problem for which a number of satisfactory solutions are long known. With the advent of calculators and computers, there is no impediment to achieving quick solutions of great accuracy. The problem has neither the broad appeal of an Olbers Paradox, nor the depth and intractability of a many-body problem.

In common with almost any scientific problem which achieves a certain longevity and whose literature exceeds a certain critical mass, the Kepler problem has acquired an undeniable luster and allure for the modern practitioner. Any new technique for the treatment of transcendental equations should be applied to this illustrious test case; any new insight, however slight, lets its conceiver join an eminent list of contributors.

Perhaps the most influential article of the 1990s that touches directly Kepler’s equation is Wisdom and Holman’s 1991 paper that describes the N-body map. The basic idea is that the trajectories of interacting planets can be divided neatly into a part consisting of Keplerian motion, and a part consisting of the derangements brought on by the interplanetary gravitational tugs. A Wisdom-Holman integration avoids forcing the computer to continually rediscover Kepler’s ellipse, reducing much of the integration to repeated numerical evaluations of Kepler’s equation. For orbital integrations that don’t involve close encounters, this trick leads to an order-of-magnitude speed up. N-body maps have made it possible to (for example) readily integrate the motion of the solar system planets for the lifetime of the solar system.

As the first decade of the new millennium starts to draw to a close, I was pleased to see that the 350+ year tradition is continuing. In a recent astro-ph posting, Eric Ford shows how graphics cards can be commandeered to implement highly parallelized numerical evaluations of Kepler’s equation. Using mixed-precision arithmetic, he shows that graphics cards can offer a speed-up of a factor of ~600 over console-style evaluations of M = Ee sin E that use the regular ‘ol CPU. Having the clock hands move 600 times faster really brings Markov Chains to stochastically vibrant life.

And the 2010s? I think quantum computation might turn the order N^2 N-body problem into an order-N computation (see this post). That’ll free up the GPUs so that everyone can get back to playing Grand Theft Auto.

Categories: worlds Tags:

CenFlix

December 14th, 2008 3 comments

Image copyright 1951, 20th-Century Fox.

A search on “Alpha Centauri” in the news archives of the New York Times turns up an average of one or two hits per year, including a scattering of genuine astronomical news clippings about the stellar system itself.

For example, on August 31st, 1904, a bulletin datelined Lick Observatory reported that the distance to Alpha Centauri has been determined “spectroscopically”, although it’s fully uncommunicative of any further details. On December 27th 1925, there was an item (unfortunately tagged pay-to-play) that seems very much in the oklo.org vein:

NEAREST STAR FLIES TO US FROM SPACE; Its speed is Fourteen Miles a second. TWENTY-FIVE thousand years hence New York will be able to see Alpha Centauri our nearest stellar neighbor. Alpha Centauri travels toward the earth at the rate…

In many of the citations, Alpha Cen hits the stands in its role as a cultural touchpoint. For example, in the Dec. 28th, 1969 edition, one finds a post-Apollo, pre-Watergate prediction (presumably a joke):

Reading the Tea Leaves — What will happen in 1970… Vice President Agnew, cut in on a split screen, suggests that the U.S. launch a crash program to go to Alpha Centauri, the nearest star.

Similarly, upon reading Friday’s NYT edition, 20th Century Fox executives must have been elated to find that their publicity stunt for The Day the Earth Stood Still has been given a promotional write-up in the science section. Last Friday at Noon, it seems that the big-budget remake of the Cold-War classic was beamed in its entirety to Alpha Centauri. To one-sigma precision, the transmission will be illuminating Alpha Cen Bb sometime between Monday April 22nd, 2013 and Saturday April 29th, 2013, just a few months into Obama’s second term.

So what are the smart-money odds that the movie will actually get watched in the Alpha Cen system? Oklo.org recommends the following conditional probabilities:

fp = Chance of a habitable planet orbiting Alpha Cen B = 0.6

fl = Chance that live evolved on that planet = 0.01

fi = Chance that life developed intelligence = 0.1

fr = Chance that intelligence understood Maxwell’s Equations = 0.01

fn = Chance that Maxwell’s Equations are currently understood on Alpha Cen Bb = 64,000 / 3×10^9 = 0.0000213.

This gives (fp)x(fl)x(fi)x(fr)x(fn) = one in eight billion, with Alpha Cen Ab kicking in an additional one in a trillion chance.

The numerator in fn is a decision-market estimate corresponding to the long-term running mean (not median!) result of polling students in my classes as to how long they think we’ll remain capable of building radios. The denominator is an estimate of the span of past time during which Alpha Cen Bb could have conceivably harbored intelligence.

Signals beamed to other worlds are readily subject to misinterpretation. I’ve always enjoyed Michael Arbib’s take on the 1961 version of the Drake signal turned up side down:

Friday’s transmission does make one thing clear, though. If a genuine ETI signal is ever beamed to Earth, it’ll almost certainly be a commercial advertisement. The primary problem of interpretation will simply be to figure out how to wire back our cash.

UPDATE:

In the comments section, bruce01 makes the following astute observation:

Alpha Centauri, at declination -60 degrees, is barely above the horizon even from Florida. The web site:

http://www.deepspacecom.net/

says they are located near the Kennedy Space Center which is north of latitude 28 degrees. This makes the zenith angle of Alpha Centauri greater than 88 degrees as seen from the Space Center. You need to add to your equation the probability that the “beamed” signal made it through the Earth’s atmosphere without being totally scattered.

Indeed. Furthermore, for the entire duration of the broadcast, Alpha Cen (RA 14h:39m, DEC -60deg:50min) was below the horizon as viewed from 28 35 06N, 80 39 04W. One can’t help but wonder whether bruce01 may have made a vital contribution to the solution of the long-running Fermi Paradox.

I’m absolutely confident, though, that any organization with the reach and technical expertise advertised by the Deep Space Communications Network would maintain a fully staffed southern hemisphere station for their broadcasts to the southern skies.

Categories: worlds Tags:

80sec. 0.47mmag. (!)

December 7th, 2008 10 comments

I like it when remarkable exoplanet results are disguised within more-or-less innocuously titled papers. A nice example occurred this summer, with “The HARPS search for southern extra-solar planets. XIII. A planetary system with 3 Super-Earths (4.2, 6.9, & 9.2 Earth masses)”. While it’s true that the three planets orbiting HD 40307 are indeed cool, the Geneva team announced much bigger news in the discussion section of the article. They reported, almost offhandedly, that 1/3 of solar-type stars have sub-Neptune mass planets with periods of 50 days or less. That’s the most important planet news since the discovery of hot Jupiters.

Another instance can be found in last weekend’s astro-ph mailing under the file-to-read-later title, “A Smaller Radius for the Transiting Exoplanet WASP-10b“. In this article, John Johnson and collaborators demonstrate 0.47 millimagnitude per-sample photometry with a cadence of 1.3 minutes from the ground. At first glance, their light curve of a WASP-10b transit looks like it came from outer space:

For comparison, here’s the classic 2001 HST composite light curve of the HD 209458b transit that really did come from outer space:

The HST light curve has an 80 second (1.33 min) cadence, and a per-point precision of 0.11 millimagnitudes. Because of HST’s low-Earth orbit, however, it took four separate transits to assemble the composite light curve:

On a per-transit basis, then, Johnson et al.’s ground-based photometry is 22% the value of the HST photometry. That is extraordinary value for the dollar.

The WASP-10 curve was obtained with a type of CCD called an orthogonal transfer array, which controls how the starlight is spilled onto the individual pixels. By distributing the incoming photons in a highly disciplined manner over a larger area of the detector, saturation is staved off, and the duty cycle is improved.

WASP-10-b is a 12.7 magnitude star, and so its transit light curve certainly benefits from having control stars of similar magnitude in the field of view of the 2.2m telescope. The most interesting transiting planets occur around brighter stars (accessible to Spitzer). Nevertheless, it seems quite probable that an observational set-up using a neutral density spot filter for the primary star would allow similar precision on brighter stars. (Back in the day, Tim Castellano used the spot filter technique to check HD 187123 for transits by its hot Jupiter.)

It’s interesting to look at a few of the possibilities that open up if one can do 80sec–0.47mmag photometry from a facility that’s not dauntingly oversubscribed.

Transit timing is high on the list. TTV precision scales in direct proportion to photometric precision, and it scales with cadence to the -1/2 power. For the Wasp-10b transit, the moment of the central transit was measured to a precision of 7 seconds. At this level, it’s possible to sense the presence of very small perturbing planets, especially if one also has precise radial velocities. Stefano has been burning the midnight oil to improve the systemic console for research-grade use. One of the primary capabilities of the new console is an enhanced transit timing analysis suite that is capable of fully exploiting timing measurements at the 5-10 second level. We’ll be officially rolling out the new console quite soon. (In the interim, you can get the current build here.)

Should transit timing indicate the presence of an Earth-mass perturbing companion, then there’s a reasonable chance that the perturber also transits the parent star. If the timing model can give good predictions for when the transit might occur, then 80sec–0.47mmag is fully sufficient to detect Earths from the ground.

In the figure just below, I’ve zoomed in on an out-of-transit portion of Johnson et al’s Wasp-10b light curve. At this scale the 10^-4 depth of a transiting Earth is just resolved at weblog resolution. By binning the photometry into half-hour chunks, one reaches this resolution. A transit by an Earth-sized planet could thus be a multi-sigma detection in a single night. Hot Damn!

And then there’s the Transitsearch angle. There are a number of Neptune-mass planets that (to my knowledge) have not been adequately checked for transits because their predicted photometric depths were just too small. At the 80sec-0.47mmag level, these planets come right into play. A short list would include (1) 55 Cancri e (11 Earth masses, 10.1% transit probability, 0.065% transit depth), (2) HD 219828b (19 Earth masses, 15.6% transit probability, 0.027% transit depth), 3) HD 40307b (4.3 Earth masses, 6.8% transit probability, 0.052% transit depth), (4) HD 69830b (10.2 Earth masses, 4.9% probability, 0.072% transit depth), and (5) HD 160691d (14.38 Earth masses, 5.6% probability, 0.056% transit depth). Assuming that your RV fits are up to date and that you’re first on the sky with one of these bad boys, your expectation value can run into hundreds of thousands of Swiss Francs per hour.

Categories: worlds Tags: