Mar 9 Eclipse Day

I have done a lot of traveling for eclipses.  In 2013 we viewed the eclipse near the equator (6°N) off Liberia.  Last year we were in Svalbard (78°N).  This year we did the eclipse almost dead on the equator ( 0° 22.5'S) 1 on the opposite side of the world from Svalbard.


Eclipse Pictures

Here are the pictures of the eclipse.  The times are all UTC from the camera. Based on the analysis below the clock in the camera was slow by about 6.5 seconds. C2 was about 9:45 AM WIT (local time).   All exposures were at f 5.6.

Click any of the images for a full size view.

HDR Image

Capturing a solar eclipse with a camera is a notoriously difficult problem.  Since I did not want to load up this section with text I will discuss some of the problems below.   The following image is a collaboration of Richard Bareford and myself.  Richard used a more ambitious setup on the boat and thus was able to capture a much wider range of data than I did.  Richard contributed 11 different image and I one. Thanks to him for allowing me to use his data.

To my eyes this version is a good representation of what I saw.

HDR image of 2016 Solar Eclipse Bareford/Hawley

This image is, of course, heavily processed. The goal of the processing is to reveal the details that would normally not be apparent due to the limitations of the camera and display technologies.  For a summary of how this picture was produced see the description below.

This is the third attempt at creating this image. I released this version on the boat.  This is my first attempt back home.

2nd Contact with Beads

The bead like effect is caused by the moon's surface being irregular.  High parts of the moon block the sun first. The valleys block the sun last.  Thus C2 and C3 are usually not instantaneous. Some light leaks through the terrain causing the beads.


1/4000 ISO 200 00:44:48 UTC

2016 Solar Eclipse C2 Diamond Ring

2nd Contact Chromosphere

In this shot a couple of seconds later the Photosphere is mostly hidden and the thin Chromosphere is visible.

1/4000 ISO 200 00:44:56 UTC

2016 Solar Eclipse C2 Chromosphere

C2 Last Prominences

This next shot shows a row of small prominences at 8 to 7 o'clock before they were covered by the sun.  The large Prominence and 9 o'clock was visible throughout totality.

1/4000 ISO 200 00:45:03 UTC

2016 Solar Eclipse C2 last chromosphere

Inner Corona

This photo was actually taken before the previous, but I put it here to group it with the other corona photos.

1/640 @ISO 200 00:45:01 UTC


2016 Solar Eclipse Inner Corona

Outer Corona

The outer corona is significantly dimmer than the bright stuff and requires longer exposures.  I partially wimped out by shooting some images at a higher ISO, but stuck to my 1/100 fastest speed.  Richard Bareford was much more ambitious. Hence my use of his photos rather than mine in the HDR.

If you would like to get an idea of the exposures needed for full coverage of the corona then see my Libya images.

Note the bright inner parts of the corona are fully saturated.

In the sound track included in the movie  I make note that the corona at 6 o'clock has a noticeable bend. You can start seeing this in these images.  It is more apparent in the HDR.


These two images were converted by Apple Aperture.  Aperture tends to stretch the images which in this case is useful.

1/640 ISO 1600 00:46:28 UTC

2016 Total Solar Eclipse Outer Corona

1/100 ISO 1600 00:46:26

2016 Solar Eclipse Outer Corona


C3 Chromosphere

The Chromosphere is revealed just before the sun's surface is exposed.  In 2013 I made a verbal note of this.  This year it happened so fast (or I lost attention due to the length of the eclipse) that I did not see this (or failed to verbally note it).

1/4000 @ ISO 200 00:47:50

2016 Solar Eclipse C3 Chromosphere

C3 Diamond Ring

I like this shot better than the 1/4000 shot taken a second later.  It was taken closer to the actual C3 and shows the prominence at 9 o'clock better. In this instance the sun was revealed in a flatter part of the moon. Thus few, if any, beads.

1/640 @ ISO 200 

2016 Solar Eclipse C3


Measuring the Time of C2 and C3

In late April Xavier Jubier, author of Solar Eclipse Maestro approached me to see if we could use my images as a test of the predictions of his program. I located two images that were not a part of my original image set, but were closest in time to the events.

C2

The closest image I had to this was my image 1707 which was taken at an indicated time of 2016:03:09 00:44:50.98.  Based on his analysis this corresponds to 00:47:57.7 in real time.  It is about 0.6 seconds (+/- 0.2 seconds) before C2.

C3

My image 1773 was the closest to C3. Unfortunately that was a slow exposure, but you can still tell that C3 has occurred in this image and had not yet happened in 1772.  Image 1773 was taken at an indicated time of 2016:03:09 00:47:52.04 which compares to a predicted time of 00:47:58.4.  This is about .2 seconds after C3.

Averaging the errors between the two predictions implies that the clock in the camera was slow by about 6.5 seconds.  Also Solar Eclipse Maestro was spot on with its predictions.

Solar Eclipse Maestro prediction of Le Soleal Eclipse

Experiencing the Eclipse on the Ship

Our group was on board the Le Soléal.  Originally we were supposed to view the eclipse at sea near Ternate, Indonesia in the eastern part of the Molucca Sea.  As we approached eclipse day the weather forecast for the Ternate area became increasing worrisome.  The reality on eclipse day was captured in an APOD. We relocated about 100 miles west to the western part of the Molucca Sea.


2016 Eclipse location of Le Soléal
click for a larger image

We awoke to scattered clouds, but with clearer weather ahead.  Our route took us virtually parallel to the Equator.  About 7:30 AM we turned south.  The plan was to turn north before C1 (8:32) and arrive near the center line at 9:47 (the center of the eclipse).  Heading south we were entering thicker clouds, but to our north were only widely scattered clouds.  Good eclipse weather.

We had had a full up dress rehearsal on the 7th.  At that time we found the boat was so stable that telescopes on tripods were holding the sun until it rotated out of the field of view.  Never have I been on a ship that offered a land quality platform.

The serious imagers set ourselves up on a small deck that is normally closed to the passengers.  This turned out to be a perfect space.

Carrie on deck
Carrie, Joan, Skip
Imagers waiting
Looking in the direction of the sun.  I am on the left adjusting the camera.  Richard to the right.  Larry Shore is just visible over my knee
Rob and Camera
 The pirate is back.  Arrggh!
courtesy of Larry Shore
Partial Eclipse at 00:01:02 UTC
The sunspot I focused on is about 10 o'clock from the center of the sun. Yeah it is pretty small.  Fortunately I had a SOHO image of the sun so I knew what I was looking for
APOD image of 2016 eclipse
The shadow of the moon over Eastern Indonesia

Image Credit: NASA, NOAA/DSCOVR 
 APOD 2016 March 11


I used the eye patch this time as I have in previous eclipses.  The difference is dramatic as you can hear in the video.  One change in procedure I did this time was to not try to take it off at C2.  Instead I planned a break about 30 seconds after C2 and did it then.

What is it like to experience an Eclipse?  This movie documents our experience


My Equipment and Techniques

Based on my experience in prior cruises I opted to mount the camera on a monopod using a 45° adapter to allow the monopod to be closer to vertical at the time of totality.  I did not count on the rock stable boat or practically flat sea which made this unnecessary.

I knew that focus was going to be critical.  I started about 7:30 right after the sun cleared some distant clouds.  I could not live focus since it was too bright to see the camera screen.  During the rehearsal run we discovered that even holding a jacket over the camera that it was almost impossible to see the screen.  Instead I took an image, went to the side of the ship, looked at the focus, repeat ... It took a while but then I could see a tiny sunspot that I knew was there.  Once I was able to see it I knew I had focus.  I adjusted once between C1 and C2, but the camera otherwise held focus.  Since this camera requires manual focus with a very sensitive ring I had my wife remove my solar filter.  I then had to require the sun right before C2 without looking in the camera.

While I might have used the display screen on the camera I opted for more reliable method.  I mounted a Televue Sol Searcher on my adapter. This is basically a tiny pinhole camera that displays on a translucent window. I calibrated the pinhole to align with my camera.  Before C2 I just had to keep the projection of the pinhole in the window.  Presto I could now image C2.

I used my Canon 60Da and 300 mm IS lens again.  I tried the extender, but as in 2013 that proved too much magnification. Learning from my previous experience I triggered the camera with my TC 80N3 programmable controller. One shot per second, Full resolution, raw only. The camera had no problem with the data rate.

I was reluctant to shoot at slower than a 1/100 rate using a monopod.  That meant dialing back to a +- 2.5 f stop framing centered on 1/640 @ ISO 5.6 @ ISO 200.  This is similar to what I have used before, but with 1/2 f stop more open. Richard shot with a tripod and got good results as slow as 1/30!

During the rehearsal I convinced myself I need more exposures.  I practiced changing the ISO from 200 to 1600.  That got me deeper, but Richards' longer exposures were far better.  That is why I asked him to collaborate with me on the HDR image above.


Technical Discussion of Eclipse Photography and HDR Imaging

Warning - Techie Stuff

It is easy to capture something by taking a picture of an eclipse. Virtually any camera whether wide angle or narrow, fast exposure or slow, iPhone or telescope with astronomical camera will capture a memory.  To get something like what is displayed on this page is more challenging and requires an understanding of the problems involved.  This section will hopefully shed light on the problems and describe how (and why) an HDR image as shown at the top of the page is created.

Capturing an Image

Human eyes are amazing.  Our eyes are able to see differences in brightness of nearly 10 orders of magnitude (a 10 billion times difference). That is an amazing statement.  We can sit in a brightly lit football stadium and still look up to see the stars4.  The eye does have its limits as emphasized in all of the safety warnings during the partial phases, but it is perfectly suited to clearly viewing the entire range of phenomena during totality from the extremely bright prominences to the dim outer corona. And it can do this all in the same view in real-time! It does this by responding non-linearly to light.  To simplify what I mean is that the eye is less sensitive to bright light (up to the point where the eye is damaged) and more sensitive to dim light. These can be mixed in the same view.2. That is why we can see both the players (which are brightly lit) and the stars.

Digital cameras are more limited. Unlike our eyes that image in real-time, cameras gather light and then present the image at the end of the exposure. To adapt to a wide range of brightnesses we change the exposure to regulate the amount of light hitting the CCD.  The difference is important. Within the range of brightnesses enabled by the exposure the camera converts these brightnesses into counts.  It does so linearly. For example, if it reports a value of 12 that means the CCD received 1.2% more light than a value of 10.  Most modern CCDs are able to perceive 65536 different levels of brightness (which I will henceforth abbreviate as 65K) and perform linearly over much of that range.  If the camera gets too much light it saturates (reports all values as 65K).  Too little light is reported as 0.

To repeat my analogy of the football stadium we can take a picture of the players (which will not show stars) or the stars (where the field will be a white blur).

Thus we have defined the first problem with eclipse photography

•    The camera is able to only sense a limited range of brightnesses.  Any exposure setting will capture some of the eclipse. Other parts will be saturated or underexposed

What you can do with a camera is to strategically select a series of exposures each of which will capture a portion of the eclipse.  That is what I have done above.  Just like a slice of a medical CT you are looking at some portion of the whole.  To see a more complete picture requires a different approach - HDR.

High Dynamic Range

The way around the problem of limited range of exposures is to combine multiple exposures into a single composite image. This is called High Dynamic Range or HDR. HDR images are constructed by combining several regular images with different exposures.  The creation tool scales the various parts of the image so the dim parts have smaller values and the bright parts have bigger values proportional to the exposures.  Since you are building a mathematical model of the image one is no longer constrained by the camera and its 65K limits.  HDR images are typically built using floating-point numbers that define as many as 3.4 ⋅ 1038 levels of brightness with a precision of 23 bits3. That is many times more than the range of brightnesses in an eclipse and with more precision (23 vs 16 bits) than is provided by a camera.

However, this brings us to the second problem …

The Problem of Display

When it comes time to display the image HDR produces we run into another problem.  While our HDR image can encode virtually any level of brightness with high precision our typical monitors are able to only display 256 at a time. Even radiologists using high-end monitors can only perceive up to 900 different levels of brightness.

If we simply divided the values in the HDR to produce 256 values we would lose much of the detail of the image. This leads to the second problem

•    Our technology for image display requires some serious compromises.

Thus forming the composite HDR image with its tremendous range of representation is actually the easy part of the process.  Now the work begins

How this image was processed

Encoded within the HDR image is the information our eyes see naturally. To display the data with current monitors requires reaching into the data to find the details.  What we have to do is to compress the tremendous range of brightnesses in the data to highlight the interesting features and present them with only 256 levels of brightness. 

Fortunately there are a number of tools that do just that. My iPhone contains HDR processing that will extend the range of the sensors by combining three exposures and then manipulating the image for a compromise between the dim and bright parts.  It only works though when the differences are not very large.

In astrophotography (where the difference can be quite large) Photoshop was the classic tool used to manipulate images. Until recently it was the gold standard (and some still use it).  More recently astronomers have been using special purpose tools such as ImagesPlus and PixInsight.  The latter is in the process of becoming the new gold standard.  I use PixInsight. These tools are able to handle a much wider difference between the bright parts of an image and the dimmer parts.

Here are the steps used to create this HDR.  In my normal nighttime work I just use PixInsight.   For this project I had to use Photoshop for the reasons described below.

Converting from Raw

The original images were Canon raw (CR2) images.  These are not directly usable, but can be converted into a form that is.  Because the next step required Photoshop I initially used it to do the conversion, but Photoshop reduced the precision of the images and stretched them on its own (which affects the HDR composition).  I ended up using PixInsight to convert the raws to TIFF (a standard lossless image format) and then used the TIFF images for alignment.

Align the Image

The sun moved between the exposures.  To combine the images they must be aligned. While PixInsight has an align function, it did not work properly with this data (probably because the edge of the moon is obscured with flaring in the long exposures).  Thus I imported the images into Photoshop and manually aligned them creating a set of aligned TIFF files for the next step.

Build the HDR

2016 Eclipse HDR image
Now it is time to combine the images.  PixInsight has a tool called HDRComposition for just this purpose.  It reads the data and builds a image where the longer exposures are scaled to smaller values. 

The result looks like just the brightest images, but the data from all (in this case 11) of the images is there scaled and combined so each image contributes the best range of values.

Remap the Brightness

2016 Eclipse Bareford/Hawley  After HT HDMT
As seen above initially only the brightest image is visible in the combined image.  That is because the default display is a simple divide by 256 presentation of the data.  The dim stuff resides in the lower numbers and hence does not make this cut.  Thus our first processing is to remap the data so the dim stuff is visible.  To do this we have to increase the brightness non-linearly.  In this case non-linearly means that dimmer values are more strongly affected than brighter values.  This process tends to overcompensate which clumps a lot of stuff in the bright numbers.

We then use another tool that analyzes the image for subtle changes in the relative brightness. These represent the structure differences to highlight. It spreads the brightness values over a wider range of numbers, which allows more of the structure to be displayed.  After this step the image looks dimmer since data hidden in similar bright values has been spread to more levels of brightness; However, this causes hidden details appear.

As you can see the bright flare at 2 o'clock and the arcing corona at 6 o'clock are now visible.



Sharpen the Image

2016 Eclipse Bareford/Hawley after MMT But we are not done.  Much more detail lies hidden in the above image.

Next we need to sharpen the edges bring out the more subtle structure. The sharpening method PixInsight uses is hard to explain.  However an analogy will make the technique clearer. A way to think of this is like the equalizer on a stereo.  If you want to hear the high frequencies of the flutes you adjust the equalizer to amply the higher frequencies more than the others. Now the flutes contribute more to what you hear.

PixInsight does something like this, but instead of frequencies it does it on the size of the structures.  Does interesting stuff exist at a 16 bits size? Then add more of that size back when creating the processed image.  In this case the fine detail of the flares was in small structures while the large flare at 2 o'clock was a larger structure.  Each were tweaked so the image was enhanced and as few artifacts as possible introduced.

This sounds ad hoc, but there are tools that show how the image is composed so the user has an idea of which way to alter the image.

This all makes sense when you see how it works.

Photoshop users use a function illogically called unsharp to sharpen images.

Managing the Noise

2016 Eclipse Bareford/Hawley  Before noise reduction
click for a larger image
The original data probably has some noise in it naturally.   The operations above will contribute their own noise.  PixInsight has tools to remove the noise by selectively blurring it out. The trick is to blur the noise not the details we previously revealed. The image to the left is the "before" image (a blow up of the image in the previous paragraph). Pixinsight has several tools to reduce the noise in the image.


Since I liked my own fastest exposure better than Richard's I took the liberty of overlaying his image of the prominances and moon with mine.

Together these operations reach into the data and pull out the details you see in the presented image.  The resulting image is a compromise.  With only 256 levels of display one has to be careful to preserve contrast (the point of Remapping Brightness from above) and still display that parts of the image are brighter than others. One also needs to not be too heavy handed with the sharpening or the image starts resembling an abstract painting. This is as much art as science.

Reality Check

Now that you have done all this work is it EXACTLY what you saw?  Of course not.  While the experts like Wong or Luethen  (or my own humble work) can create images that amaze and remind us of the details we saw nothing compares to the sight of the real eclipsed sun.  That is why those of us that do this book the next trip. It is why others, having only these excellent pictures to go on, wonder why we bother.

Footnotes


1 For those ship captains keeping track I have now crossed the equator 3 times by boat.  Once on Mar 7 and twice on Mar 9.  I have also crossed it 15 times by plane (I was above the equator returning this time).
2 This ignores dark adaptation, but this is an oversimplified model anyway.  Dark adaptation effectively allows dimmer things to be seen at the expense of viewing brighter ones. 
3 Actually PixInsight uses 64 bit floating point. This permits an insane range and precision. By convention Pixi scales all brightness values between 0.0 (black) and 1.0 (white). So all brightness's will be some fractional value. 
4 Lack of understanding of this point is the source of one of the Moon Landing Myths.

Copyright


Creative Commons License
The Indonesia 2016 pages by Robert J. Hawley are licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License. This permits the non commercial use of the material on this site, either in whole or in part, in other works provided that I am credited for the work and you apply this or similar license to the result. Some of the included works bear a similar copyright.

Data used in the HDR image is Copyright(c) 2016 Richard Bareford  All Rights Reserved. Please contact me before incorporating this image into other pages.

rjh 11/1/17