- 1 Goddard Space Flight Center, NASA, Greenbelt, MD, United States
- 2 University of Maryland Baltimore County, Baltimore, MD, United States
The Earth Polychromatic Imaging Camera (EPIC), onboard the Deep Space Climate Observatory (DSCOVR) spacecraft, located at the Earth-Sun Lagrange 1 point, has captured a unique optical effect during lunar occultation named “Gaia’s Crown.” In EPIC images, the phenomenon appears as a small “flange” at the Earth–Moon contact when the Moon is roughly half below Earth’s limb; it is present in the visible and near-infrared channels but absent in the ultraviolet. Using atmospheric data and 3D, voxel-based ray tracing models, this effect was identified as a combination of atmospheric distortion and a complex mirage caused by variations in the Earth’s atmosphere. Additionally, it is shown that while satellites closer to the Earth can see a similar phenomenon, Gaia’s Crown presents unique distortion effects that demonstrate how EPIC’s vantage point at 1.5 million kilometers from Earth provides a different perspective on atmospheric optics.
1 Introduction
The Deep Space Climate Observatory (DSCOVR) is a NOAA-operated satellite that orbits the Earth-Sun Lagrange-1 (L1) point. This location, at approximately 1.5 million kilometers away from the Earth, offers a unique view of our planet where the disk is almost fully illuminated. The DSCOVR spacecraft serves as a host to both Sun- and Earth-observing instruments.
The Earth Polychromatic Imaging Camera (EPIC), a NASA instrument onboard DSCOVR, utilizes a 30 cm Cassegrain telescope with a 0.62° field of view to photograph the Earth from this vantage point. Equipped with a 2048 × 2048 charge-coupled device (CCD) sensor, it takes 13 or 22 image sets daily. Each set consists of 10 narrowband images covering wavelengths between 317 and 780 nm, with the more frequent sets occurring during the Northern Hemisphere summer (Marshak et al., 2018). This wavelength range permits a unique view of the Earth in ultraviolet, visible, and near infrared, which enables detection of ozone, sulfur dioxide, aerosols, vegetation, ocean, cloud properties, and other science applications (Marshak et al., 2018).
This brief report documents an observational phenomenon in EPIC imagery and demonstrates a first-order, physically plausible mechanism. We do not attempt a full radiative-transfer or global occurrence analysis. Our aim is identification and plausibility, leaving comprehensive modeling for follow-up work.
2 Observations
The EPIC camera nominally observes just the sunlit Earth, but its 0.62° field of view allows it to occasionally capture the Moon as well, as seen in the left portion of Figure 1. The instrument observes these events roughly four times a year - two events with the Moon passing in front of the Earth and the remainder with it passing behind.
Figure 1. Left: Color image taken by EPIC of the Moon passing behind the Earth taken on 27 September 2015. Right: A close-up of the 680 nm band that shows the lunar limb phenomenon where the Moon intersects with the Earth. Image has been gamma corrected to improve the visibiliy of the phenomenon.
In 2015, a special imaging was performed where the EPIC camera was trained upon the Moon, following it as it transited behind the Earth. Upon more detailed inspection of the images, an interesting effect was seen as the Moon passed behind the Earth. In this sequence, as seen in the right, zoomed-in portion of Figure 1, a small “flange” is seen, appearing as almost a mount for the Moon against the planet’s surface. It was not known what this phenomenon was or how it originated. This paper will discuss the origin of the phenomenon and the underlying optics.
A review of the original, unprocessed level 0 data, which contains the raw counts as received from the instrument, also revealed the feature, eliminating possible level 1a processing calibration artifacts as the cause. As part of this investigation, we determined that pixels in the feature are not saturated, indicating that the cause is not CCD bloom, nor does it mimic any known optical artifact of the telescope including stray light (Cede et al., 2021). This was true for all images investigated as part of this paper.
Because the sequence was part of a public outreach imaging to acquire color photos of the Moon passing behind the Earth’s disk, the instrument recorded only the red (680 nm), green (551 nm), and blue (443 nm) channels. The phenomenon was seen in all 3 wavelengths.
A review of the EPIC archive was conducted and all images where the Moon passed behind the Earth were obtained. Out of the 16 image sets obtained where the Moon intersected the Earth’s horizon, only two additional sets contained the phenomenon. In the set of data collected over 10 years, it occurred only 16.7% of the time. As seen in Table 1, the only common elements between times it is seen is that it occurred when the limb was dark, and the Moon was halfway below the horizon. The orientation of the Moon on the CCD, or the physical location the lunar limb intersects with the Earth appear to be independent. The phenomenon is not seen when the limb is illuminated where the Moon intersects.
In an additional dataset, all 10 bands were obtained (Figure 2), enabling a view of the limb phenomenon in different wavelengths. The phenomenon is mostly invisible in ultraviolet (UV) bands 317, 325, 340, and 388 nm and becomes increasingly clear as the bands progress through 443, 551, 680, 688, 764, and 780 nm. This indicates that the phenomenon is not likely due to atmospheric scattering, which is smaller in the visible and near-IR bands (Figure 2).
Initial inspection of the Moon passing behind the Earth from other spacecraft, such as GOES and the ISS, did not yield the same phenomenon (as seen in Figure 3). Furthermore, crew members on board the ISS who had viewed comparable Earth-Moon transits reported that no flange-like feature was visible to the naked eye, corroborating the absence of the phenomenon at low-Earth viewing geometry (Pettit and Hague, 2025). Both images from the International Space Station (ISS), at an altitude of ∼400 km, and Geostationary Operational Environmental Satellite (GOES) Advanced Baseline Imager (ABI), at ∼35,800 km, display expected optical compression from refraction through the atmosphere. While this image is not included in this paper, the optical compression can also be seen in images from Himawari-8 (Universe Space Tech, 2023). However, while this paper was in the review process, another, newly taken picture from Himawari-8 revealed a similar phenomenon to the flange. The origin of this and the situations in which it appears is discussed in Section 7.
Figure 3. Views of the Moon passing behind the Earth. Left: Astronaut photography from International Space Station (ISS) (NASA via Astronomy.com, 2023). Right: From GOES Advanced Baseline Imager (ABI) (NOAA NESDIS, 2020). Both exhibit typical optical compression from atmospheric distortion.
3 Mirage review
Based on the known mirage in Figure 4, a further review of atmospheric phenomenon was conducted, and from a catalogue of possible features (Naylor, 2002), the following observation was made: that the appearance was very similar to a feature known as an “omega mirage.”
Figure 4. Example of Sun “omega mirage.” This inferior mirage occurs when the ocean creates a surface of low-density hot air (Inaglory, 2007).
Mirages are an optical phenomenon where light is bent by different, contrasting combinations of air temperatures and humidity layers. This bending, produced by different refractive indices of the layers, causes displacement and distortion of distant objects in the sky.
Mirages are separated into roughly three classes: inferior, superior, and “complex.”
In inferior mirages, as seen in Figure 5, an inversion of the light rays causes a mirror reflection of an object to appear below it. This is caused by scenarios where the air has high temperatures near the ground followed by layers of cooler temperatures as the height increases. A common example of this is the “highway mirage” where a hot road appears as if it has a reflective puddle, but instead it is caused by the temperature differences above the road (Hecht, 2002).
Figure 5. Examples of superior and inferior mirages. Superior mirages, caused by layers of increasingly warmer air cause an object to appear to float. Inferior mirages, caused by layers of hot air that increasingly cools causes a mirror-like reflection of the object (Lorenzelli, 2014).
In superior mirages, as seen in Figure 5, light bends in such a way to cause an object to appear to float above its original location. In this situation, layers of cool air followed by warm cause an expansion effect, which makes the objects appear higher. Superior mirages often cause objects that are visually below the line of sight to appear on the horizon when they would not in normal conditions (Lynch and Livingston, 1995).
The final mirage, a “complex mirage,” occurs when there are alternating layers between hot and cold air. In these mirages, known as fata morgana, the optical effects consist of inverted images, compressed and stretched layers, as well as objects “floating” on the horizon (van der Werf, 2022).
4 Theory development
Terrestrial mirages occur when different temperature layers cause changes in the air density, which leads to changes in the refractive index. Near the ground, these thermal layers form due to local conditions: for example, heat rising from the ocean or the sequence of cool air overlaid by warm air creates distinctive optical distortions.
It was considered that the phenomenon was perhaps based on temporary atmospheric conditions. To investigate, we used the Goddard Earth Observing System Forward Processing (GEOS-FP) data set, a product that consists of near-real-time, global weather maps. GEOS-FP merges millions of satellite and ground observations with a numerical weather model every 6 h; this data-assimilation step produces hourly fields of temperature, humidity, winds, and other variables on a grid of roughly 25 km resolution. (Global Modeling Assimilation Office, 2017). An example of GEOS data can be seen in Table 2.
Table 2. Vertical sample from a GEOS dataset (GEOS.fp.asm.inst3_3d_asm_Np.20230205_2100.V01.nc4) located at Goddard Space Flight Center (38.99N, 76.85W) for example purposes.
Atmospheric parameters from the GEOS product, including temperature and humidity, were obtained and reprojected into the EPIC projection and field of view at the times of EPIC partial lunar occultation events. However, the results revealed no evidence of a relationship between temporal atmospheric conditions and whether the phenomenon appeared.
Further review of the data was conducted, and it was observed that all three times the phenomenon was observed when the Moon was halfway below the horizon. Although this is not statistically significant, but because no other explanation fits the observation as well, we adopt a working hypothesis: that this phenomenon was not fleeting but occurred every time the Moon passed behind the Earth and occurred while the Moon was at least halfway below the horizon. This is not as unlikely as it seems, as for the majority of the sets the imagery cadence was planned for normal, Earth-imaging operations and not intended for capturing the Moon or this event. Two out of the three datasets where the phenomenon was observed were fast-cadence imaging sequences specifically targeted for imaging the Moon and therefore more likely to catch it in the right physical position. Also, importantly, the imagery serves only as an initial clue; the hypothesis is tested chiefly by a physics-based ray tracing model, rather than imagery alone.
Under the assumption that this is a consistent event, we propose a new working hypothesis. The atmospheric layers, while generally characterized by a decrease in temperatures with an increase of altitude, do not follow a linear gradient. Instead, there is a switching back and forth of the temperature as altitude increases. The troposphere, stratosphere, mesosphere, and thermosphere are characterized by alternating cooling and rising temperatures. These different layers create an environment ripe for mirage-type optical distortion by providing contrasting refractive indices.
5 Methods
There are many ways to model the effect of atmospheric distortion on light. Common astronomical methods use heuristics as a function of viewing angle. Many, including the EPIC geolocation algorithm (Blank, 2019), use layer-based ray tracing models that treat the atmosphere as a linear, 2-dimensional pathway. However, these common methods had not revealed any type of optical phenomenon, other than compression, at distance. If the phenomenon is a mirage, resolving it requires a fully three-dimensional model rather than the simpler 2D approximations.
5.1 Ray tracer model overview
To this end, a custom, voxel-based ray tracer utilizing vectors was developed. Ray tracers are software that estimate the path of viewing rays in a virtual scene. Voxels, a portmanteau of “volumetric pixel,” are essentially 3D pixels, where each voxel is populated with a property. In this case, the property is the refractive index of a physical location. Using vector math permits a complete 3D calculation of the ray paths and interactions with the Earth’s atmosphere, determining how the rays are warped and where they intersect with the Moon. Furthermore, unlike most ray tracers that discard the intermediary products, a secondary voxel model can store all computed vectors and distortion levels, permitting inspection of the actual phenomenon in post-processing.
A basic renderer (Figure 6) would consist of GEOS-derived refractive indices and an Earth and Moon model which would be used to populate a voxel model. The vector-based ray tracer would then compute on top of the model, producing vector voxel maps of the resulting view rays.
Figure 6. Voxel ray tracer pipeline. GEOS data is used to derive the refractive indices, which are then combined with geophysical models and used to populate the voxel model. The 3D vector-based ray tracer then calculates optical paths and stores the results the ray vector voxel map.
However, this is not without challenge. A full model of the EPIC, Earth, and Moon space, consisting of these refractive voxels, has a computational and storage complexity of O(n3) where n is a function of resolution. EPIC has a resolution of ∼8 km, but obtaining clear ray paths for a potential mirage, which might contain ray inversions, requires a voxel space that is oversampled by a factor of 3–4× (Shannon, 1949). A basic calculation of a 2 km voxel model of the Earth/Moon scene from EPIC would have a volume of ∼10.8 trillion km3 or ∼5.4 trillion voxels. On a computer, using double precision floating point for all calculations and stored values, would require over 4.3 terabytes of memory, an impractical amount.
To improve resource utilization, a “just-in-time” renderer was developed. In this paradigm, only the portions of the scene that are immediately being computed upon are rendered. A further reduction in computation is obtained by clipping the rendered scene to a subset of the optical pathways needed for the optical effect. To ensure continuity of the calculation, the computation retains both the current and previous slice information. Every value that is needed to ensure a complete calculation is propagated.
The renderer is prepared by developing abstract models that contain the values needed, but not in the voxel format. Data from GEOS, including pressure, temperature, and humidity for each atmospheric layer, is ingested in equirectangular projection and the air density and subsequent refractive index are calculated. An abstract 3D model of the Earth and the Moon, in Earth-Centered, Earth-Fixed (ECEF) (Nishihama et al., 1997) XYZ coordinates is also separately computed.
During computation, instead of having the entire voxel model precomputed, one slice along the X-axis is rendered at a time. The abstract models are used to generate look-up indices into the equirectangular refraction datasets, and these values are pulled into a 2D, YZ-dimension slice. The vector renderer then calculates the resulting angular distortions from refraction and the new ray-vectors, along with distorted YZ index offsets, which are then stored in an HDF (hierarchical data format) file. If an opaque surface is encountered, or the calculation yields a reflection, the ray is “snipped,” and no more computations occur on it. The renderer marches through the model space, one slice at a time, until it reaches the end.
In sections where there is only the vacuum of space and no refractive voxels, such as the area between the Earth and the Moon, the renderer will skip drawing ray slices and compute new YZ index offsets based on the last computed ray angles and the distances covered.
The product of the ray tracer is a full 3D map of the viewing rays’ vectors and the amount of distortion per ray, stored in an HDF file. Making this data useful requires additional software that can pull the relative data and display in a viewable format.
A ray tracer, which only models the path the rays take through a scene, is not a full simulator as it does not implement a full lighting model and instrument model that would be required to duplicate the original image. A full lighting model would need to include Rayleigh scattering, which describes the wavelength-dependent way that air molecules scatter light; a radiative-transfer model, which tracks absorption, emission, scattering and the resulting gain or loss of radiance along the ray; and a physics-based shader, which determines surface brightness values based on specular and angular reflectance. In addition, to fully replicate the original image would require an optical and sensor model for the EPIC instrument. The working theory is that the distorted ray-paths are the primary contributor to the phenomenon, and that it is unnecessary to model the other physical aspects to understand the basic underlying behavior. The computational complexity of the ray tracer is very high, requiring millions of physics-based calculations, extremely reducing the likelihood of accidentally replicating the phenomenon. If the DSCOVR EPIC phenomenon and the GOES ABI compressive effect are both sourced from the refractive aspect of atmospheric optics, then a ray tracer will be able to produce both images, with the only change being the initial vectors set to imitate the instrument’s field of view.
For the voxel model, while the minimum for imitating EPIC’s resolution would require ∼10 km, this would not be adequate for interpreting the ray paths. According to Nyquist-Shannon sampling theorem (Shannon, 1949), the minimum sampling to resolve a signal is at least twice the resolution, which means that the voxel model needs to be at least 5 km or better to interpret the results.
5.2 Refractive index computation
To calculate refraction, the GEOS-FP model, which contains 3D assimilated states at various pressure levels, is used. There are 42 layers in this set, each at a different pressure level, covering altitudes from sea level to ∼65 km. Each layer contains, in equirectangular projection, values for temperature, humidity, and pressure from which the air density and refractive index can be derived.
The process for calculating air density is essentially to derive the amount of dry air versus water vapor at pressure (Picard et al., 2008; OmniCalculator, 2024). The first step is to calculate saturation vapor pressure, svp, in kPa via Tetens equation (Equation 1) (Murray, 1967; Wikipedia, 2025). T is the temperature in Celsius, and RH is the relative humidity, both obtained from the GEOS model.
Actual vapor pressure (avp), in kPa, is then:
Pressure (P), in units of kPa, is then used to estimate the dry air pressure, dap, (kPa) via:
Using 287.058 J/(kg K) as the specific gas constant for dry air, and 461.495 J/(kg K) as the specific gas constant for water vapor, the air density, in kg/m3, can be calculated via:
The refractive index (refindex) for visible light (Thormahlen, 1985) is then calculated (Equation 5) (MadSci, 1998) as follows, where 1.29 kg/m3 is the air density at room temperature and pressure and c is the speed of light in m/s. 0.00029 is based on the Edlén dispersion formula for standard air pressure (Jones, 1981) and the refractive index for air:
This calculation is done for each layer of the GEOS dataset.
5.3 Vector ray tracer
Refraction is modelled with Snell’s law (Equation 6) (Angel, 2000), which describes that when a ray encounters a surface with a different index of refraction, the angle that the light gets transmitted through the surface is dependent on the ratio between the current and new refractive indices, as well as the angle of incidence. Essentially:
Where
To apply this property to vectors requires some adaptation (Angel, 2000). Two vectors are needed, one of which is the vector representing the current ray, defined as vr . The second is for the voxel coordinate of the atmospheric surface with which the ray vector is colliding. This vector is from the Earth’s center coordinate to the current voxel coordinate and is defined as va.
The intersection angle,
The ratio between refractive indices is:
And the angle the light is refracted is:
To calculate the refracted vector, vt, is then:
If
The amount of distortion for each ray is tracked and updated through each propagation.
The magnitude factor f of each ray is calculated by normalizing the x component transmitted vector, since the algorithm can only advance the rays on discrete slices:
Where the new coordinates are:
Where s refers to the current slice, and s+1 indicates results should be the new coordinates in the next slice to be computed on. Note that the x calculation can be skipped since it will always advance by 1 due to the discrete nature of the voxel model; it is shown here for completeness.
Before execution of the ray tracer, it is necessary to set the state of the rays based on the optical configuration of the instrument. The initial ray vector slice that starts the process is calculated based on the field of view (FOV) of the instrument. Where the FOV angle is
The rate of change across the slice is calculated, where vol is the voxel model dimensions in the y and z-axis:
Then each ray vector for the initialization slice is calculated, where i is the column and j is the row, and init is the initial ray slice:
The result is an array where the center vector has an offset angle close to zero, and from there the offset angles symmetrically spread as they approach the edge of the volumetric space.
5.4 Full ray tracing process
The ray tracer takes the above algorithms and pulls it all together in the following steps:
1. The initial parameters for the voxel model are set up according to the scene, and dimensions are determined by the locations of the Earth and the Moon and the voxel resolution.
2. The GEOS-FP dataset is ingested and the refractive indices for the atmospheric layers are calculated (Equations 1–5). If necessary, neighboring layers can be averaged together to meet the resolution of the voxel model.
3. The initial ray vector states are calculated (Equations 13–15), based on the FOV angle. This becomes the current ray vector.
4. The ray tracer then runs iteratively through the voxel model, advancing forward slice by slice until it reaches the end. This consists of the following steps:
a. The current slice is rendered by pulling the relevant refractive indices from the layer model into their proper location in the slice.
b. The Earth’s center coordinate to the current slice coordinate vector is built and the algorithm for calculating the transmitted angle is executed (Equations 6–10). Any rays that are reflected or encounter a body are snipped. If a body is encountered, the type is stored (i.e. Earth vs. Moon).
c. The amount of distortion for each ray is calculated (Equations 11 and 12).
d. Data concerning the ray vectors, refraction, angle of incidence, and transmitted angles are stored in a file for future analysis.
e. Ray information is propagated to the next slice calculation to ensure continuity between slices.
5. The final simulated image is calculated based on the states of the ray vectors and the calculated level of distortion.
6 Results
The ray tracer was run at 2 km resolution, resulting in Figures 7, 8. As seen in the raytraced image compared to the actual image from EPIC, the same phenomenon is seen. A vertical sampling of the rays reveals a maximum 0.45° distortion that decreases with distance away from the Earth. This is not unexpected, as the refractive indices decrease in relation to the air density, but it is interesting to note that the descent has several “bumps” where it increases instead of decreases.
Figure 7. Left: EPIC image of the Moon limb phenomenon. Right: Results from ray tracer. Because this model uses discrete rays, small black dots occur where there is no ray present due to diversion from refraction.
Figure 8. Above: Diagrams that show where the data was obtained from the simulation. Left: Per ray magnitude of distortion along vertical axis. Right: Per ray magnitude of distortion along horizontal axis.
An inspection of the rays as they traverse through the atmosphere (Figure 9) shows that the rays are primarily traveling through the stratosphere (∼20–50 km altitude) and lower mesosphere (∼50–85 km).
Figure 9. Illustration of the ray paths. This shows how the rays travel through the atmosphere, and how they are affected by the atmosphere. The rays are bent very slightly in this image; the effect seen in phenomenon is due to a combination of this slight angular distortion and the distance between the Earth and the Moon. This view helps account for what happens to UV light, such as the EPIC 317, 325, and 340 nm bands, as they pass through the stratosphere and would be filtered out by the long path alone the ozone layers. The rays that hit the troposphere approach the critical angle and are reflected.
This would account for the apparent lack of the phenomenon in UV light, as the light passes almost horizontally through the lower stratosphere (15–30 km) and it would be filtered out by the ozone layer. Although computing the atmospheric loss for scattering and absorption is outside of intent of this paper, for reference the loss at 41.8° above the horizon is approximately 99.8% at 300 nm and 47.8% at 350 nm (Spectra-AM1.5, 2025). The rays that hit the troposphere approach the critical angle and are reflected.
To ensure that the model’s result is not affected by the snipping of reflected rays, a review of a number of calculations that results in reflected rays was conducted. Of the 245,592,557 refraction calculations that were conducted, only 776 resulted in a reflected ray, or 0.0003%. This indicates that reflection does not play a statistically significant role in the construction of the final image.
The model was run on a very small segment at 1 km resolution; the smaller voxel size did not alter the image morphology.
Viewing the full ray paths (Figure 10) shows the degree of distortion and where the pixels of the phenomenon originate. The small slice of atmosphere and the distortion caused over the 384,400 km distance between the Earth and the Moon generates over a 3,000 km bending of the rays.
Figure 10. Top: Slice from ray tracer results that shows the full path of viewing rays and the level of divergence that is caused by atmospheric refraction and the distance between the Earth and the Moon. The location of the Moon is shown in green on the right. Colors of the rays have no meaning other than to distinguish them from each other. Bottom: Zoomed-in view of rays above. This magnified view reveals several ray inversions (circled in red), indicating a complex mirage. Color of the rays have no meaning other than to distinguish them from each other. Slice is taken from same location as seen in Figure 8, top left.
Zooming in on the rays provides more detail in Figure 11. Here, several inversions of the light rays can be seen, showing that this phenomenon is not just optical distortion, but a complex mirage. It is interesting to note that while the appearance of the phenomenon is similar to an “omega mirage,” the effect is obtained through different means.
Figure 11. Zoomed-in view of Z-axis slice. Many ray inversions (circled in red) are seen, indicating aspects of a complex mirage. Color of the rays have no meaning other than to distinguish them from each other. Slice is taken from same location as seen in Figure 8, top right.
Inspection of the rays along the Z-axis yields a different picture. Here, the magnitude of distortion is less, but with much greater variance along the slice.
Viewing the rays shows a much more complex distortion in the horizontal vs. vertical axis. A zoomed-in view shows much more mirage-like ray inversions.
While the analysis shows that the phenomenon has aspects of a complex mirage, it does not demonstrate the differences that occur between the views seen by EPIC versus geostationary imagery.
7 Comparative analysis
In order to determine how the GOES scene differs from EPIC, it is necessary to run the ray tracer from the GOES point of view. The GOES ABI has a 17.76° field of view, and by initializing the first vectors with corresponding offsets, it is possible to create a GOES model of the scene.
Running the ray tracer for GOES produced the matching optical compressive effect, as seen in Figure 12, top left. When viewing the magnitudes of distortion (Figure 13), the degree of distortion is in correlation to the density of the atmosphere according to altitude. Unlike that seen in the EPIC data, there are no “bumps” or aberrations that cause the ray inversions. This is true of both the vertical and horizontal slices. Unlike the EPIC phenomenon, this is only a straightforward, atmospheric lensing effect, and has no evidence of being a complex mirage.
Figure 12. Top left: GOES ABI image showing atmospheric distortion. In this image, the optical properties of the atmosphere “compress” the Moon. Top right: Simulated scene from ray tracer using GOES ABI field of view, shows similar effect. Bottom left: Himawari-8 image showing the limb phenomenon. In both simulated images, because the voxel model uses discrete rays, small black dots occur where there is no ray present due to diversion from refraction.
Figure 13. GOES ABI/Himawari-8 simulated magnitudes of refractive distortion. Left: Angular distortion of rays on horizontal axis. Right: Angular distortion of rays on vertical axis. Distances are negative since they are left of the center coordinate. In these charts, the rays bend in a more regular decline than EPIC, showing no evidence of ray inversions.
However, when looking at the magnitude of the rays for geostationary, as seen in Figure 13, there is a substantial difference between what is seen for GOES, Himawari-8, and EPIC. The magnitude of distortion is far less, plus there is no evidence of the magnitude changes which would indicate the ray inversions as seen in Figure 11. (This Figure 11 plot is not duplicated for GOES/Himawari-8 because the angular distortion due to these instruments’ wide field of view renders the relatively small effect of atmospheric disturbance invisible. The addition of this plot would exceed the number of figures permitted by the publication.)
It was initially believed that this limb effect was unique to EPIC. However, in an image taken by Himawari-8 on 11 August 2025, after an initial draft of this paper had been submitted for publication, showed otherwise, as seen in Figure 12, bottom left. We were able to replicate this, without modification to the ray tracer, after setting the Earth and Moon configuration to represent the Himawari-8 scene. The reason it was originally missed was that the time interval at which the geostationary version of the phenomenon occurs is very short. EPIC’s phenomenon occurs for over a course of ∼30 min. Empirical analysis of Himawari-8’s phenomenon using the ray tracer indicates that it is likely visible for less than 2 min. The shorter time span is partially due to the lesser amount of distortion seen at geostationary versus L1, approximately 2.5 times less. Additionally, the Moon’s relative motion is faster from a geostationary platform. Locked to the Earth’s rotation, the spacecraft is moving at a rate of 15.04° per hour compared to L1’s 0.041° per hour.
What is happening in the EPIC image that is different from the geostationary view? How can two pictures of the same apparent scene cause a different optical phenomenon? The answer lies in the fact that although the images appear similar, that similarity is itself an illusion.
DSCOVR orbits the Earth-Sun Lagrange-1 point at over 1.5 million kilometers away from the Earth. Taking the pictures requires a telescope with a 0.62° field of view. GOES, on the other hand, is in geostationary orbit, a relatively cozy 35,000 km away from the Earth. Capturing the entire Earth in a single frame requires the instrument to image at a wide angle of 17.76°. (Himawari-8 is very similar at 17.8° field of view) (Japan Meteorological Agency, 2015). This influences the scene; when the Moon is visible to EPIC, it is obscured by the Earth for GOES.
When considering the angles and slice of the atmosphere through which the two instruments view the scene, EPIC’s view is across a much wider swath of the atmosphere, while GOES’s is a narrow section near the center (Figure 14 bottom left and right). This changes the entry angles of the rays; the more extreme angle for EPIC causes a corresponding increase in magnitude of refraction, which causes a more extreme reaction when the ray collides with the different refractive indices. This is why the ray inversions occur with EPIC, but not in GOES or other platforms closer to Earth, such as ISS or other lower orbit satellites.
Figure 14. Top Image: Difference in perspective between DSCOVR and GOES. The EPIC instrument has a narrow field of view, so the rays are more parallel, and the GOES ABI has a wide field of view. Although the images appear similar, they are obtained through different means. Bottom left: Horizontal view angles without atmospheric distortion. EPIC has a 0.62° field of view at 1.5 million km vs. GOES ABI’s 17.76° at 35,000 km. Bottom right: Horizontal view angles with atmospheric refraction distortion. EPIC’s rays enter closer to the edge of the atmospheric “lens”, which causes more bending of the rays than what is seen with GOES ABI.
When looking at the GOES results in Figure 13 vs. EPIC’s in Figure 8 it can be seen that the magnitude of distortion has much more consistent delineation. There is no evidence of ray inversions, only optical distortion from the atmosphere. Ray slice figures, such as seen for EPIC in Figure 10, do not appear for GOES because the ray paths are so heavily dominated by the wide-angle view of the instrument that it hides the effect of refraction.
In the GOES and Himawari-8 versions of distortion, the vertical distortion is ∼0.2°, but the horizontal distortion is comparatively weak at ∼0.02°. The vertical distortion dominates the resulting images. In EPIC, the vertical distortion reaches a maximum of ∼0.45°, and the horizontal distortion, while at ∼0.11° still has a substantial impact to the image. Although the general shapes of the distortion look similar for both spacecraft, on close inspection of the Himawari-8 image, the craters on the Moon can be clearly made out, although warped. Unfortunately, the resolution for the EPIC data is far less, however at a resolution similar to Himawari-8, the images would likely look very different due to the additional horizontal distortion seen from EPIC. The mirage-like components caused by the ray inversions, seen only in the EPIC results, would create additional distortion in the instrument’s images, possibly creating a more fata morgana-like appearance. The end result is two phenomena that are structurally very different from each other.
8 Model validation
The above results were presented to the DSCOVR/EPIC operations group. They agreed to conduct a “special imaging” activity with the instrument in order to capture the phenomenon. The operations group advised imaging a single wavelength band, as that would permit the shortest interval between exposures. For the activity, it was decided to image the Moon as it passed behind the Earth at 3-min intervals in the 780 nm wavelength. The short interval between exposures would permit capturing the phenomenon in different stages; 780 nm, as seen in Figure 2, would provide the clearest view. In this configuration, along with the limits of onboard storage and downlink, would permit 40 images to be collected over a period of 2 h.
Based on the orbital dynamics, a favorable configuration of the Earth and Moon, permitting a transit, was found for 11 May 2025, from 21:30 to 23:30 UTC.
The ray tracer, as seen in Figure 15, was able to correctly predict the shape the flange-like shape as it begins broad and then shrinks as the Moon rises, until it eventually disappears once it is more than halfway above the limb. This demonstrates that the working hypothesis, that the phenomenon disappears when the Moon is halfway above the limb, was correct. The changes in shape are due to the movement of the Moon from the path of the highly bent rays that pass through the lower atmospheric layers, into lesser bent rays of the upper atmospheric layers.
Figure 15. Lunar transit from 11 May 2025. The ray tracer correctly predicts the behavior of the mirage, a wide swath that shrinks as the Moon rises, until it eventually disappears once the Moon is greater than halfway above the limb. The rays are “longer” in the simulation, as it does not include an atmospheric absorption or an instrument sensitivity model, which would attenuate dimmer rays.
Table 3 contains the differences in length between the phenomenon as predicted in the simulation and the measured. While the level of difference may seem to be large, it is not unexpected as predicted rays become more spread out at the edges and the phenomenon becomes very dim at the edges in the original level 1a data, averaging about 400 counts/second. For contrast, the middle of the phenomenon is ∼3,000 counts/second, the mean count value of the Earth is ∼14,000 counts/second, and the mean count for the dark space around Earth is ∼80 counts/second. The quantum efficiency of EPIC at 780 nm is 58%, which would attenuate the signal more (Epic.gsfc.nasa.gov, 2024). Atmospheric loss from absorption and scattering would contribute as well, although not as significantly. Although computing the atmospheric loss from scattering and absorption is outside of the intent of this paper, for reference it is ∼4.65% at 780 nm at 41.8° elevation (Spectra-AM1.5, 2025).
The full set of images are available on the EPIC Gallery (https://epic.gsfc.nasa.gov/galleries). Many thanks to NOAA and NASA for conducting operations and permitting use of the instrument to investigate atmospheric optical properties.
9 Conclusion
The findings demonstrate that the phenomenon seen in the EPIC image when the Moon is at the horizon is a combination of unique atmospheric distortion effects including refraction and ray inversions, making it similar to a complex mirage caused by regular differences in temperature in the atmospheric layers. Although a visually similar effect is seen at geostationary, caused solely by atmospheric refraction, it lacks the ray inversions and distortions common with mirages. EPIC’s phenomenon, “Gaia’s Crown”, is an optical effect between the Earth and the Moon when the two visually intersect and is seen only in deep space due to the unique viewing geometry causing greater distortion from atmospheric refraction. The demonstrated sensitivity to refraction and high-altitude temperature inversions hints that L1 observations could provide a coarse probe of atmospheric structure. The technique could be applied to other planets with significant atmosphere to help understand their composition.
Data availability statement
Public input datasets used in this study are available at: DSCOVR/EPIC browse (color composites): https://epic.gsfc.nasa.gov; DSCOVR/EPIC science data: https://search.earthdata.nasa.gov/; DSCOVR/EPIC lunar-transit browse images: https://epic.gsfc.nasa.gov/galleries; GEOS (e.g., GEOS-FP) meteorological fields: https://gmao.gsfc.nasa.gov; Derived materials from this study (e.g., the ray tracer, configuration files, and rendered outputs) are subject to U.S. Government release procedures and require prior approval by the sponsoring agency; release timelines are outside the authors’ control. The algorithmic approach and parameter settings necessary to understand and reproduce the identification are described in Methods. Requests to access controlled materials may be directed to the corresponding author (who will forward them for agency review).
Author contributions
KB: Conceptualization, Investigation, Methodology, Software, Visualization, Writing – original draft, Writing – review and editing. JH: Conceptualization, Validation, Writing – review and editing. SD: Software, Validation, Writing – review and editing. AM: Writing – review and editing. AT: Data curation, Writing – review and editing.
Funding
The authors declare that financial support was received for the research and/or publication of this article. The work utilized the DSCOVR/EPIC data, which utilizes computational resources and services provided by the NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The authors declare that no Generative AI was used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Angel, E. (2000). Interactive computer graphics: a top-down approach with OpenGL. 2nd ed. New York, NY: Addison Wesley Longman, Inc.
Astronomy.com. (2023). A distorted view of a full moon intersecting earth's horizon was photographed from the international space station. NASA. Available online at: https://www.astronomy.com/observing/everything-you-need-to-know-about-the-Moon/ (Accessed September 7, 2023).
Blank, K. (2019). EPIC geolocation and color imagery algorithm, revision 6. NASA Langley Research Center, Atmospheric Science Data Center. Available online at: https://asdc.larc.nasa.gov/documents/dscovr/DSCOVR_EPIC_Geolocation_V03.pdf (Accessed August 1, 2025).
Cede, A., Huang, L., McCauley, G., Herman, J., Blank, K., Kowalewski, M., et al. (2021). Raw EPIC data calibration. Front. Remote Sens. 2, 702275. doi:10.3389/frsen.2021.702275
Epic.gsfc.nasa.gov (2024). What is EPIC. Available online at: https://epic.gsfc.nasa.gov/about/epic (Accessed August 1, 2025).
Global Modeling Assimilation Office (2017). File specification for GEOS-5 FP (forward processing). Available online at: https://gmao.gsfc.nasa.gov/pubs/docs/Lucchesi1202.pdf (Accessed April 8, 2025).
Inaglory, B. (2020). Schematic diagram explaining fata Morgana. Available online at: https://commons.wikimedia.org/wiki/File:Fada_morgana_graphnn.JPG (Accessed April 8, 2025).
Inaglory, B. (2007). Sunset inferior mirage. Available online at: https://commons.wikimedia.org/wiki/File:Sunset_inferio_mirage.jpg (Accessed April 8, 2025).
Japan Meteorological Agency (2015). Himawari-8/9 Himawari standard data user’s guide, version 1.2. Available online at: https://www.data.jma.go.jp/mscweb/en/himawari89/space_segment/hsd_sample/HS_D_users_guide_en_v12.pdf (Accessed September 18, 2025).
Jones, F. (1981). The refractivity of air. J. Res. Natl. Bureau Stand. 86 (1), 27. doi:10.6028/jres.086.002
Lorenzelli, L. (2014). Basic diagram of the optical illusion. Available online at: https://commons.wikimedia.org/w/index.php?curid=37081455 (Accessed April 8, 2025).
Lynch, D., and Livingston, W. (1995). Color and light in nature. New York, NY: Cambridge University Press.
MadSci (1998). What is the speed of light as a function of air density? Available online at: https://www.madsci.org/posts/archives/feb98/888690999.Ph.r.html (Accessed April 8, 2025).
Marshak, A., Herman, J., Szabo, A., Blank, K., Cede, A., Carn, S., et al. (2018). Earth observations from DSCOVR/EPIC instrument. Bull. Am. Meteor. Soc. 99, 1829–1850. doi:10.1175/BAMS-D-17-0223.1
Naylor, J. (2002). Out of the blue: a 24-hour skywatcher’s guide. New York, NY: Cambridge University Press.
Nishihama, M., Wolfe, R., and Solomon, D. (1997). MODIS level 1A Earth location: algorithm theoretical basis document version 3.0. Available online at: https://modis.gsfc.nasa.gov/data/atbd/atbd_mod28_v3.pdf (Accessed April 8, 2025).
NOAA NESDIS (2020). An ode to the moon: how NOAA satellites capture earth’s satellite. Available online at: https://www.nesdis.noaa.gov/news/ode-the-Moon-how-noaa-satellites-capture-earths-satellite (Accessed August 1, 2025).
OmniCalculator (2024). Air density calculator. Available online at: https://www.omnicalculator.com/physics/air-density (Accessed April 8, 2025).
Pettit, D., and Hague, N. (2025). Expedition 72 postflight presentation. NASA Goddard Space Flight Center Video. Available online at: https://video.ibm.com/recorded/134509377 (Accessed September 18, 2025).
Picard, A., Davis, R. S., Gläser, M., and Fujii, K. (2008). Revised formula for the density of moist air (CIPM-2007). Metrologia 45, 149–155. doi:10.1088/0026-1394/45/2/004
Shannon, C. (1949). “Communication in the presence of noise,” in Proceedings in the institute of radio engineers. doi:10.1109/JRPROC.1949.232969
Spectra-am1.5 (2025). Reference air mass 1.5 spectra. National Renewable Energy Laboratory. Available online at: https://www.nrel.gov/grid/solar-resource/spectra-am1.5.
Thormahlen, I., Straub, J., and Grigull, U. (1985). Refractive index of water and its dependence on wavelength, temperature, and density. J. Phys. Chem. Reference Data 14, 933–945. doi:10.1063/1.555743
Universe Space Tech (2023). The moon “photobombs” the Earth in the image of a Japanese satellite. Available online at: https://universemagazine.com/en/the-Moon-photobombs-the-earth-in-the-image-of-a-japanese-satellite/ (Accessed April 8, 2025).
van der Werf, S. (2022). Novaya zemlya effect and fata morgana. Raytracing in a spherically non-symmetric atmosphere. Comptes Rendus Phys. 23, 365–389. doi:10.5802/crphys.102
Wikipedia (2025). Tetens equation. Available online at: https://en.wikipedia.org/wiki/Tetens_equation (Accessed April 8, 2025).
Keywords: DSCOVR, EPIC, atmospheric optics, mirage, GOES (geostationary operational environmental satellite), Himawari-8
Citation: Blank K, Herman J, Dangelo S, Marshak A and Tennenbaum A (2026) Gaia’s Crown: a deep space mirage seen from DSCOVR/EPIC during lunar transit. Front. Remote Sens. 6:1640320. doi: 10.3389/frsen.2025.1640320
Received: 03 June 2025; Accepted: 24 November 2025;
Published: 12 January 2026.
Edited by:
Soo Chin Liew, National University of Singapore, SingaporeReviewed by:
Santo V. Salinas, National University of Singapore, SingaporeVíctor Molina García, German Aerospace Center (DLR), Germany
Copyright © 2026 Blank, Herman, Dangelo, Marshak and Tennenbaum. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Karin Blank, a2FyaW4uYi5ibGFua0BuYXNhLmdvdg==
Sarah Dangelo1