Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
69004f3
Start unit test for SEVIRI time tracking
gerritholl Jul 17, 2025
46a14f4
First skeleton on how implementation might look like
gerritholl Jul 17, 2025
505e1f3
First implementation for seviri per-pixel times
gerritholl Jul 21, 2025
d6259c9
in a variable?
gerritholl Jul 21, 2025
7ff5824
Add time as coordinate
gerritholl Jul 22, 2025
fc0f2f5
Resample time coordinates
gerritholl Jul 22, 2025
c4866fd
resample time coordinate for real
gerritholl Jul 22, 2025
59f23b8
Implement time-tracking as floats
gerritholl Jul 22, 2025
2f177a3
Add test for writing ValidDateID
gerritholl Jul 22, 2025
b4e68c6
Write ValidDateID to ninjogeotiff header
gerritholl Jul 22, 2025
329dab5
Adapt ninjo tag test
gerritholl Jul 22, 2025
ca2c263
Refactor getting valid time out of ninjogeotiff writer
gerritholl Jul 25, 2025
d2dd40c
Refactor Scene._resampled_scene()
pnuu Jul 28, 2025
ccbab5e
Move source area access to _reduce_data()
pnuu Jul 28, 2025
b77d8b9
Test writing valid time to filename
gerritholl Jul 28, 2025
2d8d883
Refactor data reduction
pnuu Jul 28, 2025
e635b7b
Write valid time to filename
gerritholl Jul 28, 2025
5b1db64
Pass dynamic fileds as a set.
gerritholl Jul 29, 2025
b866941
Add documentation on writing valid time
gerritholl Jul 29, 2025
2aa3a43
Merge branch 'refactor_resampled_scene' into feature-valid-time
gerritholl Jul 29, 2025
f669619
Remove unneeded code
gerritholl Jul 30, 2025
9cebebe
Doc fixes
gerritholl Jul 30, 2025
4f55830
clarify writer docs
gerritholl Jul 30, 2025
823e1e9
Improve test coverage
gerritholl Jul 30, 2025
c3626c4
Changes based on Panus review
gerritholl Aug 4, 2025
1780350
Merge branch 'main' into feature-valid-time
gerritholl Aug 4, 2025
66b9ef8
valid time is now mean time
gerritholl Aug 4, 2025
9452c5a
valid time is now mean time, also in docs
gerritholl Aug 4, 2025
78adb1c
fix small error in doc index.rst
gerritholl Aug 4, 2025
24fc0b4
More doc fixes re. valid/mean time
gerritholl Aug 4, 2025
546543c
Fix doc reference errors
gerritholl Aug 5, 2025
d619238
Split remaining ValueError test in two
gerritholl Aug 5, 2025
da329f4
Merge branch 'main' into feature-valid-time
gerritholl Mar 12, 2026
33e4761
Resampling time coordinate fails if reduce_data=False
gerritholl Mar 18, 2026
f868ecd
Merge branch 'main' into feature-valid-time
gerritholl Mar 20, 2026
637bb87
Add SingleBandCompositor time test and refactor
gerritholl Mar 20, 2026
87ca1b3
Test time coordinate retained in GenericCompositor
gerritholl Mar 20, 2026
ae477eb
Expand tests for time coordinate compositing
gerritholl Apr 1, 2026
f8c9016
Make sure time info in tests has epoch data
gerritholl Apr 1, 2026
f45e66b
Average time coordinates
gerritholl Apr 1, 2026
3028456
Average the coords not data values
gerritholl Apr 1, 2026
6735e68
Fix non-reduced coordinate resampling bug
gerritholl Apr 2, 2026
9494636
Retain time coordinate upon sunz correction
gerritholl Apr 2, 2026
16a4ef8
Retain time coordinates in more cases
gerritholl Apr 2, 2026
4ed2eeb
Verify no early dask computes with times
gerritholl Apr 2, 2026
811e5b6
Avoid early computation in time coordinate
gerritholl Apr 2, 2026
af1395a
remove check_times from tests
gerritholl Apr 2, 2026
cec188c
Avoid dropping G and B
gerritholl Apr 2, 2026
ae8259a
Avoid dropping G and B
gerritholl Apr 2, 2026
32558cb
Add test to ensure no early dask computations
gerritholl Apr 7, 2026
0c39ef7
Combine times prior to compositing
gerritholl Apr 7, 2026
1adce9e
Merge branch 'feature-valid-time' of github.com:gerritholl/satpy into…
gerritholl Apr 7, 2026
3fc4f47
Repair case with 1-D time coordinate
gerritholl Apr 7, 2026
f550c10
WIP toward better laziness
gerritholl Apr 7, 2026
e2e6569
Lazy tag calculations for mean time
gerritholl Apr 7, 2026
1fa02a6
Move atmospheric modifier tests to own test module
gerritholl Apr 8, 2026
bf75088
Add unit test for CO2Corrector
gerritholl Apr 8, 2026
2c3abfa
Merge branch 'test-co2-cerroctor' into feature-valid-time
gerritholl Apr 8, 2026
c3ee22d
Add test to confirm CO2 corrector does not compute
gerritholl Apr 8, 2026
a1611e1
Retain time coordinates without computation
gerritholl Apr 8, 2026
1d19b8e
Retain time coordinates in arithmetic compositors
gerritholl Apr 8, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions doc/source/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ as explanations in the various sections of this documentation.

fci_l1c_natural_color
vii_l1b_nc
mean_time

.. list-table::
:header-rows: 1
Expand Down Expand Up @@ -45,3 +46,5 @@ as explanations in the various sections of this documentation.
- Generate Natural Color RGB from Meteosat Third Generation (MTG) FCI Level 1c data
* - :doc:`Reading EPS-SG Visible and Infrared Imager (VII) with Pytroll <vii_l1b_nc>`
- Read and visualize EPS-SG VII L1B test data and save it to an image
* - :doc:`Storing approximate measurement time <mean_time>`
- Read, resample, and store the approximate measurement time for supported readers and writers.
44 changes: 44 additions & 0 deletions doc/source/examples/mean_time.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
Tracking measurement time
=========================

.. versionadded:: v0.58

For some readers and writers, it is possible to keep track of pixel-level
measurement times and store the average measurement time in the metadata
for the resampled image. This can be stored in the filename or the file
headers (or both). It is only supported for selected readers and writers.

By default, Satpy does not keep track of measurement times. To keep track
of measurement times, we must first tell the reader to add such times to
the metadata of each dataset. With supported readers, this can be done
by passing ``reader_kwargs={"track_time": True}`` to :meth:`~satpy.scene.Scene`:

.. code-block:: python

sc = Scene(filenames={"seviri_l1b_hrit": seviri_files}, reader_kwargs={"track_time": True})
sc.load(["IR_108"])

The time is stored as a coordinate:

.. code-block:: python

sc["IR_108"].coords["time"]

To retain it upon resampling, pass ``resample_coords=True`` to :meth:`~satpy.scene.Scene.resample`:

.. code-block:: python

ls = sc.resample("eurol", resample_coords=True)

For supported writers, it can be stored in the filename by passing ``dynamic_fields={"mean_time"}``
to :meth:`~satpy.scene.Scene.save_datasets`:

.. code-block:: python

ls.save_datasets(
writer="geotiff",
filename="{platform_name}-{sensor}-{name}-{area.area_id}-{start_time:%Y%m%d%H%M}-{mean_time:%Y%m%d%H%M%S}.tif",
dynamic_fields={"mean_time"})

For supported writers, valid time may also be written to the headers. Consult
the documentation of your writer for details.
15 changes: 10 additions & 5 deletions doc/source/writing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,15 @@ One common parameter across almost all Writers is ``filename`` and
... filename="{name}_{start_time:%Y%m%d_%H%M%S}.tif",
... base_dir="/tmp/my_ouput_dir")

.. versionchanged:: 0.10

The `file_pattern` keyword argument was renamed to `filename` to match
the `save_dataset` method"s keyword argument.
The ``filename`` argument can specify Python string formatting fields.
Those fields are mostly filled by attributes available or the individual
datasets. Following Python string formatting rules, attributes of
attributes can be referenced as well, for example ``area.name``. In
addition to dataset attributes, some reader/writer combinations support
dynamically calculated field values. This currently exists only for
``{mean_time}`` if the keyword argument ``dynamic_fields={"mean_time"}``
is passed to :meth:`~satpy.scene.Scene.save_datasets`. See the
:doc:`example on storing valid time </examples/mean_time>` for details.

.. _writer_table:

Expand Down Expand Up @@ -53,7 +58,7 @@ One common parameter across almost all Writers is ``filename`` and
-
* - GeoTIFF with NinJo tags (from NinJo 7)
- :class:`ninjogeotiff <satpy.writers.ninjogeotiff.NinJoGeoTIFFWriter>`
- Beta
- Nominal
-

Available Writers
Expand Down
38 changes: 38 additions & 0 deletions satpy/cf/decoding.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,8 @@
import datetime as dt
import json

import numpy as np


def decode_attrs(attrs):
"""Decode CF-encoded attributes to Python object.
Expand Down Expand Up @@ -74,3 +76,39 @@ def _str2datetime(string):
return dt.datetime.fromisoformat(string)
except (TypeError, ValueError):
return None


def lazy_decode_cf_time(xrda_encoded):
"""Lazily decode CF-encoded time in limited situations.

This is a restricted alternative to xarray.coding.times.decode_cf_datetime
that avoids dask computations on the values to be decoded. It is
restricted to standard calendars (proleptic gregorian).

Args:
xrda_encoded (array-like):
Xarray data array with units attribute. The units attribute should
be a string describing the time units using "x since timestamp",
UDUNITS-style.
"""
# An early iteration of this function was written with the
# assistance of GPT-5.4.

(unit, ref_str) = xrda_encoded.attrs["units"].split(" since ")
ref = np.datetime64(ref_str)

unit_map = {
"days": "timedelta64[D]",
"day": "timedelta64[D]",
"hours": "timedelta64[h]",
"hour": "timedelta64[h]",
"minutes": "timedelta64[m]",
"minute": "timedelta64[m]",
"seconds": "timedelta64[s]",
"second": "timedelta64[s]",
"milliseconds": "timedelta64[ms]",
"microseconds": "timedelta64[us]",
"nanoseconds": "timedelta64[ns]",
}

return ref + xrda_encoded.astype(unit_map[unit.lower()])
9 changes: 6 additions & 3 deletions satpy/composites/arithmetic.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ def __call__(self, projectables, nonprojectables=None, **attrs):
"""Generate the composite."""
if len(projectables) != 2:
raise ValueError("Expected 2 datasets, got %d" % (len(projectables),))
projectables = self.match_data_arrays(projectables)
projectables = self.match_data_arrays(projectables,
drop_coordinates=False)
info = combine_metadata(*projectables)
info["name"] = self.attrs["name"]
info.update(self.attrs) # attrs from YAML/__init__
Expand All @@ -52,7 +53,8 @@ def __call__(self, projectables, nonprojectables=None, **info):
"""Generate the composite."""
if len(projectables) != 2:
raise ValueError("Expected 2 datasets, got %d" % (len(projectables),))
projectables = self.match_data_arrays(projectables)
projectables = self.match_data_arrays(projectables,
drop_coordinates=False)
info = combine_metadata(*projectables)
info.update(self.attrs)

Expand All @@ -68,7 +70,8 @@ def __call__(self, projectables, nonprojectables=None, **info):
"""Generate the composite."""
if len(projectables) != 2:
raise ValueError("Expected 2 datasets, got %d" % (len(projectables),))
projectables = self.match_data_arrays(projectables)
projectables = self.match_data_arrays(projectables,
drop_coordinates=False)
info = combine_metadata(*projectables)
info["name"] = self.attrs["name"]

Expand Down
151 changes: 105 additions & 46 deletions satpy/composites/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@
from typing import Optional, Sequence

import dask.array as da
import numpy as np
import xarray as xr

from satpy.dataset import DataID, combine_metadata
Expand All @@ -35,8 +34,6 @@
NEGLIGIBLE_COORDS = ["time"]
"""Keywords identifying non-dimensional coordinates to be ignored during composite generation."""

TIME_COMPATIBILITY_TOLERANCE = np.timedelta64(1, "s")


class IncompatibleAreas(Exception):
"""Error raised upon compositing things of different shapes."""
Expand Down Expand Up @@ -124,7 +121,8 @@
elif origin.get(key) is not None:
destination[key] = origin[key]

def match_data_arrays(self, data_arrays: Sequence[xr.DataArray]) -> list[xr.DataArray]:
def match_data_arrays(self, data_arrays: Sequence[xr.DataArray],
drop_coordinates: bool=True) -> list[xr.DataArray]:
"""Match data arrays so that they can be used together in a composite.

For the purpose of this method, "can be used together" means:
Expand All @@ -133,12 +131,14 @@
- Either all arrays should have an area, or none should.
- If all have an area, the areas should be all the same.

In addition, negligible non-dimensional coordinates are dropped (see
In addition, negligible non-dimensional coordinates can be dropped (see
:meth:`drop_coordinates`) and dask chunks are unified (see
:func:`satpy.utils.unify_chunks`).

Args:
data_arrays: Arrays to be checked
drop_coordinates: If true, drop non-dimensional coordinates.
If false, unify them too.

Returns:
Arrays with negligible non-dimensional coordinates removed.
Expand All @@ -150,7 +150,7 @@
If some, but not all data arrays lack an area attribute.
"""
self.check_geolocation(data_arrays)
new_arrays = self.drop_coordinates(data_arrays)
new_arrays = self.drop_coordinates(data_arrays) if drop_coordinates else self.combine_coordinates(data_arrays)
new_arrays = self.align_geo_coordinates(new_arrays)
new_arrays = list(unify_chunks(*new_arrays))
return new_arrays
Expand Down Expand Up @@ -210,6 +210,8 @@
dimension. Negligible coordinates are defined in the
:attr:`NEGLIGIBLE_COORDS` module attribute.

See also: :meth:`combine_coordinates`.

Args:
data_arrays: Arrays to be checked
"""
Expand All @@ -225,6 +227,47 @@

return new_arrays

def combine_coordinates(self, data_arrays: Sequence[xr.DataArray]) -> list[xr.DataArray]:
"""Combine time coordinates.

Combine coordinates if they do not correspond to any
dimension. Currently only supports time coordinates.

See also: :meth:`drop_coordinates`.

Args:
data_arrays: Arrays to be checked
"""
new_time = self.combine_times(data_arrays)
return [data.assign_coords({"time": new_time}) if "time" in data.coords else data
for data in data_arrays]

@staticmethod
def combine_times(projectables):
"""Get a combined time coordinate between projectables.

Returns the arithmetic mean between the available time coordinates,
provided they have a common dimensionality and units. If no projectables
have time coordinates, return None. Time coordinates should be CF-encoded,
i.e. have a numeric dtype and a units attribute.
"""
_verify_times(projectables)
timed_projectables = [proj for proj in projectables
if hasattr(proj, "coords")
and "time" in proj.coords]
if len(timed_projectables) > 0:
with xr.set_options(keep_attrs=True):
# when the time coordinates are different, can't use xarray to
# sum them without this leading to expensive computations!
da_new_time = sum([x.coords["time"].data for x in timed_projectables]) / len(timed_projectables)
first = timed_projectables[0].coords["time"]
xr_new_time = xr.DataArray(
da_new_time,
dims=first.dims,
attrs=first.attrs,
coords={"y": first.coords["y"], "x": first.coords["x"]})
return xr_new_time.assign_coords(time=(first.dims, da_new_time))

@staticmethod
def align_geo_coordinates(data_arrays: Sequence[xr.DataArray]) -> list[xr.DataArray]:
"""Align DataArrays along geolocation coordinates.
Expand Down Expand Up @@ -428,7 +471,7 @@
data = xr.concat(projectables, "bands", coords="minimal")
data["bands"] = list(mode)
except ValueError as e:
LOG.debug("Original exception for incompatible areas: {}".format(str(e)))
LOG.exception("Original exception for incompatible areas: {}".format(str(e)))
raise IncompatibleAreas("Areas do not match.")

return data
Expand Down Expand Up @@ -483,17 +526,14 @@
return mode

def _check_datasets_and_data(self, datasets, mode):
time = self.combine_times(datasets)
datasets = self.match_data_arrays(datasets)
data = self._concat_datasets(datasets, mode)
# Skip masking if user wants it or a specific alpha channel is given.
if self.common_channel_mask and mode[-1] != "A":
data = data.where(data.notnull().all(dim="bands"))
# if inputs have a time coordinate that may differ slightly between
# themselves then find the mid time and use that as the single
# time coordinate value
time = check_times(datasets)
if time is not None and "time" in data.dims:
data["time"] = [time]
if time is not None:
data = data.assign_coords({"time": time})

return datasets, data

Expand All @@ -518,39 +558,58 @@
return new_attrs


def check_times(projectables):
"""Check that *projectables* have compatible times."""
times = []
for proj in projectables:
status = _collect_time_from_proj(times, proj)
if not status:
break
else:
return _get_average_time(times)


def _collect_time_from_proj(times, proj):
status = False
try:
if proj["time"].size and proj["time"][0] != 0:
times.append(proj["time"][0].values)
status = True
except KeyError:
# the datasets don't have times
pass
except IndexError:
# time is a scalar
if proj["time"].values != 0:
times.append(proj["time"].values)
status = True
return status


def _get_average_time(times):
# Is there a more gracious way to handle this ?
if np.max(times) - np.min(times) > TIME_COMPATIBILITY_TOLERANCE:
raise IncompatibleTimes("Times do not match.")
return (np.max(times) - np.min(times)) / 2 + np.min(times)
@staticmethod
def combine_times(projectables):
"""Get a combined time coordinate between projectables.

Returns the arithmetic mean between the available time coordinates,
provided they have a common dimensionality and units. If no projectables
have time coordinates, return None. Time coordinates should be CF-encoded,
i.e. have a numeric dtype and a units attribute.
"""
_verify_times(projectables)
timed_projectables = [proj for proj in projectables
if hasattr(proj, "coords")
and "time" in proj.coords]
if len(timed_projectables) > 0:
with xr.set_options(keep_attrs=True):
# when the time coordinates are different, can't use xarray to
# sum them without this leading to expensive computations. At
# this point, we should make sure NO time coordinates are
# assigned to the time coordinate or things will get messed up
# later.
first = timed_projectables[0].coords["time"]
new_coords = {}
if "y" in first.coords:
new_coords["y"] = first.coords["y"].copy()
if "x" in first.coords:
new_coords["x"] = first.coords["x"].copy()
da_new_time = sum([x.coords["time"].data for x in timed_projectables]) / len(timed_projectables)
xr_new_time = xr.DataArray(
da_new_time,
dims=first.dims,
attrs=first.attrs.copy(),
coords=new_coords)
return xr_new_time.assign_coords(time=(first.dims, da_new_time))


def _verify_times(projectables):
"""Verify that the times can be combined.

Times can be combined if they have consistent units and dimensions.
"""
times = [p.coords["time"] for p in projectables
if hasattr(p, "coords") and "time" in p.coords]
for time in times:
if "units" not in time.attrs:
raise ValueError("Time coordinate lacks units attribute")
if time.attrs["units"] != times[0].attrs["units"]:
raise IncompatibleTimes("Time coordinates have inconsistent units. "
"Conversions not implemented.")
if time.dims != times[0].dims:
raise IncompatibleTimes(
"Time coordinates have inconsistent dimensionality. Only "
"consistent dimensionality is implemented.")

Check warning on line 612 in satpy/composites/core.py

View check run for this annotation

CodeScene Delta Analysis / CodeScene Code Health Review (main)

❌ New issue: Complex Method

_verify_times has a cyclomatic complexity of 10, threshold = 9. This function has many conditional statements (e.g. if, for, while), leading to lower code health. Avoid adding more conditionals and code to it without refactoring.


class RGBCompositor(GenericCompositor):
Expand Down
Loading
Loading