Multi-image comparisons (darsia.multi_image_analysis)#

Multi-image comparison tools to track dynamics in static images. Such are employed in order to study concentration analyses, local deformations (equivalent with the task of image regstration). An important step of translating iages to concentrations is the signal transformation into physically meaningful data. This can be achieved through scaling etc. Tools to calibrate models of interest are provided in this subpackage.

Submodules#

darsia.multi_image_analysis.balancing_calibration module#

Module collecting several calibration tools for models.

In particular provided are objective functions for calibration in ConcentrationAnalysis.calibrate_balancing().

class AbstractBalancingCalibration[source]#

Bases: object

Abstract class for defining an objective function to be called in ConcentrationAnalysis.calibrate_balancing().

calibrate_balancing(images, options)[source]#

Utility for calibrating the balancing used in darsia.ConcentrationAnalysis.

NOTE: Require to combine darsia.ConcentrationAnalysis with a calibration model mixin via multiple inheritance.

Parameters:
  • images (list of darsia.Image) – calibration images

  • options (dict) – container holding tuning information for the numerical calibration routine

Returns:

success of the calibration study.

Return type:

bool

abstractmethod optimize_balancing(input_images, images_diff, relative_times, options)[source]#

Abstract method to define an objective function.

Returns:

objective function.

Return type:

callable

update_balancing_for_calibration(parameters, options)[source]#

Wrapper for updating the balancing (provided as a model), depending on whether it is a single model or a combined model.

Parameters:
  • parameters (np.ndarray) – model parameters,

  • options (dict) – further tuning parameters and extra info.

class ContinuityBasedBalancingCalibrationMixin[source]#

Bases: AbstractBalancingCalibration

Calibration balancing based on reducing jumps over interfaces for a given labeled image. Has to be combined with ConcentrationAnalysis.

calibrate_balancing(images, options)#

Utility for calibrating the balancing used in darsia.ConcentrationAnalysis.

NOTE: Require to combine darsia.ConcentrationAnalysis with a calibration model mixin via multiple inheritance.

Parameters:
  • images (list of darsia.Image) – calibration images

  • options (dict) – container holding tuning information for the numerical calibration routine

Returns:

success of the calibration study.

Return type:

bool

optimize_balancing(images, options)[source]#

Define objective function such that the root is the min.

Parameters:
  • input_images (list of np.ndarray) – input for _convert_signal

  • images_diff (list of np.ndarray) – plain differences wrt background image

  • relative_times (list of float) – times

  • options (dict) – dictionary with objective value, here the injection rate

Returns:

optimized model parameters bool: success flag

Return type:

np.ndarray

update_balancing_for_calibration(parameters, options)#

Wrapper for updating the balancing (provided as a model), depending on whether it is a single model or a combined model.

Parameters:
  • parameters (np.ndarray) – model parameters,

  • options (dict) – further tuning parameters and extra info.

darsia.multi_image_analysis.concentrationanalysis module#

Capabilities to analyze concentrations/saturation profiles based on image comparison.

class ConcentrationAnalysis(base=None, signal_reduction=None, balancing=None, restoration=None, model=None, labels=None, **kwargs)[source]#

Bases: object

Class providing the capabilities to determine concentration/saturation profiles based on image comparison, and interpretation of intensity maps.

find_cleaning_filter(baseline_images=None)[source]#

Determine structural noise by studying a series of baseline images. The resulting cleaning filter will be used prior to the conversion of signal to concentration. The cleaning filter should be understood as thresholding mask.

Parameters:

baseline_images (list of images) – series of baseline_images; default: use internally available baseline images.

read_cleaning_filter_from_file(path)[source]#

Read cleaning filter from file.

Parameters:

path (str or Path) – path to cleaning filter array.

update(base=None, mask=None)[source]#

Update of the baseline image or parameters.

Parameters:
  • base (Image, optional) – image array

  • mask (np.ndarray, optional) – boolean mask, detecting which pixels will be considered, all other will be ignored in the analysis.

write_cleaning_filter_to_file(path_to_filter)[source]#

Store cleaning filter to file.

Parameters:

path_to_filter (str or Path) – path for storage of the cleaning filter.

balancing#

Balancing for heterogeneous signals.

base#

Baseline image.

first_restoration_then_model#

Option for defining order of routines.

labels#

Indicator for heterogeneous image.

mask#

Mask.

model#

Signal to data conversion model.

restoration#

Restoration model.

signal_reduction#

Reduction to scalar signal.

verbosity#

Fetch verbosity. With increasing number, more intermediate results are displayed. Useful for parameter tuning.

class PriorPosteriorConcentrationAnalysis(base, signal_reduction, balancing, restoration, prior_model, posterior_model, labels=None, **kwargs)[source]#

Bases: ConcentrationAnalysis

Special case of the ConcentrationAnalysis performing a prior-posterior analysis, i.e., allowing to review the conversion performed through a prior model.

find_cleaning_filter(baseline_images=None)#

Determine structural noise by studying a series of baseline images. The resulting cleaning filter will be used prior to the conversion of signal to concentration. The cleaning filter should be understood as thresholding mask.

Parameters:

baseline_images (list of images) – series of baseline_images; default: use internally available baseline images.

read_cleaning_filter_from_file(path)#

Read cleaning filter from file.

Parameters:

path (str or Path) – path to cleaning filter array.

update(base=None, mask=None)#

Update of the baseline image or parameters.

Parameters:
  • base (Image, optional) – image array

  • mask (np.ndarray, optional) – boolean mask, detecting which pixels will be considered, all other will be ignored in the analysis.

write_cleaning_filter_to_file(path_to_filter)#

Store cleaning filter to file.

Parameters:

path_to_filter (str or Path) – path for storage of the cleaning filter.

balancing#

Balancing for heterogeneous signals.

base#

Baseline image.

first_restoration_then_model#

Option for defining order of routines.

labels#

Indicator for heterogeneous image.

mask#

Mask.

model#

Signal to data conversion model.

restoration#

Restoration model.

signal_reduction#

Reduction to scalar signal.

verbosity#

Fetch verbosity. With increasing number, more intermediate results are displayed. Useful for parameter tuning.

darsia.multi_image_analysis.imageregistration module#

Module containing a diffeomorphic image registration tool.

class DiffeomorphicImageRegistration(img_dst, **kwargs)[source]#

Bases: object

Class to detect the deformation between different images.

After all, DiffeomorphicImageRegistration is a wrapper using TranslationAnalysis.

add(diffeomorphic_image_registration)[source]#

Update the store translation by adding the translation of an external diffeomorphic image registration.

Parameters:

diffeomorphic_image_registration (darsia.DiffeomorphicImageRegistration) – Diffeomorphic image registraton object holding a translation analysis.

apply(img, reverse=False)[source]#

Apply computed transformation onto arbitrary image.

Parameters:
  • img (np.ndarray or darsia.Image) – image

  • reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the latter.

Returns:

transformed image, if input is array; no output otherwise

Return type:

np.ndarray, optional

call_with_output(img, plot_patch_translation=False, return_patch_translation=False, mask=None)[source]#

Determine the deformation pattern and apply diffeomorphism to the image aiming at matching the reference/destination (dst) image.

This in the end only a wrapper for the translation analysis.

Parameters:
  • img (darsia.Image) – test image

  • reverse (bool) – flag whether the translation is understood as from the test image to the dst image, or reversed. The default is the former.

  • plot_patch_translation (bool) – flag controlling whether the displacement is also visualized as vector field.

  • return_patch_translation (bool) – flag controlling whether the displacement in the patch centers is returned in the sense of img to dst, complying to the plot; default is False.

deduct(diffeomorphic_image_registration)[source]#

Effectviely copy from external DiffeomorphicImageRegistration.

Parameters:

diffeomorphic_image_registration (darsia.DiffeomorphicImageRegistration) – Diffeomorphic image registration object holding a translation analysis.

displacement()[source]#

Return displacement in metric units on all pixels.

evaluate(coords, reverse=False, units='metric')[source]#

Evaluate diffeormorphism in arbitrary points.

Parameters:
  • coords (np.ndarray, or darsia.Patches) – coordinate array with shape num_pts x 2, or alternatively num_rows_pts x num_cols_pts x 2, identifying points in a mesh/patched image, or equivalently patch.

  • reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the former latter.

  • units (str) – input and output units; “metric” default; otherwise assumed to be “pixel”.

Returns:

deformation vectors for all coordinates.

Return type:

np.ndarray

plot(scaling=1.0, mask=None)[source]#

Plots diffeomorphism.

Parameters:
  • scaling (float) – scaling for vectors.

  • mask (darsia.Image, optional) – active set.

update_dst(img_dst)[source]#

Update of dst image.

Parameters:

dst (np.ndarray) – image array

class ImageRegistration(img_dst, method=None, **kwargs)[source]#

Bases: object

apply(img, reverse=False)[source]#

Apply computed transformation onto arbitrary image.

Parameters:
  • img (np.ndarray or darsia.Image) – image

  • reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the latter.

Returns:

transformed image, if input is array; no output otherwise

Return type:

np.ndarray, optional

evaluate(coords, reverse=False, units='metric')[source]#

See evaluate in DiffeomorphicImageRegistration.

plot(scaling, mask=None)[source]#

Plot the dislacement stored in the current image registration.

Parameters:
  • scaling (float) – scaling parameter to controll the length of the arrows.

  • mask (np.ndarray) – active mask

class MultiscaleDiffeomorphicImageRegistration(img_dst, config, mask_dst=None, total_config=None, **kwargs)[source]#

Bases: object

Class for multiscale diffeomorphic image registration being capable of tracking larger deformations.

apply(img, reverse=False)[source]#

Apply computed transformation onto arbitrary image.

Parameters:
  • img (np.ndarray or darsia.Image) – image

  • reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the latter.

Returns:

transformed image, if input is array; no output otherwise

Return type:

np.ndarray, optional

evaluate(coords, reverse=False, units='metric')[source]#

See evaluate in DiffeomorphicImageRegistration.

plot(scaling, mask=None)[source]#

Plot the dislacement stored in the current image registration.

Parameters:
  • scaling (float) – scaling parameter to controll the length of the arrows.

  • mask (np.ndarray) – active mask

darsia.multi_image_analysis.model_calibration module#

Calibration tools.

In particular objective functions for calibration in ConcentrationAnalysis.calibrate_model()

class AbsoluteVolumeModelObjectiveMixin[source]#

Bases: AbstractModelObjective

Calibration model based on matching injection rates. Has to be combined with ConcentrationAnalysis.

calibrate_model(images, options, plot_result=False)#

Utility for calibrating the model used in darsia.ConcentrationAnalysis.

NOTE: Require to combine darsia.ConcentrationAnalysis with a calibration model mixin via multiple inheritance.

Parameters:
  • images (list of darsia.Image) – calibration images

  • options (dict) – container holding tuning information for the numerical calibration routine

  • plot_result (bool) – flag controlling whether the calibration is displayed in a plot.

Returns:

success of the calibration study.

Return type:

bool

define_objective_function(input_images, images_diff, times, options)[source]#

Define objective function such that the root is the min.

Parameters:
  • input_images (list of np.ndarray) – input for _convert_signal

  • images_diff (list of np.ndarray) – plain differences wrt background image

  • times (list of float) – times

  • options (dict) – dictionary with objective value, here the injection rate

Returns:

objetive function

Return type:

callable

update_model_for_calibration(parameters, options)#

Wrapper for updating the model, depending on whether it is a single model or a combined model.

Parameters:
  • parameters (np.ndarray) – model parameters,

  • options (dict) – further tuning parameters and extra info; here, the key “dofs” is used to determine which dofs to update.

class AbstractModelObjective[source]#

Bases: object

Abstract class for defining an objective function to be called in ConcentrationAnalysis.calibrate_model().

calibrate_model(images, options, plot_result=False)[source]#

Utility for calibrating the model used in darsia.ConcentrationAnalysis.

NOTE: Require to combine darsia.ConcentrationAnalysis with a calibration model mixin via multiple inheritance.

Parameters:
  • images (list of darsia.Image) – calibration images

  • options (dict) – container holding tuning information for the numerical calibration routine

  • plot_result (bool) – flag controlling whether the calibration is displayed in a plot.

Returns:

success of the calibration study.

Return type:

bool

abstractmethod define_objective_function(input_images, images_diff, times, options)[source]#

Abstract method to define an objective function.

Returns:

objective function.

Return type:

callable

update_model_for_calibration(parameters, options)[source]#

Wrapper for updating the model, depending on whether it is a single model or a combined model.

Parameters:
  • parameters (np.ndarray) – model parameters,

  • options (dict) – further tuning parameters and extra info; here, the key “dofs” is used to determine which dofs to update.

class InjectionRateModelObjectiveMixin[source]#

Bases: AbstractModelObjective

Calibration model based on matching injection rates. Has to be combined with ConcentrationAnalysis.

calibrate_model(images, options, plot_result=False)#

Utility for calibrating the model used in darsia.ConcentrationAnalysis.

NOTE: Require to combine darsia.ConcentrationAnalysis with a calibration model mixin via multiple inheritance.

Parameters:
  • images (list of darsia.Image) – calibration images

  • options (dict) – container holding tuning information for the numerical calibration routine

  • plot_result (bool) – flag controlling whether the calibration is displayed in a plot.

Returns:

success of the calibration study.

Return type:

bool

define_objective_function(input_images, images_diff, times, options)[source]#

Define objective function such that the root is the min.

Parameters:
  • input_images (list of np.ndarray) – input for _convert_signal

  • images_diff (list of np.ndarray) – plain differences wrt background image

  • times (list of float) – times (units assumed to be compatible)

  • options (dict) – dictionary with objective value, here the injection rate

Returns:

objective function

Return type:

callable

model_calibration_postanalysis()[source]#

Interpret calibration result.

Returns:

time (in seconds) at which the signal is zero (based on calibration)

Return type:

float

update_model_for_calibration(parameters, options)#

Wrapper for updating the model, depending on whether it is a single model or a combined model.

Parameters:
  • parameters (np.ndarray) – model parameters,

  • options (dict) – further tuning parameters and extra info; here, the key “dofs” is used to determine which dofs to update.

darsia.multi_image_analysis.segmentationcomparison module#

Class for comparing segmented images.

The object contain information about the different segmentations as well as methods for comparing them and visualizing the result.

class SegmentationComparison(number_of_segmented_images=2, **kwargs)[source]#

Bases: object

Class for comparing segmented images.

Routines for comparing segmentations and creating visualizations of the comparison.

number_of_segmented_images#

Number of segmented images that one compares

Type:

int

segmentation_names#

list of names for each of the segmentations. Will affect legends in plots.

Type:

list[str]

components#

list of values the different (active) components in the segmentations. As of now, up to two are allowed for and the default values are 1 and 2.

Type:

list

component_names#

list of names for each of the components. Will be visual in legends.

Type:

list[str]

gray_colors#

array of base gray colors (in RGB space) that accounts for different overlapping segmentations of different components.

Type:

np.ndarray

colors#

color values for the different unique segmentations. Default is created from a colormap (matplotlib) depending on the amount of present segmentations.

Type:

np.ndarray

color_dictionary#

dictionary relating all of the different colors to different overlapping segmentation situations.

Type:

dict

color_fractions(comparison_image, colors=None, depth_map=None)[source]#

Returns color fractions.

Parameters:
  • comparison_image (np.ndarray) – Comparison of segmentations

  • colors (np.ndarray) – array of color values in the comparison image

  • depth_map (np.ndarray, optional) – depth map for the image

Returns:

Dictionary relating each color to the fraction of

the number of pixels that the color occupies and the total number of occupied pixels in the image.

Return type:

(dict)

compare_segmentations_binary_array(*segmentations, **kwargs)[source]#

Compares segmentations and returns an an array with pixels that containing an array of 1s and 0s depending on which segmentations are present there. At the current state it does not distinguish between the different kind of components,

Parameters:
  • *segmentations (tuple[np.ndarray, ...]) – The segmentations to be compared.

  • **kwargs

    Optional keyword arguments. roi (Union[tuple, np.ndarray]): roi where the segmentations should be

    compared, default is the maximal roi that fits in all segmentations. Should be provided in pixel coordinates using matrix indexing, either as a tuple of slices, or an array of corner points.

    components (tuple[int, …]): The components that should be recognized in

    the segmentations, default is [1,2].

get_combinations(*segmentation_numbers, num_segmentations=5)[source]#

Returns a list of all possible combinations of segmentations.

Parameters:
  • num_segmentations (int, optional) – Number of segmentations. Defaults to 5.

  • *segmentation_numbers (tuple[int, ...]) – The segmentation numbers that should be included in the combinations. Defaults to ().

Returns:

List of all possible combinations of segmentations.

Return type:

list[list[int]]

plot(image, figure_name='Comparison', legend_anchor=(0.7, 1))[source]#

Plots the provided image (should be a comparison of segmentations) with matplotlib.pyplot’s imshow and prints a legend with colors from the image and dictionary

Parameters:
  • image (np.ndarray) – image with comparison of segmentations.

  • figure_name (str) – Figure name.

  • legend_anchor (tuple) – tuple of coordinates (x,y) in Euclidean style that determines legend anchor.

plot_overlay_segmentation(comparison_image, base_image, figure_name='Comparison', opacity=0.6, legend_anchor=(1.0, 1.0), custom_legend=None, custom_legend_text=None)[source]#

Plots a comparison image overlayed a base image using matplotlib.

Parameters:
  • comparison_image (np.ndarray) – The image containing comparison of segmentations.

  • base_image (np.ndarray) – The base image that is to be overlayed.

  • figure_name (str) – Figure name.

  • opacity (float) – Tha opacity value for the comparison image.

  • legend_anchor (tuple) – tuple of coordinates (x,y) in euclidean style that determines legend anchor.

  • custom_legend (Optional[list[mpatches.Patch]]) – in case it is desirable to create a custom legend.

  • custom_legend_text (Optional[list[str]]) – in case it is desirable to customize legend.

darsia.multi_image_analysis.translationanalysis module#

Module containing class for translation analysis.

This is relevant e.g. for studying compaction of porous media.

class TranslationAnalysis(base, N_patches, rel_overlap, translation_estimator, mask=None)[source]#

Bases: object

Class for translation analysis.

add_translation_analysis(translation_analysis)[source]#

Add another translation analysis to the existing one. Modifies the interpolation object by redefinition.

Parameters:

translation_analysis (darsia.TranslationAnalysis) – Translation analysis holding an interpolation object.

bc_x(units)[source]#

Prescribed (boundary) conditions for the displacement in x direction.

Can be overwritten. Here, tailored to FluidFlower scenarios, fix the displacement in x-direction at the vertical boundaries of the image.

Parameters:

units (list of str) – “metric” or “pixel”

Returns:

coordinates list of float: translation in x direction

Return type:

list of np.ndarray

bc_y(units)[source]#

Prescribed (boundary) conditions for the displacement in y direction.

Parameters:

units (list of str) – “metric” or “pixel”

Can be overwritten. Here, tailored to FluidFlower scenarios, fix the displacement in y-direction at the horizontal boundaries of the image.

Returns:

coordinates list of float: translation in y direction

Return type:

list of np.ndarray

deduct_translation_analysis(translation_analysis)[source]#

Overwrite translation analysis by deducting from external one. (Re)defines the interpolation object.

Parameters:

translation_analysis (darsia.TranslationAnalysis) – translation analysis holding an interpolation object.

find_translation(units=['pixel', 'pixel'])[source]#

Find translation map as translation from image to baseline image such that these match as best as possible, measure on features.

The final translation map will be stored as callable function. And it allows various input and output spaces (metric vs. pixel).

Parameters:
  • units (list of str) – units for input (first entry) and output (second entry) ranges of the resulting translation map; accepts either “metric” or “pixel”.

  • mask (np.ndarray, optional) – boolean mask marking all pixels to be considered; all if mask is None (default).

Returns:

translation map defined as interpolator bool: flag indicating on which patches the routine has been successful

Return type:

Callable

load_image(img, mask=None)[source]#

Load an image to be inspected in futher analysis.

Parameters:
  • img (Image) – test image.

  • mask (Image) – mask to be considered in the analysis.

plot_translation(reverse=True, scaling=1.0, mask=None)[source]#

Translate centers of the test image and plot in terms of displacement arrows.

Parameters:
  • reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the former latter.

  • scaling (float) – scaling factor for visual comfort.

  • mask (Image) – mask of interest for arrows.

return_patch_translation(reverse=True, units='metric')[source]#

Translate patch centers of the test image.

Parameters:
  • reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the former latter.

  • units (list of str) – “metric” or “pixel”

Returns:

deformation in patch centers

Return type:

np.ndarray

translate_image(reverse=True)[source]#

Apply translation to an entire image by using piecwise perspective transformation.

Parameters:

reverse (bool) – flag whether the translation is understood as from the test image to the baseline image, or reversed. The default is the latter.

Returns:

translated image

Return type:

darsia.Image

update_base(base)[source]#

Update baseline image.

Parameters:

base (darsia.Image) – baseline image

update_base_patches()[source]#

Update patches of baseline.

update_params(N_patches=None, rel_overlap=None)[source]#

Routine allowing to update parameters for creating patches.

If any of the parameters is changed, a new patch of the base image is created.

Parameters:
  • N_patches (list of two int) – number of patches in x and y direction

  • rel_overlap (float) – relative overal related to patch size in each direction

N_patches#

Number of patches in matrix indexing.

have_translation#

Mask of flags storing success of finding translations for patches.

mask_base#

Mask.

rel_overlap#

Relative overlap.

translation#

Cache of current translation.

translation_estimator#

Translation estimator.