Satellitederived nearshore bathymetry
The traditional way to determine the bathymetry of the nearshore zone is by means of insitu sonar observations, nowadays mainly with multibeam echosounders. In this way the depth can be measured with order of decimeter accuracy. However, these measurements are laborintensive, require a lot of shipping time and are therefore expensive. This places important limitations on the frequency with which surveys can be carried out. Major parts of world’s nearshore waters have never been surveyed with insitu sonar equipment. Remote sensing therefore offers an attractive alternative for bathymetric measurements. Various methods have been developed: methods using fixed installations on the coast and methods using airborne sensors. Remote sensing from the coast uses radar or video cameras. Depth estimates are obtained indirectly through the observable influence of water depth on wave propagation in shallow water. The use of video cameras is described in the article Argus applications and the use of radar in the article Use of Xband and HF radar in marine hydrography, both with references to other related articles. However, the nearshore zone covered by fixed observation installations is of limited size. Airborne remote sensing can cover larger areas. Accurate bathymetric measurements are possible with airborne LIDAR, see for example Data processing and output of Lidar, but only in very clear water. LIDAR observations are also expensive. Airborne remote sensing with hyperspectral sensors has wide applications in nearshore coastal waters, not only for bathymetric surveys but also for mapping submarine habitats. Examples are discussed in the article HyMap: Hyperspectral seafloor mapping and direct bathymetry calculation in littoral zones. All the methods described above require investments in equipment and means for installation and/or transportation. In contrast, a huge amount of remote sensing data is collected by satellites every day globally for all coastal zones. These remote sensing images are freely available for investigating physical, chemical and biological processes and characteristics of land and ocean, including coastal areas.
We discuss in this article three main techniques to extract nearshore bathymetry from satellite remote sensing: two socalled 'passive' techniques based on (1) the interpretation of water color and (2) the detection of wave propagation characteristics and (3) an 'active' technique, spaceborne LIDAR (Laser Imaging, Detection And Ranging). LIDAR does not need daylight, unlike color remote sensing. LIDAR and color remote sensing both require calm clear water. Wave remote sensing also needs daylight, and works best with energetic waves  especially high swell waves. The three techniques are therefore complementary^{[1]}.
Contents
Satellite colorbased bathymetry
Sunlight reflected from the seabed in shallow water contains information on the water depth. A simple example is the shallow vs. deep ends of a swimming pool that appear as different colors to the human eye.
Several satellites, such as Landsat 8, Sentinel 2, WorldView2, QuickBird, IKONOS, SPOT, are equipped with optical sensors that measure sunlight reflected from the sea surface in different wavelength bands. Google Earth Engine is a cloudbased geospatial computing platform which offers a petabytescale archive of freely available optical satellite imagery. Among characteristics, it features the whole archive of Landsat, the first three Sentinel missions, and full Moderate Resolution Imaging Spectroradiometer (MODIS) data and products. Spectra recorded by these optical sensors contain information on the bathymetry of nearshore coastal waters. A plethora of factors affect the state of the atmosphere (e.g., haze, aerosols, and clouds), sea surface (e.g., sunglint, sky glint, and white caps) and water column (e.g., sedimentation, turbidity and variable optical properties) originating from either atmospheric interference or ocean surface. For instance, the topofatmosphere signal for bluetored spectral bands could consist of up to 90% of scattering due to ozone and Rayleigh effects^{[2]}. The influence of these factors has to be removed by available preprocessing algorithms to obtain the remote sensing reflectance [math]R_{rs}[/math].
Several methods have been developed to extract the seabed bathymetry from the remote sensing reflectance. Most methods are based on models that require bathymetric data from other sources for calibration. One of the possible sources is the LIDAR camera mounted on the ICESat2 satellite.
Satellitederived bathymetry is restricted to shallow coastal areas. In deep water, the light in the water column is too strongly attenuated for producing a significant reflected signal from the seabed. In clear water the maximum depth is about 25 m and in turbid water less than about 5 m^{[3]}. The accuracy of satellitederived bathymetry is usually not much better than about 10% of the water depth.
Remote sensing reflectance
Optical remote sensing provides a measure of reflectance [math]R_{rs}(\lambda)[/math] in different wavelength bands [math]\Delta \lambda[/math]. Reflectance is defined as the ratio of upwelling radiance and downwelling irradiance, [math]R_{rs}=R_{up}/R_{in}[/math]. Upwelling radiance [math]R_{up}(\lambda)[/math] is the radiance emerging from below the water surface, i.e. the total light energy in a wavelength band [math]\Delta \lambda[/math] emitted per unit time in all directions from a unit area of the sea surface. Downwelling irradiance [math]R_{in}(\lambda)[/math] is the incoming light energy in a wavelength band [math]\Delta \lambda[/math] received directly and indirectly from the sun per unit time and unit sea surface.
Upwelling radiance is the sum of the light emitted from the water column, [math]R_{water}[/math], and from the seabed, [math]R_{bed}[/math]. It is also the difference between the total emitted sea surface radiance and the reflection of the skyradiance originating from the upper hemisphere including both the direct (sunglint) and the diffuse components (sky glint). Sunglint occurs in imagery when the water surface orientation is such that the sun is directly reflected towards the sensor; and hence is a function of sea surface state, sun position and viewing angle. The upwelling radiance depends on water constituents (e.g. suspended sediment, Chlorophyl, gelbstoff), bottom reflectance and water depth.
Downwelling surface irradiance is the sum of the direct and diffuse components of sunlight attenuated by reflection at the airsea interface. Reflectance of the direct sun beam depends on the solar zenith angle and the index of refraction of seawater. Reflectance of the diffuse irradiance is related to the roughness of the sea surface. Reflectance due to foam can be related to the wind speed, and it affects both the direct and the diffuse components. Downwelling irradiance can be estimated from analytical expressions^{[4]} or from measurements^{[5]}. A more detailed discussion the optical properties of coastal waters can be found in Light fields and optics in coastal waters.
Retrieving water depth
Several methods have been developed to retrieve water depth from remote sensing reflectance spectra. A few popular methods are briefly described below.
Semianalytical methods
Semianalytical methods were developed by Lee et al. (1999)^{[6]} who analyzed water’s absorption and backscatter properties (including the influence of suspended matter, phytoplankton and gelbstoff). From these properties they derived semiempirical formulas for the contributions of absorption and backscatter to radiative transfer in seawater and a corresponding formula for the remote sensing reflectance (see Appendix SemiAnalytical Model). In shallow waters, the remote sensing reflectance depends not only on the absorption and scattering properties of dissolved and suspended material in the water column, but also on the bottom depth and the reflectivity of the bottom, or the bottom albedo. For the bottom depth to be retrieved, the watercolumn contributions to the reflectance have to be removed. The bathymetry of a shallowwater coastal field site then can be obtained by matching the semiempirical formula with observed remote sensing reflectance values for a particular wavelength, without using any bathymetric data. The accuracy is on the order of ± 1 m for depths less than about 10 m.
Semiempirical methods
Semiempirical methods use relationships between reflected radiation and water depth without considering light transmission in water. In addition to depth values measured in a number of locations, semiempirical methods require certain bands in the visible wavelength, with blue and green being the most widely used, as inputs in simple or multiple linear regressions. Two wellknown methods exist to estimate bathymetry in a given area., which are linear transform (Lyzenga et al. 2006^{[7]}) and ratio transform (Stumpf et al. 2003^{[8]}), see Appendix SemiEmpirical Model.
The physical concept underlying the ability to estimate bathymetry from multispectral imagery is the wavelengthdependent attenuation of light in the water column. The ratio transform method with the blue and red bands may perform better than with the blue and green bands in very shallow water because the red band or bands with longer wavelengths have stronger absorption than the green band or bands with shorter wavelengths^{[9]}. With the decrease of water depth, the sensitive wavelength band in waterleaving reflectance for the water depth varies from the shorter wavelength band to the longer wavelength band (from the green band to the rededge band).
Empirical method
If the water depth is known at a sufficiently large number of locations, machine learning (ML) techniques can be used to predict bathymetry directly from the remote sensing reflectance [math]R_{rs}(\lambda)[/math] in different wavelength bands. Compared with the classic method, the machine learning method does not require any empirical knowledge of attenuation, water quality, or bottom type, and has wider applicability. The core assumption is that the bathymetry and the seabed have spectral signatures that can be differentiated within the remote sensing data. The empirical approach is easy to apply and tools are readily available to process and analyze data, which are major advantages. Recent developments in ML techniques enhance the efficiency to process huge insitu data. The limitations are requirement of insitu data and the adaptation to a specific site; results usually cannot be transferred to other sites.
Machine learning techniques have also been applied to remote sensing reflectance without atmospheric correction by using the nearinfrared (NIR) wavelength band. The NIR band is widely used for atmospheric correction; when training the model using the remote sensing images without atmospheric correction, including the NIR band helps to improve the training accuracy^{[10]}. The bathymetric data obtained with atmospheric correction are more accurate than those without, but in some cases avoiding the complex atmospheric correction processes may yield acceptable results.
Popular machine learning techniques for retrieving the bathymetry directly from remote sensing reflectance are Principal Component Analysis^{[11]}, Artificial Neural Networks (ANN)^{[12]}, Random Forest Regression Trees (RF)^{[13]} and Support Vector Regression (SVR) algorithms^{[14]}.
Satelliteborn LIDAR
Airborne lidar bathymetry provides an efficient alternative to vesselbased echosounding techniques, particularly in transparent shallow waters. An introduction to the use of airborne LIDAR is given in the article Use of Lidar for coastal habitat mapping.
ICESat2 ATLAS is a spacebased laser altimeter launched in September 2018. It is a photoncounting lidar with a revisit period of 91 days. ATLAS uses a green laser (532 nm) with 10 000 pulses per second, a vertical resolution of 4 mm, a footprint of 13 m in diameter, and an alongtrack sampling interval of 0.7 m. It has three laser beams along the track, and the distance between adjacent beams is approximately 3.3 km. Each beam is divided into strong and weak subbeams. The energy of the strong subbeam is approximately four times than that of the weak subbeam, and the distance between them is 90 m. The detector is very sensitive, so the raw photon data in the ATL03 dataset are noisy, especially during the day due to solar activity^{[15]}.
The ICESat2 geolocation photon data are available in the ATL03 product, which is disseminated through the National Snow and Ice Data Center (NSIDC). ICESat2 was not originally designed for marine applications and its trajectories have a limited global distribution. However, the use of a 532nm laser and the highaccuracy of the altimetry give it great potential in nearshore bathymetry for depths up to about 40 m in optically clear waters^{[3]}. Refraction and tide corrections are essential for nearshore bathymetry when using the ATLAS remote sensing images. Even if the spacing between the beam pairs is too wide to generate highresolution bathymetric results, water depth profiles derived from ICESat2 can be used to feed the (semi)empirical methods for satellitederived bathymetry.
Satellite wavebased bathymetry
Depth information can also be retrieved by analyzing wave propagation characteristics detected by satellite remote sensing. Satellite wavebased bathymetry does not require in situ data for calibration or training. The technique is based on a formula that expresses the wave propagation speed (wave celerity) in shallow water as a function of wavelength and water depth, see Shallowwater wave theory. This formula holds in principle for uniform depth, but is generally a reasonable approximation in situations where the depth is smoothly varying. Values of the wave celerity and the wavelength can be derived from images of the sea surface at successive times. The water depth can then be computed from the analytical formula of the wave celerity. This technique has been commonly applied for sea surface images obtained from shorebased video cameras such as ARGUS ^{[16]}^{[17]} or from images taken by drones^{[18]}^{[19]}.
Wave propagation from satellite images
The wave celerity can be derived from a time series of satellite images of the sea surface if certain requirements are met^{[20]}
 the spatial coverage comprises at least one wavelength,
 the pixel diameter is much smaller than the wavelength (to enable wavelength estimation),
 the timelapse between two successive images is smaller than the wave period (to enable identification of individual waves),
 the ratio of pixel diameter and timelapse is smaller than the wave celerity (to enable estimating the shift in wave phase),
 the overlap of two successive images contains the same nearshore area.
These requirements restrict the use of satellite images, because the pixel diameter is large, O(10 m), and the timelapse between two images is often longer than the ratio of area width and satellite speed, meaning that successive images do not have sufficient overlap. In the best case 2 successive images are available, if there is a small timelapse of O(1 s) between images taken for different wavelength bands. For example, for Sentinel2’s bands with best resolution (10 m) and time lapse of 1 s, phase shift estimation requires about 78 pixels, meaning that wavelengths smaller than 70–80 m are not sufficiently resolved for phase shift estimation^{[21]}.
The identification of waves in the remote sensing images is based on the analysis of variations in pixel intensity, see Appendix C. A complicating factor is the irregular character of natural waves, with varying wave height, wave period and wave incidence direction. The uncertainty in the results is largely due to the uncertainty in the estimation of the wave celerity. Currently, the accuracy margins of depths obtained from satellitebased techniques are substantially larger than those of operational shorebased video cameras, and an order of magnitude larger than those of insitu echosoundings^{[22]}^{[23]}.
Jurisdictional issues
In situ bathymetric surveys in territorial waters typically require permission of the host country. Often, this permission comes with the request that survey findings are not made publicly available and the host country receives a copy of the data. Many countries are reluctant to grant foreign entities access to territorial waters, due to the risk that bathymetric survey information could be used to facilitate undersea navigation by military vessels^{[24]}. Bathymetry from space that makes no use of insitu data circumvents host country permissions. Satellitederived bathymetry can also be used to determine areas such as reefs, atolls and shoals that could be built upon to create artificial islands and lay claim to territorial waters and exclusive economic zones. Although currently hypothetical, the buildup of shallow water by states seeking to expand their influence is a worry in some regions of the world and satellitederived bathymetry could play a role in both expanding or refuting this practice^{[24]}.
Appendices
Appendix A: SemiAnalytical Model
The belowsurface remote reflectance [math]r_{rs}[/math] is given by^{[6]}
[math]r_{rs} = \Large\frac{r_{up}}{r_{in}}\normalsize = \Large\frac{r_{water}}{r_{in}} + \frac{r_{bed}}{r_{in}}\normalsize , \qquad (A1)[/math]
where [math] r_{in}(\lambda)[/math] is the downwelling irradiance below the sea surface. The upwelling radiance below the sea surface, [math]r_{up}(\lambda)[/math], has contributions from water column backscattering, [math]r_{water}[/math], and from seabed reflection, [math]r_{bed}[/math]. In deep water, the back radiation from the seabed is almost nil. The deep water reflectance [math]r_{\infty}[/math] is thus equal to [math]r_{\infty} = \Large\frac{r_{deep water}}{r_{in}} .[/math]
We now make 2 assumptions:
 The incoming light is exponentially attenuated with depth [math]z[/math] measured from the water surface, with attenuation coefficient [math]K[/math] that represents the sum of absorption and backscattering,
 The attenuation coefficient is the same in shallow water and deep water.
We then have
[math]r_{water} = K \, r_{in} R_W \int_0^h e^{2Kz} dz = \large\frac{1}{2}\normalsize r_{in} R_W (1  e^{2Kh}) , \quad r_{\infty} = \large\frac{1}{2}\normalsize R_W , \quad r_{bed} = r_{in} R_B e^{2Kh} , \qquad (A2)[/math]
where [math]h[/math] is the water depth and [math]R_W, \, R_B[/math] are backscattering and reflection coefficients for water and for the seabed. It then follows that
[math]r_{rs} = r_{\infty} \, (1  e^{2Kh}) + R_B e^{2Kh} . \qquad (A3)[/math]
Considering that downwelling and upwelling attenuation can be different, the formula for the remote sensing reflectance reads^{[25]}:
[math]r_{rs} = r_{\infty} \, \big(1  e^{ (K_D +K_U)h} \big) + R_B e^{(K_D +K_U)h} . \qquad (A4)[/math]
Derivation of the remote sensing signal observed by satellite [math]R_{rs}[/math] from the belowsurface reflectance [math]r_{rs}[/math] requires several correction factors: the refractive index of water, the watertoair internal reflection, the radiance transmittance from below to above the surface, the irradiance transmittance from above to below the surface and the ratio of the upwelling radiance to the upwelling irradiance. An approximate expression for the relationship between [math]r_{rs}[/math] and [math]R_{rs}[/math] is^{[26]}
[math]R_{rs} \approx \Large\frac{0.5 r_{rs}}{1  1.5 r_{rs}}\normalsize . \qquad (A5)[/math]
If the attenuation coefficients [math]K_D, \, K_U[/math] and the seabed reflection coefficient [math]R_B[/math] are known for a specific wavelength [math]\lambda[/math], then the water depth [math]h[/math] can be derived by comparing the formulas (A4, A5) with the reflectance [math]R_{sat}[/math] recorded by the satellite sensor for the corresponding wavelength band.
Using empirical expressions for [math]K_D, \, K_U, \, R_B[/math], Lee et al. (1999)^{[6]} determined the bathymetry of Florida Bay from remote sensing reflectance data with an accuracy better than 10% for these shallow waters with a uniform, sandtype bottom.
Appendix B: SemiEmpirical Model
The first model is the one proposed by Lyzenga et al. (2006)^{[7]}, which assumes a linear relationship between the logtransformed wavelength bands and depth [math]h[/math],
[math]h = h_0 + \sum_{j=1}^N h_j \, X_j , \quad X_j = \ln \big( R_{rs}(\lambda_j)  R_{\infty} (\lambda_j) \big) , \qquad (B1)[/math]
where [math]N[/math] is the number of bands considered and [math]h_0, h_1, …, h_N[/math] are coefficients that can be estimated through linear (multiple) regression from known depth values [math]\hat{h}[/math] in a number of points of the coastal area where the bathymetry is to be determined.
The second model by Stumpf et al. (2003)^{[8]} proposed an empirical linear relationship between the water depth [math]h[/math] and the ratio of the logtransformed green or red band ([math]\lambda_1[/math]) to the logtransformed blue band ([math]\lambda_2[/math]),
[math]h = m_0 + m_1 \Large\frac{\ln(1000 \, R_{rs}(\lambda_1))}{\ln(1000 \, R_{rs}(\lambda_2))}\normalsize . \qquad (B2)[/math]
The values of the parameters [math]m_0, \, m_1[/math] have to be adjusted by linear regression to predetermined depths in a number of points for estimating the bathymetry in other points of the coastal area^{[27]}.
Appendix C: Analysis of wave characteristics from satellite images
According to linear wave theory, the water depth [math]h[/math] can be computed from the wave dispersion relation,
[math]h = \Large\frac{\lambda}{2 \pi }\normalsize \, \tanh^{1} \Big( \Large\frac{2 \pi c^2}{g \lambda}\normalsize \Big), \qquad (C1)[/math]
if the wavelength [math]\lambda[/math] and the wave celerity [math]c[/math] are known. However, in deep water, [math]2 \pi h / \lambda \gt \gt 1[/math], which implies that the wave celerity does not depend on the depth and consequently the water depth cannot de computed from Eq. (C1).
In shallow water, [math]h \lt \lt \lambda / 2 \pi[/math], the relationship (C1) between water depth and wave celerity becomes [math]h \approx c^2 / g[/math]. The validity of this expression is questionable for the highly distorted boretype waves on the upper shoreface. Applying cnoidal wave theory would give [math]c^2 \approx g (h + H)[/math], where [math]H[/math] is the wave height^{[28]}. A more accurate depth estimate can be obtained when using a modified dispersion relation based on Boussinesq theory^{[29]}^{[30]}, see Nonlinear wave dispersion relations.
Wavelength and wave celerity can be extracted from the pixel intensity pattern [math]I(x,y)[/math], where [math]x, y[/math] are the spatial coordinates of the remote sensing image. As the pixel intensity pattern may contain a great amount of noise, a common technique to enhance linear patterns is the socalled Radon Transform (RT) ^{[21]}. Even if wave fronts are not obviously apparent, they normally will appear through the Radon Transform. Linear features coinciding with wave fronts are represented by lines [math]r=x \cos \theta + y \sin \theta[/math], where [math]\theta[/math] is the wave incidence angle and [math]r[/math] the distance of the wave front to [math]x=0, \, y=0[/math] along the wave ray. The Radon Transform is then given by the integral over the image area [math]A[/math] along possible wave fronts (depending on [math]\theta[/math]),
[math]R_I (\theta, r) = \int \int_A \, I(x,y) \, \delta(rx \cos \theta y \sin \theta) \, dx dy , \qquad (C2)[/math]
where [math]\delta[/math] is the Kronecker distribution ([math]\delta (x) = 0[/math] for [math]x \ne 0 , \; \int_{x\lt y}^{x\gt y} f(x) \delta (xy) dx = f(y)[/math]). Wavefronts correspond to values of [math]\theta, r[/math] where [math]R_I[/math] is maximum.
The pixel intensity pattern is discretized ([math]r \rightarrow r_0, r_1, …, r_N[/math]) by assuming periodicity ([math]R_I(\theta, r_0) = R_I(\theta, r_N)[/math]) for successive wave fronts. The discrete complex Fourier transform of [math] R_I (\theta, r)[/math], [math]\; \tilde{ R_I }(\theta, k) = \sum_{n=0}^{N1} R_I (\theta,r_n) \exp(i 2 \pi k n/N), \; [/math] gives the intensity and the phase of the pixel pattern in wavenumber space [math]k = 2 \pi / \lambda[/math]. From the phase difference [math]\Delta \Phi[/math] between two successive images ([math]\Delta t[/math]) the wave celerity [math]c[/math] can be derived:
[math]\Phi = \tan^{1} \Big( \Large\frac{\Im \tilde{ R_I }(\theta, k)}{\Re \tilde{ R_I }(\theta, k)}\normalsize \Big) , \quad c = \Large\frac{\lambda}{2 \pi}\frac{\Delta \Phi}{\Delta t}\normalsize . \qquad (C3)[/math]
Other techniques can also be used for analyzing the remote sensing images, for instance wavelet analysis ^{[20]}.
Related articles
 Light fields and optics in coastal waters
 Optical remote sensing
 HyMap: Hyperspectral seafloor mapping and direct bathymetry calculation in littoral zones
 Bathymetry from remote sensing wave propagation
 Use of Lidar for coastal habitat mapping
 Data processing and output of Lidar
 Use of Xband and HF radar in marine hydrography
 Tidal flats from space
 Optical measurements in coastal waters
 Instruments for bed level detection
References
 ↑ Almar, R., Bergsma, E. W., Thoumyre, G., Baba, M. W., Cesbron, G., Daly, C., Garlan, T. and Lifermann, A. 2021. Global satellitebased coastal bathymetry from waves. Remote Sensing 13, 4628
 ↑ Mishra, D. R., Narumalani, S., Rundquist, D. and Lawson, M. 2005. Characterizing the Vertical Diffuse Attenuation Coefficient for Downwelling Irradiance in Coastal Waters: Implications for Water Penetration by High Resolution Satellite Data. ISPRS Journal of Photogrammetry and Remote Sensing 60: 48–64
 ↑ ^{3.0} ^{3.1} Cesbron, G., Melet, A., Almar, R., Lifermann, A., Tullot, D. and Crosnier, L. 2021. PanEuropean SatelliteDerived Coastal Bathymetry—Review, User Needs and Future Services. Frontiers in Marine Science 8, 740830
 ↑ Gege, P. 2012. Analytic model for the direct and diffuse components of downwelling spectral irradiance in water. Applied Optics 51: 14071419
 ↑ Ruddick, K.G., Voss, K., Boss, E., Castagna, A., Frouin, R., Gilerson, A., Hieronymi, M., Johnson, B.C., Kuusk, J., Lee, Z., Ondrusek, M., Vabson, V. and Vendt, R. 2019. A Review of Protocols for Fiducial Reference Measurements of Downwelling Irradiance for the Validation of Satellite Remote Sensing Data over Water. Remote Sens. 11, 1742
 ↑ ^{6.0} ^{6.1} ^{6.2} Lee, Z., Carder, K. L., Mobley, C. D., Steward, R. G. and Patch, J. S. 1999. Hyperspectral remote sensing for shallow waters: 2. Deriving bottom depths and water properties by optimization. Appl. Opt. 38: 3831–3843
 ↑ ^{7.0} ^{7.1} Lyzenga, D.R., Malinas, N.P. and Tanis, F.J. 2006. Multispectral bathymetry using a simple physically based algorithm. IEEE Trans. Geosci. Remote Sens. 44: 2251–2259
 ↑ ^{8.0} ^{8.1} Stumpf, R.P., Holderied, K. and Sinclair, M. 2003. Determination of water depth with highresolution satellite imagery over variable bottom types. Limnol. Oceanogr. 48: 547–556
 ↑ Liu, Y., Tang , D., Deng, R., Cao, B., Chen, Q., Zhang, R., Qin, Y. and Zhang, S. 2020. An adaptive blended algorithm approach for deriving bathymetry from multispectral imagery. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 14: 801–817
 ↑ Xie, C., Chen, P., Zhang, Z. and Pan, D. 2023. Satellitederived bathymetry combined with Sentinel2 and ICESat2 datasets using machine learning. Front. Earth Sci. 11, 1111817
 ↑ SanchezCarnero, N., OjedaZujar, J., RodriguezPerez, D. and MarquezPerez, J. 2014. Assessment of different models for bathymetry calculation using SPOT multispectral images in a highturbidity area: The mouth of the Guadiana Estuary. International Journal of Remote Sensing 35: 493–514
 ↑ LumbanGaol, Y. A., Ohori, K. A. and Peters, R. Y. 2021. Satellitederived bathymetry using convolutional neural networks and multispectral sentinel2 images. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences  ISPRS Archives 43(B32021): 201207
 ↑ Mudiyanselage, S.S.J.D., AbdElrahman, A., Wilkinson, B. and Lecours, V. 2022. Satellitederived bathymetry using machine learning and optimal Sentinel2 imagery in SouthWest Florida coastal waters. GIScience & Remote Sensing 59: 11431158
 ↑ Misra, A., Vojinovic, Z., Ramakrishnan, B., Luijendijk, A. and Ranasinghe, R. 2018. Shallow water bathymetry mapping using Support Vector Machine (SVM) technique and multispectral imagery. International journal of remote sensing 39: 44314450
 ↑ Zhang, X., Chen, Y., Le, Y., Zhang, D., Yan, Q., Dong, Y., Han, W. and Wang, L. 2022. Nearshore Bathymetry Based on ICESat2 and Multispectral Images: Comparison Between Sentinel2, Landsat8, and Testing Gaofen2. In IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 15: 24492462
 ↑ Almar, R., Bonneton, P., Senechal, N. and Roelvink, D. 2009. Wave celerity from video imaging: A new method. In Proceedings of the 31st international conference coastal engineering. Vol. 1 :661–673
 ↑ Holman, R. A., Plant, N. and Holland, T. 2013. Cbathy: A robust algorithm for estimating nearshore bathymetry. Journal of Geophysical Research: Oceans 118: 2595–2609
 ↑ Bergsma, E. W. J., Almar, R., de Almeida, L. P. M. and Sall, M. 2019. On the operational use of uavs for videoderived bathymetry. Coastal Engineering 152, 103527
 ↑ Lange, A.M.Z., Fiedler, J.W., Merrifield, M.A. and Guza, R.T. 2023. UAV videobased estimates of nearshore bathymetry. Coastal Engineering 185, 104375
 ↑ ^{20.0} ^{20.1} Poupardin, A., Idier, D., de Michele, M. and Raucoules, D. 2016. Water depth inversion from a single spot5 dataset. IEEE Trans. Geosci. Remote Sens. 119 : 2329–2342
 ↑ ^{21.0} ^{21.1} Bergsma, E.W.J., Almar, R. and Maisongrande, P. 2019. Radonaugmented sentinel2 satellite imagery to derive wavepatterns and regional bathymetry. Remote Sens. 11, 16
 ↑ Bergsma, E.W.J., Almar, F., Rolland, A., Binet, R., Brodie, K.L. and Bak, A.S. 2021. Coastal morphology from space: A showcase of monitoring the topographybathymetry continuum. Remote Sensing of Environment 261, 112469
 ↑ Almar, R., Bergsma, E.W.J., Thoumyre, G., LemaiChenevier, S., Loyer, S., Artigues, S., Salles, G., Garlan, T., Lifermann, A. 2024. Satellitederived bathymetry from correlation of Sentinel2 spectral bands to derive wave kinematics: Qualification of Sentinel2 S2Shores estimates with hydrographic standards. Coastal Engineering 189, 104458
 ↑ ^{24.0} ^{24.1} Dickens, K. and Armstrong, A. 2019. Application of Machine Learning in Satellite Derived Bathymetry and Coastline Detection. SMU Data Science Review 2, Article 4
 ↑ Albert, A. and Mobley, C.D. 2003. An analytical model for subsurface irradiance and remote sensing reflectance in deep and shallow case2 waters. Opt. Express 11: 2873–2890
 ↑ Lee, Z., Carder, K. L., Mobley, C. D., Steward, R. G. and Patch, J. S. 1998. Hyperspectral remote sensing for shallow waters: 1. A semianalytical model. Appl. Opt. 37: 6329–6338
 ↑ Caballero, I. and Stumpf, R.P. 2020. Towards Routine Mapping of Shallow Bathymetry in Environments with Variable Turbidity: Contribution of Sentinel2A/B Satellites Mission. Remote Sens. 12, 451
 ↑ Thornton, E.B. and Guza, R.T. 1982. Energy saturation and phase speeds measured on a natural beach. J. Geophys. Res. 87, 9499
 ↑ Herbers, T. H. C., Elgar, S., Sarap, N. A. and Guza, R. T. 2002. Nonlinear dispersion of surface gravity waves in shallow water. Journal of Physical Oceanography 32: 1181–1193
 ↑ Martins, K., Bonneton, P., de Viron, O., Turner, I.L., Harley, M.D. and Splinter, K. 2022. New Perspectives for Nonlinear DepthInversion of the Nearshore Using Boussinesq Theory. Geophysical Research Letters 50, e2022GL100498
Rafaël Almar provided comments on a draft version of this article.
Please note that others may also have edited the contents of this article.
