54 resultados para physically based modeling


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud masks. Here, the technique is shown to be suitable for daytime applications over land and sea, using visible and near-infrared imagery, in addition to thermal infrared. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 89% and 73% for ocean and land, respectively using the Bayesian technique, compared to 90% and 70%, respectively for the threshold-based techniques associated with the validation dataset.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of diurnal variations in sea surface temperature (SST) on the air-sea flux of CO2 over the central Atlantic ocean and Mediterranean Sea (60 S–60 N, 60 W–45 E) is evaluated for 2005–2006. We use high spatial resolution hourly satellite ocean skin temperature data to determine the diurnal warming (ΔSST). The CO2 flux is then computed using three different temperature fields – a foundation temperature (Tf, measured at a depth where there is no diurnal variation), Tf, plus the hourly ΔSST and Tf, plus the monthly average of the ΔSSTs. This is done in conjunction with a physically-based parameterisation for the gas transfer velocity (NOAA-COARE). The differences between the fluxes evaluated for these three different temperature fields quantify the effects of both diurnal warming and diurnal covariations. We find that including diurnal warming increases the CO2 flux out of this region of the Atlantic for 2005–2006 from 9.6 Tg C a−1 to 30.4 Tg C a−1 (hourly ΔSST) and 31.2 Tg C a−1 (monthly average of ΔSST measurements). Diurnal warming in this region, therefore, has a large impact on the annual net CO2 flux but diurnal covariations are negligible. However, in this region of the Atlantic the uptake and outgassing of CO2 is approximately balanced over the annual cycle, so although we find diurnal warming has a very large effect here, the Atlantic as a whole is a very strong carbon sink (e.g. −920 Tg C a−1 Takahashi et al., 2002) making this is a small contribution to the Atlantic carbon budget.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Urbanization, the expansion of built-up areas, is an important yet less-studied aspect of land use/land cover change in climate science. To date, most global climate models used to evaluate effects of land use/land cover change on climate do not include an urban parameterization. Here, the authors describe the formulation and evaluation of a parameterization of urban areas that is incorporated into the Community Land Model, the land surface component of the Community Climate System Model. The model is designed to be simple enough to be compatible with structural and computational constraints of a land surface model coupled to a global climate model yet complex enough to explore physically based processes known to be important in determining urban climatology. The city representation is based upon the “urban canyon” concept, which consists of roofs, sunlit and shaded walls, and canyon floor. The canyon floor is divided into pervious (e.g., residential lawns, parks) and impervious (e.g., roads, parking lots, sidewalks) fractions. Trapping of longwave radiation by canyon surfaces and solar radiation absorption and reflection is determined by accounting for multiple reflections. Separate energy balances and surface temperatures are determined for each canyon facet. A one-dimensional heat conduction equation is solved numerically for a 10-layer column to determine conduction fluxes into and out of canyon surfaces. Model performance is evaluated against measured fluxes and temperatures from two urban sites. Results indicate the model does a reasonable job of simulating the energy balance of cities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Bollène-2002 Experiment was aimed at developing the use of a radar volume-scanning strategy for conducting radar rainfall estimations in the mountainous regions of France. A developmental radar processing system, called Traitements Régionalisés et Adaptatifs de Données Radar pour l’Hydrologie (Regionalized and Adaptive Radar Data Processing for Hydrological Applications), has been built and several algorithms were specifically produced as part of this project. These algorithms include 1) a clutter identification technique based on the pulse-to-pulse variability of reflectivity Z for noncoherent radar, 2) a coupled procedure for determining a rain partition between convective and widespread rainfall R and the associated normalized vertical profiles of reflectivity, and 3) a method for calculating reflectivity at ground level from reflectivities measured aloft. Several radar processing strategies, including nonadaptive, time-adaptive, and space–time-adaptive variants, have been implemented to assess the performance of these new algorithms. Reference rainfall data were derived from a careful analysis of rain gauge datasets furnished by the Cévennes–Vivarais Mediterranean Hydrometeorological Observatory. The assessment criteria for five intense and long-lasting Mediterranean rain events have proven that good quantitative precipitation estimates can be obtained from radar data alone within 100-km range by using well-sited, well-maintained radar systems and sophisticated, physically based data-processing systems. The basic requirements entail performing accurate electronic calibration and stability verification, determining the radar detection domain, achieving efficient clutter elimination, and capturing the vertical structure(s) of reflectivity for the target event. Radar performance was shown to depend on type of rainfall, with better results obtained with deep convective rain systems (Nash coefficients of roughly 0.90 for point radar–rain gauge comparisons at the event time step), as opposed to shallow convective and frontal rain systems (Nash coefficients in the 0.6–0.8 range). In comparison with time-adaptive strategies, the space–time-adaptive strategy yields a very significant reduction in the radar–rain gauge bias while the level of scatter remains basically unchanged. Because the Z–R relationships have not been optimized in this study, results are attributed to an improved processing of spatial variations in the vertical profile of reflectivity. The two main recommendations for future work consist of adapting the rain separation method for radar network operations and documenting Z–R relationships conditional on rainfall type.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As weather and climate models move toward higher resolution, there is growing excitement about potential future improvements in the understanding and prediction of atmospheric convection and its interaction with larger-scale phenomena. A meeting in January 2013 in Dartington, Devon was convened to address the best way to maximise these improvements, specifically in a UK context but with international relevance. Specific recommendations included increased convective-scale observations, high-resolution virtual laboratories, and a system of parameterization test beds with a range of complexities. The main recommendation was to facilitate the development of physically based convective parameterizations that are scale-aware, non-local, non-equilibrium, and stochastic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Flash floods pose a significant danger for life and property. Unfortunately, in arid and semiarid environment the runoff generation shows a complex non-linear behavior with a strong spatial and temporal non-uniformity. As a result, the predictions made by physically-based simulations in semiarid areas are subject to great uncertainty, and a failure in the predictive behavior of existing models is common. Thus better descriptions of physical processes at the watershed scale need to be incorporated into the hydrological model structures. For example, terrain relief has been systematically considered static in flood modelling at the watershed scale. Here, we show that the integrated effect of small distributed relief variations originated through concurrent hydrological processes within a storm event was significant on the watershed scale hydrograph. We model these observations by introducing dynamic formulations of two relief-related parameters at diverse scales: maximum depression storage, and roughness coefficient in channels. In the final (a posteriori) model structure these parameters are allowed to be both time-constant or time-varying. The case under study is a convective storm in a semiarid Mediterranean watershed with ephemeral channels and high agricultural pressures (the Rambla del Albujón watershed; 556 km 2 ), which showed a complex multi-peak response. First, to obtain quasi-sensible simulations in the (a priori) model with time-constant relief-related parameters, a spatially distributed parameterization was strictly required. Second, a generalized likelihood uncertainty estimation (GLUE) inference applied to the improved model structure, and conditioned to observed nested hydrographs, showed that accounting for dynamic relief-related parameters led to improved simulations. The discussion is finally broadened by considering the use of the calibrated model both to analyze the sensitivity of the watershed to storm motion and to attempt the flood forecasting of a stratiform event with highly different behavior.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Radar reflectivity measurements from three different wavelengths are used to retrieve information about the shape of aggregate snowflakes in deep stratiform ice clouds. Dual-wavelength ratios are calculated for different shape models and compared to observations at 3, 35 and 94 GHz. It is demonstrated that many scattering models, including spherical and spheroidal models, do not adequately describe the aggregate snowflakes that are observed. The observations are consistent with fractal aggregate geometries generated by a physically-based aggregation model. It is demonstrated that the fractal dimension of large aggregates can be inferred directly from the radar data. Fractal dimensions close to 2 are retrieved, consistent with previous theoretical models and in-situ observations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many of the next generation of global climate models will include aerosol schemes which explicitly simulate the microphysical processes that determine the particle size distribution. These models enable aerosol optical properties and cloud condensation nuclei (CCN) concentrations to be determined by fundamental aerosol processes, which should lead to a more physically based simulation of aerosol direct and indirect radiative forcings. This study examines the global variation in particle size distribution simulated by 12 global aerosol microphysics models to quantify model diversity and to identify any common biases against observations. Evaluation against size distribution measurements from a new European network of aerosol supersites shows that the mean model agrees quite well with the observations at many sites on the annual mean, but there are some seasonal biases common to many sites. In particular, at many of these European sites, the accumulation mode number concentration is biased low during winter and Aitken mode concentrations tend to be overestimated in winter and underestimated in summer. At high northern latitudes, the models strongly underpredict Aitken and accumulation particle concentrations compared to the measurements, consistent with previous studies that have highlighted the poor performance of global aerosol models in the Arctic. In the marine boundary layer, the models capture the observed meridional variation in the size distribution, which is dominated by the Aitken mode at high latitudes, with an increasing concentration of accumulation particles with decreasing latitude. Considering vertical profiles, the models reproduce the observed peak in total particle concentrations in the upper troposphere due to new particle formation, although modelled peak concentrations tend to be biased high over Europe. Overall, the multi-model-mean data set simulates the global variation of the particle size distribution with a good degree of skill, suggesting that most of the individual global aerosol microphysics models are performing well, although the large model diversity indicates that some models are in poor agreement with the observations. Further work is required to better constrain size-resolved primary and secondary particle number sources, and an improved understanding of nucleation and growth (e.g. the role of nitrate and secondary organics) will improve the fidelity of simulated particle size distributions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Predicting the evolution of ice sheets requires numerical models able to accurately track the migration of ice sheet continental margins or grounding lines. We introduce a physically based moving point approach for the flow of ice sheets based on the conservation of local masses. This allows the ice sheet margins to be tracked explicitly and the waiting time behaviours to be modelled efficiently. A finite difference moving point scheme is derived and applied in a simplified context (continental radially-symmetrical shallow ice approximation). The scheme, which is inexpensive, is validated by comparing the results with moving-margin exact solutions and steady states. In both cases the scheme is able to track the position of the ice sheet margin with high precision.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ocean prediction systems are now able to analyse and predict temperature, salinity and velocity structures within the ocean by assimilating measurements of the ocean’s temperature and salinity into physically based ocean models. Data assimilation combines current estimates of state variables, such as temperature and salinity, from a computational model with measurements of the ocean and atmosphere in order to improve forecasts and reduce uncertainty in the forecast accuracy. Data assimilation generally works well with ocean models away from the equator but has been found to induce vigorous and unrealistic overturning circulations near the equator. A pressure correction method was developed at the University of Reading and the Met Office to control these circulations using ideas from control theory and an understanding of equatorial dynamics. The method has been used for the last 10 years in seasonal forecasting and ocean prediction systems at the Met Office and European Center for Medium-range Weather Forecasting (ECMWF). It has been an important element in recent re-analyses of the ocean heat uptake that mitigates climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Predicting the evolution of ice sheets requires numerical models able to accurately track the migration of ice sheet continental margins or grounding lines. We introduce a physically based moving-point approach for the flow of ice sheets based on the conservation of local masses. This allows the ice sheet margins to be tracked explicitly. Our approach is also well suited to capture waiting-time behaviour efficiently. A finite-difference moving-point scheme is derived and applied in a simplified context (continental radially symmetrical shallow ice approximation). The scheme, which is inexpensive, is verified by comparing the results with steady states obtained from an analytic solution and with exact moving-margin transient solutions. In both cases the scheme is able to track the position of the ice sheet margin with high accuracy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is known that germin, which is a marker of the onset of growth in germinating wheat, is an oxalate oxidase, and also that germins possess sequence similarity with legumin and vicilin seed storage proteins. These two pieces of information have been combined in order to generate a 3D model of germin based on the structure of vicilin and to examine the model with regard to a potential oxalate oxidase active site. A cluster of three histidine residues has been located within the conserved beta-barrel structure. While there is a relatively low level of overall sequence similarity between the model and the vicilin structures, the conservation of amino acids important in maintaining the scaffold of the beta-barrel lends confidence to the juxtaposition of the histidine residues. The cluster is similar structurally to those found in copper amine oxidase and other proteins, leading to the suggestion that it defines a metal-binding location within the oxalate oxidase active site. It is also proposed that the structural elements involved in intermolecular interactions in vicilins may play a role in oligomer formation in germin/oxalate oxidase.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.