991 resultados para Convection-dispersion Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of an international intercomparison project, a set of single column models (SCMs) and cloud-resolving models (CRMs) are run under the weak temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistent implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the development of convection-permitting numerical weather prediction the efficient use of high resolution observations in data assimilation is becoming increasingly important. The operational assimilation of these observations, such as Dopplerradar radial winds, is now common, though to avoid violating the assumption of un- correlated observation errors the observation density is severely reduced. To improve the quantity of observations used and the impact that they have on the forecast will require the introduction of the full, potentially correlated, error statistics. In this work, observation error statistics are calculated for the Doppler radar radial winds that are assimilated into the Met Office high resolution UK model using a diagnostic that makes use of statistical averages of observation-minus-background and observation-minus-analysis residuals. This is the first in-depth study using the diagnostic to estimate both horizontal and along-beam correlated observation errors. By considering the new results obtained it is found that the Doppler radar radial wind error standard deviations are similar to those used operationally and increase as the observation height increases. Surprisingly the estimated observation error correlation length scales are longer than the operational thinning distance. They are dependent on both the height of the observation and on the distance of the observation away from the radar. Further tests show that the long correlations cannot be attributed to the use of superobservations or the background error covariance matrix used in the assimilation. The large horizontal correlation length scales are, however, in part, a result of using a simplified observation operator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initialising the ocean internal variability for decadal predictability studies is a new area of research and a variety of ad hoc methods are currently proposed. In this study, we explore how nudging with sea surface temperature (SST) and salinity (SSS) can reconstruct the three-dimensional variability of the ocean in a perfect model framework. This approach builds on the hypothesis that oceanic processes themselves will transport the surface information into the ocean interior as seen in ocean-only simulations. Five nudged simulations are designed to reconstruct a 150 years “target” simulation, defined as a portion of a long control simulation. The nudged simulations differ by the variables restored to, SST or SST + SSS, and by the area where the nudging is applied. The strength of the heat flux feedback is diagnosed from observations and the restoring coefficients for SSS use the same time-scale. We observed that this choice prevents spurious convection at high latitudes and near sea-ice border when nudging both SST and SSS. In the tropics, nudging the SST is enough to reconstruct the tropical atmosphere circulation and the associated dynamical and thermodynamical impacts on the underlying ocean. In the tropical Pacific Ocean, the profiles for temperature show a significant correlation from the surface down to 2,000 m, due to dynamical adjustment of the isopycnals. At mid-to-high latitudes, SSS nudging is required to reconstruct both the temperature and the salinity below the seasonal thermocline. This is particularly true in the North Atlantic where adding SSS nudging enables to reconstruct the deep convection regions of the target. By initiating a previously documented 20-year cycle of the model, the SST + SSS nudging is also able to reproduce most of the AMOC variations, a key source of decadal predictability. Reconstruction at depth does not significantly improve with amount of time spent nudging and the efficiency of the surface nudging rather depends on the period/events considered. The joint SST + SSS nudging applied everywhere is the most efficient approach. It ensures that the right water masses are formed at the right surface density, the subsequent circulation, subduction and deep convection further transporting them at depth. The results of this study underline the potential key role of SSS for decadal predictability and further make the case for sustained large-scale observations of this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Substantial low-frequency rainfall fluctuations occurred in the Sahel throughout the twentieth century, causing devastating drought. Modeling these low-frequency rainfall fluctuations has remained problematic for climate models for many years. Here we show using a combination of state-of-the-art rainfall observations and high-resolution global climate models that changes in organized heavy rainfall events carry most of the rainfall variability in the Sahel at multiannual to decadal time scales. Ability to produce intense, organized convection allows climate models to correctly simulate the magnitude of late-twentieth century rainfall change, underlining the importance of model resolution. Increasing model resolution allows a better coupling between large-scale circulation changes and regional rainfall processes over the Sahel. These results provide a strong basis for developing more reliable and skilful long-term predictions of rainfall (seasons to years) which could benefit many sectors in the region by allowing early adaptation to impending extremes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Madden-Julian Oscillation (MJO) is the dominant mode of intraseasonal variability in the Trop- ics. It can be characterised as a planetary-scale coupling between the atmospheric circulation and organised deep convection that propagates east through the equatorial Indo-Pacific region. The MJO interacts with weather and climate systems on a near-global scale and is a crucial source of predictability for weather forecasts on medium to seasonal timescales. Despite its global signifi- cance, accurately representing the MJO in numerical weather prediction (NWP) and climate models remains a challenge. This thesis focuses on the representation of the MJO in the Integrated Forecasting System (IFS) at the European Centre for Medium-Range Weather Forecasting (ECMWF), a state-of-the-art NWP model. Recent modifications to the model physics in Cycle 32r3 (Cy32r3) of the IFS led to ad- vances in the simulation of the MJO; for the first time the observed amplitude of the MJO was maintained throughout the integration period. A set of hindcast experiments, which differ only in their formulation of convection, have been performed between May 2008 and April 2009 to asses the sensitivity of MJO simulation in the IFS to the Cy32r3 convective parameterization. Unique to this thesis is the attribution of the advances in MJO simulation in Cy32r3 to the mod- ified convective parameterization, specifically, the relative-humidity-dependent formulation for or- ganised deep entrainment. Increasing the sensitivity of the deep convection scheme to environmen- tal moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid-troposphere. Due to the modified precipitation-moisture relationship more moisture is able to build up which effectively preconditions the tropical atmosphere for the transition to deep convection. Results from this thesis suggest that a tropospheric moisture control on convection is key to simulating the interaction between the physics and large-scale circulation associated with the MJO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional Climate Model version 3 (RegCM3) simulations of 17 summers (1988-2004) over part of South America south of 5 degrees S were evaluated to identify model systematic errors. Model results were compared to different rainfall data sets (Climate Research Unit (CRU), Climate Prediction Center (CPC), Global Precipitation Climatology Project (GPCP), and National Centers for Environmental Prediction (NCEP) reanalysis), including the five summers mean (1998-2002) precipitation diurnal cycle observed by the Tropical Rainfall Measuring Mission (TRMM)-Precipitation Radar (PR). In spite of regional differences, the RegCM3 simulates the main observed aspects of summer climatology associated with the precipitation (northwest-southeast band of South Atlantic Convergence Zone (SACZ)) and air temperature (warmer air in the central part of the continent and colder in eastern Brazil and the Andes Mountains). At a regional scale, the main RegCM3 failures are the underestimation of the precipitation in the northern branch of the SACZ and some unrealistic intense precipitation around the Andes Mountains. However, the RegCM3 seasonal precipitation is closer to the fine-scale analyses (CPC, CRU, and TRMM-PR) than is the NCEP reanalysis, which presents an incorrect north-south orientation of SACZ and an overestimation of its intensity. The precipitation diurnal cycle observed by TRMM-PR shows pronounced contrasts between Tropics and Extratropics and land and ocean, where most of these features are simulated by RegCM3. The major similarities between the simulation and observation, especially the diurnal cycle phase, are found over the continental tropical and subtropical SACZ regions, which present afternoon maximum (1500-1800 UTC) and morning minimum (0900-1200 UTC). More specifically, over the core of SACZ, the phase and amplitude of the simulated precipitation diurnal cycle are very close to the TRMM-PR observations. Although there are amplitude differences, the RegCM3 simulates the observed nighttime rainfall in the eastern Andes Mountains, over the Atlantic Ocean, and also over northern Argentina. The main simulation deficiencies are found in the Atlantic Ocean and near the Andes Mountains. Over the Atlantic Ocean the convective scheme is not triggered; thus the rainfall arises from the grid-scale scheme and therefore differs from the TRMM-PR. Near the Andes, intense (nighttime and daytime) simulated precipitation could be a response of an incorrect circulation and topographic uplift. Finally, it is important to note that unlike most reported bias of global models, RegCM3 does not trigger the moist convection just after sunrise over the southern part of the Amazon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we deal with a Bayesian analysis for right-censored survival data suitable for populations with a cure rate. We consider a cure rate model based on the negative binomial distribution, encompassing as a special case the promotion time cure model. Bayesian analysis is based on Markov chain Monte Carlo (MCMC) methods. We also present some discussion on model selection and an illustration with a real dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this paper is to investigate a cure rate model that comprehends some well-known proposals found in the literature. In our work the number of competing causes of the event of interest follows the negative binomial distribution. The model is conveniently reparametrized through the cured fraction, which is then linked to covariates by means of the logistic link. We explore the use of Markov chain Monte Carlo methods to develop a Bayesian analysis in the proposed model. The procedure is illustrated with a numerical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the dielectric dispersion of water, specially in the low-frequency range, by using the impedance spectroscopy technique. The frequency dependencies of the real R and imaginary Z parts of the impedance Could not be explained by means of the Usual description of the dielectric properties of the water as all insulating liquid containing ions. This is due to the incomplete knowledge of the parameters entering in the fundamental equations describing the evolution of the system, and oil the mechanisms regulating the exchange of charge of the cell with the external circuit. We propose a simple description of our experimental data based on the model of Debye, by invoking a dc conductivity of the cell, related to the nonblocking character of the electrodes. A discussion on the electric Circuits able to simulate the cell under investigation, based oil bulk and Surface elements, is also reported. We find that the simple circuit formed by a series of two parallels of resistance and capacitance is able to reproduce the experimental data concerning the real and imaginary part of the electrical impedance of the cell for frequency larger than 1 Hz. According to this description, one of the parallels takes into account the electrical properties of interface between the electrode and water, and the other of the bulk. For frequency lower than 1 Hz, a good agreement with the experimental data is obtained by simulating the electrical properties of the interface by means of the constant phase element.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A previously proposed model describing the trapping site of the interstitial atomic hydrogen in borate glasses is analyzed. In this model the atomic hydrogen is stabilized at the centers of oxygen polygons belonging to B-O ring structures in the glass network by van der Waals forces. The previously reported atomic hydrogen isothermal decay experimental data are discussed in the light of this microscopic model. A coupled differential equation system of the observed decay kinetics was solved numerically using the Runge Kutta method. The experimental untrapping activation energy of 0.7 x 10(-19) J is in good agreement with the calculated results of dispersion interaction between the stabilized atomic hydrogen and the neighboring oxygen atoms at the vertices of hexagonal ring structures. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the issue of performing residual and local influence analyses in beta regression models with varying dispersion, which are useful for modelling random variables that assume values in the standard unit interval. In such models, both the mean and the dispersion depend upon independent variables. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes. An application using real data is presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A three-dimensional time-dependent hydrodynamic and heat transport model of Lake Binaba, a shallow and small dam reservoir in Ghana, emphasizing the simulation of dynamics and thermal structure has been developed. Most numerical studies of temperature dynamics in reservoirs are based on one- or two-dimensional models. These models are not applicable for reservoirs characterized with complex flow pattern and unsteady heat exchange between the atmosphere and water surface. Continuity, momentum and temperature transport equations have been solved. Proper assignment of boundary conditions, especially surface heat fluxes, has been found crucial in simulating the lake’s hydrothermal dynamics. This model is based on the Reynolds Average Navier-Stokes equations, using a Boussinesq approach, with a standard k − ε turbulence closure to solve the flow field. The thermal model includes a heat source term, which takes into account the short wave radiation and also heat convection at the free surface, which is function of air temperatures, wind velocity and stability conditions of atmospheric boundary layer over the water surface. The governing equations of the model have been solved by OpenFOAM; an open source, freely available CFD toolbox. As its core, OpenFOAM has a set of efficient C++ modules that are used to build solvers. It uses collocated, polyhedral numerics that can be applied on unstructured meshes and can be easily extended to run in parallel. A new solver has been developed to solve the hydrothermal model of lake. The simulated temperature was compared against a 15 days field data set. Simulated and measured temperature profiles in the probe locations show reasonable agreement. The model might be able to compute total heat storage of water bodies to estimate evaporation from water surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Em modelos de competição de preços, somente um custo de procura positivo por parte do consumidor não gera equilíbrio com dispersão de preços. Já modelos dinâmicos de switching cost consistentemente geram este fenômeno bastante documentado para preços no varejo. Embora ambas as literaturas sejam vastas, poucos modelos tentaram combinar as duas fricções em um só modelo. Este trabalho apresenta um modelo dinâmico de competição de preços em que consumidores idênticos enfrentam custos de procura e de switching. O equilíbrio gera dispersão nos preços. Ainda, como os consumidores são obrigados a se comprometer com uma amostra fixa de firmas antes dos preços serem definidos, somente dois preços serão considerados antes de cada compra. Este resultado independe do tamanho do custo de procura individual do consumidor.