925 resultados para Modelling lifetime data
Resumo:
At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.
Resumo:
This paper will introduce the Baltex research programme and summarize associated numerical modelling work which has been undertaken during the last five years. The research has broadly managed to clarify the main mechanisms determining the water and energy cycle in the Baltic region, such as the strong dependence upon the large scale atmospheric circulation. It has further been shown that the Baltic Sea has a positive water balance, albeit with large interannual variations. The focus on the modelling studies has been the use of limited area models at ultra-high resolution driven by boundary conditions from global models or from reanalysis data sets. The programme has further initiated a comprehensive integration of atmospheric, land surface and hydrological modelling incorporating snow, sea ice and special lake models. Other aspects of the programme include process studies such as the role of deep convection, air sea interaction and the handling of land surface moisture. Studies have also been undertaken to investigate synoptic and sub-synoptic events over the Baltic region, thus exploring the role of transient weather systems for the hydrological cycle. A special aspect has been the strong interests and commitments of the meteorological and hydrological services because of the potentially large societal interests of operational applications of the research. As a result of this interests special attention has been put on data-assimilation aspects and the use of new types of data such as SSM/I, GPS-measurements and digital radar. A series of high resolution data sets are being produced. One of those, a 1/6 degree daily precipitation climatology for the years 1996–1999, is such a unique contribution. The specific research achievements to be presented in this volume of Meteorology and Atmospheric Physics is the result of a cooperative venture between 11 European research groups supported under the EU-Framework programmes.
Resumo:
This letter presents an effective approach for selection of appropriate terrain modeling methods in forming a digital elevation model (DEM). This approach achieves a balance between modeling accuracy and modeling speed. A terrain complexity index is defined to represent a terrain's complexity. A support vector machine (SVM) classifies terrain surfaces into either complex or moderate based on this index associated with the terrain elevation range. The classification result recommends a terrain modeling method for a given data set in accordance with its required modeling accuracy. Sample terrain data from the lunar surface are used in constructing an experimental data set. The results have shown that the terrain complexity index properly reflects the terrain complexity, and the SVM classifier derived from both the terrain complexity index and the terrain elevation range is more effective and generic than that designed from either the terrain complexity index or the terrain elevation range only. The statistical results have shown that the average classification accuracy of SVMs is about 84.3% ± 0.9% for terrain types (complex or moderate). For various ratios of complex and moderate terrain types in a selected data set, the DEM modeling speed increases up to 19.5% with given DEM accuracy.
Resumo:
There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.
Resumo:
Atmospheric CO2 concentration is hypothesized to influence vegetation distribution via tree–grass competition, with higher CO2 concentrations favouring trees. The stable carbon isotope (δ13C) signature of vegetation is influenced by the relative importance of C4 plants (including most tropical grasses) and C3 plants (including nearly all trees), and the degree of stomatal closure – a response to aridity – in C3 plants. Compound-specific δ13C analyses of leaf-wax biomarkers in sediment cores of an offshore South Atlantic transect are used here as a record of vegetation changes in subequatorial Africa. These data suggest a large increase in C3 relative to C4 plant dominance after the Last Glacial Maximum. Using a process-based biogeography model that explicitly simulates 13C discrimination, it is shown that precipitation and temperature changes cannot explain the observed shift in δ13C values. The physiological effect of increasing CO2 concentration is decisive, altering the C3/C4 balance and bringing the simulated and observed δ13C values into line. It is concluded that CO2 concentration itself was a key agent of vegetation change in tropical southern Africa during the last glacial–interglacial transition. Two additional inferences follow. First, long-term variations in terrestrial δ13Cvalues are not simply a proxy for regional rainfall, as has sometimes been assumed. Although precipitation and temperature changes have had major effects on vegetation in many regions of the world during the period between the Last Glacial Maximum and recent times, CO2 effects must also be taken into account, especially when reconstructing changes in climate between glacial and interglacial states. Second, rising CO2 concentration today is likely to be influencing tree–grass competition in a similar way, and thus contributing to the "woody thickening" observed in savannas worldwide. This second inference points to the importance of experiments to determine how vegetation composition in savannas is likely to be influenced by the continuing rise of CO2 concentration.
Resumo:
In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional “climate modeling” source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.
Resumo:
Relating the measurable, large scale, effects of anaesthetic agents to their molecular and cellular targets of action is necessary to better understand the principles by which they affect behavior, as well as enabling the design and evaluation of more effective agents and the better clinical monitoring of existing and future drugs. Volatile and intravenous general anaesthetic agents (GAs) are now known to exert their effects on a variety of protein targets, the most important of which seem to be the neuronal ion channels. It is hence unlikely that anaesthetic effect is the result of a unitary mechanism at the single cell level. However, by altering the behavior of ion channels GAs are believed to change the overall dynamics of distributed networks of neurons. This disruption of regular network activity can be hypothesized to cause the hypnotic and analgesic effects of GAs and may well present more stereotypical characteristics than its underlying microscopic causes. Nevertheless, there have been surprisingly few theories that have attempted to integrate, in a quantitative manner, the empirically well documented alterations in neuronal ion channel behavior with the corresponding macroscopic effects. Here we outline one such approach, and show that a range of well documented effects of anaesthetics on the electroencephalogram (EEG) may be putatively accounted for. In particular we parameterize, on the basis of detailed empirical data, the effects of halogenated volatile ethers (a clinically widely used class of general anaesthetic agent). The resulting model is able to provisionally account for a range of anaesthetically induced EEG phenomena that include EEG slowing, biphasic changes in EEG power, and the dose dependent appearance of anomalous ictal activity, as well as providing a basis for novel approaches to monitoring brain function in both health and disease.
Assessment of the Wind Gust Estimate Method in mesoscale modelling of storm events over West Germany
Resumo:
A physically based gust parameterisation is added to the atmospheric mesoscale model FOOT3DK to estimate wind gusts associated with storms over West Germany. The gust parameterisation follows the Wind Gust Estimate (WGE) method and its functionality is verified in this study. The method assumes that gusts occurring at the surface are induced by turbulent eddies in the planetary boundary layer, deflecting air parcels from higher levels down to the surface under suitable conditions. Model simulations are performed with horizontal resolutions of 20 km and 5 km. Ten historical storm events of different characteristics and intensities are chosen in order to include a wide range of typical storms affecting Central Europe. All simulated storms occurred between 1990 and 1998. The accuracy of the method is assessed objectively by validating the simulated wind gusts against data from 16 synoptic stations by means of “quality parameters”. Concerning these parameters, the temporal and spatial evolution of the simulated gusts is well reproduced. Simulated values for low altitude stations agree particularly well with the measured gusts. For orographically exposed locations, the gust speeds are partly underestimated. The absolute maximum gusts lie in most cases within the bounding interval given by the WGE method. Focussing on individual storms, the performance of the method is better for intense and large storms than for weaker ones. Particularly for weaker storms, the gusts are typically overestimated. The results for the sample of ten storms document that the method is generally applicable with the mesoscale model FOOT3DK for mid-latitude winter storms, even in areas with complex orography.
Resumo:
Anaerobic digestion (AD) technologies convert organic wastes and crops into methane-rich biogas for heating, electricity generation and vehicle fuel. Farm-based AD has proliferated in some EU countries, driven by favourable policies promoting sustainable energy generation and GHG mitigation. Despite increased state support there are still few AD plants on UK farms leading to a lack of normative data on viability of AD in the whole-farm context. Farmers and lenders are therefore reluctant to fund AD projects and policy makers are hampered in their attempts to design policies that adequately support the industry. Existing AD studies and modelling tools do not adequately capture the farm context within which AD interacts. This paper demonstrates a whole-farm, optimisation modelling approach to assess the viability of AD in a more holistic way, accounting for such issues as: AD scale, synergies and conflicts with other farm enterprises, choice of feedstocks, digestate use and impact on farm Net Margin. This modelling approach demonstrates, for example, that: AD is complementary to dairy enterprises, but competes with arable enterprises for farm resources. Reduced nutrient purchases significantly improve Net Margin on arable farms, but AD scale is constrained by the capacity of farmland to absorb nutrients in AD digestate.
Resumo:
A dynamic size-structured model is developed for phytoplankton and nutrients in the oceanic mixed layer and applied to extract phytoplankton biomass at discrete size fractions from remotely sensed, ocean-colour data. General relationships between cell size and biophysical processes (such as sinking, grazing, and primary production) of phytoplankton were included in the model through a bottom–up approach. Time-dependent, mixed-layer depth was used as a forcing variable, and a sequential data-assimilation scheme was implemented to derive model trajectories. From a given time-series, the method produces estimates of size-structured biomass at every observation, so estimates seasonal succession of individual phytoplankton size, derived here from remote sensing for the first time. From these estimates, normalized phytoplankton biomass size spectra over a period of 9 years were calculated for one location in the North Atlantic. Further analysis demonstrated that strong relationships exist between the seasonal trends of the estimated size spectra and the mixed-layer depth, nutrient biomass, and total chlorophyll. The results contain useful information on the time-dependent biomass flux in the pelagic ecosystem.
Resumo:
Understanding how species and ecosystems respond to climate change has become a major focus of ecology and conservation biology. Modelling approaches provide important tools for making future projections, but current models of the climate-biosphere interface remain overly simplistic, undermining the credibility of projections. We identify five ways in which substantial advances could be made in the next few years: (i) improving the accessibility and efficiency of biodiversity monitoring data, (ii) quantifying the main determinants of the sensitivity of species to climate change, (iii) incorporating community dynamics into projections of biodiversity responses, (iv) accounting for the influence of evolutionary processes on the response of species to climate change, and (v) improving the biophysical rule sets that define functional groupings of species in global models.
Resumo:
It is often assumed that humans generate a 3D reconstruction of the environment, either in egocentric or world-based coordinates, but the steps involved are unknown. Here, we propose two reconstruction-based models, evaluated using data from two tasks in immersive virtual reality. We model the observer’s prediction of landmark location based on standard photogrammetric methods and then combine location predictions to compute likelihood maps of navigation behaviour. In one model, each scene point is treated independently in the reconstruction; in the other, the pertinent variable is the spatial relationship between pairs of points. Participants viewed a simple environment from one location, were transported (virtually) to another part of the scene and were asked to navigate back. Error distributions varied substantially with changes in scene layout; we compared these directly with the likelihood maps to quantify the success of the models. We also measured error distributions when participants manipulated the location of a landmark to match the preceding interval, providing a direct test of the landmark-location stage of the navigation models. Models such as this, which start with scenes and end with a probabilistic prediction of behaviour, are likely to be increasingly useful for understanding 3D vision.
Resumo:
Communication signal processing applications often involve complex-valued (CV) functional representations for signals and systems. CV artificial neural networks have been studied theoretically and applied widely in nonlinear signal and data processing [1–11]. Note that most artificial neural networks cannot be automatically extended from the real-valued (RV) domain to the CV domain because the resulting model would in general violate Cauchy-Riemann conditions, and this means that the training algorithms become unusable. A number of analytic functions were introduced for the fully CV multilayer perceptrons (MLP) [4]. A fully CV radial basis function (RBF) nework was introduced in [8] for regression and classification applications. Alternatively, the problem can be avoided by using two RV artificial neural networks, one processing the real part and the other processing the imaginary part of the CV signal/system. A even more challenging problem is the inverse of a CV
Resumo:
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society
Resumo:
Urbanization related alterations to the surface energy balance impact urban warming (‘heat islands’), the growth of the boundary layer, and many other biophysical processes. Traditionally, in situ heat flux measures have been used to quantify such processes, but these typically represent only a small local-scale area within the heterogeneous urban environment. For this reason, remote sensing approaches are very attractive for elucidating more spatially representative information. Here we use hyperspectral imagery from a new airborne sensor, the Operative Modular Imaging Spectrometer (OMIS), along with a survey map and meteorological data, to derive the land cover information and surface parameters required to map spatial variations in turbulent sensible heat flux (QH). The results from two spatially-explicit flux retrieval methods which use contrasting approaches and, to a large degree, different input data are compared for a central urban area of Shanghai, China: (1) the Local-scale Urban Meteorological Parameterization Scheme (LUMPS) and (2) an Aerodynamic Resistance Method (ARM). Sensible heat fluxes are determined at the full 6 m spatial resolution of the OMIS sensor, and at lower resolutions via pixel aggregation and spatial averaging. At the 6 m spatial resolution, the sensible heat flux of rooftop dominated pixels exceeds that of roads, water and vegetated areas, with values peaking at ∼ 350 W m− 2, whilst the storage heat flux is greatest for road dominated pixels (peaking at around 420 W m− 2). We investigate the use of both OMIS-derived land surface temperatures made using a Temperature–Emissivity Separation (TES) approach, and land surface temperatures estimated from air temperature measures. Sensible heat flux differences from the two approaches over the entire 2 × 2 km study area are less than 30 W m− 2, suggesting that methods employing either strategy maybe practica1 when operated using low spatial resolution (e.g. 1 km) data. Due to the differing methodologies, direct comparisons between results obtained with the LUMPS and ARM methods are most sensibly made at reduced spatial scales. At 30 m spatial resolution, both approaches produce similar results, with the smallest difference being less than 15 W m− 2 in mean QH averaged over the entire study area. This is encouraging given the differing architecture and data requirements of the LUMPS and ARM methods. Furthermore, in terms of mean study QH, the results obtained by averaging the original 6 m spatial resolution LUMPS-derived QH values to 30 and 90 m spatial resolution are within ∼ 5 W m− 2 of those derived from averaging the original surface parameter maps prior to input into LUMPS, suggesting that that use of much lower spatial resolution spaceborne imagery data, for example from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is likely to be a practical solution for heat flux determination in urban areas.