96 resultados para temperature-based models
Resumo:
The contribution non-point P sources make to the total P loading on water bodies in agricultural catchments has not been fully appreciated. Using data derived from plot scale experimental studies, and modelling approaches developed to simulate system behaviour under differing management scenarios, a fuller understanding of the processes controlling P export and transformations along non-point transport pathways can be achieved. One modelling approach which has been successfully applied to large UK catchments (50-350km2 in area) is applied here to a small, 1.5 km2 experimental catchment. The importance of scaling is discussed in the context of how such approaches can extrapolate the results from plot-scale experimental studies to full catchment scale. However, the scope of such models is limited, since they do not at present directly simulate the processes controlling P transport and transformation dynamics. As such, they can only simulate total P export on an annual basis, and are not capable of prediction over shorter time scales. The need for development of process-based models to help answer these questions, and for more comprehensive UK experimental studies is highlighted as a pre-requisite for the development of suitable and sustainable management strategies to reduce non-point P loading on water bodies in agricultural catchments.
Resumo:
Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.
Resumo:
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions.
Resumo:
The two-way relationship between Rossby Wave-Breaking (RWB) and intensification of extra tropical cyclones is analysed over the Euro-Atlantic sector. In particular, the timing, intensity and location of cyclone development are related to RWB occurrences. For this purpose, two potential-temperature based indices are used to detect and classify anticyclonic and cyclonic RWB episodes from ERA-40 Re-Analysis data. Results show that explosive cyclogenesis over the North Atlantic (NA) is fostered by enhanced occurrence of RWB on days prior to the cyclone’s maximum intensification. Under such conditions, the eddy-driven jet stream is accelerated over the NA, thus enhancing conditions for cyclogenesis. For explosive cyclogenesis over the eastern NA, enhanced cyclonic RWB over eastern Greenland and anticyclonic RWB over the sub-tropical NA are observed. Typically only one of these is present in any given case, with the RWB over eastern Greenland being more frequent than its southern counterpart. This leads to an intensification of the jet over the eastern NA and enhanced probability of windstorms reaching Western Europe. Explosive cyclones evolving under simultaneous RWB on both sides of the jet feature a higher mean intensity and deepening rates than cyclones preceded by a single RWB event. Explosive developments over the western NA are typically linked to a single area of enhanced cyclonic RWB over western Greenland. Here, the eddy-driven jet is accelerated over the western NA. Enhanced occurrence of cyclonic RWB over southern Greenland and anticyclonic RWB over Europe is also observed after explosive cyclogenesis, potentially leading to the onset of Scandinavian Blocking. However, only very intense developments have a considerable influence on the large-scale atmospheric flow. Non-explosive cyclones depict no sign of enhanced RWB over the whole NA area. We conclude that the links between RWB and cyclogenesis over the Euro-Atlantic sector are sensitive to the cyclone’s maximum intensity, deepening rate and location.
Resumo:
Quantitative simulations of the global-scale benefits of climate change mitigation are presented, using a harmonised, self-consistent approach based on a single set of climate change scenarios. The approach draws on a synthesis of output from both physically-based and economics-based models, and incorporates uncertainty analyses. Previous studies have projected global and regional climate change and its impacts over the 21st century but have generally focused on analysis of business-as-usual scenarios, with no explicit mitigation policy included. This study finds that both the economics-based and physically-based models indicate that early, stringent mitigation would avoid a large proportion of the impacts of climate change projected for the 2080s. However, it also shows that not all the impacts can now be avoided, so that adaptation would also therefore be needed to avoid some of the potential damage. Delay in mitigation substantially reduces the percentage of impacts that can be avoided, providing strong new quantitative evidence for the need for stringent and prompt global mitigation action on greenhouse gas emissions, combined with effective adaptation, if large, widespread climate change impacts are to be avoided. Energy technology models suggest that such stringent and prompt mitigation action is technologically feasible, although the estimated costs vary depending on the specific modelling approach and assumptions.
Resumo:
It is often assumed that humans generate a 3D reconstruction of the environment, either in egocentric or world-based coordinates, but the steps involved are unknown. Here, we propose two reconstruction-based models, evaluated using data from two tasks in immersive virtual reality. We model the observer’s prediction of landmark location based on standard photogrammetric methods and then combine location predictions to compute likelihood maps of navigation behaviour. In one model, each scene point is treated independently in the reconstruction; in the other, the pertinent variable is the spatial relationship between pairs of points. Participants viewed a simple environment from one location, were transported (virtually) to another part of the scene and were asked to navigate back. Error distributions varied substantially with changes in scene layout; we compared these directly with the likelihood maps to quantify the success of the models. We also measured error distributions when participants manipulated the location of a landmark to match the preceding interval, providing a direct test of the landmark-location stage of the navigation models. Models such as this, which start with scenes and end with a probabilistic prediction of behaviour, are likely to be increasingly useful for understanding 3D vision.
Resumo:
During the cold period of the Last Glacial Maximum (LGM, about 21 000 years ago) atmospheric CO2 was around 190 ppm, much lower than the pre-industrial concentration of 280 ppm. The causes of this substantial drop remain partially unresolved, despite intense research. Understanding the origin of reduced atmospheric CO2 during glacial times is crucial to comprehend the evolution of the different carbon reservoirs within the Earth system (atmosphere, terrestrial biosphere and ocean). In this context, the ocean is believed to play a major role as it can store large amounts of carbon, especially in the abyss, which is a carbon reservoir that is thought to have expanded during glacial times. To create this larger reservoir, one possible mechanism is to produce very dense glacial waters, thereby stratifying the deep ocean and reducing the carbon exchange between the deep and upper ocean. The existence of such very dense waters has been inferred in the LGM deep Atlantic from sediment pore water salinity and δ18O inferred temperature. Based on these observations, we study the impact of a brine mechanism on the glacial carbon cycle. This mechanism relies on the formation and rapid sinking of brines, very salty water released during sea ice formation, which brings salty dense water down to the bottom of the ocean. It provides two major features: a direct link from the surface to the deep ocean along with an efficient way of setting a strong stratification. We show with the CLIMBER-2 carbon-climate model that such a brine mechanism can account for a significant decrease in atmospheric CO2 and contribute to the glacial-interglacial change. This mechanism can be amplified by low vertical diffusion resulting from the brine-induced stratification. The modeled glacial distribution of oceanic δ13C as well as the deep ocean salinity are substantially improved and better agree with reconstructions from sediment cores, suggesting that such a mechanism could have played an important role during glacial times.
Resumo:
The avoidance of regular but not irregular plurals inside compounds (e.g. *rats eater vs. mice eater) has been one of the most widely studied morphological phenomena in the psycholinguistics literature. To examine whether the constraints that are responsible for this contrast have any general significance beyond compounding, we investigated derived word forms containing regular and irregular plurals in two experiments. Experiment 1 was an offline acceptability judgment task, and experiment 2 measured eye movements during reading derived words containing regular and irregular plurals and uninflected base nouns. The results from both experiments show that the constraint against regular plurals inside compounds generalizes to derived words. We argue that this constraint cannot be reduced to phonological properties, but is instead morphological in nature. The eye-movement data provide detailed information on the time-course of processing derived word forms indicating that early stages of processing are affected by a general constraint that disallows inflected words from feeding derivational processes, and that the more specific constraint against regular plurals comes in at a subsequent later stage of processing. We argue that these results are consistent with stage-based models of language processing.
Resumo:
Atmospheric CO2 concentration has varied from minima of 170-200 ppm in glacials to maxima of 280-300 ppm in the recent interglacials. Photosynthesis by C-3 plants is highly sensitive to CO2 concentration variations in this range. Physiological consequences of the CO2 changes should therefore be discernible in palaeodata. Several lines of evidence support this expectation. Reduced terrestrial carbon storage during glacials, indicated by the shift in stable isotope composition of dissolved inorganic carbon in the ocean, cannot be explained by climate or sea-level changes. It is however consistent with predictions of current process-based models that propagate known physiological CO2 effects into net primary production at the ecosystem scale. Restricted forest cover during glacial periods, indicated by pollen assemblages dominated by non-arboreal taxa, cannot be reproduced accurately by palaeoclimate models unless CO2 effects on C-3-C-4 plant competition are also modelled. It follows that methods to reconstruct climate from palaeodata should account for CO2 concentration changes. When they do so, they yield results more consistent with palaeoclimate models. In conclusion, the palaeorecord of the Late Quaternary, interpreted with the help of climate and ecosystem models, provides evidence that CO2 effects at the ecosystem scale are neither trivial nor transient.
Resumo:
Ships and wind turbines generate noise, which can have a negative impact on marine mammal populations by scaring animals away. Effective modelling of how this affects the populations has to take account of the location and timing of disturbances. Here we construct an individual-based model of harbour porpoises in the Inner Danish Waters. Individuals have their own energy budgets constructed using established principles of physiological ecology. Data are lacking on the spatial distribution of food which is instead inferred from knowledge of time-varying porpoise distributions. The model produces plausible patterns of population dynamics and matches well the age distribution of porpoises caught in by-catch. It estimates the effect of existing wind farms as a 10% reduction in population size when food recovers fast (after two days). Proposed new wind farms and ships do not result in further population declines. The population is however sensitive to variations in mortality resulting from by-catch and to the speed at which food recovers after being depleted. If food recovers slowly the effect of wind turbines becomes negligible, whereas ships are estimated to have a significant negative impact on the population. Annual by-catch rates ≥10% lead to monotonously decreasing populations and to extinction, and even the estimated by-catch rate from the adjacent area (approximately 4.1%) has a strong impact on the population. This suggests that conservation efforts should be more focused on reducing by-catch in commercial gillnet fisheries than on limiting the amount of anthropogenic noise. Individual-based models are unique in their ability to take account of the location and timing of disturbances and to show their likely effects on populations. The models also identify deficiencies in the existing database and can be used to set priorities for future field research.
Resumo:
Population ecology is a discipline that studies changes in the number and composition (age, sex) of the individuals that form a population. Many of the mechanisms that generate these changes are associated with individual behavior, for example how individuals defend their territories, find mates or disperse. Therefore, it is important to model population dynamics considering the potential influence of behavior on the modeled dynamics. This study illustrates the diversity of behaviors that influence population dynamics describing several methods that allow integrating behavior into population models and range from simpler models that only consider the number of individuals to complex individual-based models that capture great levels of detail. A series of examples shows the importance of explicitly considering behavior in population modeling to avoid reaching erroneous conclusions. This integration is particularly relevant for conservation, as incorrect predictions regarding the dynamics of populations of conservation interest can lead to inadequate assessment and management. Improved predictions can favor effective protection of species and better use of the limited financial and human conservation resources.
Resumo:
Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, either using well-founded empirical relationships or process-based models with good predictive skill. A large variety of models exist today and it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project - FireMIP, an international project to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we summarise the current state-of-the-art in fire regime modelling and model evaluation, and outline what essons may be learned from FireMIP.
Resumo:
This introduction to the Virtual Special Issue surveys the development of spatial housing economics from its roots in neo-classical theory, through more recent developments in social interactions modelling, and touching on the role of institutions, path dependence and economic history. The survey also points to some of the more promising future directions for the subject that are beginning to appear in the literature. The survey covers elements hedonic models, spatial econometrics, neighbourhood models, housing market areas, housing supply, models of segregation, migration, housing tenure, sub-national house price modelling including the so-called ripple effect, and agent-based models. Possible future directions are set in the context of a selection of recent papers that have appeared in Urban Studies. Nevertheless, there are still important gaps in the literature that merit further attention, arising at least partly from emerging policy problems. These include more research on housing and biodiversity, the relationship between housing and civil unrest, the effects of changing age distributions - notably housing for the elderly - and the impact of different international institutional structures. Methodologically, developments in Big Data provide an exciting framework for future work.
Resumo:
We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie above the CMIP5 carbon cycle range. These high-end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real-world climate-sensitivity constraints which, if achieved, would lead to reductions on the upper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present-day observables and future changes, while the large spread of future-projected changes highlights the ongoing need for such work.
Resumo:
The retrieval (estimation) of sea surface temperatures (SSTs) from space-based infrared observations is increasingly performed using retrieval coefficients derived from radiative transfer simulations of top-of-atmosphere brightness temperatures (BTs). Typically, an estimate of SST is formed from a weighted combination of BTs at a few wavelengths, plus an offset. This paper addresses two questions about the radiative transfer modeling approach to deriving these weighting and offset coefficients. How precisely specified do the coefficients need to be in order to obtain the required SST accuracy (e.g., scatter <0.3 K in week-average SST, bias <0.1 K)? And how precisely is it actually possible to specify them using current forward models? The conclusions are that weighting coefficients can be obtained with adequate precision, while the offset coefficient will often require an empirical adjustment of the order of a few tenths of a kelvin against validation data. Thus, a rational approach to defining retrieval coefficients is one of radiative transfer modeling followed by offset adjustment. The need for this approach is illustrated from experience in defining SST retrieval schemes for operational meteorological satellites. A strategy is described for obtaining the required offset adjustment, and the paper highlights some of the subtler aspects involved with reference to the example of SST retrievals from the imager on the geostationary satellite GOES-8.