928 resultados para Regional analysis
Resumo:
Objective: To investigate the sociodemographic determinants of diet quality of the elderly in four EU countries. Design: Cross-sectional study. For each country, a regression was performed of a multidimensional index of dietary quality v. sociodemographic variables. Setting In Finland, Finnish Household Budget Survey (1998 and 2006); in Sweden, SNAC-K (2001–2004); in the UK, Expenditure & Food Survey (2006–07); in Italy, Multi-purpose Survey of Daily Life (2009). Subjects: One- and two-person households of over-50s (Finland, n 2994; UK, n 4749); over-50 s living alone or in two-person households (Italy, n 7564); over-60 s (Sweden, n 2023). Results: Diet quality among the EU elderly is both low on average and heterogeneous across individuals. The regression models explained a small but significant part of the observed heterogeneity in diet quality. Resource availability was associated with diet quality either negatively (Finland and UK) or in a non-linear or non-statistically significant manner (Italy and Sweden), as was the preference for food parameter. Education, not living alone and female gender were characteristics positively associated with diet quality with consistency across the four countries, unlike socio-professional status, age and seasonality. Regional differences within countries persisted even after controlling for the other sociodemographic variables. Conclusions: Poor dietary choices among the EU elderly were not caused by insufficient resources and informational measures could be successful in promoting healthy eating for healthy ageing. On the other hand, food habits appeared largely set in the latter part of life, with age and retirement having little influence on the healthiness of dietary choices.
Resumo:
The character of settlement patterns within the late Mesolithic communities of north-west Europe is a topic of substantial debate. An important case study concerns the five shell middens on the island of Oronsay, Inner Hebrides, western Scotland. Two conflicting interpretations have been proposed: the evidence from seasonality indicators and stable isotope analysis of human bones has been used to support a model of year-round settlement on this small island; alternatively, the middens have been interpreted as resulting from short-term intermittent visits to Oronsay within a regionally mobile settlement pattern. We contribute to this debate by describing Storakaig, a newly discovered site on the nearby island of Islay, undertaking a Bayesian chronological analysis and providing evidence for technological continuity between Oronsay and sites elsewhere in the region. While this new evidence remains open to alternative interpretation, we suggest that it makes regional mobility rather than year-round settlement on Oronsay a more viable interpretation for the Oronsay middens. Our analysis also confirms the likely overlap of the late Mesolithic with the earliest Neolithic within western Scotland.
Resumo:
The Water and Global Change (WATCH) project evaluation of the terrestrial water cycle involves using land surface models and general hydrological models to assess hydrologically important variables including evaporation, soil moisture, and runoff. Such models require meteorological forcing data, and this paper describes the creation of the WATCH Forcing Data for 1958–2001 based on the 40-yr ECMWF Re-Analysis (ERA-40) and for 1901–57 based on reordered reanalysis data. It also discusses and analyses modelindependent estimates of reference crop evaporation. Global average annual cumulative reference crop evaporation was selected as a widely adopted measure of potential evapotranspiration. It exhibits no significant trend from 1979 to 2001 although there are significant long-term increases in global average vapor pressure deficit and concurrent significant decreases in global average net radiation and wind speed. The near-constant global average of annual reference crop evaporation in the late twentieth century masks significant decreases in some regions (e.g., the Murray–Darling basin) with significant increases in others.
Resumo:
Reliable evidence of trends in the illegal ivory trade is important for informing decision making for elephants but it is difficult to obtain due to the covert nature of the trade. The Elephant Trade Information System, a global database of reported seizures of illegal ivory, holds the only extensive information on illicit trade available. However inherent biases in seizure data make it difficult to infer trends; countries differ in their ability to make and report seizures and these differences cannot be directly measured. We developed a new modelling framework to provide quantitative evidence on trends in the illegal ivory trade from seizures data. The framework used Bayesian hierarchical latent variable models to reduce bias in seizures data by identifying proxy variables that describe the variability in seizure and reporting rates between countries and over time. Models produced bias-adjusted smoothed estimates of relative trends in illegal ivory activity for raw and worked ivory in three weight classes. Activity is represented by two indicators describing the number of illegal ivory transactions--Transactions Index--and the total weight of illegal ivory transactions--Weights Index--at global, regional or national levels. Globally, activity was found to be rapidly increasing and at its highest level for 16 years, more than doubling from 2007 to 2011 and tripling from 1998 to 2011. Over 70% of the Transactions Index is from shipments of worked ivory weighing less than 10 kg and the rapid increase since 2007 is mainly due to increased consumption in China. Over 70% of the Weights Index is from shipments of raw ivory weighing at least 100 kg mainly moving from Central and East Africa to Southeast and East Asia. The results tie together recent findings on trends in poaching rates, declining populations and consumption and provide detailed evidence to inform international decision making on elephants.
Resumo:
Using the GlobAEROSOL-AATSR dataset, estimates of the instantaneous, clear-sky, direct aerosol radiative effect and radiative forcing have been produced for the year 2006. Aerosol Robotic Network sun-photometer measurements have been used to characterise the random and systematic error in the GlobAEROSOL product for 22 regions covering the globe. Representative aerosol properties for each region were derived from the results of a wide range of literature sources and, along with the de-biased GlobAEROSOL AODs, were used to drive an offline version of the Met Office unified model radiation scheme. In addition to the mean AOD, best-estimate run of the radiation scheme, a range of additional calculations were done to propagate uncertainty estimates in the AOD, optical properties, surface albedo and errors due to the temporal and spatial averaging of the AOD fields. This analysis produced monthly, regional estimates of the clear-sky aerosol radiative effect and its uncertainty, which were combined to produce annual, global mean values of (−6.7±3.9)Wm−2 at the top of atmosphere (TOA) and (−12±6)Wm−2 at the surface. These results were then used to give estimates of regional, clear-sky aerosol direct radiative forcing, using modelled pre-industrial AOD fields for the year 1750 calculated for the AEROCOM PRE experiment. However, as it was not possible to quantify the uncertainty in the pre-industrial aerosol loading, these figures can only be taken as indicative and their uncertainties as lower bounds on the likely errors. Although the uncertainty on aerosol radiative effect presented here is considerably larger than most previous estimates, the explicit inclusion of the major sources of error in the calculations suggest that they are closer to the true constraint on this figure from similar methodologies, and point to the need for more, improved estimates of both global aerosol loading and aerosol optical properties.
Resumo:
A realistic representation of the North Atlantic tropical cyclone tracks is crucial as it allows, for example, explaining potential changes in US landfalling systems. Here we present a tentative study, which examines the ability of recent climate models to represent North Atlantic tropical cyclone tracks. Tracks from two types of climate models are evaluated: explicit tracks are obtained from tropical cyclones simulated in regional or global climate models with moderate to high horizontal resolution (1° to 0.25°), and downscaled tracks are obtained using a downscaling technique with large-scale environmental fields from a subset of these models. For both configurations, tracks are objectively separated into four groups using a cluster technique, leading to a zonal and a meridional separation of the tracks. The meridional separation largely captures the separation between deep tropical and sub-tropical, hybrid or baroclinic cyclones, while the zonal separation segregates Gulf of Mexico and Cape Verde storms. The properties of the tracks’ seasonality, intensity and power dissipation index in each cluster are documented for both configurations. Our results show that except for the seasonality, the downscaled tracks better capture the observed characteristics of the clusters. We also use three different idealized scenarios to examine the possible future changes of tropical cyclone tracks under 1) warming sea surface temperature, 2) increasing carbon dioxide, and 3) a combination of the two. The response to each scenario is highly variable depending on the simulation considered. Finally, we examine the role of each cluster in these future changes and find no preponderant contribution of any single cluster over the others.
Resumo:
When considering adaptation measures and global climate mitigation goals, stakeholders need regional-scale climate projections, including the range of plausible warming rates. To assist these stakeholders, it is important to understand whether some locations may see disproportionately high or low warming from additional forcing above targets such as 2 K (ref. 1). There is a need to narrow uncertainty2 in this nonlinear warming, which requires understanding how climate changes as forcings increase from medium to high levels. However, quantifying and understanding regional nonlinear processes is challenging. Here we show that regional-scale warming can be strongly superlinear to successive CO2 doublings, using five different climate models. Ensemble-mean warming is superlinear over most land locations. Further, the inter-model spread tends to be amplified at higher forcing levels, as nonlinearities grow—especially when considering changes per kelvin of global warming. Regional nonlinearities in surface warming arise from nonlinearities in global-mean radiative balance, the Atlantic meridional overturning circulation, surface snow/ice cover and evapotranspiration. For robust adaptation and mitigation advice, therefore, potentially avoidable climate change (the difference between business-as-usual and mitigation scenarios) and unavoidable climate change (change under strong mitigation scenarios) may need different analysis methods.
Resumo:
Land cover plays a key role in global to regional monitoring and modeling because it affects and is being affected by climate change and thus became one of the essential variables for climate change studies. National and international organizations require timely and accurate land cover information for reporting and management actions. The North American Land Change Monitoring System (NALCMS) is an international cooperation of organizations and entities of Canada, the United States, and Mexico to map land cover change of North America's changing environment. This paper presents the methodology to derive the land cover map of Mexico for the year 2005 which was integrated in the NALCMS continental map. Based on a time series of 250 m Moderate Resolution Imaging Spectroradiometer (MODIS) data and an extensive sample data base the complexity of the Mexican landscape required a specific approach to reflect land cover heterogeneity. To estimate the proportion of each land cover class for every pixel several decision tree classifications were combined to obtain class membership maps which were finally converted to a discrete map accompanied by a confidence estimate. The map yielded an overall accuracy of 82.5% (Kappa of 0.79) for pixels with at least 50% map confidence (71.3% of the data). An additional assessment with 780 randomly stratified samples and primary and alternative calls in the reference data to account for ambiguity indicated 83.4% overall accuracy (Kappa of 0.80). A high agreement of 83.6% for all pixels and 92.6% for pixels with a map confidence of more than 50% was found for the comparison between the land cover maps of 2005 and 2006. Further wall-to-wall comparisons to related land cover maps resulted in 56.6% agreement with the MODIS land cover product and a congruence of 49.5 with Globcover.
Resumo:
About 90% of the anthropogenic increase in heat stored in the climate system is found the oceans. Therefore it is relevant to understand the details of ocean heat uptake. Here we present a detailed, process-based analysis of ocean heat uptake (OHU) processes in HiGEM1.2, an atmosphere-ocean general circulation model (AOGCM) with an eddy-permitting ocean component of 1/3 degree resolution. Similarly to various other models, HiGEM1.2 shows that the global heat budget is dominated by a downward advection of heat compensated by upward isopycnal diffusion. Only in the upper tropical ocean do we find the classical balance between downward diapycnal diffusion and upward advection of heat. The upward isopycnal diffusion of heat is located mostly in the Southern Ocean, which thus dominates the global heat budget. We compare the responses to a 4xCO2 forcing and an enhancement of the windstress forcing in the Southern Ocean. This highlights the importance of regional processes for the global ocean heat uptake. These are mainly surface fluxes and convection in the high latitudes, and advection in the Southern Ocean mid-latitudes. Changes in diffusion are less important. In line with the CMIP5 models, HiGEM1.2 shows a band of strong OHU in the mid-latitude Southern Ocean in the 4xCO2 run, which is mostly advective. By contrast, in the high-latitude Southern Ocean regions it is the suppression of convection that leads to OHU. In the enhanced windstress run, convection is strengthened at high Southern latitudes, leading to heat loss, while the magnitude of the OHU in the Southern mid-latitudes is very similar to the 4xCO2 results. Remarkably, there is only very small global OHU in the enhanced windstress run. The wind stress forcing just leads to a redistribution of heat. We relate the ocean changes at high southern latitudes to the effect of climate change on the Antarctic Circumpolar Current (ACC). It weakens in the 4xCO2 run and strengthens in the wind stress run. The weakening is due to a narrowing of the ACC, caused by an expansion of the Weddell Gyre, and a flattening of the isopycnals, which are explained by a combination of the wind stress forcing and increased precipitation.
Resumo:
Fire activity has varied globally and continuously since the last glacial maximum (LGM) in response to long-term changes in global climate and shorter-term regional changes in climate, vegetation, and human land use. We have synthesized sedimentary charcoal records of biomass burning since the LGM and present global maps showing changes in fire activity for time slices during the past 21,000 years (as differences in charcoal accumulation values compared to pre-industrial). There is strong broad-scale coherence in fire activity after the LGM, but spatial heterogeneity in the signals increases thereafter. In North America, Europe and southern South America, charcoal records indicate less-than-present fire activity during the deglacial period, from 21,000 to ∼11,000 cal yr BP. In contrast, the tropical latitudes of South America and Africa show greater-than-present fire activity from ∼19,000 to ∼17,000 cal yr BP and most sites from Indochina and Australia show greater-than-present fire activity from 16,000 to ∼13,000 cal yr BP. Many sites indicate greater-than-present or near-present activity during the Holocene with the exception of eastern North America and eastern Asia from 8,000 to ∼3,000 cal yr BP, Indonesia and Australia from 11,000 to 4,000 cal yr BP, and southern South America from 6,000 to 3,000 cal yr BP where fire activity was less than present. Regional coherence in the patterns of change in fire activity was evident throughout the post-glacial period. These complex patterns can largely be explained in terms of large-scale climate controls modulated by local changes in vegetation and fuel load
Resumo:
Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.
Resumo:
The globalization of trade in fish has created many challenges for the developing world specifically with regard to food safety and quality. International organisations have established a good basis for standards in international trade. Whilst these requirements are frequently embraced by the major importers (such as Japan, the EU and the USA), they often impose additional safety requirements and regularly identify batches which fail to meet their strict standards. Creating an effective national seafood control system which meets both the internal national needs as well the requirements for the export market can be challenging. Many countries adopt a dual system where seafood products for the major export markets are subject to tight control whilst the majority of the products (whether for the local market or for more regional trade) are less tightly controlled. With regional liberalization also occurring, deciding on appropriate controls is complex. In the Sultanate of Oman, fisheries production is one of the countries' chief sources of economic revenue after oil production and is a major source of the national food supply. In this paper the structure of the fish supply chain has been analysed and highlighted the different routes operating for the different markets. Although much of the fish are consumed within Oman, there is a major export trade to the local regional markets. Much smaller quantities meet the more stringent standards imposed by the major importing countries and exports to these are limited. The paper has considered the development of the Omani fish control system including the key legislative documents and the administrative structures that have been developed. Establishing modern controls which satisfy the demands of the major importers is possible but places additional costs on businesses. Enhanced controls such as HACCP and other management standards are required but can be difficult to justify when alternative markets do not specify these. These enhanced controls do however provide additional consumer protection and can bring benefits to local consumers. The Omani government is attempting to upgrade the system of controls and has made tremendous progress toward the implementation of HACCP and introducing enhanced management systems into its industrial sector. The existence of strengthened legislative and government support, including subsidies, has encouraged some businesses to implement HACCP. The current control systems have been reviewed and a SWOT analysis approach used to identify key factors for their future development. The study shows that seafood products in the supply chain are often exposed to lengthy handling and distribution process before reaching the consumers, a typical issue faced by many developing countries. As seafood products are often perishable, they safety is compromised if not adequately controlled. The enforcement of current food safety laws in the Sultanate of Oman is shared across various government agencies. Consequently, there is a need to harmonize all regulatory requirements, enhancing the domestic food protection and to continue to work towards a fully risk-based approach in order to compete successfully in the global market.
Resumo:
This paper investigates the challenge of representing structural differences in river channel cross-section geometry for regional to global scale river hydraulic models and the effect this can have on simulations of wave dynamics. Classically, channel geometry is defined using data, yet at larger scales the necessary information and model structures do not exist to take this approach. We therefore propose a fundamentally different approach where the structural uncertainty in channel geometry is represented using a simple parameterization, which could then be estimated through calibration or data assimilation. This paper first outlines the development of a computationally efficient numerical scheme to represent generalised channel shapes using a single parameter, which is then validated using a simple straight channel test case and shown to predict wetted perimeter to within 2% for the channels tested. An application to the River Severn, UK is also presented, along with an analysis of model sensitivity to channel shape, depth and friction. The channel shape parameter was shown to improve model simulations of river level, particularly for more physically plausible channel roughness and depth parameter ranges. Calibrating channel Manning’s coefficient in a rectangular channel provided similar water level simulation accuracy in terms of Nash-Sutcliffe efficiency to a model where friction and shape or depth were calibrated. However, the calibrated Manning coefficient in the rectangular channel model was ~2/3 greater than the likely physically realistic value for this reach and this erroneously slowed wave propagation times through the reach by several hours. Therefore, for large scale models applied in data sparse areas, calibrating channel depth and/or shape may be preferable to assuming a rectangular geometry and calibrating friction alone.
Resumo:
Debate over the late Quaternary megafaunal extinctions has focussed on whether human colonisation or climatic changes were more important drivers of extinction, with few extinctions being unambiguously attributable to either. Most analyses have been geographically or taxonomically restricted and the few quantitative global analyses have been limited by coarse temporal resolution or overly simplified climate reconstructions or proxies. We present a global analysis of the causes of these extinctions which uses high-resolution climate reconstructions and explicitly investigates the sensitivity of our results to uncertainty in the palaeological record. Our results show that human colonisation was the dominant driver of megafaunal extinction across the world but that climatic factors were also important. We identify the geographic regions where future research is likely to have the most impact, with our models reliably predicting extinctions across most of the world, with the notable exception of mainland Asia where we fail to explain the apparently low rate of extinction found in in the fossil record. Our results are highly robust to uncertainties in the palaeological record, and our main conclusions are unlikely to change qualitatively following minor improvements or changes in the dates of extinctions and human colonisation.
Resumo:
We propose a geoadditive negative binomial model (Geo-NB-GAM) for regional count data that allows us to address simultaneously some important methodological issues, such as spatial clustering, nonlinearities, and overdispersion. This model is applied to the study of location determinants of inward greenfield investments that occurred during 2003–2007 in 249 European regions. After presenting the data set and showing the presence of overdispersion and spatial clustering, we review the theoretical framework that motivates the choice of the location determinants included in the empirical model, and we highlight some reasons why the relationship between some of the covariates and the dependent variable might be nonlinear. The subsequent section first describes the solutions proposed by previous literature to tackle spatial clustering, nonlinearities, and overdispersion, and then presents the Geo-NB-GAM. The empirical analysis shows the good performance of Geo-NB-GAM. Notably, the inclusion of a geoadditive component (a smooth spatial trend surface) permits us to control for spatial unobserved heterogeneity that induces spatial clustering. Allowing for nonlinearities reveals, in keeping with theoretical predictions, that the positive effect of agglomeration economies fades as the density of economic activities reaches some threshold value. However, no matter how dense the economic activity becomes, our results suggest that congestion costs never overcome positive agglomeration externalities.