988 resultados para climate appropriate clothing
Resumo:
A variety of methods are available to estimate future solar radiation (SR) scenarios at spatial scales that are appropriate for local climate change impact assessment. However, there are no clear guidelines available in the literature to decide which methodologies are most suitable for different applications. Three methodologies to guide the estimation of SR are discussed in this study, namely: Case 1: SR is measured, Case 2: SR is measured but sparse and Case 3: SR is not measured. In Case 1, future SR scenarios are derived using several downscaling methodologies that transfer the simulated large-scale information of global climate models to a local scale ( measurements). In Case 2, the SR was first estimated at the local scale for a longer time period using sparse measured records, and then future scenarios were derived using several downscaling methodologies. In Case 3: the SR was first estimated at a regional scale for a longer time period using complete or sparse measured records of SR from which SR at the local scale was estimated. Finally, the future scenarios were derived using several downscaling methodologies. The lack of observed SR data, especially in developing countries, has hindered various climate change impact studies. Hence, this was further elaborated by applying the Case 3 methodology to a semi-arid Malaprabha reservoir catchment in southern India. A support vector machine was used in downscaling SR. Future monthly scenarios of SR were estimated from simulations of third-generation Canadian General Circulation Model (CGCM3) for various SRES emission scenarios (A1B, A2, B1, and COMMIT). Results indicated a projected decrease of 0.4 to 12.2 W m(-2) yr(-1) in SR during the period 2001-2100 across the 4 scenarios. SR was calculated using the modified Hargreaves method. The decreasing trends for the future were in agreement with the simulations of SR from the CGCM3 model directly obtained for the 4 scenarios.
Resumo:
Developments in the statistical extreme value theory, which allow non-stationary modeling of changes in the frequency and severity of extremes, are explored to analyze changes in return levels of droughts for the Colorado River. The transient future return levels (conditional quantiles) derived from regional drought projections using appropriate extreme value models, are compared with those from observed naturalized streamflows. The time of detection is computed as the time at which significant differences exist between the observed and future extreme drought levels, accounting for the uncertainties in their estimates. Projections from multiple climate model-scenario combinations are considered; no uniform pattern of changes in drought quantiles is observed across all the projections. While some projections indicate shifting to another stationary regime, for many projections which are found to be non-stationary, detection of change in tail quantiles of droughts occurs within the 21st century with no unanimity in the time of detection. Earlier detection is observed in droughts levels of higher probability of exceedance. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The countries and territories of the Pacific Islands face many challenges in building the three main pillars of food security: availability, access and appropriate use of nutritious food. These challenges arise from factors including rapid population growth and urbanization, shortages of arable land for farming and the availability of cheap, low-quality foods. As a result, many are now highly dependent on imported food, and the incidence of non-communicable diseases in the region is among the highest in the world. This report summarizes: 1) the projected effects of climate change on agriculture, fisheries and aquaculture in the Pacific region; 2) adaptations and supporting policies needed to reduce risks to food production; 3) gaps in knowledge that must be filled in order to implement the adaptations effectively; 4) recommendations to fill these knowledge gaps.
Resumo:
Climate change is expected to have significant impact on the future thermal performance of buildings. Building simulation and sensitivity analysis can be employed to predict these impacts, guiding interventions to adapt buildings to future conditions. This article explores the use of simulation to study the impact of climate change on a theoretical office building in the UK, employing a probabilistic approach. The work studies (1) appropriate performance metrics and underlying modelling assumptions, (2) sensitivity of computational results to identify key design parameters and (3) the impact of zonal resolution. The conclusions highlight the importance of assumptions in the field of electricity conversion factors, proper management of internal heat gains, and the need to use an appropriately detailed zonal resolution. © 2010 Elsevier B.V. All rights reserved.
Resumo:
A balloon tethered at an altitude of 20 km could deliver a particulate cloud leading to global cooling. Tethering a balloon at this altitude poses significant problems with respect to vibration and stability, especially in regions of high wind. No-one has ever proposed, yet alone launched, a balloon at an altitude of 20 km tethered to the ground. Owing to wind, the tether needs to be 23 km in length and is to be fixed to a ship at sea or on land in equatorial regions. Whilst the balloon at 20 km is subject to relatively modest wind conditions, at jet stream altitudes (10km) the tether will experience much higher wind loadings, not only because of the high wind speeds of up to 300 km / hr but also because of the high air density. A tether of circular cross section in these high winds will be subject to horizontal and downward drag forces that would bring the aerostat down. For this reason it is advantageous to consider a self-aligning tether of an aerodynamic cross section whereby it is possible to reduce the drag substantially. One disadvantage of a non-circular tether is the possibility of flutter and galloping instabilities. It is reasonably straightforward to model these phenomena for short lengths of aerofoil, but the situation becomes more complex for a 20 km tensioned tether with large deflection and curvature, variable wind speed, variable air density and variable tension. Analysis using models of infinite length are used to establish the stability at a local scale where the tension, aerodynamic and geometric properties are considered constant. Dispersion curve analysis is useful here. But for dynamics on a long-wavelength scale (several km) then a full non-linear analysis is required. This non-linear model can be used to establish the local values of tension appropriate for the dispersion analysis. This keynote presentation will give some insight into these issues.
Resumo:
This paper provides an exhaustive review of critical issues in the design of climate mitigation policy by pulling together key findings and controversies from diverse literatures on mitigation costs, damage valuation, policy instrument choice, technological innovation, and international climate policy. We begin with the broadest issue of how high assessments suggest the near and medium term price on greenhouse gases would need to be, both under cost-effective stabilization of global climate and under net benefit maximization or Pigouvian emissions pricing. The remainder of the paper focuses on the appropriate scope of regulation, issues in policy instrument choice, complementary technology policy, and international policy architectures.
Resumo:
Satellite-derived remote-sensing reflectance (Rrs) can be used for mapping biogeochemically relevant variables, such as the chlorophyll concentration and the Inherent Optical Properties (IOPs) of the water, at global scale for use in climate-change studies. Prior to generating such products, suitable algorithms have to be selected that are appropriate for the purpose. Algorithm selection needs to account for both qualitative and quantitative requirements. In this paper we develop an objective methodology designed to rank the quantitative performance of a suite of bio-optical models. The objective classification is applied using the NASA bio-Optical Marine Algorithm Dataset (NOMAD). Using in situRrs as input to the models, the performance of eleven semi-analytical models, as well as five empirical chlorophyll algorithms and an empirical diffuse attenuation coefficient algorithm, is ranked for spectrally-resolved IOPs, chlorophyll concentration and the diffuse attenuation coefficient at 489 nm. The sensitivity of the objective classification and the uncertainty in the ranking are tested using a Monte-Carlo approach (bootstrapping). Results indicate that the performance of the semi-analytical models varies depending on the product and wavelength of interest. For chlorophyll retrieval, empirical algorithms perform better than semi-analytical models, in general. The performance of these empirical models reflects either their immunity to scale errors or instrument noise in Rrs data, or simply that the data used for model parameterisation were not independent of NOMAD. Nonetheless, uncertainty in the classification suggests that the performance of some semi-analytical algorithms at retrieving chlorophyll is comparable with the empirical algorithms. For phytoplankton absorption at 443 nm, some semi-analytical models also perform with similar accuracy to an empirical model. We discuss the potential biases, limitations and uncertainty in the approach, as well as additional qualitative considerations for algorithm selection for climate-change studies. Our classification has the potential to be routinely implemented, such that the performance of emerging algorithms can be compared with existing algorithms as they become available. In the long-term, such an approach will further aid algorithm development for ocean-colour studies.
Resumo:
The extent to which climate change might diminish the efficacy of protected areas is one of the most pressing conservation questions. Many projections suggest that climate-driven species distribution shifts will leave protected areas impoverished and species inadequately protected while other evidence suggests that intact ecosystems within protected areas will be resilient to change. Here, we tackle this problem empirically. We show how recent changes in distribution of 139 Tanzanian savannah bird species are linked to climate change, protected area status and land degradation. We provide the first evidence of climate-driven range shifts for an African bird community. Our results suggest that the continued maintenance of existing protected areas is an appropriate conservation response to the challenge of climate and environmental change.
Resumo:
This article takes as its starting point the potentially negative human rights implications that the effects of climate change, disasters and development practices can have on individuals and communities. It argues that key international instruments, including the post-2015 successors to the Kyoto Protocol, Hyogo Framework for Action on disaster risk reduction and the Millennium Development Goals, appear to be moving towards an express acknowledgment of the relevance of international human rights law as an important mechanism to minimise potential harms that may arise. This raises the question as to the appropriate role of the UN human rights monitoring and accountability mechanisms in identifying the relevant rights-holders and duty-bearers. The article therefore provides an examination of the linkages between climate change and international human rights law, as well as discussion of the human rights considerations and accountability mechanisms for disasters and sustainable development. The article concludes by arguing that despite differential understandings between disciplines as to the meaning of key terms such as ‘vulnerability’ and ‘resilience’, international human rights law provides a comprehensive basis for promoting international and national accountability. It follows that a greater level of coordination and coherence between the human rights approaches of the various post-2015 legal and policy frameworks is warranted as a means of promoting the dignity of those most affected by climate change, disasters and developmental activities.
Resumo:
Many parts of the UK’s rail network were constructed in the mid-19th century long before the advent of modern construction standards. Historic levels of low investment, poor maintenance strategies and the deleterious effects of climate change have resulted in critical elements of the rail network being at significant risk of failure. The majority of failures which have occurred over recent years have been triggered by extreme weather events. Advance assessment and remediation of earthworks is, however, significantly less costly than dealing with failures reactively. It is therefore crucial that appropriate approaches for assessment of the stability of earthworks are developed, so that repair work can be better targeted and failures avoided wherever possible. This extended abstract briefly discusses some preliminary results from an ongoing geophysical research project being carried out in order to study the impact of climate or seasonal weather variations on the stability of a century old railway embankment on the Gloucestershire Warwickshire steam railway line in Southern England.
Resumo:
The UK’s transportation network is supported by critical geotechnical assets (cuttings/embankments/dams) that require sustainable, cost-effective management, while maintaining an appropriate service level to meet social, economic, and environmental needs. Recent effects of extreme weather on these geotechnical assets have highlighted their vulnerability to climate variations. We have assessed the potential of surface wave data to portray the climate-related variations in mechanical properties of a clay-filled railway embankment. Seismic data were acquired bimonthly from July 2013 to November 2014 along the crest of a heritage railway embankment in southwest England. For each acquisition, the collected data were first processed to obtain a set of Rayleigh-wave dispersion and attenuation curves, referenced to the same spatial locations. These data were then analyzed to identify a coherent trend in their spatial and temporal variability. The relevance of the observed temporal variations was also verified with respect to the experimental data uncertainties. Finally, the surface wave dispersion data sets were inverted to reconstruct a time-lapse model of S-wave velocity for the embankment structure, using a least-squares laterally constrained inversion scheme. A key point of the inversion process was constituted by the estimation of a suitable initial model and the selection of adequate levels of spatial regularization. The initial model and the strength of spatial smoothing were then kept constant throughout the processing of all available data sets to ensure homogeneity of the procedure and comparability among the obtained VS sections. A continuous and coherent temporal pattern of surface wave data, and consequently of the reconstructed VS models, was identified. This pattern is related to the seasonal distribution of precipitation and soil water content measured on site.
Resumo:
Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.
Resumo:
Les facteurs climatiques ainsi bien que les facteurs non-climatiques doivent être pris en considération dans le processus d'adaptation de l'agriculture aux changements et à la variabilité climatiques (CVC). Ce changement de paradigme met l'agent humain au centre du processus d'adaptation, ce qui peut conduire à une maladaptation. Suite aux débats sur les changements climatiques qui ont attiré l'attention scientifique et publique dans les années 1980 et 1990, l'agriculture canadienne est devenue un des points focaux de plusieurs études pionnières sur les CVC, un phénomène principalement dû à l’effet anthropique. Pour faire face aux CVC, ce n’est pas seulement la mitigation qui est importante mais aussi l’adaptation. Quand il s'agit de l'adaptation, c'est plutôt la variabilité climatique qui nous intéresse que simplement les augmentations moyennes des températures. L'objectif général de ce mémoire de maîtrise est d'améliorer la compréhension des processus d'adaptation et de construction de la capacité d'adaptation ai niveau de la ferme et de la communauté agricole à travers un processus ascendant, c’est-à-dire en utilisant l'approche de co-construction (qui peut également être considéré comme une stratégie d'adaptation en soi), pour développer une gestion et des outils de planification appropriés aux parties prenantes pour accroître ainsi la capacité d'adaptation de la communauté agricole. Pour y arriver, l'approche grounded theory est utilisée. Les résultats consistent de cinq catégories interdépendantes de codes élargis, conceptuellement distinctes et avec un plus grand niveau d'abstraction. La MRC du Haut-Richelieu a été choisie comme étude de cas en raison de plusieurs de ses dimensions agricoles, à part de ses conditions biophysiques favorables. 15 entrevues ont été menées avec les agriculteurs. Les résultats montrent que si certains agriculteurs ont reconnu les côtés positifs et négatifs des CVC, d’autres sont très optimistes à ce sujet comme se ils ne voient que le côté positif; d'où la nécessité de voir les deux côtés des CVC. Aussi, il y a encore une certaine incertitude liée aux CVC, qui vient de la désinformation et la désensibilisation des agriculteurs principalement en ce qui concerne les causes des CVC ainsi que la nature des événements climatiques. En outre, et compte tenu du fait que l'adaptation a plusieurs caractéristiques et types, il existe de nombreux types d'adaptation qui impliquent à la fois l'acteur privé et le gouvernement. De plus, les stratégies d'adaptation doivent être élaborées conjointement par les agriculteurs en concert avec d'autres acteurs, à commencer par les agronomes, car ils servent en tant que relais important entre les agriculteurs et d'autres parties prenantes telles que les institutions publiques et les entreprises privées.
Resumo:
Poor adaptation to climate change is a major threat to sustainable rice production in Nigeria. Determinants of appropriate climate-change adaptation strategies used by rice farmers in Southwestern Nigeria have not been fully investigated. In this study, the determinants of climate change adaptation strategies used by rice farmers in Southwestern Nigeria were investigated. Data were obtained through Focus Group Discussions (FGDs) and field survey conducted in the study areas. Data obtained were analyzed using descriptive and inferential statistical tools such as percentage and regression analysis. The major climate change adaptation strategies used by the respondents included; planting improved rice variety such as Federal Agricultural Research Oryza (FARO) (80.5 %), seeking early warning information (80.9 %), shifting planting date until the weather condition was favourable (99.1 %), and using chemical fertilizer on their farms in order to maintain soil fertility (20.5 %). The determinants of climate change adaptation strategies used by the farmers, included access to early warning information (β=43.04), access to fertilizer (β=5.78), farm plot size (β=–12.04) and access to regular water supply (β=–24.79). Climate change adaptation required provision of incentives to farmers, training on drought and flood control, and the use of improved technology to obtain higher yield.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.