27 resultados para Jacobson, Jeff
Resumo:
Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.
Resumo:
Abstract Objective: To systematically review the available evidence on whether national or international agricultural policies that directly affect the price of food influence the prevalence rates of undernutrition or nutrition-related chronic disease in children and adults. Design: Systematic review. Setting: Global. Search strategy: We systematically searched five databases for published literature (MEDLINE, EconLit, Agricola, AgEcon Search, Scopus) and systematically browsed other databases and relevant organisational websites for unpublished literature. Reference lists of included publications were hand-searched for additional relevant studies. We included studies that evaluated or simulated the effects of national or international food-price-related agricultural policies on nutrition outcomes reporting data collected after 1990 and published in English. Primary and secondary outcomes: Prevalence rates of undernutrition (measured with anthropometry or clinical deficiencies) and overnutrition (obesity and nutrition-related chronic diseases including cancer, heart disease and diabetes). Results: We identified a total of four relevant reports; two ex post evaluations and two ex ante simulations. A study from India reported on the undernutrition rates in children, and the other three studies from Egypt, the Netherlands and the USA reported on the nutrition related chronic disease outcomes in adults. Two of the studies assessed the impact of policies that subsidised the price of agricultural outputs and two focused on public food distribution policies. The limited evidence base provided some support for the notion that agricultural policies that change the prices of foods at a national level can have an effect on population-level nutrition and health outcomes. Conclusions: A systematic review of the available literature suggests that there is a paucity of robust direct evidence on the impact of agricultural price policies on nutrition and health.
Resumo:
Television’s long-form storytelling has the potential to allow the rippling of music across episodes and seasons in interesting ways. In the integration of narrative, music and meaning found in The O.C. (Fox, FOX 2003-7), popular song’s allusive and referential qualities are drawn upon to particularly televisual ends. At times embracing its ‘disruptive’ presence, at others suturing popular music into narrative, at times doing both at once. With television studies largely lacking theories of music, this chapter draws on film music theory and close textual analysis to analyse some of the programme's music moments in detail. In particular it considers the series-spanning use of Jeff Buckley’s cover of ‘Hallelujah’ (and its subsequent oppressive presence across multiple televisual texts), the end of episode musical montage and the use of recurring song fragments as theme within single episodes. In doing so it highlights music's role in the fragmentation and flow of the television aesthetic and popular song’s structural presence in television narrative. Illustrating the multiplicity of popular song’s use in television, these moments demonstrate song’s ability to provide narrative commentary, yet also make particular use of what Ian Garwood describes as the ability of ‘a non-diegetic song to exceed the emotional range displayed by diegetic characters’ (2003:115), to ‘speak’ for characters or to their feelings, contributing to both teen TV’s melodramatic affect and narrative expression.
Resumo:
Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.
Resumo:
There is a growing need for massive computational resources for the analysis of new astronomical datasets. To tackle this problem, we present here our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, AstroGrid) and the computa- tional grid (e.g. TeraGrid, COSMOS etc.). We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of compu- tational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We discuss our planned usages of the VOTechBroker in computing a huge number of n–point correlation functions from the SDSS data and mas- sive model-fitting of millions of CMBfast models to WMAP data. We also discuss other applications including the determination of the XMM Cluster Survey selection function and the construction of new WMAP maps.
Resumo:
We outline our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, Astro- Grid) and the computational grid. We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of computational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We present our planned usage of the VOTechBroker in computing a huge number of n–point correlation functions from the SDSS, as well as fitting over a million CMBfast models to the WMAP data.
Resumo:
Anxious mothers’ parenting, particularly transfer of threat information, has been considered important in their children’s risk for social anxiety disorder (SAnxD), and maternal narratives concerning potential social threat could elucidate this contribution. Maternal narratives to their pre-school 4-5 year-old children, via a picture book about starting school, were assessed in socially anxious (N=73), and non-anxious (N=63) mothers. Child representations of school were assessed via Doll Play (DP). After one school term, mothers (CBCL) and teachers (TRF) reported on child internalizing problems, and child SAnxD was assessed via maternal interview. Relations between these variables, infant behavioral inhibition, and attachment, were examined. Socially anxious mothers showed more negative (higher threat attribution), and less supportive (lower encouragement) narratives, than controls, and their children’s DP representations, SAnxD and CBCL scores were more adverse. High narrative threat predicted child SAnxD; lower encouragement predicted negative child CBCL scores and, particularly for behaviorally inhibited children, TRF scores and DP representations. In securely attached children, CBCL scores and risk for SAnxD were affected by maternal anxiety and threat attributions, respectively. Low encouragement mediated the effects of maternal anxiety on child DP representations, and CBCL scores. Maternal narratives are affected by social anxiety, and contribute to adverse child outcome.
Resumo:
The surface response to 11 year solar cycle variations is investigated by analyzing the long-term mean sea level pressure and sea surface temperature observations for the period 1870–2010. The analysis reveals a statistically significant 11 year solar signal over Europe, and the North Atlantic provided that the data are lagged by a few years. The delayed signal resembles the positive phase of the North Atlantic Oscillation (NAO) following a solar maximum. The corresponding sea surface temperature response is consistent with this. A similar analysis is performed on long-term climate simulations from a coupled ocean-atmosphere version of the Hadley Centre model that has an extended upper lid so that influences of solar variability via the stratosphere are well resolved. The model reproduces the positive NAO signal over the Atlantic/European sector, but the lag of the surface response is not well reproduced. Possible mechanisms for the lagged nature of the observed response are discussed.
Resumo:
When considering adaptation measures and global climate mitigation goals, stakeholders need regional-scale climate projections, including the range of plausible warming rates. To assist these stakeholders, it is important to understand whether some locations may see disproportionately high or low warming from additional forcing above targets such as 2 K (ref. 1). There is a need to narrow uncertainty2 in this nonlinear warming, which requires understanding how climate changes as forcings increase from medium to high levels. However, quantifying and understanding regional nonlinear processes is challenging. Here we show that regional-scale warming can be strongly superlinear to successive CO2 doublings, using five different climate models. Ensemble-mean warming is superlinear over most land locations. Further, the inter-model spread tends to be amplified at higher forcing levels, as nonlinearities grow—especially when considering changes per kelvin of global warming. Regional nonlinearities in surface warming arise from nonlinearities in global-mean radiative balance, the Atlantic meridional overturning circulation, surface snow/ice cover and evapotranspiration. For robust adaptation and mitigation advice, therefore, potentially avoidable climate change (the difference between business-as-usual and mitigation scenarios) and unavoidable climate change (change under strong mitigation scenarios) may need different analysis methods.
Resumo:
Satellite-based (e.g., Synthetic Aperture Radar [SAR]) water level observations (WLOs) of the floodplain can be sequentially assimilated into a hydrodynamic model to decrease forecast uncertainty. This has the potential to keep the forecast on track, so providing an Earth Observation (EO) based flood forecast system. However, the operational applicability of such a system for floods developed over river networks requires further testing. One of the promising techniques for assimilation in this field is the family of ensemble Kalman (EnKF) filters. These filters use a limited-size ensemble representation of the forecast error covariance matrix. This representation tends to develop spurious correlations as the forecast-assimilation cycle proceeds, which is a further complication for dealing with floods in either urban areas or river junctions in rural environments. Here we evaluate the assimilation of WLOs obtained from a sequence of real SAR overpasses (the X-band COSMO-Skymed constellation) in a case study. We show that a direct application of a global Ensemble Transform Kalman Filter (ETKF) suffers from filter divergence caused by spurious correlations. However, a spatially-based filter localization provides a substantial moderation in the development of the forecast error covariance matrix, directly improving the forecast and also making it possible to further benefit from a simultaneous online inflow error estimation and correction. Additionally, we propose and evaluate a novel along-network metric for filter localization, which is physically-meaningful for the flood over a network problem. Using this metric, we further evaluate the simultaneous estimation of channel friction and spatially-variable channel bathymetry, for which the filter seems able to converge simultaneously to sensible values. Results also indicate that friction is a second order effect in flood inundation models applied to gradually varied flow in large rivers. The study is not conclusive regarding whether in an operational situation the simultaneous estimation of friction and bathymetry helps the current forecast. Overall, the results indicate the feasibility of stand-alone EO-based operational flood forecasting.
Resumo:
Any reduction in global mean near-surface temperature due to a future decline in solar activity is likely to be a small fraction of projected anthropogenic warming. However, variability in ultraviolet solar irradiance is linked to modulation of the Arctic and North Atlantic Oscillations, suggesting the potential for larger regional surface climate effects. Here, we explore possible impacts through two experiments designed to bracket uncertainty in ultraviolet irradiance in a scenario in which future solar activity decreases to Maunder Minimum-like conditions by 2050. Both experiments show regional structure in the wintertime response, resembling the North Atlantic Oscillation, with enhanced relative cooling over northern Eurasia and the eastern United States. For a high-end decline in solar ultraviolet irradiance, the impact on winter northern European surface temperatures over the late twenty-first century could be a significant fraction of the difference in climate change between plausible AR5 scenarios of greenhouse gas concentrations.
Resumo:
The topography of many floodplains in the developed world has now been surveyed with high resolution sensors such as airborne LiDAR (Light Detection and Ranging), giving accurate Digital Elevation Models (DEMs) that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X WorldDEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution Synthetic Aperture Radar (SAR) images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. This paper discusses an additional use of SAR flood extents, namely to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving this DEM for future flood modelling and other studies. The method is based on the fact that for larger rivers the water elevation generally changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as samples with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the corrected heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must in general be no lower than the corrected heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the reduced errors on the corrected waterline heights. The method was tested on a section of the TanDEM-X Intermediate DEM (IDEM) covering an 11km reach of the Warwickshire Avon, England. Flood extents from four COSMO-SKyMed images were available at various stages of a flood in November 2012, and a LiDAR DEM was available for validation. In the area covered by the flood extents, the original IDEM heights had a mean difference from the corresponding LiDAR heights of 0.5 m with a standard deviation of 2.0 m, while the corrected heights had a mean difference of 0.3 m with standard deviation 1.2 m. These figures show that significant reductions in IDEM height bias and error can be made using the method, with the corrected error being only 60% of the original. Even if only a single SAR image obtained near the peak of the flood was used, the corrected error was only 66% of the original. The method should also be capable of improving the final TanDEM-X DEM and other DEMs, and may also be of use with data from the SWOT (Surface Water and Ocean Topography) satellite.