912 resultados para TODD HEWITT
Resumo:
The Representative Soil Sampling Scheme of England and Wales has recorded information on the soil of agricultural land in England and Wales since 1969. It is a valuable source of information about the soil in the context of monitoring for sustainable agricultural development. Changes in soil nutrient status and pH were examined over the period 1971-2001. Several methods of statistical analysis were applied to data from the surveys during this period. The main focus here is on the data for 1971, 1981, 1991 and 2001. The results of examining change over time in general show that levels of potassium in the soil have increased, those of magnesium have remained fairly constant, those of phosphorus have declined and pH has changed little. Future sampling needs have been assessed in the context of monitoring, to determine the mean at a given level of confidence and tolerable error and to detect change in the mean over time at these same levels over periods of 5 and 10 years. The results of a non-hierarchical multivariate classification suggest that England and Wales could be stratified to optimize future sampling and analysis. To monitor soil quality and health more generally than for agriculture, more of the country should be sampled and a wider range of properties recorded.
Resumo:
The commonly held view of the conditions in the North Atlantic at the last glacial maximum, based on the interpretation of proxy records, is of large-scale cooling compared to today, limited deep convection, and extensive sea ice, all associated with a southward displaced and weakened overturning thermohaline circulation (THC) in the North Atlantic. Not all studies support that view; in particular, the "strength of the overturning circulation" is contentious and is a quantity that is difficult to determine even for the present day. Quasi-equilibrium simulations with coupled climate models forced by glacial boundary conditions have produced differing results, as have inferences made from proxy records. Most studies suggest the weaker circulation, some suggest little or no change, and a few suggest a stronger circulation. Here results are presented from a three-dimensional climate model, the Hadley Centre Coupled Model version 3 (HadCM3), of the coupled atmosphere - ocean - sea ice system suggesting, in a qualitative sense, that these diverging views could all have occurred at different times during the last glacial period, with different modes existing at different times. One mode might have been characterized by an active THC associated with moderate temperatures in the North Atlantic and a modest expanse of sea ice. The other mode, perhaps forced by large inputs of meltwater from the continental ice sheets into the northern North Atlantic, might have been characterized by a sluggish THC associated with very cold conditions around the North Atlantic and a large areal cover of sea ice. The authors' model simulation of such a mode, forced by a large input of freshwater, bears several of the characteristics of the Climate: Long-range Investigation, Mapping, and Prediction (CLIMAP) Project's reconstruction of glacial sea surface temperature and sea ice extent.
Resumo:
Numerous factors are associated with poverty and underdevelopment in Africa, including climate variability. Rainfall, and climate more generally, are implicated directly in the United Nations “Millennium Development Goals” to eradicate extreme poverty and hunger, and reduce child mortality and incidence of diseases such as malaria by the target date of 2015. But, Africa is not currently on target to meet these goals. We pose a number of questions from a climate science perspective aimed at understanding this background: Is there a common origin to factors that currently constrain climate science? Why is it that in a continent where human activity is so closely linked to interannual rainfall variability has climate science received little of the benefit that saw commercialization driving meteorology in the developed world? What might be suggested as an effective way for the continent to approach future climate variability and change? We make the case that a route to addressing the challenges of climate change in Africa rests with the improved management of climate variability. We start by discussing the constraints on climate science and how they might be overcome. We explain why the optimal management of activities directly influenced by interannual climate variability (which include the development of scientific capacity) has the potential to serve as a forerunner to engagement in the wider issue of climate change. We show this both from the perspective of the climate system and the institutions that engage with climate issues. We end with a thought experiment that tests the benefits of linking climate variability and climate change in the setting of smallholder farmers in Limpopo Province, South Africa.
Resumo:
An update of Owens et al. (2008) shows that the relationship between the coronal mass ejection (CME) rate and the heliospheric magnetic field strength predicts a field floor of less than 4 nT at 1 AU. This implies that the record low values measured during this solar minimum do not necessarily contradict the idea that open flux is conserved. The results are consistent with the hypothesis that CMEs add flux to the heliosphere and interchange reconnection between open flux and closed CME loops subtracts flux. An existing model embracing this hypothesis, however, overestimates flux during the current minimum, even though the CME rate has been low. The discrepancy calls for reasonable changes in model assumptions.
Resumo:
In April–July 2008, intensive measurements were made of atmospheric composition and chemistry in Sabah, Malaysia, as part of the "Oxidant and particle photochemical processes above a South-East Asian tropical rainforest" (OP3) project. Fluxes and concentrations of trace gases and particles were made from and above the rainforest canopy at the Bukit Atur Global Atmosphere Watch station and at the nearby Sabahmas oil palm plantation, using both ground-based and airborne measurements. Here, the measurement and modelling strategies used, the characteristics of the sites and an overview of data obtained are described. Composition measurements show that the rainforest site was not significantly impacted by anthropogenic pollution, and this is confirmed by satellite retrievals of NO2 and HCHO. The dominant modulators of atmospheric chemistry at the rainforest site were therefore emissions of BVOCs and soil emissions of reactive nitrogen oxides. At the observed BVOC:NOx volume mixing ratio (~100 pptv/pptv), current chemical models suggest that daytime maximum OH concentrations should be ca. 105 radicals cm−3, but observed OH concentrations were an order of magnitude greater than this. We confirm, therefore, previous measurements that suggest that an unexplained source of OH must exist above tropical rainforest and we continue to interrogate the data to find explanations for this.
Resumo:
More than half the world's rainforest has been lost to agriculture since the Industrial Revolution. Among the most widespread tropical crops is oil palm (Elaeis guineensis): global production now exceeds 35 million tonnes per year. In Malaysia, for example, 13% of land area is now oil palm plantation, compared with 1% in 1974. There are enormous pressures to increase palm oil production for food, domestic products, and, especially, biofuels. Greater use of palm oil for biofuel production is predicated on the assumption that palm oil is an “environmentally friendly” fuel feedstock. Here we show, using measurements and models, that oil palm plantations in Malaysia directly emit more oxides of nitrogen and volatile organic compounds than rainforest. These compounds lead to the production of ground-level ozone (O3), an air pollutant that damages human health, plants, and materials, reduces crop productivity, and has effects on the Earth's climate. Our measurements show that, at present, O3 concentrations do not differ significantly over rainforest and adjacent oil palm plantation landscapes. However, our model calculations predict that if concentrations of oxides of nitrogen in Borneo are allowed to reach those currently seen over rural North America and Europe, ground-level O3 concentrations will reach 100 parts per billion (109) volume (ppbv) and exceed levels known to be harmful to human health. Our study provides an early warning of the urgent need to develop policies that manage nitrogen emissions if the detrimental effects of palm oil production on air quality and climate are to be avoided.
Resumo:
The double triangular test was introduced twenty years ago, and the purpose of this paper is to review applications that have been made since then. In fact, take-up of the method was rather slow until the late 1990s, but in recent years several clinical trial reports have been published describing its use in a wide range of therapeutic areas. The core of this paper is a detailed account of five trials that have been published since 2000 in which the method was applied to studies of pancreatic cancer, breast cancer, myocardial infarction, epilepsy and bedsores. Before those accounts are given, the method is described and the history behind its evolution is presented. The future potential of the method for sequential case-control and equivalence trials is also discussed. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.
Resumo:
A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.