997 resultados para air surveillance
Resumo:
Air traffic condensation trails, or contrails, are believed to have a net atmospheric warming effect(1), although one that is currently small compared to that induced by other sources of human emissions. However, the comparably large growth rate of air traffic requires an improved understanding of the resulting impact of aircraft radiative forcing on climate(2). Contrails have an effect on the Earth's energy balance similar to that of high thin ice clouds(3). Their trapping of outgoing longwave radiation emitted by the Earth and atmosphere (positive radiative forcing) is partly compensated by their reflection of incoming solar radiation (negative radiative forcing). On average, the longwave effect dominates and the net contrail radiative forcing is believed to be positive(1,2,4). Over daily and annual timescales, varying levels of air traffic, meteorological conditions, and solar insolation influence the net forcing effect of contrails. Here we determine the factors most important for contrail climate forcing using a sophisticated radiative transfer model(5,6) for a site in southeast England, located in the entrance to the North Atlantic flight corridor. We find that night-time flights during winter (December to February) are responsible for most of the contrail radiative forcing. Night flights account for only 25 per cent of daily air traffic, but contribute 60 to 80 per cent of the contrail forcing. Further, winter flights account for only 22 per cent of annual air traffic, but contribute half of the annual mean forcing. These results suggest that flight rescheduling could help to minimize the climate impact of aviation.
Resumo:
With both climate change and air quality on political and social agendas from local to global scale, the links between these hitherto separate fields are becoming more apparent. Black carbon, largely from combustion processes, scatters and absorbs incoming solar radiation, contributes to poor air quality and induces respiratory and cardiovascular problems. Uncertainties in the amount, location, size and shape of atmospheric black carbon cause large uncertainty in both climate change estimates and toxicology studies alike. Increased research has led to new effects and areas of uncertainty being uncovered. Here we draw together recent results and explore the increasing opportunities for synergistic research that will lead to improved confidence in the impact of black carbon on climate change, air quality and human health. Topics of mutual interest include better information on spatial distribution, size, mixing state and measuring and monitoring. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
We have previously placed the solar contribution to recent global warming in context using observations and without recourse to climate models. It was shown that all solar forcings of climate have declined since 1987. The present paper extends that analysis to include the effects of the various time constants with which the Earth’s climate system might react to solar forcing. The solar input waveform over the past 100 years is defined using observed and inferred galactic cosmic ray fluxes, valid for either a direct effect of cosmic rays on climate or an effect via their known correlation with total solar irradiance (TSI), or for a combination of the two. The implications, and the relative merits, of the various TSI composite data series are discussed and independent tests reveal that the PMOD composite used in our previous paper is the most realistic. Use of the ACRIM composite, which shows a rise in TSI over recent decades, is shown to be inconsistent with most published evidence for solar influences on pre-industrial climate. The conclusions of our previous paper, that solar forcing has declined over the past 20 years while surface air temperatures have continued to rise, are shown to apply for the full range of potential time constants for the climate response to the variations in the solar forcings.
Resumo:
A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.
Resumo:
Biological emergencies such as the appearance of an exotic transboundary or emerging disease can become disasters. The question that faces Veterinary Services in developing countries is how to balance resources dedicated to active insurance measures, such as border control, surveillance, working with the governments of developing countries, and investing in improving veterinary knowledge and tools, with passive measures, such as contingency funds and vaccine banks. There is strong evidence that the animal health situation in developed countries has improved and is relatively stable. In addition, through trade with other countries, developing countries are becoming part of the international animal health system, the status of which is improving, though with occasional setbacks. However, despite these improvements, the risk of a possible biological disaster still remains, and has increased in recent times because of the threat of bioterrorism. This paper suggests that a model that combines decision tree analysis with epidemiology is required to identify critical points in food chains that should be strengthened to reduce the risk of emergencies and prevent emergencies from becoming disasters.
Resumo:
In this paper, we apply one-list capture-recapture models to estimate the number of scrapie-affected holdings in Great Britain. We applied this technique to the Compulsory Scrapie Flocks Scheme dataset where cases from all the surveillance sources monitoring the presence of scrapie in Great Britain, the abattoir survey, the fallen stock survey and the statutory reporting of clinical cases, are gathered. Consequently, the estimates of prevalence obtained from this scheme should be comprehensive and cover all the different presentations of the disease captured individually by the surveillance sources. Two estimators were applied under the one-list approach: the Zelterman estimator and Chao's lower bound estimator. Our results could only inform with confidence the scrapie-affected holding population with clinical disease; this moved around the figure of 350 holdings in Great Britain for the period under study, April 2005-April 2006. Our models allowed the stratification by surveillance source and the input of covariate information, holding size and country of origin. None of the covariates appear to inform the model significantly. Crown Copyright (C) 2008 Published by Elsevier B.V. All rights reserved.
Resumo:
The abattoir and the fallen stock surveys constitute the active surveillance component aimed at improving the detection of scrapie across the European Union. Previous studies have suggested the occurrence of significant differences in the operation of the surveys across the EU. In the present study we assessed the standardisation of the surveys throughout time across the EU and identified clusters of countries with similar underlying characteristics allowing comparisons between them. In the absence of sufficient covariate information to explain the observed variability across countries, we modelled the unobserved heterogeneity by means of non-parametric distributions on the risk ratios of the fallen stock over the abattoir survey. More specifically, we used the profile likelihood method on 2003, 2004 and 2005 active surveillance data for 18 European countries on classical scrapie, and on 2004 and 2005 data for atypical scrapie separately. We extended our analyses to include the limited covariate information available, more specifically, the proportion of the adult sheep population sampled by the fallen stock survey every year. Our results show that the between-country heterogeneity dropped in 2004 and 2005 relative to that of 2003 for classical scrapie. As a consequence, the number of clusters in the last two years was also reduced indicating the gradual standardisation of the surveillance efforts across the EU. The crude analyses of the atypical data grouped all the countries in one cluster and showed non-significant gain in the detection of this type of scrapie by any of the two sources. The proportion of the population sampled by the fallen stock appeared significantly associated with our risk ratio for both types of scrapie, although in opposite directions: negative for classical and positive for atypical. The initial justification for the fallen stock, targeting a high-risk population to increase the likelihood of case finding, appears compromised for both types of scrapie in some countries.
Resumo:
This investigation deals with the question of when a particular population can be considered to be disease-free. The motivation is the case of BSE where specific birth cohorts may present distinct disease-free subpopulations. The specific objective is to develop a statistical approach suitable for documenting freedom of disease, in particular, freedom from BSE in birth cohorts. The approach is based upon a geometric waiting time distribution for the occurrence of positive surveillance results and formalizes the relationship between design prevalence, cumulative sample size and statistical power. The simple geometric waiting time model is further modified to account for the diagnostic sensitivity and specificity associated with the detection of disease. This is exemplified for BSE using two different models for the diagnostic sensitivity. The model is furthermore modified in such a way that a set of different values for the design prevalence in the surveillance streams can be accommodated (prevalence heterogeneity) and a general expression for the power function is developed. For illustration, numerical results for BSE suggest that currently (data status September 2004) a birth cohort of Danish cattle born after March 1999 is free from BSE with probability (power) of 0.8746 or 0.8509, depending on the choice of a model for the diagnostic sensitivity.