900 resultados para Global Analysis
Resumo:
We have previously placed the solar contribution to recent global warming in context using observations and without recourse to climate models. It was shown that all solar forcings of climate have declined since 1987. The present paper extends that analysis to include the effects of the various time constants with which the Earth’s climate system might react to solar forcing. The solar input waveform over the past 100 years is defined using observed and inferred galactic cosmic ray fluxes, valid for either a direct effect of cosmic rays on climate or an effect via their known correlation with total solar irradiance (TSI), or for a combination of the two. The implications, and the relative merits, of the various TSI composite data series are discussed and independent tests reveal that the PMOD composite used in our previous paper is the most realistic. Use of the ACRIM composite, which shows a rise in TSI over recent decades, is shown to be inconsistent with most published evidence for solar influences on pre-industrial climate. The conclusions of our previous paper, that solar forcing has declined over the past 20 years while surface air temperatures have continued to rise, are shown to apply for the full range of potential time constants for the climate response to the variations in the solar forcings.
Resumo:
Discrepancies between recent global earth albedo anomaly data obtained from the climate models, space and ground observations call for a new and better earth reflectance measurement technique. The SALEX (Space Ashen Light Explorer) instrument is a space-based visible and IR instrument for precise estimation of the global earth albedo by measuring the ashen light reflected off the shadowy side of the Moon from the low earth orbit. The instrument consists of a conventional 2-mirror telescope, a pair of a 3-mirror visible imager and an IR bolometer. The performance of this unique multi-channel optical system is sensitive to the stray light contamination due to the complex optical train incorporating several reflecting and refracting elements, associated mounts and the payload mechanical enclosure. This could be further aggravated by the very bright and extended observation target (i.e. the Moon). In this paper, we report the details of extensive stray light analysis including ghosts and cross-talks, leading to the optimum set of stray light precautions for the highest signal-to-noise ratio attainable.
Resumo:
Purpose - The purpose of this paper is to offer an exploratory case study comparing one Brazilian beef processor's relationships supplying two different distribution channels, an EU importer and an EU retail chain operating in Brazil. Design/methodology/approach - The paper begins with a short review of global value chains and the recent literature on trust. It gives the background to the Brazilian beef chain and presents data obtained through in-depth interviews, annual reports and direct observation with the Brazilian beef processor, the EU importer and the retailer. The interviews were conducted with individual firms, but the analysis places them in a chain context, identifying the links and relationships between the agents of the chains and aiming to describe each distribution channel. Findings - Executive chain governance exercised by the domestic retailer stimulates technical upgrading and transferring of best practices to local. suppliers. Consequently, this kind of relationship results in more trust within the global value chain. Practical implications - There are difficulties and challenges facing this Brazilian beef processor that are party related to the need to comply with increasingly complex and demanding food safety and food quality standards. There is still a gap between practices adopted for the export market and practices adopted locally. The strategies of transnational retailers in offering differentiated beef should be taken in account. Originality/value - The research outlines an interdisciplinary framework able to explain chain relationships and the kind of trust that emerges in relationships between EU importer/retail and a developing country supplier.
Resumo:
Background: The objective was to evaluate the efficacy and tolerability of donepezil (5 and 10 mg/day) compared with placebo in alleviating manifestations of mild to moderate Alzheimer's disease (AD). Method: A systematic review of individual patient data from Phase II and III double-blind, randomised, placebo-controlled studies of up to 24 weeks and completed by 20 December 1999. The main outcome measures were the ADAS-cog, the CIBIC-plus, and reports of adverse events. Results: A total of 2376 patients from ten trials were randomised to either donepezil 5 mg/day (n = 821), 10 mg/day (n = 662) or placebo (n = 893). Cognitive performance was better in patients receiving donepezil than in patients receiving placebo. At 12 weeks the differences in ADAS-cog scores were 5 mg/day-placebo: - 2.1 [95% confidence interval (CI), - 2.6 to - 1.6; p < 0.001], 10 mg/day-placebo: - 2.5 ( - 3.1 to - 2.0; p < 0.001). The corresponding results at 24 weeks were - 2.0 ( - 2.7 to - 1.3; p < 0.001) and - 3.1 ( - 3.9 to - 2.4; p < 0.001). The difference between the 5 and 10 mg/day doses was significant at 24 weeks (p = 0.005). The odds ratios (OR) of improvement on the CIBIC-plus at 12 weeks were: 5 mg/day-placebo 1.8 (1.5 to 2.1; p < 0.001), 10 mg/day-placebo 1.9 (1.5 to 2.4; p < 0.001). The corresponding values at 24 weeks were 1.9 (1.5 to 2.4; p = 0.001) and 2.1 (1.6 to 2.8; p < 0.001). Donepezil was well tolerated; adverse events were cholinergic in nature and generally of mild severity and brief in duration. Conclusion: Donepezil (5 and 10 mg/day) provides meaningful benefits in alleviating deficits in cognitive and clinician-rated global function in AD patients relative to placebo. Increased improvements in cognition were indicated for the higher dose. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
The International Citicoline Trial in acUte Stroke is a sequential phase III study of the use of the drug citicoline in the treatment of acute ischaemic stroke, which was initiated in 2006 in 56 treatment centres. The primary objective of the trial is to demonstrate improved recovery of patients randomized to citicoline relative to those randomized to placebo after 12 weeks of follow-up. The primary analysis will take the form of a global test combining the dichotomized results of assessments on three well-established scales: the Barthel Index, the modified Rankin scale and the National Institutes of Health Stroke Scale. This approach was previously used in the analysis of the influential National Institute of Neurological Disorders and Stroke trial of recombinant tissue plasminogen activator in stroke. The purpose of this paper is to describe how this trial was designed, and in particular how the simultaneous objectives of taking into account three assessment scales, performing a series of interim analyses and conducting treatment allocation and adjusting the analyses to account for prognostic factors, including more than 50 treatment centres, were addressed. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
The yncE gene of Escherichia coli encodes a predicted periplasmic protein of unknown function. The gene is de-repressed under iron restriction through the action of the global iron regulator Fur. This suggests a role in iron acquisition, which is supported by the presence of the adjacent yncD gene encoding a potential TonB-dependent outer-membrane transporter. Here, the preliminary crystallographic structure of YncE is reported, revealing that it consists of a seven-bladed beta-propeller which resembles the corresponding domain of the `surface-layer protein' of Methanosarcina mazei. A full structure determination is under way in order to provide insight into the function of this protein.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
There is a concerted global effort to digitize biodiversity occurrence data from herbarium and museum collections that together offer an unparalleled archive of life on Earth over the past few centuries. The Global Biodiversity Information Facility provides the largest single gateway to these data. Since 2004 it has provided a single point of access to specimen data from databases of biological surveys and collections. Biologists now have rapid access to more than 120 million observations, for use in many biological analyses. We investigate the quality and coverage of data digitally available, from the perspective of a biologist seeking distribution data for spatial analysis on a global scale. We present an example of automatic verification of geographic data using distributions from the International Legume Database and Information Service to test empirically, issues of geographic coverage and accuracy. There are over 1/2 million records covering 31% of all Legume species, and 84% of these records pass geographic validation. These data are not yet a global biodiversity resource for all species, or all countries. A user will encounter many biases and gaps in these data which should be understood before data are used or analyzed. The data are notably deficient in many of the world's biodiversity hotspots. The deficiencies in data coverage can be resolved by an increased application of resources to digitize and publish data throughout these most diverse regions. But in the push to provide ever more data online, we should not forget that consistent data quality is of paramount importance if the data are to be useful in capturing a meaningful picture of life on Earth.
Resumo:
Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.
Resumo:
A full dimensional, ab initio-based semiglobal potential energy surface for C2H3+ is reported. The ab initio electronic energies for this molecule are calculated using the spin-restricted, coupled cluster method restricted to single and double excitations with triples corrections [RCCSD(T)]. The RCCSD(T) method is used with the correlation-consistent polarized valence triple-zeta basis augmented with diffuse functions (aug-cc-pVTZ). The ab initio potential energy surface is represented by a many-body (cluster) expansion, each term of which uses functions that are fully invariant under permutations of like nuclei. The fitted potential energy surface is validated by comparing normal mode frequencies at the global minimum and secondary minimum with previous and new direct ab initio frequencies. The potential surface is used in vibrational analysis using the "single-reference" and "reaction-path" versions of the code MULTIMODE. (c) 2006 American Institute of Physics.
Resumo:
Frequency recognition is an important task in many engineering fields such as audio signal processing and telecommunications engineering, for example in applications like Dual-Tone Multi-Frequency (DTMF) detection or the recognition of the carrier frequency of a Global Positioning, System (GPS) signal. This paper will present results of investigations on several common Fourier Transform-based frequency recognition algorithms implemented in real time on a Texas Instruments (TI) TMS320C6713 Digital Signal Processor (DSP) core. In addition, suitable metrics are going to be evaluated in order to ascertain which of these selected algorithms is appropriate for audio signal processing(1).
Resumo:
This paper examines aspects of the case against global oil peaking, and in particular sets out to answer a viewpoint that the world can have abundant supplies of oil "for years to come". Arguments supporting the latter view include: past forecasts of oil shortage have proved incorrect, so current predictions should also be discounted; many modellers depend on Hubbert's analysis but this contained fundamental flaws; new oil supply will result from reserves growth and from the wider deployment of advanced extraction technology; and that the world contains large resources of unconventional oil that can come on-stream if the production of conventional oil declines. These arguments are examined in turn and shown to be incorrect, or to need setting into a broader context. The paper concludes therefore that such arguments cannot be used to rule out calculations that the resource-limited peak in the world's production of conventional oil will occur in the near term. Moreover, peaking of conventional oil is likely to impact the world's total availability of oil where the latter includes non-conventional oil and oil substitutes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we initiate the study of a class of Putnam-type equation of the form x(n-1) = A(1)x(n) + A(2)x(n-1) + A(3)x(n-2)x(n-3) + A(4)/B(1)x(n)x(n-1) + B(2)x(n-2) + B(3)x(n-3) + B-4 n = 0, 1, 2,..., where A(1), A(2), A(3), A(4), B-1, B-2, B-3, B-4 are positive constants with A(1) + A(2) + A(3) + A(4) = B-1 + B-2 + B-3 + B-4, x(-3), x(-2), x(-1), x(0) are positive numbers. A sufficient condition is given for the global asymptotic stability of the equilibrium point c = 1 of such equations. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.