966 resultados para quantifying heteroskedasticity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how species and ecosystems respond to climate change has become a major focus of ecology and conservation biology. Modelling approaches provide important tools for making future projections, but current models of the climate-biosphere interface remain overly simplistic, undermining the credibility of projections. We identify five ways in which substantial advances could be made in the next few years: (i) improving the accessibility and efficiency of biodiversity monitoring data, (ii) quantifying the main determinants of the sensitivity of species to climate change, (iii) incorporating community dynamics into projections of biodiversity responses, (iv) accounting for the influence of evolutionary processes on the response of species to climate change, and (v) improving the biophysical rule sets that define functional groupings of species in global models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new record of sea surface temperature (SST) for climate applications is described. This record provides independent corroboration of global variations estimated from SST measurements made in situ. Infrared imagery from Along-Track Scanning Radiometers (ATSRs) is used to create a 20 year time series of SST at 0.1° latitude-longitude resolution, in the ATSR Reprocessing for Climate (ARC) project. A very high degree of independence of in situ measurements is achieved via physics-based techniques. Skin SST and SST estimated for 20 cm depth are provided, with grid cell uncertainty estimates. Comparison with in situ data sets establishes that ARC SSTs generally have bias of order 0.1 K or smaller. The precision of the ARC SSTs is 0.14 K during 2003 to 2009, from three-way error analysis. Over the period 1994 to 2010, ARC SSTs are stable, with better than 95% confidence, to within 0.005 K yr−1(demonstrated for tropical regions). The data set appears useful for cleanly quantifying interannual variability in SST and major SST anomalies. The ARC SST global anomaly time series is compared to the in situ-based Hadley Centre SST data set version 3 (HadSST3). Within known uncertainties in bias adjustments applied to in situ measurements, the independent ARC record and HadSST3 present the same variations in global marine temperature since 1996. Since the in situ observing system evolved significantly in its mix of measurement platforms and techniques over this period, ARC SSTs provide an important corroboration that HadSST3 accurately represents recent variability and change in this essential climate variable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Affymetrix GeneChip (R) arrays are used widely to study transcriptional changes in response to developmental and environmental stimuli. GeneChip (R) arrays comprise multiple 25-mer oligonucleotide probes per gene and retain certain advantages over direct sequencing. For plants, there are several public GeneChip (R) arrays whose probes are localised primarily in 39 exons. Plant whole-transcript (WT) GeneChip (R) arrays are not yet publicly available, although WT resolution is needed to study complex crop genomes such as Brassica, which are typified by segmental duplications containing paralogous genes and/or allopolyploidy. Available sequence data were sampled from the Brassica A and C genomes, and 142,997 gene models identified. The assembled gene models were then used to establish a comprehensive public WT exon array for transcriptomics studies. The Affymetrix GeneChip (R) Brassica Exon 1.0 ST Array is a 5 mu M feature size array, containing 2.4 million 25-base oligonucleotide probes representing 135,201 gene models, with 15 probes per gene distributed among exons. Discrimination of the gene models was based on an E-value cut-off of 1E(-5), with <= 98 sequence identity. The 135 k Brassica Exon Array was validated by quantifying transcriptome differences between leaf and root tissue from a reference Brassica rapa line (R-o-18), and categorisation by Gene Ontologies (GO) based on gene orthology with Arabidopsis thaliana. Technical validation involved comparison of the exon array with a 60-mer array platform using the same starting RNA samples. The 135 k Brassica Exon Array is a robust platform. All data relating to the array design and probe identities are available in the public domain and are curated within the BrassEnsembl genome viewer at http://www.brassica.info/BrassEnsembl/index.html.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Methyl benzimidazole carbamate (MBC) fungicides are used to control the oilseed rape pathogen Pyrenopeziza brassicae. Resistance to MBCs has been reported in P. brassicae, but the molecular mechanism(s) associated with reductions in sensitivity have not been verified in this species. Elucidation of the genetic changes responsible for resistance, hypothesised to be target-site mutations in β-tubulin, will enable resistance diagnostics and thereby inform resistance management strategies. RESULTS P. brassicae isolates were classified as sensitive, moderately resistant or resistant to MBCs. Crossing P. brassicae isolates of different MBC sensitivities indicated that resistance was conferred by a single gene. The MBC-target encoding gene β-tubulin was cloned and sequenced. Reduced MBC sensitivity of field isolates correlated with β-tubulin amino acid substitutions L240F and E198A. The highest level of MBC resistance was measured for isolates carrying E198A. Negative cross-resistance between MBCs and the fungicides diethofencarb and zoxamide was only measured in E198A isolates. PCR-RFLP was used to screen isolates for the presence of L240F and E198A. The substitutions E198G and F200Y were also detected in DNA samples from P. brassicae populations after cloning and sequencing of PCR products. The frequencies of L240F and E198A in different P. brassicae populations were quantified by pyrosequencing. There were no differences in the frequencies of these alleles between P. brassicae populations sampled from different locations or after fungicide treatment regimes. CONCLUSIONS The molecular mechanisms affecting sensitivity to MBCs in P. brassicae have been identified. Pyrosequencing assays are a powerful tool for quantifying fungicide-resistant alleles in pathogen populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Guide to Office Clerical Time Standards is an instructional performance piece based on a corporate manual from 1960. The pamphlet is focused on the time necessary for the accomplishment of minute labour procedures in the office, from the depressing and releasing of typewriter keys to the opening and closing of filing cabinet drawers. In the performance, seven costumed performers represent the different levels of management and employment while performing the actions described in the guide, accompanied by a live musical score. There has been much discussion of the changes to work in the west following the decline of post-Fordist service sector jobs. These increasingly emphasise the specificity of employees’ knowledge and cognitive skill. However, this greater flexibility and creativity at work has been accompanied by an opposite trajectory. The proletarisation of white collar work has given rise to more bureaucracy, target assessment and control for workers in previously looser creative professions, from academia to the arts. The midcentury office is the meeting point of these cultures, where the assembly line efficiency management of the factory meets the quantifying control of the knowledge economy. A Guide to Office Clerical Time Standards explores the survival of one regime into its successor following the lines of combined and uneven development that have turned the emancipatory promise of immaterial labour into the perma-temp hell of the cognitariat. The movement is accompanied by a score of guitar, bass and drums, the componenets of the rock ‘n’ roll music that rose from the car factories of the motor city and the cotton fields of the southern states to represent the same junction of expression and control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper demonstrates the impracticality of a comprehensive mathematical definition of the term `drought' which formalises the general qualitative definition that drought is `a deficit of water relative to normal conditions'. Starting from the local water balance, it is shown that a universal description of drought requires reference to water supply, demand and management. The influence of human intervention through water management is shown to be intrinsic to the definition of drought in the universal sense and can only be eliminated in the case of purely meteorological drought. The state of `drought' is shown to be predicated on the existence of climatological norms for a multitude of process specific terms. In general these norms are either difficult to obtain or even non-existent in the non-stationary context of climate change. Such climatological considerations, in conjunction with the difficulty of quantifying human influence, lead to the conclusion that we cannot reasonably expect the existence of any workable generalised objective definition of drought.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2007 futures contracts were introduced based upon the listed real estate market in Europe. Following their launch they have received increasing attention from property investors, however, few studies have considered the impact their introduction has had. This study considers two key elements. Firstly, a traditional Generalized Autoregressive Conditional Heteroskedasticity (GARCH) model, the approach of Bessembinder & Seguin (1992) and the Gray’s (1996) Markov-switching-GARCH model are used to examine the impact of futures trading on the European real estate securities market. The results show that futures trading did not destabilize the underlying listed market. Importantly, the results also reveal that the introduction of a futures market has improved the speed and quality of information flowing to the spot market. Secondly, we assess the hedging effectiveness of the contracts using two alternative strategies (naïve and Ordinary Least Squares models). The empirical results also show that the contracts are effective hedging instruments, leading to a reduction in risk of 64 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On-going human population growth and changing patterns of resource consumption are increasing global demand for ecosystem services, many of which are provided by soils. Some of these ecosystem services are linearly related to the surface area of pervious soil, whereas others show non-linear relationships, making ecosystem service optimization a complex task. As limited land availability creates conflicting demands among various types of land use, a central challenge is how to weigh these conflicting interests and how to achieve the best solutions possible from a perspective of sustainable societal development. These conflicting interests become most apparent in soils that are the most heavily used by humans for specific purposes: urban soils used for green spaces, housing, and other infrastructure and agricultural soils for producing food, fibres and biofuels. We argue that, despite their seemingly divergent uses of land, agricultural and urban soils share common features with regards to interactions between ecosystem services, and that the trade-offs associated with decision-making, while scale- and context-dependent, can be surprisingly similar between the two systems. We propose that the trade-offs within land use types and their soil-related ecosystems services are often disproportional, and quantifying these will enable ecologists and soil scientists to help policy makers optimizing management decisions when confronted with demands for multiple services under limited land availability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse the widely-used international/ Zürich sunspot number record, R, with a view to quantifying a suspected calibration discontinuity around 1945 (which has been termed the “Waldmeier discontinuity” [Svalgaard, 2011]). We compare R against the composite sunspot group data from the Royal Greenwich Observatory (RGO) network and the Solar Optical Observing Network (SOON), using both the number of sunspot groups, N{sub}G{\sub}, and the total area of the sunspots, A{sub}G{\sub}. In addition, we compare R with the recently developed interdiurnal variability geomagnetic indices IDV and IDV(1d). In all four cases, linearity of the relationship with R is not assumed and care is taken to ensure that the relationship of each with R is the same before and after the putative calibration change. It is shown the probability that a correction is not needed is of order 10{sup}−8{\sup} and that R is indeed too low before 1945. The optimum correction to R for values before 1945 is found to be 11.6%, 11.7%, 10.3% and 7.9% using A{sub}G{\sub}, N{sub)G{\sub}, IDV, and IDV(1d), respectively. The optimum value obtained by combining the sunspot group data is 11.6% with an uncertainty range 8.1-14.8% at the 2σ level. The geomagnetic indices provide an independent yet less stringent test but do give values that fall within the 2σ uncertainty band with optimum values are slightly lower than from the sunspot group data. The probability of the correction needed being as large as 20%, as advocated by Svalgaard [2011], is shown to be 1.6 × 10{sup}−5{\sup}.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using sunspot observations from Greenwich and Mount Wilson, we show that the latitudinal spread of sunspot groups has increased since 1874, in a manner that closely mirrors the long-term (similar to 100 year) changes in the coronal source flux, F-s, as inferred from geomagnetic activity. This latitude spread is shown to be well correlated with the flux emergence rate required by the model of the coronal source flux variation by Solanki er al. [2000]. The time constant for the decay of this open flux is found to be 3.6 +/-0.8 years. Using this value, and quantifying the photospheric flux emergence rate using the latitudinal spread of sunspot groups, the model reproduces the observed coronal source flux variation. The ratio of the 100-year drift to the solar cycle amplitude for the flux emergence rate is found to be half of the same ratio for F-s.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for quantifying diffusive flows of O+ ions in the topside ionosphere from satellite soundings is described. A departure from diffusive equilibrium alters the shape of the plasma scale-height profile near the F2-peak where ion-neutral frictional drag is large. The effect enables the evaluation of , the field-aligned flux of O+ ions relative to the neutral oxygen atom gas, using MSIS model values for the neutral thermospheric densities and temperature. Upward flow values are accurate to within about 10%, the largest sources of error being the MSIS prediction for the concentration of oxygen atoms and the plasma temperature gradient deduced from the sounding. Downward flux values are only determined to within 20%. From 60,000 topside soundings, taken at the minimum and rising phase of the solar cycle, a total of 1098 mean scale-height profiles are identified for which no storm sudden commencement had occurred in the previous 12 days and for which Kp was less than 2o, each mean profile being an average of about six soundings. A statistical study ofdeduced from these profiles shows the diurnal cycle of O+ flow in the quiet, topside ionosphere at mid-latitudes and its seasonal variations. The differences betweenand ion flux observations from incoherent scatter radars are considered using the meridional thermospheric winds predicted by a global, three-dimensional model. The mean interhemispheric flow from summer to winter is compared with predictions by a numerical model of the protonospheric coupling of conjugate ionospheres for up to 6 days following a geomagnetic storm. The observed mean (of order 3 × 1016 ions day−1 along a flux tube of area 1 m2 at 1000 km) is larger than predicted for day 6 and the suggested explanation is a decrease in upward flows from the winter, daytime ionosphere between the sixth and twelfth days.