825 resultados para robust estimators of location
Resumo:
Varroa destructor is a parasitic mite of the Eastern honeybee Apis cerana. Fifty years ago, two distinct evolutionary lineages (Korean and Japanese) invaded the Western honeybee Apis mellifera. This haplo-diploid parasite species reproduces mainly through brother sister matings, a system which largely favors the fixation of new mutations. In a worldwide sample of 225 individuals from 21 locations collected on Western honeybees and analyzed at 19 microsatellite loci, a series of de novo mutations was observed. Using historical data concerning the invasion, this original biological system has been exploited to compare three mutation models with allele size constraints for microsatellite markers: stepwise (SMM) and generalized (GSM) mutation models, and a model with mutation rate increasing exponentially with microsatellite length (ESM). Posterior probabilities of the three models have been estimated for each locus individually using reversible jump Markov Chain Monte Carlo. The relative support of each model varies widely among loci, but the GSM is the only model that always receives at least 9% support, whatever the locus. The analysis also provides robust estimates of mutation parameters for each locus and of the divergence time of the two invasive lineages (67,000 generations with a 90% credibility interval of 35,000-174,000). With an average of 10 generations per year, this divergence time fits with the last post-glacial Korea Japan land separation. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
1. Wildlife managers often require estimates of abundance. Direct methods of estimation are often impractical, especially in closed-forest environments, so indirect methods such as dung or nest surveys are increasingly popular. 2. Dung and nest surveys typically have three elements: surveys to estimate abundance of the dung or nests; experiments to estimate the production (defecation or nest construction) rate; and experiments to estimate the decay or disappearance rate. The last of these is usually the most problematic, and was the subject of this study. 3. The design of experiments to allow robust estimation of mean time to decay was addressed. In most studies to date, dung or nests have been monitored until they disappear. Instead, we advocate that fresh dung or nests are located, with a single follow-up visit to establish whether the dung or nest is still present or has decayed. 4. Logistic regression was used to estimate probability of decay as a function of time, and possibly of other covariates. Mean time to decay was estimated from this function. 5. Synthesis and applications. Effective management of mammal populations usually requires reliable abundance estimates. The difficulty in estimating abundance of mammals in forest environments has increasingly led to the use of indirect survey methods, in which abundance of sign, usually dung (e.g. deer, antelope and elephants) or nests (e.g. apes), is estimated. Given estimated rates of sign production and decay, sign abundance estimates can be converted to estimates of animal abundance. Decay rates typically vary according to season, weather, habitat, diet and many other factors, making reliable estimation of mean time to decay of signs present at the time of the survey problematic. We emphasize the need for retrospective rather than prospective rates, propose a strategy for survey design, and provide analysis methods for estimating retrospective rates.
Resumo:
Ever since man invented writing he has used text to store and distribute his thoughts. With the advent of computers and the Internet the delivery of these messages has become almost instant. Textual conversations can now be had regardless of location or distance. Advances in computational power for 3D graphics are enabling Virtual Environments(VE) within which users can become increasingly more immersed. By opening these environments to other users such as initially through sharing these text conversations channels, we aim to extend the immersed experience into an online virtual community. This paper examines work that brings textual communications into the VE, enabling interaction between the real and virtual worlds.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
This paper investigates the effect of voluntary eco-certification on the rental and sale prices of US commercial office properties. Hedonic and logistic regressions are used to test whether there are rental and sale price premiums for LEED and Energy Star certified buildings. The results of the hedonic analysis suggest that there is a rental premium of approximately 6% for LEED and Energy Star certification. A sale price premium of approximately 35% was found for 127 price observations involving LEED rated buildings and 31% for 662 buildings involving Energy Star rated buildings. When compared to samples of similar buildings identified by a binomial logistic regression for LEED-certified buildings, the existence of a rent and sales price premium is confirmed albeit with differences regarding the magnitude of the premium. Overall, the results of this study confirm that LEED and Energy Star buildings exhibit higher rental rates and sales prices per square foot controlling for a large number of location- and property-specific factors.
Resumo:
The Geostationary Earth Radiation Budget Intercomparison of Longwave and Shortwave radiation (GERBILS) was an observational field experiment over North Africa during June 2007. The campaign involved 10 flights by the FAAM BAe-146 research aircraft over southwestern parts of the Sahara Desert and coastal stretches of the Atlantic Ocean. Objectives of the GERBILS campaign included characterisation of mineral dust geographic distribution and physical and optical properties, assessment of the impact upon radiation, validation of satellite remote sensing retrievals, and validation of numerical weather prediction model forecasts of aerosol optical depths (AODs) and size distributions. We provide the motivation behind GERBILS and the experimental design and report the progress made in each of the objectives. We show that mineral dust in the region is relatively non-absorbing (mean single scattering albedo at 550 nm of 0.97) owing to the relatively small fraction of iron oxides present (1–3%), and that detailed spectral radiances are most accurately modelled using irregularly shaped particles. Satellite retrievals over bright desert surfaces are challenging owing to the lack of spectral contrast between the dust and the underlying surface. However, new techniques have been developed which are shown to be in relatively good agreement with AERONET estimates of AOD and with each other. This encouraging result enables relatively robust validation of numerical models which treat the production, transport, and deposition of mineral dust. The dust models themselves are able to represent large-scale synoptically driven dust events to a reasonable degree, but some deficiencies remain both in the Sahara and over the Sahelian region, where cold pool outflow from convective cells associated with the intertropical convergence zone can lead to significant dust production.
Resumo:
Satellite cells represent the stem cell population of adult skeletal muscle. The molecular mechanisms that control the proliferation of satellite cells are not well understood. In this study, we show that in response to injury, myofibres activate Wnt ligand transcription and activate a reporter cell line that is sensitive to the canonical Wnt-signalling pathway. Activated satellite cells on isolated cultured myofibres show robust expression of activated-β-catenin (Act-β-Cat), a key downstream transcriptional coactivator of canonical Wnt signalling. We provide evidence that the Wnt family of secreted glycoproteins act on satellite cells in a ligand-specific manner. Overexpression of Wnt1, Wnt3a or Wnt5a protein causes a dramatic increase in satellite-cell proliferation. By contrast, exposure of satellite cells to Wnt4 or Wnt6 diminishes this process. Moreover, we show that the prolonged satellite-cell quiescence induced by inhibitory Wnt is reversible and exposing inhibited satellite cells to stimulatory Wnt signalling restores their proliferation rate. Stimulatory Wnt proteins induce premature satellite cell BrdU incorporation as well as nuclear translocation of Act-β-Cat. Finally, we provide evidence that the Act-β-Cat translocation observed in single fibres during in vitro culture also occurs in cases of acute and chronic skeletal muscle regeneration in rodents and humans. We propose that Wnt proteins may be key factors that regulate the rate of satellite-cell proliferation on adult muscle fibres during the wound-healing response.
Resumo:
The issue of diversification in direct real estate investment portfolios has been widely studied in academic and practitioner literature. Most work, however, has been done using either partially aggregated data or data for small samples of individual properties. This paper reports results from tests of both risk reduction and diversification that use the records of 10,000+ UK properties tracked by Investment Property Databank. It provides, for the first time, robust estimates of the diversification gains attainable given the returns, risks and cross‐correlations across the individual properties available to fund managers. The results quantify the number of assets and amount of money needed to construct both ‘balanced’ and ‘specialist’ property portfolios by direct investment. Target numbers will vary according to the objectives of investors and the degree to which tracking error is tolerated. The top‐level results are consistent with previous work, showing that a large measure of risk reduction can be achieved with portfolios of 30–50 properties, but full diversification of specific risk can only be achieved in very large portfolios. However, the paper extends previous work by demonstrating on a single, large dataset the implications of different methods of calculating risk reduction, and also by showing more disaggregated results relevant to the construction of specialist, sector‐focussed funds.
Resumo:
Bowen and colleagues’ methods and conclusions raise concerns.1 At best, the trial evaluates the variability in current practice. In no way is it a robust test of treatment. Two communication impairments (aphasia and dysarthria) were included. In the post-acute stage spontaneous recovery is highly unpredictable, and changes in the profile of impairment during this time are common.2 Both impairments manifest in different forms,3 which may be more or less responsive to treatment. A third kind of impairment, apraxia of speech, was not excluded but was not targeted in therapy. All three impairments can and do co-occur. Whether randomised controlled trial designs can effectively cope with such complex disorders has been discussed elsewhere.4 Treatment was defined within terms of current practice but was unconstrained. Therefore, the treatment group would have received a variety of therapeutic approaches and protocols, some of which may indeed be ineffective. Only 53% of the contact time with a speech and language therapist was direct (one to one), the rest was impairment based therapy. In contrast, all of the visitors’ time was direct contact, usually in conversation. In both groups, the frequency and length of contact time varied. We already know that the transfer from impairment based therapy to functional communication can be limited and varies across individuals.5 However, it is not possible to conclude from this trial that one to one impairment based therapy should be replaced. For that, a well defined impairment therapy protocol must be directly compared with a similarly well defined functional communication therapy, with an attention control.
Resumo:
We present a description of the theoretical framework and "best practice" for using the paleo-climate model component of the Coupled Model Intercomparison Project (Phase 5) (CMIP5) to constrain future projections of climate using the same models. The constraints arise from measures of skill in hindcasting paleo-climate changes from the present over 3 periods: the Last Glacial Maximum (LGM) (21 thousand years before present, ka), the mid-Holocene (MH) (6 ka) and the Last Millennium (LM) (850–1850 CE). The skill measures may be used to validate robust patterns of climate change across scenarios or to distinguish between models that have differing outcomes in future scenarios. We find that the multi-model ensemble of paleo-simulations is adequate for addressing at least some of these issues. For example, selected benchmarks for the LGM and MH are correlated to the rank of future projections of precipitation/temperature or sea ice extent to indicate that models that produce the best agreement with paleoclimate information give demonstrably different future results than the rest of the models. We also find that some comparisons, for instance associated with model variability, are strongly dependent on uncertain forcing timeseries, or show time dependent behaviour, making direct inferences for the future problematic. Overall, we demonstrate that there is a strong potential for the paleo-climate simulations to help inform the future projections and urge all the modeling groups to complete this subset of the CMIP5 runs.
Resumo:
Multiple equilibria in a coupled ocean–atmosphere–sea ice general circulation model (GCM) of an aquaplanet with many degrees of freedom are studied. Three different stable states are found for exactly the same set of parameters and external forcings: a cold state in which a polar sea ice cap extends into the midlatitudes; a warm state, which is ice free; and a completely sea ice–covered “snowball” state. Although low-order energy balance models of the climate are known to exhibit intransitivity (i.e., more than one climate state for a given set of governing equations), the results reported here are the first to demonstrate that this is a property of a complex coupled climate model with a consistent set of equations representing the 3D dynamics of the ocean and atmosphere. The coupled model notably includes atmospheric synoptic systems, large-scale circulation of the ocean, a fully active hydrological cycle, sea ice, and a seasonal cycle. There are no flux adjustments, with the system being solely forced by incoming solar radiation at the top of the atmosphere. It is demonstrated that the multiple equilibria owe their existence to the presence of meridional structure in ocean heat transport: namely, a large heat transport out of the tropics and a relatively weak high-latitude transport. The associated large midlatitude convergence of ocean heat transport leads to a preferred latitude at which the sea ice edge can rest. The mechanism operates in two very different ocean circulation regimes, suggesting that the stabilization of the large ice cap could be a robust feature of the climate system. Finally, the role of ocean heat convergence in permitting multiple equilibria is further explored in simpler models: an atmospheric GCM coupled to a slab mixed layer ocean and an energy balance model
Resumo:
While changes in land precipitation during the last 50 years have been attributed in part to human influences, results vary by season, are affected by data uncertainty and do not account for changes over ocean. One of the more physically robust responses of the water cycle to warming is the expected amplification of existing patterns of precipitation minus evaporation. Here, precipitation changes in wet and dry regions are analyzed from satellite data for 1988–2010, covering land and ocean. We derive fingerprints for the expected change from climate model simulations that separately track changes in wet and dry regions. The simulations used are driven with anthropogenic and natural forcings combined, and greenhouse gas forcing or natural forcing only. Results of detection and attribution analysis show that the fingerprint of combined external forcing is detectable in observations and that this intensification of the water cycle is partly attributable to greenhouse gas forcing.
Resumo:
Biomization provides an objective and robust method of assigning pollen spectra to biomes so that pollen data can be mapped and compared directly with the output of biomgeographic models. We have tested the applicability of this procedure, originally developed for Europe, to assign modern surface samples from China to biomes. The procedure successfully delineated the major vegetation types of China. When the same procedure was applied to fossil pollen samples for 6000 years ago, the reconstructions showed systematic differences from present, consistent with previous interpretations of vegetation changes since the mid-Holocene. In eastern China, the forest zones were systematically shifted northwards, such that cool mixed forests displaced taiga in northeastern China, while broad-leaved evergreen forest extended c. 300 km and temperate deciduous forestc. 500–600 km beyond their present northern limits. In northwestern China, the area of desert and steppe vegetation was reduced compared to present. On the Tibetan Plateau, forest vegetation extended to higher elevations than today and the area of tundra was reduced. These shifts in biome distributions imply significant changes in climate since 6000 years ago that can be interpreted qualitatively as a response to orbital forcing and its secondary effects on the Asian monsoon.
Resumo:
Global hydrographic and air–sea freshwater flux datasets are used to investigate ocean salinity changes over 1950–2010 in relation to surface freshwater flux. On multi-decadal timescales, surface salinity increases (decreases) in evaporation (precipitation) dominated regions, the Atlantic–Pacific salinity contrast increases, and the upper thermocline salinity maximum increases while the salinity minimum of intermediate waters decreases. Potential trends in E–P are examined for 1950–2010 (using two reanalyses) and 1979–2010 (using four reanalyses and two blended products). Large differences in the 1950–2010 E–P trend patterns are evident in several regions, particularly the North Atlantic. For 1979–2010 some coherency in the spatial change patterns is evident but there is still a large spread in trend magnitude and sign between the six E–P products. However, a robust pattern of increased E–P in the southern hemisphere subtropical gyres is seen in all products. There is also some evidence in the tropical Pacific for a link between the spatial change patterns of salinity and E–P associated with ENSO. The water cycle amplification rate over specific regions is subsequently inferred from the observed 3-D salinity change field using a salt conservation equation in variable isopycnal volumes, implicitly accounting for the migration of isopycnal surfaces. Inferred global changes of E–P over 1950–2010 amount to an increase of 1 ± 0.6 % in net evaporation across the subtropics and an increase of 4.2 ± 2 % in net precipitation across subpolar latitudes. Amplification rates are approximately doubled over 1979–2010, consistent with accelerated broad-scale warming but also coincident with much improved salinity sampling over the latter period.
Resumo:
Seasonal-to-interannual predictions of Arctic sea ice may be important for Arctic communities and industries alike. Previous studies have suggested that Arctic sea ice is potentially predictable but that the skill of predictions of the September extent minimum, initialized in early summer, may be low. The authors demonstrate that a melt season “predictability barrier” and two predictability reemergence mechanisms, suggested by a previous study, are robust features of five global climate models. Analysis of idealized predictions with one of these models [Hadley Centre Global Environment Model, version 1.2 (HadGEM1.2)], initialized in January, May and July, demonstrates that this predictability barrier exists in initialized forecasts as well. As a result, the skill of sea ice extent and volume forecasts are strongly start date dependent and those that are initialized in May lose skill much faster than those initialized in January or July. Thus, in an operational setting, initializing predictions of extent and volume in July has strong advantages for the prediction of the September minimum when compared to predictions initialized in May. Furthermore, a regional analysis of sea ice predictability indicates that extent is predictable for longer in the seasonal ice zones of the North Atlantic and North Pacific than in the regions dominated by perennial ice in the central Arctic and marginal seas. In a number of the Eurasian shelf seas, which are important for Arctic shipping, only the forecasts initialized in July have continuous skill during the first summer. In contrast, predictability of ice volume persists for over 2 yr in the central Arctic but less in other regions.