897 resultados para Unbiased estimating functions
Resumo:
The composition of species communities is changing rapidly through drivers such as habitat loss and climate change, with potentially serious consequences for the resilience of ecosystem functions on which humans depend. To assess such changes in resilience, we analyse trends in the frequency of species in Great Britain that provide key ecosystem functions-specifically decomposition, carbon sequestration, pollination, pest control and cultural values. For 4,424 species over four decades, there have been significant net declines among animal species that provide pollination, pest control and cultural values. Groups providing decomposition and carbon sequestration remain relatively stable, as fewer species are in decline and these are offset by large numbers of new arrivals into Great Britain. While there is general concern about degradation of a wide range of ecosystem functions, our results suggest actions should focus on particular functions for which there is evidence of substantial erosion of their resilience.
Resumo:
Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.
Resumo:
Human observers exhibit large systematic distance-dependent biases when estimating the three-dimensional (3D) shape of objects defined by binocular image disparities. This has led some to question the utility of disparity as a cue to 3D shape and whether accurate estimation of 3D shape is at all possible. Others have argued that accurate perception is possible, but only with large continuous perspective transformations of an object. Using a stimulus that is known to elicit large distance-dependent perceptual bias (random dot stereograms of elliptical cylinders) we show that contrary to these findings the simple adoption of a more naturalistic viewing angle completely eliminates this bias. Using behavioural psychophysics, coupled with a novel surface-based reverse correlation methodology, we show that it is binocular edge and contour information that allows for accurate and precise perception and that observers actively exploit and sample this information when it is available.
Resumo:
Background: Accurate dietary assessment is key to understanding nutrition-related outcomes and is essential for estimating dietary change in nutrition-based interventions. Objective: The objective of this study was to assess the pan-European reproducibility of the Food4Me food-frequency questionnaire (FFQ) in assessing the habitual diet of adults. Methods: Participantsfromthe Food4Me study, a 6-mo,Internet-based, randomizedcontrolled trial of personalized nutrition conducted in the United Kingdom, Ireland, Spain, Netherlands, Germany, Greece, and Poland were included. Screening and baseline data (both collected before commencement of the intervention) were used in the present analyses, and participants were includedonly iftheycompleted FFQs at screeningand at baselinewithin a 1-mo timeframebeforethe commencement oftheintervention. Sociodemographic (e.g., sex andcountry) andlifestyle[e.g.,bodymass index(BMI,inkg/m2)and physical activity] characteristics were collected. Linear regression, correlation coefficients, concordance (percentage) in quartile classification, and Bland-Altman plots for daily intakes were used to assess reproducibility. Results: In total, 567 participants (59% female), with a mean 6 SD age of 38.7 6 13.4 y and BMI of 25.4 6 4.8, completed bothFFQswithin 1 mo(mean 6 SD: 19.26 6.2d).Exact plus adjacent classification oftotal energy intakeinparticipants was highest in Ireland (94%) and lowest in Poland (81%). Spearman correlation coefficients (r) in total energy intake between FFQs ranged from 0.50 for obese participants to 0.68 and 0.60 in normal-weight and overweight participants, respectively. Bland-Altman plots showed a mean difference between FFQs of 210 kcal/d, with the agreement deteriorating as energy intakes increased. There was little variation in reproducibility of total energy intakes between sex and age groups. Conclusions: The online Food4Me FFQ was shown to be reproducible across 7 European countries when administered within a 1-mo period to a large number of participants. The results support the utility of the online Food4Me FFQ as a reproducible tool across multiple European populations. This trial was registered at clinicaltrials.gov as NCT01530139.
Resumo:
This study determined the sensory shelf life of a commercial brand of chocolate and carrot cupcakes, aiming at increasing the current 120 days of shelf life to 180. Appearance, texture, flavor and overall quality of cakes stored at six different storage times were evaluated by 102 consumers. The data were analyzed by analysis of variance and linear regression. For both flavors, the texture presented a greater loss in acceptance during the storage period, showing an acceptance mean close to indifference on the hedonic scale at 120 days. Nevertheless, appearance, flavor and overall quality stayed acceptable up to 150 days. The end of shelf life was estimated at about 161 days for chocolate cakes and 150 days for carrot cakes. This study showed that the current 120 days of shelf life can be extended to 150 days for carrot cake and to 160 days for chocolate cake. However, the 180 days of shelf life desired by the company were not achieved. PRACTICAL APPLICATIONS This research shows the adequacy of using sensory acceptance tests to determine the shelf life of two food products (chocolate and carrot cupcakes). This practical application is useful because the precise determination of the shelf life of a food product is of vital importance for its commercial success. The maximum storage time should always be evaluated in the development or reformulation of new products, changes in packing or storage conditions. Once the physical-chemical and microbiological stability of a product is guaranteed, sensorial changes that could affect consumer acceptance will determine the end of the shelf life of a food product. Thus, the use of sensitive and reliable methods to estimate the sensory shelf life of a product is very important. Findings show the importance of determining the shelf life of each product separately and to avoid using the shelf time estimated for a specific product on other, similar products.
Resumo:
Cosmic shear requires high precision measurement of galaxy shapes in the presence of the observational point spread function (PSF) that smears out the image. The PSF must therefore be known for each galaxy to a high accuracy. However, for several reasons, the PSF is usually wavelength dependent; therefore, the differences between the spectral energy distribution of the observed objects introduce further complexity. In this paper, we investigate the effect of the wavelength dependence of the PSF, focusing on instruments in which the PSF size is dominated by the diffraction limit of the telescope and which use broad-band filters for shape measurement. We first calculate biases on cosmological parameter estimation from cosmic shear when the stellar PSF is used uncorrected. Using realistic galaxy and star spectral energy distributions and populations and a simple three-component circular PSF, we find that the colour dependence must be taken into account for the next generation of telescopes. We then consider two different methods for removing the effect: (i) the use of stars of the same colour as the galaxies and (ii) estimation of the galaxy spectral energy distribution using multiple colours and using a telescope model for the PSF. We find that both of these methods correct the effect to levels below the tolerances required for per cent level measurements of dark energy parameters. Comparison of the two methods favours the template-fitting method because its efficiency is less dependent on galaxy redshift than the broad-band colour method and takes full advantage of deeper photometry.
Resumo:
We estimate crustal structure and thickness of South America north of roughly 40 degrees S. To this end, we analyzed receiver functions from 20 relatively new temporary broadband seismic stations deployed across eastern Brazil. In the analysis we include teleseismic and some regional events, particularly for stations that recorded few suitable earthquakes. We first estimate crustal thickness and average Poisson`s ratio using two different stacking methods. We then combine the new crustal constraints with results from previous receiver function studies. To interpolate the crustal thickness between the station locations, we jointly invert these Moho point constraints, Rayleigh wave group velocities, and regional S and Rayleigh waveforms for a continuous map of Moho depth. The new tomographic Moho map suggests that Moho depth and Moho relief vary slightly with age within the Precambrian crust. Whether or not a positive correlation between crustal thickness and geologic age is derived from the pre-interpolation point constraints depends strongly on the selected subset of receiver functions. This implies that using only pre-interpolation point constraints (receiver functions) inadequately samples the spatial variation in geologic age. The new Moho map also reveals an anomalously deep Moho beneath the oldest core of the Amazonian Craton.
Resumo:
The deep crustal structure of the Parana Basin of southern Brazil is investigated by analyzing P- and PP-wave receiver functions at 17 Brazilian Lithosphere Seismic Project stations within the basin. The study area can be described as a typical Paleozoic intracratonic basin that hosts one of the largest Large Igneous Province of the world and makes a unique setting for investigating models of basin subsidence and their interaction with mantle plumes. Our study consists of (1) an analysis of the Moho interaction phases in the receiver functions to obtain the thickness and bulk Vp/Vs ratio of the basin`s underlying crust and (2) a joint inversion with Rayleigh-wave dispersion velocities from an independent tomographic study to delineate the detailed S-wave velocity variation with depth. The results of our analysis reveal that Moho depths and bulk Vp/Vs ratios (including sediments) vary between 41 and 48 km and between 1.70 and 1.76, respectively, with the largest values roughly coinciding with the basin`s axis, and that S-wave velocities in the lower crust are generally below 3.8 km/s. Select sites within the basin, however, show lower crustal S-wave velocities slightly above 3.9 km/s suggestive of underplated mafic material. We show that these observations are consistent with a fragmented cratonic root under the Parana basin that defined a zone of weakness for the initial Paleozoic subsidence of the basin and which allowed localized mafic underplating of the crust along the suture zones by Cenozoic magmatism.
Resumo:
Receiver functions from small local earthquakes were used to determine sediment thicknesses in Porto dos Gauchos seismic zone (PGSZ), Parecis basin, Amazonian craton, Brazil. The high velocity contrast between basement and sediments (P-wave velocities of 6.1 and 3.0 km/s, respectively) favors the generation of clear P-to-S converted phases (Ps) seen in the radial component, and also S-to-P conversions (Sp) seen in the vertical component. A reference 10 velocity model determined with shallow refraction experiment in PGSZ helped to convert Ps P time differences to basement depths at 15 stations deployed for aftershocks studies. The results of receiver function integrated with the shallow refraction reveal that the basement depths in the PGSZ increases from the basin border in the north up to about 600 m depth in the south. The basement topography, however, does not vary smoothly and a basement high with a steep topography was detected near the epicentral area. A 400 m elevation difference within 1.7 km distance suggests a possible border fault of a buried graben. This feature seems to be oriented roughly WSW-ENE and could indicate basement structures related to the seismicity of the Porto dos Gauchos Seismic Zone. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Information to guide decision making is especially urgent in human dominated landscapes in the tropics, where urban and agricultural frontiers are still expanding in an unplanned manner. Nevertheless, most studies that have investigated the influence of landscape structure on species distribution have not considered the heterogeneity of altered habitats of the matrix, which is usually high in human dominated landscapes. Using the distribution of small mammals in forest remnants and in the four main altered habitats in an Atlantic forest landscape, we investigated 1) how explanatory power of models describing species distribution in forest remnants varies between landscape structure variables that do or do not incorporate matrix quality and 2) the importance of spatial scale for analyzing the influence of landscape structure. We used standardized sampling in remnants and altered habitats to generate two indices of habitat quality, corresponding to the abundance and to the occurrence of small mammals. For each remnant, we calculated habitat quantity and connectivity in different spatial scales, considering or not the quality of surrounding habitats. The incorporation of matrix quality increased model explanatory power across all spatial scales for half the species that occurred in the matrix, but only when taking into account the distance between habitat patches (connectivity). These connectivity models were also less affected by spatial scale than habitat quantity models. The few consistent responses to the variation in spatial scales indicate that despite their small size, small mammals perceive landscape features at large spatial scales. Matrix quality index corresponding to species occurrence presented a better or similar performance compared to that of species abundance. Results indicate the importance of the matrix for the dynamics of fragmented landscapes and suggest that relatively simple indices can improve our understanding of species distribution, and could be applied in modeling, monitoring and managing complex tropical landscapes.
Resumo:
Only a small fraction of spectra acquired in LC-MS/MS runs matches peptides from target proteins upon database searches. The remaining, operationally termed background, spectra originate from a variety of poorly controlled sources and affect the throughput and confidence of database searches. Here, we report an algorithm and its software implementation that rapidly removes background spectra, regardless of their precise origin. The method estimates the dissimilarity distance between screened MS/MS spectra and unannotated spectra from a partially redundant background library compiled from several control and blank runs. Filtering MS/MS queries enhanced the protein identification capacity when searches lacked spectrum to sequence matching specificity. In sequence-similarity searches it reduced by, on average, 30-fold the number of orphan hits, which were not explicitly related to background protein contaminants and required manual validation. Removing high quality background MS/MS spectra, while preserving in the data set the genuine spectra from target proteins, decreased the false positive rate of stringent database searches and improved the identification of low-abundance proteins.
Resumo:
Most techniques used for estimating the age of Sotalia guianensis (van B,n,den, 1864) (Cetacea; Delphinidae) are very expensive, and require sophisticated equipment for preparing histological sections of teeth. The objective of this study was to test a more affordable and much simpler method, involving of the manual wear of teeth followed by decalcification and observation under a stereomicroscope. This technique has been employed successfully with larger species of Odontoceti. Twenty-six specimens were selected, and one tooth of each specimen was worn and demineralized for growth layers reading. Growth layers were evidenced in all specimens; however, in 4 of the 26 teeth, not all the layers could be clearly observed. In these teeth, there was a significant decrease of growth layer group thickness, thus hindering the layers count. The juxtaposition of layers hindered the reading of larger numbers of layers by the wear and decalcification technique. Analysis of more than 17 layers in a single tooth proved inconclusive. The method applied here proved to be efficient in estimating the age of Sotalia guianensis individuals younger than 18 years. This method could simplify the study of the age structure of the overall population, and allows the use of the more expensive methodologies to be confined to more specific studies of older specimens. It also enables the classification of the calf, young and adult classes, which is important for general population studies.