895 resultados para error estimate


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tests for bioaccessibility are useful in human health risk assessment. No research data with the objective of determining bioaccessible arsenic (As) in areas affected by gold mining and smelting activities have been published so far in Brazil. Samples were collected from four areas: a private natural land reserve of Cerrado; mine tailings; overburden; and refuse from gold smelting of a mining company in Paracatu, Minas Gerais. The total, bioaccessible and Mehlich-1-extractable As levels were determined. Based on the reproducibility and the accuracy/precision of the in vitro gastrointestinal (IVG) determination method of bioaccessible As in the reference material NIST 2710, it was concluded that this procedure is adequate to determine bioaccessible As in soil and tailing samples from gold mining areas in Brazil. All samples from the studied mining area contained low percentages of bioaccessible As.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies on water retention and availability are scarce for subtropical or humid temperate climate regions of the southern hemisphere. The aims of this study were to evaluate the relations of the soil physical, chemical, and mineralogical properties with water retention and availability for the generation and validation of continuous point pedotransfer functions (PTFs) for soils of the State of Santa Catarina (SC) in the South of Brazil. Horizons of 44 profiles were sampled in areas under different cover crops and regions of SC, to determine: field capacity (FC, 10 kPa), permanent wilting point (PWP, 1,500 kPa), available water content (AW, by difference), saturated hydraulic conductivity, bulk density, aggregate stability, particle size distribution (seven classes), organic matter content, and particle density. Chemical and mineralogical properties were obtained from the literature. Spearman's rank correlation analysis and path analysis were used in the statistical analyses. The point PTFs for estimation of FC, PWP and AW were generated for the soil surface and subsurface through multiple regression analysis, followed by robust regression analysis, using two sets of predictive variables. Soils with finer texture and/or greater organic matter content retain more moisture, and organic matter is the property that mainly controls the water availability to plants in soil surface horizons. Path analysis was useful in understanding the relationships between soil properties for FC, PWP and AW. The predictive power of the generated PTFs to estimate FC and PWP was good for all horizons, while AW was best estimated by more complex models with better prediction for the surface horizons of soils in Santa Catarina.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the sensitivity limits of a broadband gravitational-wave detector based on dual resonators such as nested spheres. We determine both the thermal and back-action noises when the resonators displacements are read out with an optomechanical sensor. We analyze the contributions of all mechanical modes, using a new method to deal with the force-displacement transfer functions in the intermediate frequency domain between the two gravitational-wave sensitive modes associated with each resonator. This method gives an accurate estimate of the mechanical response, together with an evaluation of the estimate error. We show that very high sensitivities can be reached on a wide frequency band for realistic parameters in the case of a dual-sphere detector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT Groundwater management depends on the knowledge on recharge rates and water fluxes within aquifers. The recharge is one of the water cycle components most difficult to estimate. As a result, despite the chosen method, the estimates are subject to uncertainties that can be identified by means of comparison with other approaches. In this study, groundwater recharge estimates based on the water balance in the unsaturated zone is assessed. Firstly, the approach is evaluated by comparing the results with those of another method. Then, the estimates are used as inputs in a transient groundwater flow model in order to assess how the water table would respond to the obtained recharges rates compared to measured levels. The results suggest a good performance of the adopted approach and, despite some inherent limitations, it has advantages over other methods since the data required are easier to obtain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este documento se ilustra de un modo práctico, el empleo de tres instrumentos que permiten al actuario definir grupos arancelarios y estimar premios de riesgo en el proceso que tasa la clase para el seguro de no vida. El primero es el análisis de segmentación (CHAID y XAID) usado en primer lugar en 1997 por UNESPA en su cartera común de coches. El segundo es un proceso de selección gradual con el modelo de regresión a base de distancia. Y el tercero es un proceso con el modelo conocido y generalizado de regresión linear, que representa la técnica más moderna en la bibliografía actuarial. De estos últimos, si combinamos funciones de eslabón diferentes y distribuciones de error, podemos obtener el aditivo clásico y modelos multiplicativos

Relevância:

20.00% 20.00%

Publicador:

Resumo:

State general fund revenue estimates are generated by the Iowa Revenue Estimating Conference (REC). The REC is comprised of the Governor or their designee, the Director of the Legislative Services Agency, and a third person agreed upon by the other two members. The REC meets periodically, generally in October, December, and March/April. The Governor and the Legislature are required to use the REC estimates in preparing the state budget.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Macroscopic features such as volume, surface estimate, thickness and caudorostral length of the human primary visual cortex (Brodman's area 17) of 46 human brains between midgestation and 93 years were studied by means of camera lucida drawings from serial frontal sections. Individual values were best fitted by a logistic function from midgestation to adulthood and by a regression line between adulthood and old age. Allometric functions were calculated to study developmental relationships between all the features. The three-dimensional shape of area 17 was also reconstructed from the serial sections in 15 cases and correlated with the sequence of morphological events. The sulcal pattern of area 17 begins to develop around 21 weeks of gestation but remains rather simple until birth, while it becomes more convoluted, particularly in the caudal part, during the postnatal period. Until birth, a large increase in cortical thickness (about 83% of its mean adult value) and caudorostral length (69%) produces a moderate increase in cortical volume (31%) and surface estimate (40%) of area 17. After birth, the cortical volume and surface undergo their maximum growth rate, in spite of a rather small increase in cortical thickness and caudorostral length. This is due to the development of the pattern of gyrification within and around the calcarine fissure. All macroscopic features have reached the mean adult value by the end of the first postnatal year. With aging, the only features to undergo significant regression are the cortical surface estimate and the caudorostral length. The total number of neurons in area 17 shows great interindividual variability at all ages. No decrease in the postnatal period or in aging could be demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust estimators for accelerated failure time models with asymmetric (or symmetric) error distribution and censored observations are proposed. It is assumed that the error model belongs to a log-location-scale family of distributions and that the mean response is the parameter of interest. Since scale is a main component of mean, scale is not treated as a nuisance parameter. A three steps procedure is proposed. In the first step, an initial high breakdown point S estimate is computed. In the second step, observations that are unlikely under the estimated model are rejected or down weighted. Finally, a weighted maximum likelihood estimate is computed. To define the estimates, functions of censored residuals are replaced by their estimated conditional expectation given that the response is larger than the observed censored value. The rejection rule in the second step is based on an adaptive cut-off that, asymptotically, does not reject any observation when the data are generat ed according to the model. Therefore, the final estimate attains full efficiency at the model, with respect to the maximum likelihood estimate, while maintaining the breakdown point of the initial estimator. Asymptotic results are provided. The new procedure is evaluated with the help of Monte Carlo simulations. Two examples with real data are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method is used to estimate the volumes of sediments of glacial valleys. This method is based on the concept of sloping local base level and requires only a digital terrain model and the limits of the alluvial valleys as input data. The bedrock surface of the glacial valley is estimated by a progressive excavation of the digital elevation model (DEM) of the filled valley area. This is performed using an iterative routine that replaces the altitude of a point of the DEM by the mean value of its neighbors minus a fixed value. The result is a curved surface, quadratic in 2D. The bedrock surface of the Rhone Valley in Switzerland was estimated by this method using the free digital terrain model Shuttle Radar Topography Mission (SRTM) (~92 m resolution). The results obtained are in good agreement with the previous estimations based on seismic profiles and gravimetric modeling, with the exceptions of some particular locations. The results from the present method and those from the seismic interpretation are slightly different from the results of the gravimetric data. This discrepancy may result from the presence of large buried landslides in the bottom of the Rhone Valley.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we develop a new decision making model and apply it in political Surveys of economic climate collect opinions of managers about the short-term future evolution of their business. Interviews are carried out on a regular basis and responses measure optimistic, neutral or pessimistic views about the economic perspectives. We propose a method to evaluate the sampling error of the average opinion derived from a particular type of survey data. Our variance estimate is useful to interpret historical trends and to decide whether changes in the index from one period to another are due to a structural change or whether ups and downs can be attributed to sampling randomness. An illustration using real data from a survey of business managers opinions is discussed.