933 resultados para Statistical factora analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The use of inter-laboratory test comparisons to determine the performance of individual laboratories for specific tests (or for calibration) [ISO/IEC Guide 43-1, 1997. Proficiency testing by interlaboratory comparisons - Part 1: Development and operation of proficiency testing schemes] is called Proficiency Testing (PT). In this paper we propose the use of the generalized likelihood ratio test to compare the performance of the group of laboratories for specific tests relative to the assigned value and illustrate the procedure considering an actual data from the PT program in the area of volume. The proposed test extends the test criteria in use allowing to test for the consistency of the group of laboratories. Moreover, the class of elliptical distributions are considered for the obtained measurements. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We report a statistical analysis of Doppler broadening coincidence data of electron-positron annihilation radiation in silicon using a (22)Na source. The Doppler broadening coincidence spectrum was fit using a model function that included positron annihilation at rest with 1s, 2s, 2p, and valence band electrons. In-flight positron annihilation was also fit. The response functions of the detectors accounted for backscattering, combinations of Compton effects, pileup, ballistic deficit, and pulse-shaping problems. The procedure allows the quantitative determination of positron annihilation with core and valence electron intensities as well as their standard deviations directly from the experimental spectrum. The results obtained for the core and valence band electron annihilation intensities were 2.56(9)% and 97.44(9)%, respectively. These intensities are consistent with published experimental data treated by conventional analysis methods. This new procedure has the advantage of allowing one to distinguish additional effects from those associated with the detection system response function. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

MCNP has stood so far as one of the main Monte Carlo radiation transport codes. Its use, as any other Monte Carlo based code, has increased as computers perform calculations faster and become more affordable along time. However, the use of Monte Carlo method to tally events in volumes which represent a small fraction of the whole system may turn to be unfeasible, if a straight analogue transport procedure (no use of variance reduction techniques) is employed and precise results are demanded. Calculations of reaction rates in activation foils placed in critical systems turn to be one of the mentioned cases. The present work takes advantage of the fixed source representation from MCNP to perform the above mentioned task in a more effective sampling way (characterizing neutron population in the vicinity of the tallying region and using it in a geometric reduced coupled simulation). An extended analysis of source dependent parameters is studied in order to understand their influence on simulation performance and on validity of results. Although discrepant results have been observed for small enveloping regions, the procedure presents itself as very efficient, giving adequate and precise results in shorter times than the standard analogue procedure. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Statistical methods of multiple regression analysis, trend surface analysis and principal components analysis were applied to seismographic data recorded during production blasting at a diabase quarry in the urban area of Campinas (SP), Brazil. The purpose of these analyses was to determine the influence of the following variables: distance (D), charge weight per delay (W), and scaled distance (SD) associated with properties of the rock body (orientation, frequency and angle of geological discontinuities; depth of bedrock and thickness of the soil overburden) in the variation of the peak particle velocity (PPV). This approach yielded variables with larger influences (loads) on the variation of ground vibration, as well as behavior and space tendency of this variation. The results showed a better relationship between PPV and D, with D being the most important factor in the attenuation of the ground vibrations. The geological joints and the depth to bedrock have a larger influence than the explosive charges in the variation of the vibration levels, but frequencies appear to be more influenced by the amount of soil overburden.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Given that the total amount of losses in a distribution system is known, with a reliable methodology for the technical loss calculation, the non-technical losses can be obtained by subtraction. A usual method of calculation technical losses in the electric utilities uses two important factors: load factor and the loss factor. The load factor is usually obtained with energy and demand measurements, whereas, to compute the loss factor it is necessary the learning of demand and energy loss, which are not, in general, prone of direct measurements. In this work, a statistical analysis of this relationship using the curves of a sampling of consumers in a specific company is presented. These curves will be summarized in different bands of coefficient k. Then, it will be possible determine where each group of consumer has its major concentration of points. ©2008 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The purpose of this study is to analyze the tension distribution on bone tissue around implants with different angulations (0 degrees, 17 degrees, and 30 degrees) and connections (external hexagon and tapered) through the use of three-dimensional finite element and statistical analyses.Methods: Twelve different configurations of three-dimensional finite element models, including three inclinations of the implants (0 degrees, 17 degrees, and 30 degrees), two connections (an external hexagon and a tapered), and two load applications (axial and oblique), were simulated. The maximum principal stress values for cortical bone were measured at the mesial, distal, buccal, and lingual regions around the implant for each analyzed situation, totaling 48 groups. Loads of 200 and 100 N were applied at the occlusal surface in the axial and oblique directions, respectively. Maximum principal stress values were measured at the bone crest and statistically analyzed using analysis of variance. Stress patterns in the bone tissue around the implant were analyzed qualitatively.Results: The results demonstrated that under the oblique loading process, the external hexagon connection showed significantly higher stress concentrations in the bone tissue (P < 0.05) compared with the tapered connection. Moreover, the buccal and mesial regions of the cortical bone concentrated significantly higher stress (P < 0.005) to the external hexagon implant type. Under the oblique loading direction, the increased external hexagon implant angulation induced a significantly higher stress concentration (P = 0.045).Conclusions: The study results show that: 1) the oblique load was more damaging to bone tissue, mainly when associated with external hexagon implants; and 2) there was a higher stress concentration on the buccal region in comparison to all other regions under oblique load.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The study of short implants is relevant to the biomechanics of dental implants, and research on crown increase has implications for the daily clinic. The aim of this study was to analyze the biomechanical interactions of a singular implant-supported prosthesis of different crown heights under vertical and oblique force, using the 3-D finite element method. Six 3-D models were designed with Invesalius 3.0, Rhinoceros 3D 4.0, and Solidworks 2010 software. Each model was constructed with a mandibular segment of bone block, including an implant supporting a screwed metal-ceramic crown. The crown height was set at 10, 12.5, and 15 mm. The applied force was 200 N (axial) and 100 N (oblique). We performed an ANOVA statistical test and Tukey tests; p < 0.05 was considered statistically significant. The increase of crown height did not influence the stress distribution on screw prosthetic (p > 0.05) under axial load. However, crown heights of 12.5 and 15 mm caused statistically significant damage to the stress distribution of screws and to the cortical bone (p <0.001) under oblique load. High crown to implant (C/I) ratio harmed microstrain distribution on bone tissue under axial and oblique loads (p < 0.001). Crown increase was a possible deleterious factor to the screws and to the different regions of bone tissue. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The identification of tree species is a key step for sustainable management plans of forest resources, as well as for several other applications that are based on such surveys. However, the present available techniques are dependent on the presence of tree structures, such as flowers, fruits, and leaves, limiting the identification process to certain periods of the year Therefore, this article introduces a study on the application of statistical parameters for texture classification of tree trunk images. For that, 540 samples from five Brazilian native deciduous species were acquired and measures of entropy, uniformity, smoothness, asymmetry (third moment), mean, and standard deviation were obtained from the presented textures. Using a decision tree, a biometric species identification system was constructed and resulted to a 0.84 average precision rate for species classification with 0.83accuracy and 0.79 agreement. Thus, it can be considered that the use of texture presented in trunk images can represent an important advance in tree identification, since the limitations of the current techniques can be overcome.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.