948 resultados para Hierarchical partitioning analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

La determinazione della qualità dell’olio vergine di oliva e la definizione dell’appartenenza del prodotto ad una specifica categoria merceologica (extra vergine, vergine, lampante) può essere ottenuta mediante la valutazione organolettica effettuando il Panel test. Quest’ ultimo è attuato da un gruppo di assaggiatori esperti (panel), guidati da un capo-panel, e ha l’obiettivo di identificare e quantificare i principali attributi sensoriali (positivi e negativi) stabilendo, sulla base dei risultati, la categoria merceologica di appartenenza del prodotto. Lo scopo di questo lavoro di tesi è stato quello di valutare l’applicabilità di un metodo strumentale che, attraverso un approccio di screening rapido, possa supportare l’analisi sensoriale, fornendo una discriminazione degli oli analizzati in funzione della loro qualità (categoria merceologica). Per tale finalità, un set di 42 oli di oliva vergini provenienti dalla Spagna e dalla Croazia, classificati nelle tre categorie merceologiche sulla base del Panel test, è stato analizzato mediante guida d’onda che ha consentito di esaminare le forme d’onda, sia del guadagno che della fase, nell’intervallo di frequenza 1,6 -2,7 GHz. Dai risultati ottenuti, per diversi intervalli di frequenza gli spettri del guadagno sembrano essere influenzati dalla categoria merceologica. Inoltre, l’analisi delle componenti principali (PCA), condotta a partire da tale informazione spettrale, ha consentito, in linea generale, la discriminazione tra oli extra vergini, vergini e lampanti. Infine, la successiva Hierarchical Cluster Analysis ha permesso di identificare clusters distinti per i campioni lampanti e quelli extra vergini e vergini.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work shows the application of the analytic hierarchy process (AHP) in the full cost accounting (FCA) within the integrated resource planning (IRP) process. For this purpose, a pioneer case was developed and different energy solutions of supply and demand for a metropolitan airport (Congonhas) were considered [Moreira, E.M., 2005. Modelamento energetico para o desenvolvimento limpo de aeroporto metropolitano baseado na filosofia do PIR-O caso da metropole de Sao Paulo. Dissertacao de mestrado, GEPEA/USP]. These solutions were compared and analyzed utilizing the software solution ""Decision Lens"" that implements the AHP. The final part of this work has a classification of resources that can be considered to be the initial target as energy resources, thus facilitating the restraints of the IRP of the airport and setting parameters aiming at sustainable development. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Evidence of a sport-specific hierarchy of protective factors against doping would thus be a powerful aid in adapting information and prevention campaigns to target the characteristics of specific athlete groups, and especially those athletes most vulnerable for doping control. The contents of phone calls to a free and anonymous national anti-doping service called 'ecoute dopage' were analysed (192 bodybuilders, 124 cyclists and 44 footballers). The results showed that the protective factors that emerged from analysis could be categorised into two groups. The first comprised 'Health concerns', 'Respect for the law' and 'Doping controls from the environment' and the second comprised 'Doubts about the effectiveness of illicit products, 'Thinking skills' and 'Doubts about doctors'. The ranking of the factors for the cyclists differed from that of the other athletes. The ordering of factors was 1) respect for the law, 2) doping controls from the environment, 3) health concerns 4) doubts about doctors, and 5) doubts about the effectiveness illicit products. The results are analysed in terms of the ranking in each athlete group and the consequences on the athletes' experience and relationship to doping. Specific prevention campaigns are proposed to limit doping behaviour in general and for each sport.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Analysis of variance is commonly used in morphometry in order to ascertain differences in parameters between several populations. Failure to detect significant differences between populations (type II error) may be due to suboptimal sampling and lead to erroneous conclusions; the concept of statistical power allows one to avoid such failures by means of an adequate sampling. Several examples are given in the morphometry of the nervous system, showing the use of the power of a hierarchical analysis of variance test for the choice of appropriate sample and subsample sizes. In the first case chosen, neuronal densities in the human visual cortex, we find the number of observations to be of little effect. For dendritic spine densities in the visual cortex of mice and humans, the effect is somewhat larger. A substantial effect is shown in our last example, dendritic segmental lengths in monkey lateral geniculate nucleus. It is in the nature of the hierarchical model that sample size is always more important than subsample size. The relative weight to be attributed to subsample size thus depends on the relative magnitude of the between observations variance compared to the between individuals variance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Behavioral researchers commonly use single subject designs to evaluate the effects of a given treatment. Several different methods of data analysis are used, each with their own set of methodological strengths and limitations. Visual inspection is commonly used as a method of analyzing data which assesses the variability, level, and trend both within and between conditions (Cooper, Heron, & Heward, 2007). In an attempt to quantify treatment outcomes, researchers developed two methods for analysing data called Percentage of Non-overlapping Data Points (PND) and Percentage of Data Points Exceeding the Median (PEM). The purpose of the present study is to compare and contrast the use of Hierarchical Linear Modelling (HLM), PND and PEM in single subject research. The present study used 39 behaviours, across 17 participants to compare treatment outcomes of a group cognitive behavioural therapy program, using PND, PEM, and HLM on three response classes of Obsessive Compulsive Behaviour in children with Autism Spectrum Disorder. Findings suggest that PEM and HLM complement each other and both add invaluable information to the overall treatment results. Future research should consider using both PEM and HLM when analysing single subject designs, specifically grouped data with variability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Growth patterns and cropping were evaluated over the season for the everbearing strawberry 'Everest' at a range of temperatures (15-27degreesC) in two light environments (ambient and 50% shade). The highest yield was recorded for unshaded plants grown at 23degreesC, but the optimum temperature for vegetative growth was 15degreesC. With increasing temperature fruit number increased, but fruit weight decreased. Fruit weight was also significantly reduced by shade, and although 'Everest' showed a degree of shade tolerance in vegetative growth, yield was consistently reduced by shade. Shade also reduced the number of crowns developed by the plants over the course of the season, emphasising that crown number was ultimately the limiting factor for yield potential. We conclude that, in contrast to Junebearers which partition more assimilates to fruit at temperatures around 15degreesC (Le Miere et al., 1998), optimised cropping in the everbearer 'Everest' is achieved at the significantly higher temperature of 23degreesC. These findings have significance for commercial production, in which protection tends to reduce light levels but increase average temperature throughout the season.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In designing modern office buildings, building spaces are frequently zoned by introducing internal partitioning, which may have a significant influence on the room air environment. This internal partitioning was studied by means of model test, numerical simulation, and statistical analysis as the final stage. In this paper, the results produced from the statistical analysis are summarized and presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

K-Means is a popular clustering algorithm which adopts an iterative refinement procedure to determine data partitions and to compute their associated centres of mass, called centroids. The straightforward implementation of the algorithm is often referred to as `brute force' since it computes a proximity measure from each data point to each centroid at every iteration of the K-Means process. Efficient implementations of the K-Means algorithm have been predominantly based on multi-dimensional binary search trees (KD-Trees). A combination of an efficient data structure and geometrical constraints allow to reduce the number of distance computations required at each iteration. In this work we present a general space partitioning approach for improving the efficiency and the scalability of the K-Means algorithm. We propose to adopt approximate hierarchical clustering methods to generate binary space partitioning trees in contrast to KD-Trees. In the experimental analysis, we have tested the performance of the proposed Binary Space Partitioning K-Means (BSP-KM) when a divisive clustering algorithm is used. We have carried out extensive experimental tests to compare the proposed approach to the one based on KD-Trees (KD-KM) in a wide range of the parameters space. BSP-KM is more scalable than KDKM, while keeping the deterministic nature of the `brute force' algorithm. In particular, the proposed space partitioning approach has shown to overcome the well-known limitation of KD-Trees in high-dimensional spaces and can also be adopted to improve the efficiency of other algorithms in which KD-Trees have been used.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present, pedagogically, the Bayesian approach to composed error models under alternative, hierarchical characterizations; demonstrate, briefly, the Bayesian approach to model comparison using recent advances in Markov Chain Monte Carlo (MCMC) methods; and illustrate, empirically, the value of these techniques to natural resource economics and coastal fisheries management, in particular. The Bayesian approach to fisheries efficiency analysis is interesting for at least three reasons. First, it is a robust and highly flexible alternative to commonly applied, frequentist procedures, which dominate the literature. Second,the Bayesian approach is extremely simple to implement, requiring only a modest addition to most natural-resource economist tool-kits. Third, despite its attractions, applications of Bayesian methodology in coastal fisheries management are few.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: To assess the risk factors for delayed diagnosis of uterine cervical lesions. Materials and Methods: This is a case-control study that recruited 178 women at 2 Brazilian hospitals. The cases (n = 74) were composed of women with a late diagnosis of a lesion in the uterine cervix (invasive carcinoma in any stage). The controls (n = 104) were composed of women with cervical lesions diagnosed early on (low-or high-grade intraepithelial lesions). The analysis was performed by means of logistic regression model using a hierarchical model. The socioeconomic and demographic variables were included at level I (distal). Level II (intermediate) included the personal and family antecedents and knowledge about the Papanicolaou test and human papillomavirus. Level III (proximal) encompassed the variables relating to individuals' care for their own health, gynecologic symptoms, and variables relating to access to the health care system. Results: The risk factors for late diagnosis of uterine cervical lesions were age older than 40 years (odds ratio [OR] = 10.4; 95% confidence interval [CI], 2.3-48.4), not knowing the difference between the Papanicolaou test and gynecological pelvic examinations (OR, = 2.5; 95% CI, 1.3-4.9), not thinking that the Papanicolaou test was important (odds ratio [OR], 4.2; 95% CI, 1.3-13.4), and abnormal vaginal bleeding (OR, 15.0; 95% CI, 6.5-35.0). Previous treatment for sexually transmissible disease was a protective factor (OR, 0.3; 95% CI, 0.1-0.8) for delayed diagnosis. Conclusions: Deficiencies in cervical cancer prevention programs in developing countries are not simply a matter of better provision and coverage of Papanicolaou tests. The misconception about the Papanicolaou test is a serious educational problem, as demonstrated by the present study.