961 resultados para multiple analysis of variance (MANOVA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have previously reported the use of a novel mini-sequencing protocol for detection of the factor V Leiden variant, the first nucleotide change (FNC) technology. This technology is based on a single nucleotide extension of a primer, which is hybridized immediately adjacent to the site of mutation. The extended nucleotide that carries a reporter molecule (fluorescein) has the power to discriminate the genotype at the site of mutation. More recently, the prothrombin 20210 and thermolabile methylene tetrahydrofolate reductase (MTHFR) 677 variants have been identified as possible risk factors associated with thrombophilia. This study describes the use of the FNC technology in a combined assay to detect factor V, prothrombin and MTHFR variants in a population of Australian blood donors, and describes the objective numerical methodology used to determine genotype cut-off values for each genetic variation. Using FNC to test 500 normal blood donors, the incidence of Factor V Leiden was 3.6% (all heterozygous), that of prothrombin 20210 was 2.8% (all heterozygous) and that of MTHFR was 10% (homozygous). The combined FNC technology offers a simple, rapid, automatable DNA-based test for the detection of these three important mutations that are associated with familial thrombophilia. (C) 2000 Lippincott Williams and Wilkins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ENGLISH: Longline hook rates of bigeye and yellowfin tunas in the eastern Pacific Ocean were standardized by maximum depth of fishing, area, and season, using generalized linear models (GLM's). The annual trends of the standardized hook rates differ from the unstandardized, and are more likely to represent the changes in abundance of tunas in the age groups most vulnerable to longliners in the fishing grounds. For both species all of the interactions in the GLM's involving years, depths of fishing, areas, and seasons were significant. This means that the annual trends in hook rates depend on which depths, areas, and seasons are being considered. The overall average hook rates for each were estimated by weighting each 5-degree quadrangle equally and each season by the number of months in it. Since the annual trends in hook rates for each fishing depth category are roughly the same for bigeye, total average annual hook rate estimates are possible with the GLM. For yellowfin, the situation is less clear because of a preponderance of empty cells in the model. The full models explained 55% of the variation in bigeye hook rate and 33% of that of yellowfin. SPANISH: Se estandardizaron las tasas de captura con palangre de atunes patudo y aleta amarilla en el Océano Pacífico oriental por la profunidad máxima de pesca, área, y temporada, usando modelos lineales generalizados (MLG). Las tendencias anuales de las tasas de captura estandardizadas son diferentes a las de las tasas no estandardizadas, y es más que representen los cambios en la abundancia de los atunes en los grupos de edad más vulnerables a los palangreros en las áreas de pesca. Para ambas especies fueron significativas todas las interacciones en los MLG con año, profundidad de pesca, área, y temporada. Esto significa que las tendencias anuales de las tasas de captura dependen de cuál profundidad, área, y temporado se está considerando. Para la estimación de la tasa de captura general media para cada especie se ponderó cada cuadrángulo de 5 grados igualmente y cada temporada por el número de meses que contiene. Ya que las tendencias anuales en las tasas de captura para cada categoría de profundidad de pesca son aproximadamente iguales para el patudo, son posibles estimaciones de la tasa de captura anual media total con el MLG. En el caso del aleta amarilla, la situación es más confusa, debido a una preponderancia de celdas vacías en el modelo. Los modelos completos explican el 55% de la variación de la tasa de captura de patudo y 33% de la del aleta amarilla. (PDF contains 19 pages.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A brief description is given of a program to carry out analysis of variance two-way classification on MICRO 2200, for use in fishery data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To bring out the relative efficiency of various types of fishing gears, in the analysis of catch data, a combination of Tukey's test, consequent transformation and graphical analysis for outlier elimination has been introduced, which can be advantageously used for applying ANOVA techniques, Application of these procedures to actual sets of data showed that nonadditivity in the data was caused by either the presence of outliers, or the absence of a suitable transformation or both. As a corollary, the concurrent model: X sub(ij) = µ + α sub(i) + β sub(j) + λ α sub(i) β sub(j) + E sub(ij) adequately fits the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite claims in the trade literature that a number of recommended practices have been proved to lead to IT outsourcing success, few of these practices have been subject to disconfirmatory research. Even fewer have been tested statistically to determine whether they generalize to wider populations, or to determine the magnitude of their effect. In this paper, several recommended outsourcing practices associated with service level agreements (SLAs) and benchmarking are investigated. These practices are recommended extensively on the basis of case study research, yet they do have downsides, and they add substantially to the transaction costs of outsourcing. Based on a large survey of organizations engaged in IT outsourcing, this paper established that developing detailed SLAs did improve cost and service outcome, and that clients who met with vendors more frequently to renegotiate service levels reported greater outsourcing success. The research also established that benchmarking both before outsourcing commences, and once the outsourcing contract is in place, led to improvements in cost and service outcomes. Benchmarking during the outsourcing contract had the greatest effect, accounting for 10% of the variance in a success vector that included strategic, technical, cost-related and service outcomes plus an overall evaluation of satisfaction and value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To analyse the effects of two interventions on the cognition and balance of institutionalized elderly people with mixed dementia.Methods: Fifty-four participants were allocated into three groups. Group 1 was assisted by an interdisciplinary programme comprising physiotherapy, occupational therapy and physical education. A physiotherapist alone carried out the intervention in group 2. Group 3 was considered as control. Assessors were blinded to guarantee the absence of bias. Cognitive functions were analysed with the Mini-Mental State Examination and the Brief Cognitive Screening Battery. Balance was assessed with the Berg Balance Scale and the Timed Get-Up-and-Go Test. Multiple analysis of variance (MANOVA) was used to test possible main effects of the interventions.Results: The results showed benefits on the balance of subjects in both groups 1 (F=3.9, P < 0.05) and 2 (F= 3.1, P < 0.05), compared with group 3. MANOVA did not indicate benefits on the cognitive functions between groups 1 and 3 (F= 1.1, P > 0.05) and groups 2 and 3 (F= 1.6, P > 0.05). However, univariate analysis indicated some benefits of the interdisciplinary intervention on two specific domains measured by the Brief Cognitive Screening Battery (F=26.5, P < 0.05; F= 4.4, P < 0.05).Conclusion: Six months of multidisciplinary or physiotherapeutic intervention were able to improve a person's balance. Although global cognition did not improve through treatment, when the intervention was carried out on a multidisciplinary basis we observed an attenuation in the decline of global cognition on two specific cognitive domains. Exercises applied in different contexts may have positive outcomes for people with dementia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying and comparing different steady states is an important task for clinical decision making. Data from unequal sources, comprising diverse patient status information, have to be interpreted. In order to compare results an expressive representation is the key. In this contribution we suggest a criterion to calculate a context-sensitive value based on variance analysis and discuss its advantages and limitations referring to a clinical data example obtained during anesthesia. Different drug plasma target levels of the anesthetic propofol were preset to reach and maintain clinically desirable steady state conditions with target controlled infusion (TCI). At the same time systolic blood pressure was monitored, depth of anesthesia was recorded using the bispectral index (BIS) and propofol plasma concentrations were determined in venous blood samples. The presented analysis of variance (ANOVA) is used to quantify how accurately steady states can be monitored and compared using the three methods of measurement.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador: