974 resultados para variance-ratio tests


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique. Methods and patients: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes), and glaucoma suspect patients (38 eyes). All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis protocol: the hemifield sector analysis protocol. Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P<0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group). The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P<0.001), statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P<0.01), and only 1/11 pair was statistically significant (t-test P<0.9). The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86% respectively and 89% and 79% in glaucoma suspects. These results showed that the new analysis protocol was able to confirm existing visual field defects detected by standard perimetry, was able to differentiate between the three study groups with a clear distinction between normal patients and those with suspected glaucoma, and was able to detect early visual field changes not detected by standard perimetry. In addition, the distinction between normal and glaucoma patients was especially clear and significant using this analysis. Conclusion: The new hemifield sector analysis protocol used in mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol, it can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. The sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucomatous visual field loss. The intersector analysis protocol can detect early field changes not detected by the standard Humphrey Field Analyzer test. © 2013 Mousa et al, publisher and licensee Dove Medical Press Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To assess the accuracy and acceptability of polymerase chain reaction (PCR) and optical immunoassay (OIA) tests for the detection of maternal group B streptococcus (GBS) colonisation during labour, comparing their performance with the current UK policy of risk factor-based screening. Design Diagnostic test accuracy study. Setting and population Fourteen hundred women in labour at two large UK maternity units provided vaginal and rectal swabs for testing. Methods The PCR and OIA index tests were compared with the reference standard of selective enriched culture, assessed blind to index tests. Factors influencing neonatal GBS colonisation were assessed using multiple logistic regression, adjusting for antibiotic use. The acceptability of testing to participants was evaluated through a structured questionnaire administered after delivery. Main outcome measures The sensitivity and specificity of PCR, OIA and risk factor-based screening. Results Maternal GBS colonisation was 21% (19-24%) by combined vaginal and rectal swab enriched culture. PCR test of either vaginal or rectal swabs was more sensitive (84% [79-88%] versus 72% [65-77%]) and specific (87% [85-89%] versus 57% [53-60%]) than OIA (P <0.001), and far more sensitive (84 versus 30% [25-35%]) and specific (87 versus 80% [77-82%]) than risk factor-based screening (P <0.001). Maternal antibiotics (odds ratio, 0.22 [0.07-0.62]; P = 0.004) and a positive PCR test (odds ratio, 29.4 [15.8-54.8]; P <0.001) were strongly related to neonatal GBS colonisation, whereas risk factors were not (odds ratio, 1.44 [0.80-2.62]; P = 0.2). Conclusion Intrapartum PCR screening is a more accurate predictor of maternal and neonatal GBS colonisation than is OIA or risk factor-based screening, and is acceptable to women. © RCOG 2010 BJOG An International Journal of Obstetrics and Gynaecology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a simulation analysis we show that non-trading can cause an overstatement of the observed illiquidity ratio. Our paper shows how this overstatement can be eliminated with a very simple adjustment to the Amihud illiquidity ratio. We find that the adjustment improves the relationship between the illiquidity ratio and measures of illiquidity calculated from transaction data. Asset pricing tests show that without the adjustment, illiquidity premia estimates can be understated by more than 17% for NYSE securities and by more than 24% for NASDAQ securities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to explore the impact of the Florida State-mandated Basic Skills Exit Tests (BSET) on the effectiveness of remedial instruction programs to adequately serve the academically underprepared student population. The primary research question concerned whether the introduction of the BSET has resulted in remedial completers who are better prepared for college-level coursework. ^ This study consisted of an ex post facto research design to examine the impact of the BSET on student readiness for subsequent college-level coursework at Miami-Dade Community College. Two way analysis of variance was used to compare the performance of remedial and college-ready students before and after the introduction of the BSET requirement. Chi-square analysis was used to explore changes in the proportion of students completing and passing remedial courses. Finally, correlation analysis was used to explore the utility of the BSET in predicting subsequent college-level course performance. Differences based on subject area and race/ethnicity were explored. ^ The introduction of the BSET did not improve the performance of remedial completers in subsequent college-level courses in any of the subject areas. The BSET did have a negative impact on the success rate of students in remedial reading and mathematics courses. There was a significant decrease in minority students' likelihood of passing remedial reading and mathematics courses after the BSET was introduced. The reliability of the BSET is unacceptably low for all subject areas, based on estimates derived from administrations at M-DCC. Nevertheless, there was a significant positive relationship between BSET score and grade point average in subsequent college-level courses. This relationship varied by subject area and ethnicity, with the BSET reading score having no relationship with subsequent course performance for Black non-Hispanic students. ^ The BSET had no discernable positive effect on remedial student performance in subsequent college-level courses. In other words, the BSET has not enhanced the effectiveness of the remedial programs to prepare students for later coursework at M-DCC. The BSET had a negative impact on the progress and success of students in remedial reading and mathematics. ^

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present new Holocene century to millennial-scale proxies for the well-dated piston core MD99-2269 from Húnaflóadjúp on the North Iceland Shelf. The core is located in 365 mwd and lies close to the fluctuating boundary between Atlantic and Arctic/Polar waters. The proxies are: alkenone-based SST°C, and Mg/Ca SST°C estimates and stable d13C and d18O values on planktonic and benthic foraminifera. The data were converted to 60 yr equi-spaced time-series. Significant trends in the data were extracted using Singular Spectrum Analysis and these accounted for between 50% and 70% of the variance. A comparison between these data with previously published climate proxies from MD99-2269 was carried out on a data set which consisted of 14-variable data set covering the interval 400-9200 cal yr BP at 100 yr time steps. This analysis indicated that the 1st two PC axes accounted for 57% of the variability with high loadings clustering primarily into "nutrient" and "temperature" proxies. Clustering on the 100 yr time-series indicated major changes in environment at ~6350 and ~3450 cal yr BP, which define early, mid- and late Holocene climatic intervals. We argue that a pervasive freshwater cap during the early Holocene resulted in warm SST°s, a stratified water column, and a depleted nutrient supply. The loss of the freshwater layer in the mid-Holocene resulted in high carbonate production, and the late Holocene/neoglacial interval was marked by significantly more variable sea surface conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concentrations of Cd, Pb, Zn, Cu, Co, Ni, Fe, and Al203, water content, the amounts of organic carbon, the ratio of 13C/12C and the 14C-activity of the organic fraction were determined with sediment depth from a 34 cm long box-core from the Bornholm Basin (Baltic Sea). The average sedimentation rate was 2.4 mm/yr. The upper portion of the core contained increasing amounts of 14C-inactive organic carbon, and above 3 cm depth, man-made 14C from atomic bomb tests. The concentrations of the heavy metals Cd, Pb, Zn, and Cu increase strongly towards the surface, while other metals, as Fe, Ni and Co remain almost unchanged. This phenomenon is attributed to anthropogenic influences. A comparison of the Kieler Bucht, the Bornholm and the Gotland Basins shows that today the anthropogenic addition of Zn is about 100 mg/m**2 yr in all three basins. The beginning of this excess of Zn, however, is delayed by about 20 years in, the Bornholm Basin and by about 40 years in the Gotland Basin. It is suggested that SW-NE transport of these anthropogenically mobilized metals may be related to periodic bottom water renewal in the Baltic Sea sedimentary basins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean acidification will likely have negative impacts on invertebrates producing skeletons composed of calcium carbonate. Skeletal solubility is partly controlled by the incorporation of "foreign" ions (e.g. magnesium) into the crystal lattice of these skeletal structures, a process that is sensitive to a variety of biological and environmental factors. Here we explore effects of life stage, oceanographic region of origin, and changes in the partial pressure of carbon dioxide in seawater (pCO2) on trace elemental composition in the purple sea urchin (Strongylocentrotus purpuratus). We show that, similar to other urchin taxa, adult purple sea urchins have the ability to precipitate skeleton composed of a range of biominerals spanning low- to high-Mg calcites. Mg / Ca and Sr / Ca ratios were substantially lower in adult spines compared to adult tests. On the other hand, trace elemental composition was invariant among adults collected from four oceanographically distinct regions spanning a range of carbonate chemistry conditions (Oregon, Northern California, Central California, and Southern California). Skeletons of newly settled juvenile urchins that originated from adults from the four regions exhibited intermediate Mg / Ca and Sr / Ca between adult spine and test endmembers, indicating that skeleton precipitated during early life stages is more soluble than adult spines and less soluble than adult tests. Mean skeletal Mg / Ca or Sr / Ca of juvenile skeleton did not vary with source region when larvae were reared under present-day, global-average seawater carbonate conditions (400 µatm; pHT = 8.02 ± 0.03 1 SD; Omega calcite = 3.3 ± 0.2 1 SD). However, when reared under elevated pCO2 (900 µatm; pHT = 7.73 ± 0.03; Omega calcite = 1.8 ± 0.1), skeletal Sr / Ca in juveniles exhibited increased variance across the four regions. Although larvae from the northern populations (Oregon, Northern California, Central California) did not exhibit differences in Mg or Sr incorporation under elevated pCO2 (Sr / Ca = 2.10 ± 0.06 mmol/mol; Mg / Ca = 67.4 ± 3.9 mmol/mol), juveniles of Southern California origin partitioned ~8% more Sr into their skeletons when exposed to higher pCO2 (Sr / Ca = 2.26 ± 0.08 vs. 2.09 ± 0.005 mmol/mol 1 SD). Together these results suggest that the diversity of carbonate minerologies present across different skeletal structures and life stages in purple sea urchins does not translate into an equivalent geochemical plasticity of response associated with geographic variation or temporal shifts in seawater properties. Rather, composition of S. purpuratus skeleton precipitated during both early and adult life history stages appears relatively robust to spatial gradients and predicted future changes in carbonate chemistry. An exception to this trend may arise during early life stages, where certain populations of purple sea urchins may alter skeletal mineral precipitation rates and composition beyond a given pCO2 threshold. This potential for geochemical plasticity during early development in contrast to adult stage geochemical resilience adds to the growing body of evidence that ocean acidification can have differing effects across organismal life stages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Development Permit System has been introduce with minimal directives for establishing a decision making process. This is in opposition to the long established process for minor variances and suggests that the Development Permit System does not necessarily incorporate all of Ontario’s fundamental planning principles. From this concept, the study aimed to identify how minor variances are incorporated into the Development Permit System. In order to examine this topic, the research was based around the following research questions: • How are ‘minor variance’ applications processed within the DPS? • To what extent do the four tests of a minor variance influence the outcomes of lower level applications in the DPS approval process? A case study approach was used for this research. The single-case design employed both qualitative and quantitative research methods including a review of academic literature, court cases, and official documents, as well as a content analysis of Class 1, 1A, and 2 Development Permit application files from the Town of Carleton Place that were decided between 2011 and 2015. Upon the completion of the content analysis, it was found that minor variance issues were most commonly assigned to Class 1 applications. Planning staff generally met approval timelines and embraced their delegated approval authority, readily attaching conditions to applications in order to mitigate off-site impacts. While staff met the regulatory requirements of the DPS, ‘minor variance’ applications were largely decided on impact alone, demonstrating that the principles established by the four tests, the defining quality of the minor variance approval process, had not transferred to the Development Permit System. Alternatively, there was some evidence that the development community has not fully adjusted to the requirements of the new approvals process, as some applications were supported using a rationale containing the four tests. Subsequently, a set of four recommendations were offered which reflect the main themes established by the findings. The first two recommendations are directed towards the Province, the third to municipalities and the fourth to developers and planning consultants: 1) Amend Ontario Regulation 608/06 so that provisions under Section 4(3)(e) fall under Section 4(2). 2) Change the rhetoric from “combining elements of minor variances” to “replacing minor variances”. 3) Establish clear evaluation criteria. 4) Understand the evaluative criteria of the municipality in which you are working.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il existe une multitude de façons de protéger un portefeuille formé d'obligations contre une variation des taux d'intérêt qui pourrait affecter défavorablement la valeur marchande du portefeuille. L'une d'elles consiste à vendre des contrats à terme sur obligations afin que les variations de la valeur marchande du portefeuille soient compensées par les gains (ou les pertes) sur le marché à terme. Le succès d'une telle opération dépend de l'évaluation du ratio de couverture puisque c'est lui qui déterminera quelle quantité de contrats à terme il faudra vendre pour protéger le portefeuille. L'objectif de cette étude consiste à déterminer, parmi cinq méthodes d'estimation du ratio de couverture (une naïve et quatre théoriques), celle qui permet de minimiser la variance du rendement du portefeuille à couvrir tout en sacrifiant le moins possible en terme de rendement. Pour ce faire, nous avons utilisé neuf portefeuilles formés d'obligations du gouvernement du Canada ayant des caractéristiques (coupon, échéance) très différentes que nous avons couverts en utilisant le contrat à terme sur obligations du gouvernement du Canada qui se transige à la Bourse de Montréal. L'analyse des résultats nous a amené à conclure que la méthode naïve génère de meilleurs résultats que les méthodes théoriques lorsque le portefeuille à couvrir possède des caractéristiques semblables au titre qui sert de couverture. Dans tous les autres cas (où le portefeuille à couvrir a des caractéristiques très différentes du contrat à terme qui sert de couverture), la performance de la méthode naïve est plutôt médiocre, mais aucune autre méthode n'est supérieure aux autres sur une base régulière.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cada vez mais, os principais objetivos na indústria é a produção a baixo custo, com a máxima qualidade e com o tempo de fabrico o mais curto possível. Para atingir esta meta, a indústria recorre, frequentemente, às máquinas de comando numérico (CNC), uma vez que com esta tecnologia torna se capaz alcançar uma elevada precisão e um tempo de processamento mais baixo. As máquinas ferramentas CNC podem ser aplicadas em diferentes processos de maquinagem, tais como: torneamento, fresagem, furação, entre outros. De todos estes processos, o mais utilizado é a fresagem devido à sua versatilidade. Utiliza-se normalmente este processo para maquinar materiais metálicos como é o caso do aço e dos ferros fundidos. Neste trabalho, são analisados os efeitos da variação de quatro parâmetros no processo de fresagem (velocidade de corte, velocidade de avanço, penetração radial e penetração axial), individualmente e a interação entre alguns deles, na variação da rugosidade num aço endurecido (aço 12738). Para essa análise são utilizados dois métodos de otimização: o método de Taguchi e o método das superfícies. O primeiro método foi utilizado para diminuir o número de combinações possíveis e, consequentemente, o número de ensaios a realizar é denominado por método de Taguchi. O método das superfícies ou método das superfícies de resposta (RSM) foi utilizado com o intuito de comparar os resultados obtidos com o método de Taguchi, de acordo com alguns trabalhos referidos na bibliografia especializada, o RSM converge mais rapidamente para um valor ótimo. O método de Taguchi é muito conhecido no setor industrial onde é utilizado para o controlo de qualidade. Apresenta conceitos interessantes, tais como robustez e perda de qualidade, sendo bastante útil para identificar variações do sistema de produção, durante o processo industrial, quantificando a variação e permitindo eliminar os fatores indesejáveis. Com este método foi vi construída uma matriz ortogonal L16 e para cada parâmetro foram definidos dois níveis diferentes e realizados dezasseis ensaios. Após cada ensaio, faz-se a medição superficial da rugosidade da peça. Com base nos resultados obtidos das medições da rugosidade é feito um tratamento estatístico dos dados através da análise de variância (Anova) a fim de determinar a influência de cada um dos parâmetros na rugosidade superficial. Verificou-se que a rugosidade mínima medida foi de 1,05m. Neste estudo foi também determinada a contribuição de cada um dos parâmetros de maquinagem e a sua interação. A análise dos valores de “F-ratio” (Anova) revela que os fatores mais importantes são a profundidade de corte radial e da interação entre profundidade de corte radial e profundidade de corte axial para minimizar a rugosidade da superfície. Estes têm contribuições de cerca de 30% e 24%, respetivamente. Numa segunda etapa este mesmo estudo foi realizado pelo método das superfícies, a fim de comparar os resultados por estes dois métodos e verificar qual o melhor método de otimização para minimizar a rugosidade. A metodologia das superfícies de resposta é baseada num conjunto de técnicas matemáticas e estatísticas úteis para modelar e analisar problemas em que a resposta de interesse é influenciada por diversas variáveis e cujo objetivo é otimizar essa resposta. Para este método apenas foram realizados cinco ensaios, ao contrário de Taguchi, uma vez que apenas em cinco ensaios consegue-se valores de rugosidade mais baixos do que a média da rugosidade no método de Taguchi. O valor mais baixo por este método foi de 1,03μm. Assim, conclui-se que RSM é um método de otimização mais adequado do que Taguchi para os ensaios realizados. Foram obtidos melhores resultados num menor número de ensaios, o que implica menos desgaste da ferramenta, menor tempo de processamento e uma redução significativa do material utilizado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detecting change points in epidemic models has been studied by many scholars. Yao (1993) summarized five existing test statistics in the literature. Out of those test statistics, it was observed that the likelihood ratio statistic showed its standout power. However, all of the existing test statistics are based on an assumption that population variance is known, which is an unrealistic assumption in practice. To avoid assuming known population variance, a new test statistic for detecting epidemic models is studied in this thesis. The new test statistic is a parameter-free test statistic which is more powerful compared to the existing test statistics. Different sample sizes and lengths of epidemic durations are used for the power comparison purpose. Monte Carlo simulation is used to find the critical values of the new test statistic and to perform the power comparison. Based on the Monte Carlo simulation result, it can be concluded that the sample size and the length of the duration have some effect on the power of the tests. It can also be observed that the new test statistic studied in this thesis has higher power than the existing test statistics do in all of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El síndrome antifosfolípido es un desorden autoinmune caracterizado por hipercoagulabilidad que requiere terapia anticoagulante como pilar fundamental, siendo la warfarina el tratamiento de elección en los casos que requieren manejo por largos periodos. Sin embargo, los pacientes con anticoagulante lúpico positivo representan un reto porque tienen mayor riesgo de presentar eventos trombóticos, sumado a que el seguimiento con el International Normalized Ratio (INR) no es confiable, ya que estos anticuerpos generan interferencia con las pruebas de laboratorio basadas en fosfolípidos, como es el caso del tiempo de protrombina (PT) con INR basal prolongado, incluso antes del inicio de la terapia anticoagulante. Por tal razón, se ilustra el caso de una paciente con síndrome antifosfolípido primario y anticoagulante lúpico positivo quien ha presentado múltiples episodios trombóticos, a pesar de recibir terapia anticoagulante. Además se hace una revisión de la literatura disponible y se postulan nuevas metas de INR en estos pacientes diferentes de las que se plantean actualmente.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Indices predictive of central obesity include waist circumference (WC) and waist-to-height ratio (WHtR). The aims of this study were 1) to establish a Colombian youth smoothed centile charts and LMS tables for WC and WHtR and 2) to evaluate the utility of these parameters as predictors of overweight and obesity. Method: A cross-sectional study whose sample population comprised 7954 healthy Colombian schoolchildren [boys n=3460 and girls n=4494, mean (standard deviation) age 12.8 (2.3) years old]. Weight, height, body mass index (BMI), WC and WHtR and its percentiles were calculated. Appropriate cut-offs point of WC and WHtR for overweight and obesity, as defined by the International Obesity Task Force (IOTF) definitions, were selected using receiver operating characteristic (ROC) analysis. The discriminating power of WC and WHtR was expressed as area under the curve (AUC). Results: Reference values for WC and WHtR are presented. Mean WC increased and WHtR decreased with age for both genders. We found a moderate positive correlation between WC and BMI (r= 0.756, P < 0.01) and WHtR and BMI (r= 0.604, P < 0.01). The ROC analysis showed a high discrimination power in the identification of overweight and obesity for both measures in our sample population. Overall, WHtR was slightly a better predictor for overweight/obesity (AUC 95% CI 0.868-0.916) than the WC (AUC 95% CI 0.862-0.904). Conclusion: This paper presents the first sex- and age-specific WC and WHtR percentiles for both measures among Colombian children and adolescents aged 9–17.9 years. By providing LMS tables for Latin-American people based on Colombian reference data, we hope to provide quantitative tools for the study of obesity and its comorbidities.