985 resultados para GENERAL CORRELATION
Resumo:
Educational institutions of all levels invest large amounts of time and resources into instructional technology, with the goal of enhancing the educational effectiveness of the learning environment. The decisions made by instructors and institutions regarding the implementation of technology are guided by perceptions of usefulness held by those who are in control. The primary objective of this mixed methods study was to examine the student and faculty perceptions of technology being used in general education courses at a community college. This study builds upon and challenges the assertions of writers such as Prensky (2001a, 2001b) and Tapscott (1998) who claim that a vast difference in technology perception exists between generational groups, resulting in a diminished usefulness of technology in instruction. In this study, data were gathered through student surveys and interviews, and through faculty surveys and interviews. Analysis of the data used Kendall’s Tau test for correlation between various student and faculty variables in various groupings, and also typological analysis of the transcribed interview data. The analysis of the quantitative data revealed no relationship between age and perception of technology’s usefulness. A positive relationship was found to exist between the perception of the frequency of technology use and the perception of technology’s effectiveness, suggesting that both faculty members and students believed that the more technology is used, the more useful it is in instruction. The analysis of the qualitative data revealed that both faculty and students perceive technology to be useful, and that the most significant barriers to technology’s usefulness include faulty hardware and software systems,lack of user support, and lack of training for faculty. The results of the study suggest that the differences in perception of technology between generations that are proposed by Prensky may not exist when comparing adults from the younger generation with adults from the older generation. Further, the study suggests that institutions continue to invest in instructional technology, with a focus on high levels of support and training for faculty, and more universal availability of specific technologies, including web access, in class video, and presentation software. Adviser: Ronald Joekel
Resumo:
The Kellogg Shale of northern California has traditionally been considered to be late Eocene in age on the basis of benthic foraminifer, radiolarian, and diatom correlations. The 30-m-thick Kellogg section exposed west of Byron, California, however, contains middle Eocene planktonic foraminifers (Zone P12), coccoliths (Subzones CP13c and CP14a), silicoflagellates (Dictyocha hexacantha Zone), and diatoms. Quantitative studies of the silicoflagellates and diatoms show a general cooling trend through the section which is consistent with paleoclimatic trends for this part of the middle Eocene (ca. 42-45 Ma) from elsewhere in the world. Seven new silicoflagellate taxa (Corbisema angularis. C, exilis, C, hastate miranda, C. inermis ballantina, C. regina, Dictyocha byronalis, Naviculopsis Americana) and one new coccolithophorid species (Helicosphaera neolophota) are described.
Resumo:
Centronuclear myopathy (CNM) is a genetically heterogeneous disorder associated with general skeletal muscle weakness, type I fiber predominance and atrophy, and abnormally centralized nuclei. Autosomal dominant CNM is due to mutations in the large GTPase dynamin 2 (DNM2), a mechanochemical enzyme regulating cytoskeleton and membrane trafficking in cells. To date, 40 families with CNM-related DNM2 mutations have been described, and here we report 60 additional families encompassing a broad genotypic and phenotypic spectrum. In total, 18 different mutations are reported in 100 families and our cohort harbors nine known and four new mutations, including the first splice-site mutation. Genotype-phenotype correlation hypotheses are drawn from the published and new data, and allow an efficient screening strategy for molecular diagnosis. In addition to CNM, dissimilar DNM2 mutations are associated with Charcot-Marie-Tooth (CMT) peripheral neuropathy (CMTD1B and CMT2M), suggesting a tissue-specific impact of the mutations. In this study, we discuss the possible clinical overlap of CNM and CMT, and the biological significance of the respective mutations based on the known functions of dynamin 2 and its protein structure. Defects in membrane trafficking due to DNM2 mutations potentially represent a common pathological mechanism in CNM and CMT. Hum Mutat 33: 949-959, 2012. (C) 2012 Wiley Periodicals, Inc.
Resumo:
Ein neu entwickeltes globales Atmosphärenchemie- und Zirkulationsmodell (ECHAM5/MESSy1) wurde verwendet um die Chemie und den Transport von Ozonvorläufersubstanzen zu untersuchen, mit dem Schwerpunkt auf Nichtmethankohlenwasserstoffen. Zu diesem Zweck wurde das Modell durch den Vergleich der Ergebnisse mit Messungen verschiedenen Ursprungs umfangreich evaluiert. Die Analyse zeigt, daß das Modell die Verteilung von Ozon realistisch vorhersagt, und zwar sowohl die Menge als auch den Jahresgang. An der Tropopause gibt das Modell den Austausch zwischen Stratosphäre und Troposphäre ohne vorgeschriebene Flüsse oder Konzentrationen richtig wieder. Das Modell simuliert die Ozonvorläufersubstanzen mit verschiedener Qualität im Vergleich zu den Messungen. Obwohl die Alkane vom Modell gut wiedergeben werden, ergibt sich einige Abweichungen für die Alkene. Von den oxidierten Substanzen wird Formaldehyd (HCHO) richtig wiedergegeben, während die Korrelationen zwischen Beobachtungen und Modellergebnissen für Methanol (CH3OH) und Aceton (CH3COCH3) weitaus schlechter ausfallen. Um die Qualität des Modells im Bezug auf oxidierte Substanzen zu verbessern, wurden einige Sensitivitätsstudien durchgeführt. Diese Substanzen werden durch Emissionen/Deposition von/in den Ozean beeinflußt, und die Kenntnis über den Gasaustausch mit dem Ozean ist mit großen Unsicherheiten behaftet. Um die Ergebnisse des Modells ECHAM5/MESSy1 zu verbessern wurde das neue Submodell AIRSEA entwickelt und in die MESSy-Struktur integriert. Dieses Submodell berücksichtigt den Gasaustausch zwischen Ozean und Atmosphäre einschließlich der oxidierten Substanzen. AIRSEA, welches Informationen über die Flüssigphasenkonzentration des Gases im Oberflächenwasser des Ozeans benötigt wurde ausgiebig getestet. Die Anwendung des neuen Submodells verbessert geringfügig die Modellergebnisse für Aceton und Methanol, obwohl die Verwendung einer vorgeschriebenen Flüssigphasenkonzentration stark den Erfolg der Methode einschränkt, da Meßergebnisse nicht in ausreichendem Maße zu Verfügung stehen. Diese Arbeit vermittelt neue Einsichten über organische Substanzen. Sie stellt die Wichtigkeit der Kopplung zwischen Ozean und Atmosphäre für die Budgets vieler Gase heraus.
Resumo:
The aim of the thesis is to propose a Bayesian estimation through Markov chain Monte Carlo of multidimensional item response theory models for graded responses with complex structures and correlated traits. In particular, this work focuses on the multiunidimensional and the additive underlying latent structures, considering that the first one is widely used and represents a classical approach in multidimensional item response analysis, while the second one is able to reflect the complexity of real interactions between items and respondents. A simulation study is conducted to evaluate the parameter recovery for the proposed models under different conditions (sample size, test and subtest length, number of response categories, and correlation structure). The results show that the parameter recovery is particularly sensitive to the sample size, due to the model complexity and the high number of parameters to be estimated. For a sufficiently large sample size the parameters of the multiunidimensional and additive graded response models are well reproduced. The results are also affected by the trade-off between the number of items constituting the test and the number of item categories. An application of the proposed models on response data collected to investigate Romagna and San Marino residents' perceptions and attitudes towards the tourism industry is also presented.
Resumo:
Qualitative assessment of spontaneous motor activity in early infancy is widely used in clinical practice. It enables the description of maturational changes of motor behavior in both healthy infants and infants who are at risk for later neurological impairment. These assessments are, however, time-consuming and are dependent upon professional experience. Therefore, a simple physiological method that describes the complex behavior of spontaneous movements (SMs) in infants would be helpful. In this methodological study, we aimed to determine whether time series of motor acceleration measurements at 40-44 weeks and 50-55 weeks gestational age in healthy infants exhibit fractal-like properties and if this self-affinity of the acceleration signal is sensitive to maturation. Healthy motor state was ensured by General Movement assessment. We assessed statistical persistence in the acceleration time series by calculating the scaling exponent α via detrended fluctuation analysis of the time series. In hand trajectories of SMs in infants we found a mean α value of 1.198 (95 % CI 1.167-1.230) at 40-44 weeks. Alpha changed significantly (p = 0.001) at 50-55 weeks to a mean of 1.102 (1.055-1.149). Complementary multilevel regression analysis confirmed a decreasing trend of α with increasing age. Statistical persistence of fluctuation in hand trajectories of SMs is sensitive to neurological maturation and can be characterized by a simple parameter α in an automated and observer-independent fashion. Future studies including children at risk for neurological impairment should evaluate whether this method could be used as an early clinical screening tool for later neurological compromise.
Resumo:
Multislice-computed tomography (MSCT) and magnetic resonance imaging (MRI) are increasingly used for forensic purposes. Based on broad experience in clinical neuroimaging, post-mortem MSCT and MRI were performed in 57 forensic cases with the goal to evaluate the radiological methods concerning their usability for forensic head and brain examination. An experienced clinical radiologist evaluated the imaging data. The results were compared to the autopsy findings that served as the gold standard with regard to common forensic neurotrauma findings such as skull fractures, soft tissue lesions of the scalp, various forms of intracranial hemorrhage or signs of increased brain pressure. The sensitivity of the imaging methods ranged from 100% (e.g., heat-induced alterations, intracranial gas) to zero (e.g., mediobasal impression marks as a sign of increased brain pressure, plaques jaunes). The agreement between MRI and CT was 69%. The radiological methods prevalently failed in the detection of lesions smaller than 3mm of size, whereas they were generally satisfactory concerning the evaluation of intracranial hemorrhage. Due to its advanced 2D and 3D post-processing possibilities, CT in particular possessed certain advantages in comparison with autopsy with regard to forensic reconstruction. MRI showed forensically relevant findings not seen during autopsy in several cases. The partly limited sensitivity of imaging that was observed in this retrospective study was based on several factors: besides general technical limitations it became apparent that clinical radiologists require a sound basic forensic background in order to detect specific signs. Focused teaching sessions will be essential to improve the outcome in future examinations. On the other hand, the autopsy protocols should be further standardized to allow an exact comparison of imaging and autopsy data. In consideration of these facts, MRI and CT have the power to play an important role in future forensic neuropathological examination.
Resumo:
Four papers, written in collaboration with the author’s graduate school advisor, are presented. In the first paper, uniform and non-uniform Berry-Esseen (BE) bounds on the convergence to normality of a general class of nonlinear statistics are provided; novel applications to specific statistics, including the non-central Student’s, Pearson’s, and the non-central Hotelling’s, are also stated. In the second paper, a BE bound on the rate of convergence of the F-statistic used in testing hypotheses from a general linear model is given. The third paper considers the asymptotic relative efficiency (ARE) between the Pearson, Spearman, and Kendall correlation statistics; conditions sufficient to ensure that the Spearman and Kendall statistics are equally (asymptotically) efficient are provided, and several models are considered which illustrate the use of such conditions. Lastly, the fourth paper proves that, in the bivariate normal model, the ARE between any of these correlation statistics possesses certain monotonicity properties; quadratic lower and upper bounds on the ARE are stated as direct applications of such monotonicity patterns.
Resumo:
In this article, the authors evaluate a merit function for 2D/3D registration called stochastic rank correlation (SRC). SRC is characterized by the fact that differences in image intensity do not influence the registration result; it therefore combines the numerical advantages of cross correlation (CC)-type merit functions with the flexibility of mutual-information-type merit functions. The basic idea is that registration is achieved on a random subset of the image, which allows for an efficient computation of Spearman's rank correlation coefficient. This measure is, by nature, invariant to monotonic intensity transforms in the images under comparison, which renders it an ideal solution for intramodal images acquired at different energy levels as encountered in intrafractional kV imaging in image-guided radiotherapy. Initial evaluation was undertaken using a 2D/3D registration reference image dataset of a cadaver spine. Even with no radiometric calibration, SRC shows a significant improvement in robustness and stability compared to CC. Pattern intensity, another merit function that was evaluated for comparison, gave rather poor results due to its limited convergence range. The time required for SRC with 5% image content compares well to the other merit functions; increasing the image content does not significantly influence the algorithm accuracy. The authors conclude that SRC is a promising measure for 2D/3D registration in IGRT and image-guided therapy in general.
Resumo:
The close association between psychometric intelligence and general discrimination ability (GDA), conceptualized as latent variable derived from performance on different sensory discrimination tasks, is empirically well-established but theoretically widely unclear. The present study contrasted two alternative explanations for this association. The first explanation is based on what Spearman (1904) referred to as a central function underlying this relationship in the sense of the g factor of intelligence and becoming most evident in GDA. In this case, correlations between different aspects of cognitive abilities, such as working memory (WM) capacity, and psychometric intelligence should be mediated by GDA if their correlation is caused by g. Alternatively, the second explanation for the relationship between psychometric intelligence and GDA proceeds from fMRI studies which emphasize the role of WM functioning for sensory discrimination. Given the well-known relationship between WM and psychometric intelligence, the relationship between GDA and psychometric intelligence might be attributed to WM. The present study investigated these two alternative explanations at the level of latent variables. In 197 young adults, a model in which WM mediated the relationship between GDA and psychometric intelligence described the data better than a model in which GDA mediated the relationship between WM and psychometric intelligence. Moreover, GDA failed to explain portions of variance of psychometric intelligence above and beyond WM. These findings clearly support the view that the association between psychometric intelligence and GDA must be understood in terms of WM functioning.
Resumo:
In this paper, we extend the debate concerning Credit Default Swap valuation to include time varying correlation and co-variances. Traditional multi-variate techniques treat the correlations between covariates as constant over time; however, this view is not supported by the data. Secondly, since financial data does not follow a normal distribution because of its heavy tails, modeling the data using a Generalized Linear model (GLM) incorporating copulas emerge as a more robust technique over traditional approaches. This paper also includes an empirical analysis of the regime switching dynamics of credit risk in the presence of liquidity by following the general practice of assuming that credit and market risk follow a Markov process. The study was based on Credit Default Swap data obtained from Bloomberg that spanned the period January 1st 2004 to August 08th 2006. The empirical examination of the regime switching tendencies provided quantitative support to the anecdotal view that liquidity decreases as credit quality deteriorates. The analysis also examined the joint probability distribution of the credit risk determinants across credit quality through the use of a copula function which disaggregates the behavior embedded in the marginal gamma distributions, so as to isolate the level of dependence which is captured in the copula function. The results suggest that the time varying joint correlation matrix performed far superior as compared to the constant correlation matrix; the centerpiece of linear regression models.
Resumo:
With most clinical trials, missing data presents a statistical problem in evaluating a treatment's efficacy. There are many methods commonly used to assess missing data; however, these methods leave room for bias to enter the study. This thesis was a secondary analysis on data taken from TIME, a phase 2 randomized clinical trial conducted to evaluate the safety and effect of the administration timing of bone marrow mononuclear cells (BMMNC) for subjects with acute myocardial infarction (AMI).^ We evaluated the effect of missing data by comparing the variance inflation factor (VIF) of the effect of therapy between all subjects and only subjects with complete data. Through the general linear model, an unbiased solution was made for the VIF of the treatment's efficacy using the weighted least squares method to incorporate missing data. Two groups were identified from the TIME data: 1) all subjects and 2) subjects with complete data (baseline and follow-up measurements). After the general solution was found for the VIF, it was migrated Excel 2010 to evaluate data from TIME. The resulting numerical value from the two groups was compared to assess the effect of missing data.^ The VIF values from the TIME study were considerably less in the group with missing data. By design, we varied the correlation factor in order to evaluate the VIFs of both groups. As the correlation factor increased, the VIF values increased at a faster rate in the group with only complete data. Furthermore, while varying the correlation factor, the number of subjects with missing data was also varied to see how missing data affects the VIF. When subjects with only baseline data was increased, we saw a significant rate increase in VIF values in the group with only complete data while the group with missing data saw a steady and consistent increase in the VIF. The same was seen when we varied the group with follow-up only data. This essentially showed that the VIFs steadily increased when missing data is not ignored. When missing data is ignored as with our comparison group, the VIF values sharply increase as correlation increases.^
Resumo:
The effects of five technological procedures and of the contents of total anthocyanins and condensed tan- nins on 19 fermentation-related aroma compounds of young red Mencia wines were studied. Multifactor ANOVA revealed that levels of those volatiles changed significantly over the length of storage in bottles and, to a lesser extent, due to other technological factors considered; total anthocyanins and condensed tannins also changed significantly as a result of the five practices assayed. Five aroma compounds pos- sessed an odour activity value >1 in all wines, and another four in some wines. Linear correlation among volatile compounds and general phenolic composition revealed that total anthocyanins were highly related to 14 different aroma compounds. Multifactor ANOVA, considering the content of total anthocy- anins as a sixth random factor, revealed that this parameter affected significantly the contents of ethyl lactate, ethyl isovalerate, 1-pentanol and ethyl octanoate. Thus, the aroma of young red Mencia wines may be affected by levels of total anthocyanins
Resumo:
Piotr Omenzetter and Simon Hoell’s work within the Lloyd’s Register Foundation Centre for Safety and Reliability Engineering at the University of Aberdeen is supported by Lloyd’s Register Foundation. The Foundation helps to protect life and property by supporting engineering-related education, public engagement and the application of research.