941 resultados para Multiple Hypothesis Testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background For reliable assessment of ventilation inhomogeneity, multiple-breath washout (MBW) systems should be realistically validated. We describe a new lung model for in vitro validation under physiological conditions and the assessment of a new nitrogen (N2)MBW system. Methods The N2MBW setup indirectly measures the N2 fraction (FN2) from main-stream carbon dioxide (CO2) and side-stream oxygen (O2) signals: FN2 = 1−FO2−FCO2−FArgon. For in vitro N2MBW, a double chamber plastic lung model was filled with water, heated to 37°C, and ventilated at various lung volumes, respiratory rates, and FCO2. In vivo N2MBW was undertaken in triplets on two occasions in 30 healthy adults. Primary N2MBW outcome was functional residual capacity (FRC). We assessed in vitro error (√[difference]2) between measured and model FRC (100–4174 mL), and error between tests of in vivo FRC, lung clearance index (LCI), and normalized phase III slope indices (Sacin and Scond). Results The model generated 145 FRCs under BTPS conditions and various breathing patterns. Mean (SD) error was 2.3 (1.7)%. In 500 to 4174 mL FRCs, 121 (98%) of FRCs were within 5%. In 100 to 400 mL FRCs, the error was better than 7%. In vivo FRC error between tests was 10.1 (8.2)%. LCI was the most reproducible ventilation inhomogeneity index. Conclusion The lung model generates lung volumes under the conditions encountered during clinical MBW testing and enables realistic validation of MBW systems. The new N2MBW system reliably measures lung volumes and delivers reproducible LCI values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the late eighties, economists have been regarding the transition from command to market economies in Central and Eastern Europe with intense interest. In addition to studying the transition per se, they have begun using the region as a testing ground on which to investigate the validity of certain classic economic propositions. In his research, comprising three articles written in English and totalling 40 pages, Mr. Hanousek uses the so-called "Czech national experiment" (voucher privatisation scheme) to test the permanent income hypothesis (PIH). He took as his inspiration Kreinin's recommendation: "Since data concerning the behaviour of windfall income recipients is relatively scanty, and since such data can constitute an important test of the permanent income hypothesis, it is of interest to bring to bear on the hypothesis whatever information is available". Mr. Hanousek argues that, since the transfer of property to Czech citizens from 1992 to 1994 through the voucher scheme was not anticipated, it can be regarded as windfall income. The average size of the windfall was more than three month's salary and over 60 percent of the Czech population received this unexpected income. Furthermore, there are other reasons for conducting such an analysis in the Czech Republic. Firstly, the privatisation process took place quickly. Secondly, both the economy and consumer behaviour have been very stable. Thirdly, out of a total population of 10 million Czech citizens, an astonishing 6 million, that is, virtually every household, participated in the scheme. Thus Czech voucher privatisation provides a sample for testing the PIH almost equivalent to a full population, thus avoiding problems with the distribution of windfalls. Compare this, for instance with the fact that only 4% of the Israeli urban population received personal restitution from Germany, while the number of veterans who received the National Service Life Insurance Dividends amounted to less than 9% of the US population and were concentrated in certain age groups. But to begin with, Mr. Hanousek considers the question of whether the public percieves the transfer from the state to individual as an increase in net wealth. It can be argued that the state is only divesting itself of assets that would otherwise provide a future source of transfers. According to this argument, assigning these assets to individuals creates an offsetting change in the present value of potential future transfers so that individuals are no better off after the transfer. Mr. Hanousek disagrees with this approach. He points out that a change in the ownership of inefficient state-owned enterprises should lead to higher efficiency, which alone increases the value of enterprises and creates a windfall increase in citizens' portfolios. More importantly, the state and individuals had very different preferences during the transition. Despite government propaganda, it is doubtful that citizens of former communist countries viewed government-owned enterprises as being operated in the citizens' best interest. Moreover, it is unlikely that the public fully comprehended the sophisticated links between the state budget, state-owned enterprises, and transfers to individuals. Finally, the transfers were not equal across the population. Mr. Hanousek conducted a survey on 1263 individuals, dividing them into four monthly earnings categories. After determining whether the respondent had participated in the voucher process, he asked those who had how much of what they received from voucher privatisation had been (a) spent on goods and services, (b) invested elsewhere, (c) transferred to newly emerging pension funds, (d) given to a family member, and (e) retained in their original form as an investment. Both the mean and the variance of the windfall rise with income. He obtained similar results with respect to education, where the mean (median) windfall for those with a basic school education was 13,600 Czech Crowns (CZK), a figure that increased to 15,000 CZK for those with a high school education without exams, 19,900 CZK for high school graduates with exams, and 24,600 CZK for university graduates. Mr. Hanousek concludes that it can be argued that higher income (and better educated) groups allocated their vouchers or timed the disposition of their shares better. He turns next to an analysis of how respondents reported using their windfalls. The key result is that only a relatively small number of individuals reported spending on goods. Overall, the results provide strong support for the permanent income hypothesis, the only apparent deviation being the fact that both men and women aged 26 to 35 apparently consume more than they should if the windfall were annuitised. This finding is still fully consistent with the PIH, however, if this group is at a stage in their life-cycle where, without the windfall, they would be borrowing to finance consumption associated with family formation etc. Indeed, the PIH predicts that individuals who would otherwise borrow to finance consumption would consume the windfall up to the level equal to the annuitised fraction of the increase in lifetime income plus the full amount of the previously planned borrowing for consumption. Greater consumption would then be financed, not from investing the windfall, but from avoidance of future repayment obligations for debts that would have been incurred without the windfall.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last few years have seen the advent of high-throughput technologies to analyze various properties of the transcriptome and proteome of several organisms. The congruency of these different data sources, or lack thereof, can shed light on the mechanisms that govern cellular function. A central challenge for bioinformatics research is to develop a unified framework for combining the multiple sources of functional genomics information and testing associations between them, thus obtaining a robust and integrated view of the underlying biology. We present a graph theoretic approach to test the significance of the association between multiple disparate sources of functional genomics data by proposing two statistical tests, namely edge permutation and node label permutation tests. We demonstrate the use of the proposed tests by finding significant association between a Gene Ontology-derived "predictome" and data obtained from mRNA expression and phenotypic experiments for Saccharomyces cerevisiae. Moreover, we employ the graph theoretic framework to recast a surprising discrepancy presented in Giaever et al. (2002) between gene expression and knockout phenotype, using expression data from a different set of experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been a continuous evolutionary process in asphalt pavement design. In the beginning it was crude and based on past experience. Through research, empirical methods were developed based on materials response to specific loading at the AASHO Road Test. Today, pavement design has progressed to a mechanistic-empirical method. This methodology takes into account the mechanical properties of the individual layers and uses empirical relationships to relate them to performance. The mechanical tests that are used as part of this methodology include dynamic modulus and flow number, which have been shown to correlate with field pavement performance. This thesis was based on a portion of a research project being conducted at Michigan Technological University (MTU) for the Wisconsin Department of Transportation (WisDOT). The global scope of this project dealt with the development of a library of values as they pertain to the mechanical properties of the asphalt pavement mixtures paved in Wisconsin. Additionally, a comparison with the current associated pavement design to that of the new AASHTO Design Guide was conducted. This thesis describes the development of the current pavement design methodology as well as the associated tests as part of a literature review. This report also details the materials that were sampled from field operations around the state of Wisconsin and their testing preparation and procedures. Testing was conducted on available round robin and three Wisconsin mixtures and the main results of the research were: The test history of the Superpave SPT (fatigue and permanent deformation dynamic modulus) does not affect the mean response for both dynamic modulus and flow number, but does increase the variability in the test results of the flow number. The method of specimen preparation, compacting to test geometry versus sawing/coring to test geometry, does not statistically appear to affect the intermediate and high temperature dynamic modulus and flow number test results. The 2002 AASHTO Design Guide simulations support the findings of the statistical analyses that the method of specimen preparation did not impact the performance of the HMA as a structural layer as predicted by the Design Guide software. The methodologies for determining the temperature-viscosity relationship as stipulated by Witczak are sensitive to the viscosity test temperatures employed. The increase in asphalt binder content by 0.3% was found to actually increase the dynamic modulus at the intermediate and high test temperature as well as flow number. This result was based the testing that was conducted and was contradictory to previous research and the hypothesis that was put forth for this thesis. This result should be used with caution and requires further review. Based on the limited results presented herein, the asphalt binder grade appears to have a greater impact on performance in the Superpave SPT than aggregate angularity. Dynamic modulus and flow number was shown to increase with traffic level (requiring an increase in aggregate angularity) and with a decrease in air voids and confirm the hypotheses regarding these two factors. Accumulated micro-strain at flow number as opposed to the use of flow number appeared to be a promising measure for comparing the quality of specimens within a specific mixture. At the current time the Design Guide and its associate software needs to be further improved prior to implementation by owner/agencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of genotyping and next-generation sequencing technologies, multi-marker testing in genome-wide association study and rare variant association study became active research areas in statistical genetics. This dissertation contains three methodologies for association study by exploring different genetic data features and demonstrates how to use those methods to test genetic association hypothesis. The methods can be categorized into in three scenarios: 1) multi-marker testing for strong Linkage Disequilibrium regions, 2) multi-marker testing for family-based association studies, 3) multi-marker testing for rare variant association study. I also discussed the advantage of using these methods and demonstrated its power by simulation studies and applications to real genetic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report shares my efforts in developing a solid unit of instruction that has a clear focus on student outcomes. I have been a teacher for 20 years and have been writing and revising curricula for much of that time. However, most has been developed without the benefit of current research on how students learn and did not focus on what and how students are learning. My journey as a teacher has involved a lot of trial and error. My traditional method of teaching is to look at the benchmarks (now content expectations) to see what needs to be covered. My unit consists of having students read the appropriate sections in the textbook, complete work sheets, watch a video, and take some notes. I try to include at least one hands-on activity, one or more quizzes, and the traditional end-of-unit test consisting mostly of multiple choice questions I find in the textbook. I try to be engaging, make the lessons fun, and hope that at the end of the unit my students get whatever concepts I‘ve presented so that we can move on to the next topic. I want to increase students‘ understanding of science concepts and their ability to connect understanding to the real-world. However, sometimes I feel that my lessons are missing something. For a long time I have wanted to develop a unit of instruction that I know is an effective tool for the teaching and learning of science. In this report, I describe my efforts to reform my curricula using the “Understanding by Design” process. I want to see if this style of curriculum design will help me be a more effective teacher and if it will lead to an increase in student learning. My hypothesis is that this new (for me) approach to teaching will lead to increased understanding of science concepts among students because it is based on purposefully thinking about learning targets based on “big ideas” in science. For my reformed curricula I incorporate lessons from several outstanding programs I‘ve been involved with including EpiCenter (Purdue University), Incorporated Research Institutions for Seismology (IRIS), the Master of Science Program in Applied Science Education at Michigan Technological University, and the Michigan Association for Computer Users in Learning (MACUL). In this report, I present the methodology on how I developed a new unit of instruction based on the Understanding by Design process. I present several lessons and learning plans I‘ve developed for the unit that follow the 5E Learning Cycle as appendices at the end of this report. I also include the results of pilot testing of one of lessons. Although the lesson I pilot-tested was not as successful in increasing student learning outcomes as I had anticipated, the development process I followed was helpful in that it required me to focus on important concepts. Conducting the pilot test was also helpful to me because it led me to identify ways in which I could improve upon the lesson in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypothesis: Early recognition of coagulopathy may improve the care of patients with multiple injuries. Rapid thrombelastography (RapidTEG) is a new variant of thrombelastography (TEG), in which coagulation is initiated by the addition of protein tissue factor. The kinetics of coagulation and the times of measurement were compared for two variants of TEG--RapidTEG and conventional TEG, in which coagulation was initiated with kaolin. The measurements were performed on blood samples from 20 patients with multiple injuries. The RapidTEG results were also compared with conventional measurements of blood coagulation. The mean time for the RapidTEG test was 19.2 +/- 3.1 minutes (mean +/- SD), in comparison with 29.9 +/- 4.3 minutes for kaolin TEG and 34.1 +/- 14.5 minutes for conventional coagulation tests. The mean time for the RapidTEG test was 30.8 +/- 5.72 minutes, in comparison with 41.5 +/- 5.66 minutes for kaolin TEG and 64.9 +/- 18.8 for conventional coagulation tests---measured from admission of the patients to the resuscitation bay until the results were available. There were significant correlations between the RapidTEG results and those from kaolin TEG and conventional coagulation tests. RapidTEG is the most rapid available test for providing reliable information on coagulopathy in patients with multiple injuries. This has implications for improving patient care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our study evaluates the dimensionality and equivalence of social trust across cultural contexts, using new data from Switzerland and the World Values Survey 2005–2008. Whereas some scholars assert that trust should be regarded as a coherent concept, others claim that trust is better conceived of as a multidimensional concept. In contrast to the conventional dichotomy of the forms of social trust, we identify three distinct forms of trust, namely, particularized, generalized, and identity-based trust. Moreover, we dispute the view that respondents understand the wording of survey questions regarding social trust differently between different cultural contexts, which would imply that comparative research on trust is a pointless endeavor. Applying multiple-group confirmatory factor analysis to the various constructs of social trust, we conclude that one may study relationships among the three forms of trust and other theoretical constructs as well as compare latent means across cultural contexts. Our analyses therefore provide an optimistic outlook for future comparative analyses that investigate forms of social trust across cultural contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Nitrogen multiple-breath washout (N2 MBW) using 100% oxygen (O2 ) has regained interest to assess efficiency of tracer gas clearance in, for example, children with Cystic Fibrosis (CF). However, the influence of hyperoxia on the infants' respiratory control is unclear. We assessed safety and impact on breathing pattern from hyperoxia, and if exposure to 40% O2 first induces tolerance to subsequent 100% O2 for N2 MBW. METHODS We prospectively enrolled 39 infants aged 3-57 weeks: 15 infants with CF (8 sedated for testing) and 24 healthy controls. Infants were consecutively allocated to the protocols comprising of 100% O2 or 40/100% O2 administered for 30 breaths. Lung function was measured using an ultrasonic flowmeter setup. Primary outcome was tidal volume (VT ). RESULTS None of the infants experienced apnea, desaturation, or bradycardia. Both protocols initially induced hypoventilation. VT temporarily declined in 33/39 infants across 10-25 breaths. Hypoventilation occurred independent of age, disease, and sedation. In the new 40/100% O2 protocol, VT returned to baseline during 40% O2 and remained stable during 100% O2 exposure. End-tidal carbon dioxide monitored online did not change. CONCLUSION The classical N2 MBW protocol with 100% O2 may change breathing patterns of the infants. The new protocol with 40% O2 induces hyperoxia-tolerance and does not lead to changes in breathing patterns during later N2 washout using 100% O2 . Both protocols are safe, the new protocol seems an attractive option for N2 MBW in infants. Pediatr Pulmonol. © 2013 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: (1) To describe the ultrasonographic appearance of multiple congenital ocular anomalies (MCOA) in the eyes of horses with the PMEL17 (Silver) mutant gene. (2) To compare the accuracy of B-mode ocular ultrasound to conventional direct ophthalmoscopy. ANIMALS STUDIED: Sixty-seven Comtois and 18 Rocky Mountain horses were included in the study. PROCEDURES: Horses were classified as being carriers or noncarriers of the PMEL17 mutant allele based on coat color or genetic testing. Direct ophthalmoscopy followed by standardized ultrasonographic examination was performed in all horses. RESULTS: Seventy-five of 85 horses (88.24%) carried at least one copy of the Silver mutant allele. Cornea globosa, severe iridal hypoplasia, uveal cysts, cataracts, and retinal detachment could be appreciated with ultrasound. Carrier horses had statistically significantly increased anterior chamber depth and decreased thickness of anterior uvea compared with noncarriers (P < 0.05). Uveal cysts had a wide range of location and ultrasonographic appearances. In 51/73 (69.86%) carrier horses, ultrasound detected ciliary cysts that were missed with direct ophthalmoscopy. CONCLUSIONS: In this study, ultrasonography was useful to identify uveal cysts in PMEL17 mutant carriers and to assess anterior chamber depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A case of pulmonary tuberculosis caused by Mycobacterium tuberculosis was diagnosed in a horse. Clinical evaluation performed prior to euthanasia did not suggest tuberculosis, but postmortem examination provided pathological and bacteriological evidence of mycobacteriosis. In the lungs, multiple tuberculoid granulomas communicating with the bronchiolar lumen, pleural effusion, and a granulomatous lymphadenitis involving mediastinal and tracheobronchial lymph nodes were found. Serologic response to M. tuberculosis antigens was detected in the infected horse, but not in the group of 42 potentially exposed animals (18 horses, 14 alpacas, 6 donkeys, and 4 dogs) which showed no signs of disease. Diagnosis of tuberculosis in live horses remains extremely difficult. Four of 20 animal handlers at the farm were positive for tuberculous infection upon follow-up testing by interferon-gamma release assay, indicating a possibility of interspecies transmission of M. tuberculosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Texture analysis is an alternative method to quantitatively assess MR-images. In this study, we introduce dynamic texture parameter analysis (DTPA), a novel technique to investigate the temporal evolution of texture parameters using dynamic susceptibility contrast enhanced (DSCE) imaging. Here, we aim to introduce the method and its application on enhancing lesions (EL), non-enhancing lesions (NEL) and normal appearing white matter (NAWM) in multiple sclerosis (MS). METHODS We investigated 18 patients with MS and clinical isolated syndrome (CIS), according to the 2010 McDonald's criteria using DSCE imaging at different field strengths (1.5 and 3 Tesla). Tissues of interest (TOIs) were defined within 27 EL, 29 NEL and 37 NAWM areas after normalization and eight histogram-based texture parameter maps (TPMs) were computed. TPMs quantify the heterogeneity of the TOI. For every TOI, the average, variance, skewness, kurtosis and variance-of-the-variance statistical parameters were calculated. These TOI parameters were further analyzed using one-way ANOVA followed by multiple Wilcoxon sum rank testing corrected for multiple comparisons. RESULTS Tissue- and time-dependent differences were observed in the dynamics of computed texture parameters. Sixteen parameters discriminated between EL, NEL and NAWM (pAVG = 0.0005). Significant differences in the DTPA texture maps were found during inflow (52 parameters), outflow (40 parameters) and reperfusion (62 parameters). The strongest discriminators among the TPMs were observed in the variance-related parameters, while skewness and kurtosis TPMs were in general less sensitive to detect differences between the tissues. CONCLUSION DTPA of DSCE image time series revealed characteristic time responses for ELs, NELs and NAWM. This may be further used for a refined quantitative grading of MS lesions during their evolution from acute to chronic state. DTPA discriminates lesions beyond features of enhancement or T2-hypersignal, on a numeric scale allowing for a more subtle grading of MS-lesions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To review our clinical experience and determine if there are appropriate signs and symptoms to consider POLG sequencing prior to valproic acid (VPA) dosing in patients with seizures. METHODS: Four patients who developed VPA-induced hepatotoxicity were examined for POLG sequence variations. A subsequent chart review was used to describe clinical course prior to and after VPA dosing. RESULTS: Four patients of multiple different ethnicities, age 3-18 years, developed VPA-induced hepatotoxicity. All were given VPA due to intractable partial seizures. Three of the patients had developed epilepsia partialis continua. The time from VPA exposure to liver failure was between 2 and 3 months. Liver failure was reversible in one patient. Molecular studies revealed homozygous p.R597W or p.A467T mutations in two patients. The other two patients showed compound heterozygous mutations, p.A467T/p.Q68X and p.L83P/p.G888S. Clinical findings and POLG mutations were diagnostic of Alpers-Huttenlocher syndrome. CONCLUSION: Our cases underscore several important findings: POLG mutations have been observed in every ethnic group studied to date; early predominance of epileptiform discharges over the occipital region is common in POLG-induced epilepsy; the EEG and MRI findings varying between patients and stages of the disease; and VPA dosing at any stage of Alpers-Huttenlocher syndrome can precipitate liver failure. Our data support an emerging proposal that POLG gene testing should be considered in any child or adolescent who presents or develops intractable seizures with or without status epilepticus or epilepsia partialis continua, particularly when there is a history of psychomotor regression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple sclerosis (MS) is the most common demyelinating disease affecting the central nervous system. There is no cure for MS and current therapies have limited efficacy. While the majority of individuals with MS develop significant clinical disability, a subset experiences a disease course with minimal impairment even in the presence of significant apparent tissue damage on magnetic resonance imaging (MRI). The current studies combined functional MRI and diffusion tensor imaging (DTI) to elucidate brain mechanisms associated with lack of clinical disability in patients with MS. Recent evidence has implicated cortical reorganization as a mechanism to limit the clinical manifestation of the disease. Functional MRI was used to test the hypothesis that non-disabled MS patients (Expanded Disability Status Scale ≤ 1.5) show increased recruitment of cognitive control regions (dorsolateral prefrontal and anterior cingulate cortex) while performing sensory, motor and cognitive tasks. Compared to matched healthy controls, patients increased activation of cognitive control brain regions when performing non-dominant hand movements and the 2-back working memory task. Using dynamic causal modeling, we tested whether increased cognitive control recruitment is associated with alterations in connectivity in the working memory functional network. Patients exhibited similar network connectivity to that of control subjects when performing working memory tasks. We subsequently investigated the integrity of major white matter tracts to assess structural connectivity and its relation to activation and functional integration of the cognitive control system. Patients showed substantial alterations in callosal, inferior and posterior white matter tracts and less pronounced involvement of the corticospinal tracts and superior longitudinal fasciculi (SLF). Decreased structural integrity within the right SLF in patients was associated with decreased performance, and decreased activation and connectivity of the cognitive control system when performing working memory tasks. These studies suggest that patient with MS without clinical disability increase cognitive control system recruitment across functional domains and rely on preserved functional and structural connectivity of brain regions associated with this network. Moreover, the current studies show the usefulness of combining brain activation data from functional MRI and structural connectivity data from DTI to improve our understanding of brain adaptation mechanisms to neurological disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A graphing method was developed and tested to estimate gestational ages pre-and postnatally in a consistent manner for epidemiological research and clinical purposes on feti/infants of women with few consistent prenatal estimators of gestational age. Each patient's available data was plotted on a single page graph to give a comprehensive overview of that patient. A hierarchical classification of gestational age determination was then applied in a systematic manner, and reasonable gestational age estimates were produced. The method was tested for validity and reliability on 50 women who had known dates for their last menstrual period or dates of conception, and multiple ultrasound examinations and other gestational age estimating measures. The feasibility of the procedure was then tested on 1223 low income women with few gestational age estimators. The graphing method proved to have high inter- and intrarater reliability. It was quick, easy to use, inexpensive, and did not require special equipment. The graphing method estimate of gestational age for each infant was tested against the last menstrual period gestational age estimate using paired t-Tests, F tests and the Kolmogorov-Smirnov test of similar populations, producing a 98 percent probability or better that the means and data populations were the same. Less than 5 percent of the infants' gestational ages were misclassified using the graphing method, much lower than the amount of misclassification produced by ultrasound or neonatal examination estimates. ^