947 resultados para genetics, statistical genetics, variable models
Resumo:
According to Bell's theorem a large class of hidden-variable models obeying Bell's notion of local causality (LC) conflict with the predictions of quantum mechanics. Recently, a Bell-type theorem has been proven using a weaker notion of LC, yet assuming the existence of perfectly correlated event types. Here we present a similar Bell-type theorem without this latter assumption. The derived inequality differs from the Clauser-Horne inequality by some small correction terms, which render it less constraining.
Resumo:
Characterizing the spatial scaling and dynamics of convective precipitation in mountainous terrain and the development of downscaling methods to transfer precipitation fields from one scale to another is the overall motivation for this research. Substantial progress has been made on characterizing the space-time organization of Midwestern convective systems and tropical rainfall, which has led to the development of statistical/dynamical downscaling models. Space-time analysis and downscaling of orographic precipitation has received less attention due to the complexities of topographic influences. This study uses multiscale statistical analysis to investigate the spatial scaling of organized thunderstorms that produce heavy rainfall and flooding in mountainous regions. Focus is placed on the eastern and western slopes of the Appalachian region and the Front Range of the Rocky Mountains. Parameter estimates are analyzed over time and attention is given to linking changes in the multiscale parameters with meteorological forcings and orographic influences on the rainfall. Influences of geographic regions and predominant orographic controls on trends in multiscale properties of precipitation are investigated. Spatial resolutions from 1 km to 50 km are considered. This range of spatial scales is needed to bridge typical scale gaps between distributed hydrologic models and numerical weather prediction (NWP) forecasts and attempts to address the open research problem of scaling organized thunderstorms and convection in mountainous terrain down to 1-4 km scales.
Resumo:
The factorial validity of the SF-36 was evaluated using confirmatory factor analysis (CFA) methods, structural equation modeling (SEM), and multigroup structural equation modeling (MSEM). First, the measurement and structural model of the hypothesized SF-36 was explicated. Second, the model was tested for the validity of a second-order factorial structure, upon evidence of model misfit, determined the best-fitting model, and tested the validity of the best-fitting model on a second random sample from the same population. Third, the best-fitting model was tested for invariance of the factorial structure across race, age, and educational subgroups using MSEM.^ The findings support the second-order factorial structure of the SF-36 as proposed by Ware and Sherbourne (1992). However, the results suggest that: (a) Mental Health and Physical Health covary; (b) general mental health cross-loads onto Physical Health; (c) general health perception loads onto Mental Health instead of Physical Health; (d) many of the error terms are correlated; and (e) the physical function scale is not reliable across these two samples. This hierarchical factor pattern was replicated across both samples of health care workers, suggesting that the post hoc model fitting was not data specific. Subgroup analysis suggests that the physical function scale is not reliable across the "age" or "education" subgroups and that the general mental health scale path from Mental Health is not reliable across the "white/nonwhite" or "education" subgroups.^ The importance of this study is in the use of SEM and MSEM in evaluating sample data from the use of the SF-36. These methods are uniquely suited to the analysis of latent variable structures and are widely used in other fields. The use of latent variable models for self reported outcome measures has become widespread, and should now be applied to medical outcomes research. Invariance testing is superior to mean scores or summary scores when evaluating differences between groups. From a practical, as well as, psychometric perspective, it seems imperative that construct validity research related to the SF-36 establish whether this same hierarchical structure and invariance holds for other populations.^ This project is presented as three articles to be submitted for publication. ^
Resumo:
The most influential theoretical account in time psychophysics assumes the existence of a unitary internal clock based on neural counting. The distinct timing hypothesis, on the other hand, suggests an automatic timing mechanism for processing of durations in the sub-second range and a cognitively controlled timing mechanism for processing of durations in the range of seconds. Although several psychophysical approaches can be applied for identifying the internal structure of interval timing in the second and sub-second range, the existing data provide a puzzling picture of rather inconsistent results. In the present chapter, we introduce confirmatory factor analysis (CFA) to further elucidate the internal structure of interval timing performance in the sub-second and second range. More specifically, we investigated whether CFA would rather support the notion of a unitary timing mechanism or of distinct timing mechanisms underlying interval timing in the sub-second and second range, respectively. The assumption of two distinct timing mechanisms which are completely independent of each other was not supported by our data. The model assuming a unitary timing mechanism underlying interval timing in both the sub-second and second range fitted the empirical data much better. Eventually, we also tested a third model assuming two distinct, but functionally related mechanisms. The correlation between the two latent variables representing the hypothesized timing mechanisms was rather high and comparison of fit indices indicated that the assumption of two associated timing mechanisms described the observed data better than only one latent variable. Models are discussed in the light of the existing psychophysical and neurophysiological data.
Resumo:
The objective of this dissertation was to design and implement strategies for assessment of exposures to organic chemicals used in the production of a styrene-butadiene polymer at the Texas Plastics Company (TPC). Linear statistical retrospective exposure models, univariate and multivariate, were developed based on the validation of historical industrial hygiene monitoring data collected by industrial hygienists at TPC, and additional current industrial hygiene monitoring data collected for the purposes of this study. The current monitoring data served several purposes. First, it provided information on current exposure data, in the form of unbiased estimates of mean exposure to organic chemicals for each job title included. Second, it provided information on homogeneity of exposure within each job title, through the use of a carefully designed sampling scheme which addressed variability of exposure both between and within job titles. Third, it permitted the investigation of how well current exposure data can serve as an evaluation tool for retrospective exposure estimation. Finally, this dissertation investigated the simultaneous evaluation of exposure to several chemicals, as well as the use of values below detection limits in a multivariate linear statistical model of exposures. ^
Resumo:
El remonte extremo o remonte del 2% es un parámetro clave en la ingeniería costera dado que permite acometer actuaciones en las playas bajo criterios de sostenibilidad económico y socioambiental. Estas actuaciones van desde el diseño de estructuras en el trasdós de la playa a planes de actuación urbanística en la costa tal que se determine adecuadamente los límites de dominio público. El adecuado diseño de estas actuaciones adquiere más relevancia hoy en día debido a las nuevas amenazas que se ponen de relieve debido al cambio climático, y que en el caso concreto de la costa se materializa en inundaciones que provocan pérdidas económicas. Estudios precedentes han realizado ensayos in situ o en modelo físico para la determinación del remonte extremo en playas. Al comparar estas formulaciones la dispersión es alta lo que implica que la precisión en la obtención del remonte no sea suficiente. Esta dispersión se justifica debido al amplio espectro de playas existentes y la alta variabilidad del clima marítimo. Este problema cobra más relevancia debido a las actuaciones preventivas o correctivas a acometer frente al cambio climático bajo un criterio de sostenibilidad. Con el fin de realizar actuaciones sostenibles bajo el contexto actual del probable aumento de inundaciones costeras por cambio climático no deben obtenerse ni magnitudes sobredimensionadas con el consecuente consumo de recursos y afección a las actividades económicas, ni magnitudes subestimadas que pongan en riesgo la estabilidad y/o la funcionalidad de las actuaciones para un periodo de diseño. El principal objetivo de esta tesis es proponer una formulación de aplicación en la obtención del remonte extremo tal que se cumplan los criterios de seguridad para el servicio y funcionalidad de la obra y los criterios de sostenibilidad económico y socio-ambiental que se requieren hoy en día. Es decir, una fórmula que no sobredimensione el cálculo de este valor pero que pueda cubrir la casuística que acontece en las distintas tipologías de playas. Complementariamente a este objetivo se ejemplifica la aplicación de estas formulaciones en casos reales tal que se reduzca la incertidumbre y ambigüedad en la obtención de las variables independientes de las formulaciones. Para la consecución de estos objetivos se realiza un estado del arte en el que se estudia tanto los estudios estadísticos en la obtención de este parámetro como los modelos numéricos propuestos para ello, tal que se deduzca la mejor línea de investigación en la consecución del fin de esta tesis. Tras este estudio del arte se concluye que la mejor línea de investigación sigue la vía estadística y se diseña un modelo físico con fondo de arena en contraste con modelos físicos con fondo impermeable fijo. Los resultados de dicho modelo se han comparado con las formulaciones precedentes y se proponen las fórmulas de aplicación más convenientes para la obtención del remonte extremo. Complementariamente a la propuesta de formulaciones se desarrolla una metodología de aplicación de dichas formulaciones a casos de la costa española que ejemplifican convenientemente su uso para una adecuada predicción de este valor en las playas. The extreme runup is a key parameter in coastal management. This parameter allows to develop sustainability actions at the coast that meet economical and environmental criteria. At the coast the actions can be either design of structures at the shore or actions plans delimiting reclamation areas. The climate change has given more relevance to accomplish an appropriate design for the coastal management actions. At the coast the threaten are mainly focused on more frequent floods that cause economic losses. Previous studies have carried out field or physical model experiments to accomplish an equation for the extreme runup prediction. Although dispersion remains high when comparing the different proposals so the accuracy in the prediction might be risky. This scattering comes from the wide sort of beaches and the high variability of the maritime climate. The new actions that are needed to develop to counteract the effects of the climate change need a more efficient criteria. Hence formulations should not overestimate or underestimate the values of the extreme runup. The overestimation implies to consume resources that are not needed and the underestimation means in a structure risk to support safely the loads. The main goal of this thesis is to propose a formulation for the extreme runup prediction so the safety of the structure can be accomplished but at the same time the sustainability of the action is ensured under economical and environmental criteria that are demanded nowadays. So the formulation does not overestimate the extreme value but cover with enough confidence the different sort of beaches. The application of the formulation is also explained in order to reduce uncertainty when the input values are obtained. In order to accomplish the goal of this research firstly a literature review is done. Statistical and numerical models are studied. The statistical model is selected as the most convenient research guideline. In order to obtain runup results a physical model with sand bed is carried out. The bed differs from those that used impermeable slope in previous experiments. Once the results are obtained they are compared with the previous equations and a final formulation is proposed. Finally a methodology to apply the deduced formulation to the Spanish beaches is addressed.
Resumo:
Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.
Resumo:
Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.
Resumo:
Shipboard power systems have different characteristics than the utility power systems. In the Shipboard power system it is crucial that the systems and equipment work at their peak performance levels. One of the most demanding aspects for simulations of the Shipboard Power Systems is to connect the device under test to a real-time simulated dynamic equivalent and in an environment with actual hardware in the Loop (HIL). The real time simulations can be achieved by using multi-distributed modeling concept, in which the global system model is distributed over several processors through a communication link. The advantage of this approach is that it permits the gradual change from pure simulation to actual application. In order to perform system studies in such an environment physical phase variable models of different components of the shipboard power system were developed using operational parameters obtained from finite element (FE) analysis. These models were developed for two types of studies low and high frequency studies. Low frequency studies are used to examine the shipboard power systems behavior under load switching, and faults. High-frequency studies were used to predict abnormal conditions due to overvoltage, and components harmonic behavior. Different experiments were conducted to validate the developed models. The Simulation and experiment results show excellent agreement. The shipboard power systems components behavior under internal faults was investigated using FE analysis. This developed technique is very curial in the Shipboard power systems faults detection due to the lack of comprehensive fault test databases. A wavelet based methodology for feature extraction of the shipboard power systems current signals was developed for harmonic and fault diagnosis studies. This modeling methodology can be utilized to evaluate and predicate the NPS components future behavior in the design stage which will reduce the development cycles, cut overall cost, prevent failures, and test each subsystem exhaustively before integrating it into the system.
Resumo:
The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^
Resumo:
Shipboard power systems have different characteristics than the utility power systems. In the Shipboard power system it is crucial that the systems and equipment work at their peak performance levels. One of the most demanding aspects for simulations of the Shipboard Power Systems is to connect the device under test to a real-time simulated dynamic equivalent and in an environment with actual hardware in the Loop (HIL). The real time simulations can be achieved by using multi-distributed modeling concept, in which the global system model is distributed over several processors through a communication link. The advantage of this approach is that it permits the gradual change from pure simulation to actual application. In order to perform system studies in such an environment physical phase variable models of different components of the shipboard power system were developed using operational parameters obtained from finite element (FE) analysis. These models were developed for two types of studies low and high frequency studies. Low frequency studies are used to examine the shipboard power systems behavior under load switching, and faults. High-frequency studies were used to predict abnormal conditions due to overvoltage, and components harmonic behavior. Different experiments were conducted to validate the developed models. The Simulation and experiment results show excellent agreement. The shipboard power systems components behavior under internal faults was investigated using FE analysis. This developed technique is very curial in the Shipboard power systems faults detection due to the lack of comprehensive fault test databases. A wavelet based methodology for feature extraction of the shipboard power systems current signals was developed for harmonic and fault diagnosis studies. This modeling methodology can be utilized to evaluate and predicate the NPS components future behavior in the design stage which will reduce the development cycles, cut overall cost, prevent failures, and test each subsystem exhaustively before integrating it into the system.
Resumo:
We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.
Resumo:
We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.
Resumo:
Introduction The objectives of this thesis are to: (1) examine how ambulatory blood pressure monitoring (ABPM) refines office blood pressure (BP) measurement; (2) determine if absolute ambulatory BP or dipping status is better associated with target organ damage (TOD); (3) explore the association of isolated nocturnal hypertension (INH) with TOD; and (4) investigate the association of night-time BP with ultrasound markers of cardiovascular damage. Methods Data from the Mitchelstown Cohort Study was analysed to deliver objectives 1 and 2. Objective 3 was addressed by a systematic review and analysis of data from the Mitchelstown Study. A sample of participants from the Mitchelstown Study underwent an echocardiogram for speckle tracking analysis and carotid ultrasound to achieve objective 4. Results ABPM reclassifies hypertension status in approximately a quarter of individuals, with white coat and masked hypertension prevalence rates of 11% and 13% respectively. Night-time systolic BP is better associated with TOD than daytime systolic BP and dipping level. In multi-variable models the odds ratio (OR) for LVH was 1.4 (95% CI 1.1 -1.8) and for albumin:creatinine ratio ≥ 1.1 mg/mmol was 1.5 (95% CI 1.2 – 1.8) for each 10 mmHg rise in night-time systolic BP. The evidence for the association of INH with TOD is inconclusive. Night-time systolic BP is significantly associated with global longitudinal strain (GLS) (beta coefficient 0.85 for every 10 mmHg rise, 95% CI 0.3 – 1.4) and carotid plaques (OR 1.9 for every 10 mmHg rise, 95% CI 1.1 – 3.2) in univariable analysis. The findings persist for GLS in sex and age adjusted models but not in multivariable models. Discussion Hypertension cannot be effectively managed without using ABPM. Night-time systolic BP is better associated with TOD than daytime systolic BP and dipping level, and therefore, may be a better therapeutic target in future studies.
Resumo:
Entangled quantum states can be given a separable decomposition if we relax the restriction that the local operators be quantum states. Motivated by the construction of classical simulations and local hidden variable models, we construct `smallest' local sets of operators that achieve this. In other words, given an arbitrary bipartite quantum state we construct convex sets of local operators that allow for a separable decomposition, but that cannot be made smaller while continuing to do so. We then consider two further variants of the problem where the local state spaces are required to contain the local quantum states, and obtain solutions for a variety of cases including a region of pure states around the maximally entangled state. The methods involve calculating certain forms of cross norm. Two of the variants of the problem have a strong relationship to theorems on ensemble decompositions of positive operators, and our results thereby give those theorems an added interpretation. The results generalise those obtained in our previous work on this topic [New J. Phys. 17, 093047 (2015)].