913 resultados para multiple simultaneous equation models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La literatura gallega dels darrers anys s"ha vist profundament transformada per la irrupció de les escriptores i una «gramàtica violeta», una literatura concebuda en femení, que ha afectat també l"escriptura d"autoria masculina. Les seves propostes no se situen, doncs, en el marge del camp literari, malgrat que numèricament són menys. A la narrativa han engegat ambiciosos projectes de subversió de gènere amb una bona recepció. A la poesia s"han convertit en models literaris per la seva capacitat d"experimentació i renovació integral de l"escriptura, fent quallar una gramàtica violeta. I l"espai virtual, convertit en laboratori públic del llenguatge, s"experimenta amb l"escriptura i la identitat múltiple.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La literatura gallega dels darrers anys s"ha vist profundament transformada per la irrupció de les escriptores i una «gramàtica violeta», una literatura concebuda en femení, que ha afectat també l"escriptura d"autoria masculina. Les seves propostes no se situen, doncs, en el marge del camp literari, malgrat que numèricament són menys. A la narrativa han engegat ambiciosos projectes de subversió de gènere amb una bona recepció. A la poesia s"han convertit en models literaris per la seva capacitat d"experimentació i renovació integral de l"escriptura, fent quallar una gramàtica violeta. I l"espai virtual, convertit en laboratori públic del llenguatge, s"experimenta amb l"escriptura i la identitat múltiple.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Local microstructural pathology in multiple sclerosis patients might influence their clinical performance. This study applied multicontrast MRI to quantify inflammation and neurodegeneration in MS lesions. We explored the impact of MRI-based lesion pathology in cognition and disability. METHODS: 36 relapsing-remitting MS subjects and 18 healthy controls underwent neurological, cognitive, behavioural examinations and 3 T MRI including (i) fluid attenuated inversion recovery, double inversion recovery, and magnetization-prepared gradient echo for lesion count; (ii) T1, T2, and T2(*) relaxometry and magnetisation transfer imaging for lesion tissue characterization. Lesions were classified according to the extent of inflammation/neurodegeneration. A generalized linear model assessed the contribution of lesion groups to clinical performances. RESULTS: Four lesion groups were identified and characterized by (1) absence of significant alterations, (2) prevalent inflammation, (3) concomitant inflammation and microdegeneration, and (4) prevalent tissue loss. Groups 1, 3, 4 correlated with general disability (Adj-R (2) = 0.6; P = 0.0005), executive function (Adj-R (2) = 0.5; P = 0.004), verbal memory (Adj-R (2) = 0.4; P = 0.02), and attention (Adj-R (2) = 0.5; P = 0.002). CONCLUSION: Multicontrast MRI provides a new approach to infer in vivo histopathology of plaques. Our results support evidence that neurodegeneration is the major determinant of patients' disability and cognitive dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The enhanced functional sensitivity offered by ultra-high field imaging may significantly benefit simultaneous EEG-fMRI studies, but the concurrent increases in artifact contamination can strongly compromise EEG data quality. In the present study, we focus on EEG artifacts created by head motion in the static B0 field. A novel approach for motion artifact detection is proposed, based on a simple modification of a commercial EEG cap, in which four electrodes are non-permanently adapted to record only magnetic induction effects. Simultaneous EEG-fMRI data were acquired with this setup, at 7T, from healthy volunteers undergoing a reversing-checkerboard visual stimulation paradigm. Data analysis assisted by the motion sensors revealed that, after gradient artifact correction, EEG signal variance was largely dominated by pulse artifacts (81-93%), but contributions from spontaneous motion (4-13%) were still comparable to or even larger than those of actual neuronal activity (3-9%). Multiple approaches were tested to determine the most effective procedure for denoising EEG data incorporating motion sensor information. Optimal results were obtained by applying an initial pulse artifact correction step (AAS-based), followed by motion artifact correction (based on the motion sensors) and ICA denoising. On average, motion artifact correction (after AAS) yielded a 61% reduction in signal power and a 62% increase in VEP trial-by-trial consistency. Combined with ICA, these improvements rose to a 74% power reduction and an 86% increase in trial consistency. Overall, the improvements achieved were well appreciable at single-subject and single-trial levels, and set an encouraging quality mark for simultaneous EEG-fMRI at ultra-high field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To develop predictive models for early triage of burn patients based on hypersusceptibility to repeated infections. BACKGROUND: Infection remains a major cause of mortality and morbidity after severe trauma, demanding new strategies to combat infections. Models for infection prediction are lacking. METHODS: Secondary analysis of 459 burn patients (≥16 years old) with 20% or more total body surface area burns recruited from 6 US burn centers. We compared blood transcriptomes with a 180-hour cutoff on the injury-to-transcriptome interval of 47 patients (≤1 infection episode) to those of 66 hypersusceptible patients [multiple (≥2) infection episodes (MIE)]. We used LASSO regression to select biomarkers and multivariate logistic regression to built models, accuracy of which were assessed by area under receiver operating characteristic curve (AUROC) and cross-validation. RESULTS: Three predictive models were developed using covariates of (1) clinical characteristics; (2) expression profiles of 14 genomic probes; (3) combining (1) and (2). The genomic and clinical models were highly predictive of MIE status [AUROCGenomic = 0.946 (95% CI: 0.906-0.986); AUROCClinical = 0.864 (CI: 0.794-0.933); AUROCGenomic/AUROCClinical P = 0.044]. Combined model has an increased AUROCCombined of 0.967 (CI: 0.940-0.993) compared with the individual models (AUROCCombined/AUROCClinical P = 0.0069). Hypersusceptible patients show early alterations in immune-related signaling pathways, epigenetic modulation, and chromatin remodeling. CONCLUSIONS: Early triage of burn patients more susceptible to infections can be made using clinical characteristics and/or genomic signatures. Genomic signature suggests new insights into the pathophysiology of hypersusceptibility to infection may lead to novel potential therapeutic or prophylactic targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of private funding and management is enjoying an increasing trend in airports. The literature has not paid enough attention to the mixed management models in this industry, although many European airports take the form of mixed public-private companies, where ownership is shared between public and private sectors. We examine the determinants of the degree of private participation in the European airport sector. Drawing on a sample of the 100 largest European airports, we estimate a multivariate equation in order to determine the role of airport characteristics, fiscal variables, and political factors on the extent of private involvement. Our results confirm the alignment between public and private interests in partially privatized airports. Fiscal constraints and market attractiveness promote private participation. Integrated governance models and the share of network carriers prevent the presence of private ownership, while the degree of private participation appears to be pragmatic rather than ideological.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Horizontal gene transfer is central to microbial evolution, because it enables genetic regions to spread horizontally through diverse communities. However, how gene transfer exerts such a strong effect is not understood. Here we develop an eco-evolutionary model and show how genetic transfer, even when rare, can transform the evolution and ecology of microbes. We recapitulate existing models, which suggest that asexual reproduction will overpower horizontal transfer and greatly limit its effects. We then show that allowing immigration completely changes these predictions. With migration, the rates and impacts of horizontal transfer are greatly increased, and transfer is most frequent for loci under positive natural selection. Our analysis explains how ecologically important loci can sweep through competing strains and species. In this way, microbial genomes can evolve to become ecologically diverse where different genomic regions encode for partially overlapping, but distinct, ecologies. Under these conditions ecological species do not exist, because genes, not species, inhabit niches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of private funding and management enjoys an increasing trend in airports. The literature has not paid enough attention to the mixed management models in this industry, although many European airports take the form of mixed firms or Institutional PPP, where ownership is shared between public and private sectors. We examine the determinants of the degree of private participation in the European airport sector. Drawing on a sample of the 100 largest European airports we estimate a multivariate equation in order to determine the role of airport characteristics, fiscal variables and political factors on the extent of private involvement. Our results confirm the alignment between public and private interests in PPPs. Fiscal constraints and market attractiveness promote private participation. Integrated governance models and the share of network carriers prevent the presence of private ownership, while the degree of private participation appears to be pragmatic rather than ideological.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews and extends our previous work to enable fast axonal diameter mapping from diffusion MRI data in the presence of multiple fibre populations within a voxel. Most of the existing mi-crostructure imaging techniques use non-linear algorithms to fit their data models and consequently, they are computationally expensive and usually slow. Moreover, most of them assume a single axon orientation while numerous regions of the brain actually present more complex configurations, e.g. fiber crossing. We present a flexible framework, based on convex optimisation, that enables fast and accurate reconstructions of the microstructure organisation, not limited to areas where the white matter is coherently oriented. We show through numerical simulations the ability of our method to correctly estimate the microstructure features (mean axon diameter and intra-cellular volume fraction) in crossing regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the overall incidence of myocardial infarction (MI) has been decreasing since 2000 [1], there is an increasing number of younger patients presenting with MI [2]. Few studies have focused on MI in very young patients, aged 35 years or less, as they only account for a minority of all patients with myocardial infarction [3]. According to the age category, MI differs in presentation, treatment and outcome, as illustrated in table 1. Echocardiography is considered mandatory according to scientific guidelines in the management and diagnosis of MI [4,5,6]. However, new imaging techniques such as cardiac magnetic resonance (CMR) and computed tomography (CT) are increasingly performed and enable further refinement of the diagnosis of MI. These techniques allow, in particular, precise location and quantification of MI. In this case, MI was located to the septum, which is an unusual presentation of MI. The incidence of pulmonary embolism (PE) has also increased in young patients over the past years [7]. Since symptoms and signs of PE may be non-specific, establishing its diagnosis remains a challenge [8]. Therefore, PE is one of the most frequently missed diagnosis in clinical medicine. Because of the widespread use of CT and its improved visualization of pulmonary arteries, PE may be discovered incidentally [9]. In the absence of a congenital disorder, multiple and/or simultaneous disease presentation is uncommon in the young. We report the rare case of a 35 year old male with isolated septal MI and simultaneous PE. The diagnosis of this rare clinical entity was only possible by means of newer imaging techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simultaneous localization and mapping(SLAM) is a very important problem in mobile robotics. Many solutions have been proposed by different scientists during the last two decades, nevertheless few studies have considered the use of multiple sensors simultane¬ously. The solution is on combining several data sources with the aid of an Extended Kalman Filter (EKF). Two approaches are proposed. The first one is to use the ordinary EKF SLAM algorithm for each data source separately in parallel and then at the end of each step, fuse the results into one solution. Another proposed approach is the use of multiple data sources simultaneously in a single filter. The comparison of the computational com¬plexity of the two methods is also presented. The first method is almost four times faster than the second one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic algorithm was used for variable selection in simultaneous determination of mixtures of glucose, maltose and fructose by mid infrared spectroscopy. Different models, using partial least squares (PLS) and multiple linear regression (MLR) with and without data pre-processing, were used. Based on the results obtained, it was verified that a simpler model (multiple linear regression with variable selection by genetic algorithm) produces results comparable to more complex methods (partial least squares). The relative errors obtained for the best model was around 3% for the sugar determination, which is acceptable for this kind of determination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: This study examined potential predictors of remission among patients treated for major depressive disorder (MDD) in a naturalistic clinical setting, mostly in the Middle East, East Asia, and Mexico. METHODS: Data for this post hoc analysis were taken from a 6-month prospective, noninterventional, observational study that involved 1,549 MDD patients without sexual dysfunction at baseline in 12 countries worldwide. Depression severity was measured using the Clinical Global Impression of Severity and the 16-item Quick Inventory of Depressive Symptomatology Self-Report (QIDS-SR16). Depression-related pain was measured using the pain-related items of the Somatic Symptom Inventory. Remission was defined as a QIDS-SR16 score ≤5. Generalized estimating equation regression models were used to examine baseline factors associated with remission during follow-up. RESULTS: Being from East Asia (odds ratio [OR] 0.48 versus Mexico; P<0.001), a higher level of depression severity at baseline (OR 0.77, P=0.003, for Clinical Global Impression of Severity; OR 0.92, P<0.001, for QIDS-SR16), more previous MDD episodes (OR 0.92, P=0.007), previous treatments/therapies for depression (OR 0.78, P=0.030), and having any significant psychiatric and medical comorbidity at baseline (OR 0.60, P<0.001) were negatively associated with remission, whereas being male (OR 1.29, P=0.026) and treatment with duloxetine (OR 2.38 versus selective serotonin reuptake inhibitors, P<0.001) were positively associated with remission. However, the association between Somatic Symptom Inventory pain scores and remission no longer appeared to be significant in this multiple regression (P=0.580), (P=0.008 in descriptive statistics), although it remained significant in a subgroup of patients treated with selective serotonin reuptake inhibitors (OR 0.97, P=0.023), but not in those treated with duloxetine (P=0.182). CONCLUSION: These findings are largely consistent with previous reports from the USA and Europe. They also highlight the potential mediating role of treatment with duloxetine on the negative relationship between depression-related pain and outcomes of depression.