177 resultados para Scale density


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapport de synthèse Introduction : Le Glasgow coma score (GCS) est un outil reconnu permettant l'évaluation des patients après avoir subi un traumatisme crânien. Il est réputé pour sa simplicité et sa reproductibilité permettant ainsi aux soignants une évaluation appropriée et continue du status neurologique des patients. Le GCS est composé de trois catégories évaluant la réponse oculaire, verbale et motrice. En Suisse, les soins préhospitaliers aux patients victimes d'un trauma crânien sévère sont effectués par des médecins, essdntiellement à bord des hélicoptères médicalisés. Avant une anesthésie générale nécessaire à ces patients, une évaluation du GCS est essentielle indiquant au personnel hospitalier la gravité des lésions cérébrales. Afin d'évaluer la connaissance du GCS par les médecins à bord des hélicoptères médicalisés en Suisse, nous avons élaboré un questionnaire, contenant dans une première partie des questions sur les connaissances générales du GCS suivi d'un cas clinique. Objectif : Evaluation des connaissances pratiques et théoriques du GCS par les médecins travaillant à bord des hélicoptères médicalisés en Suisse. Méthode : Etude observationnelle prospective et anonymisée à l'aide d'un questionnaire. Evaluation des connaissances générales du GCS et de son utilisation clinique lors de la présentation d'un cas. Résultats : 16 des 18 bases d'hélicoptères médicalisés suisses ont participé à notre étude. 130 questionnaires ont été envoyés et le taux de réponse a été de 79.2%. Les connaissances théoriques du GCS étaient comparables pour tous les médecins indépendamment de leur niveau de formation. Des erreurs dans l'appréciation du cas clinique étaient présentes chez 36.9% des participants. 27.2% ont commis des erreurs dans le score moteur et 18.5% dans le score verbal. Les erreurs ont été répertoriées le plus fréquemment chez les médecins assistants (47.5%, p=0.09), suivi par les chefs de clinique (31.6%, p=0.67) et les médecins installés en cabinet (18.4%, p=1.00). Les médecins cadres ont fait significativement moins d'erreurs que les autres participants (0%, p<0.05). Aucune différence significative n'à été observée entre les différentes spécialités (anesthésie, médecine interne, médecine général et «autres »). Conclusion Même si les connaissances théoriques du GCS sont adéquates parmi les médecins travaillant à bord des hélicoptères médicalisés, des erreurs dans son application clinique sont présentes dans plus d'un tiers des cas. Les médecins avec le moins d'expériences professionnelle font le plus d'erreurs. Au vu de l'importance de l'évaluation correcte du score de Glasgow initial, une amélioration des connaissances est indispensable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the Trabecular Bone Score (TBS) measure. TBS is a novel grey-level texture measurement reflecting bone micro-architecture based on the use of experimental variograms of 2D projection images. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis value, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. Method: The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goals of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. Results: We included 631 women: mean age 67.4±6.7 y, BMI 26.1±4.6, mean lumbar spine BMD 0.943±0.168 (T-score -1.4 SD), TBS 1.271±0.103. As expected, correlation between BMD and site matched TBS is low (r2=0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2- 2.5), 1.6 (1.2-2.1), 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < -2.5 SD or a TBS < 1.200. If we combine a BMD < -2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. Conclusion: As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been miss-classified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS & HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. METHODS: 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. RESULTS: QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. CONCLUSIONS: QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'objectif de l'étude présentée est d'adapter et de valider une version française de la Stigma Scale (King, 2007) auprès d'une population de personnes souffrant de troubles psychiques. Dans une première phase, la stabilité temporelle (fidélité test-retest), la cohérence interne et la validité convergente de l'instrument original à 28 items traduit en français ont été évaluées auprès d'un échantillon de 183 patients. Les résultats d'analyses factorielles confirmatoires ne nous ont pas permis de confirmer la structure originale de l'instrument. Nous avons donc proposé, sur la base des résultats d'une analyse factorielle exploratoire, une version courte de l'échelle de stigmatisation (9 items) qui conserve la structure en trois facteurs du modèle original. Dans une deuxième phase, nous avons examiné les qualités psychométriques et validé cette version abrégée de l'échelle de stigmatisation auprès d'un second échantillon de 234 patients. Les indices d'ajustements de notre analyse factorielle confirmatoire confirme la structure en trois facteurs de la version abrégée de la Stigma Scale. Les résultats suggèrent que la version française abrégée de l'échelle de stigmatisation constitue un instrument utile, fiable et valide dans l'autoévaluation de la stigmatisation perçue par des personnes souffrant de troubles psychiques. - Aim People suffering from mental illness are exposed to stigma. However, only few tools are available to assess stigmatization as perceived from the patient's perspective. The aim of this study is to adapt and validate a French version of the Stigma Scale (King, 2007). This self-report questionnaire has a three-factor structure: discrimination, disclosure and positive aspects of mental illness. Discrimination subscale refers to perceived negative reactions by others. Disclosure subscale refers mainly to managing disclosure to avoid discrimination and finally positive aspects subscale taps into how patients are becoming more accepting, more understanding toward their illness. Method In the first step, internal consistency, convergent validity and test-retest reliability of the French adaptation of the 28-item scale have been assessed on a sample of 183 patients. Results of confirmatory factor analyses (CFA) did not confirm the hypothesized structure. In light of the failed attempts to validate the original version, an alternative 9-item short-form version of the Stigma Scale, maintaining the integrity of the original model, was developed based on results of exploratory factor analyses in the first sample and cross- validated in a new sample of 234 patients. Results Results of CFA did not confirm that the data fitted well to the three-factor model of the 28-item Stigma Scale (χ2/άί=2.02, GFI=0.77, AGFI=0.73, RMSEA=0.07, CFI=0.77 et NNFI=0.75). Cronbach's α are excellent for discrimination (0.84) and disclosure (0.83) subscales but poor for potential positive aspects (0.46). External validity is satisfactory. Overall Stigma Scale total score is negatively correlated with score on Rosenberg's Self-Esteem Scale (r = -0.49), and each sub-scale is significantly correlated with a visual analogue scale that refers to the specific aspect of stigma (0.43 < |r| < 0.60). Intraclass correlation coefficients between 0.68 and 0.89 indicate good test- retest reliability. Results of CFA demonstrate that the items chosen for the short version of the Stigma Scale have the expected fit properties fa2/df=1.02, GFI=0.98, AGFI=0.98, RMSEA=0.01, CFI=1.0 et NNFI=1.0). Considering the small number (3 items) of items in each subscales of the short version of the Stigma Scale, a coefficients for the discrimination (0.57), disclosure (0.80) and potential positive aspects subscales (0.62) are considered as good. Conclusion Our results suggest that the 9-item French short-version of the Stigma Scale is a useful, reliable and valid self-report questionnaire to assess perceived stigmatization in people suffering from mental illness. The time of completion is really short and questions are well understood and accepted by the patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To determine the psychometric properties of an adapted version of the Falls Efficacy Scale (FES) in older rehabilitation patients. DESIGN: Cross-sectional survey. SETTING: Postacute rehabilitation facility in Switzerland. PARTICIPANTS: Seventy elderly persons aged 65 years and older receiving postacute, inpatient rehabilitation. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: FES questions asked about subject's confidence (range, 0 [none]-10 [full]) in performing 12 activities of daily living (ADLs) without falling. Construct validity was assessed using correlation with measures of physical (basic ADLs [BADLs]), cognitive (Mini-Mental State Examination [MMSE]), affective (15-item Geriatric Depression Scale [GDS]), and mobility (Performance Oriented Mobility Assessment [POMA]) performance. Predictive validity was assessed using the length of rehabilitation stay as the outcome. To determine test-retest reliability, FES administration was repeated in a random subsample (n=20) within 72 hours. RESULTS: FES scores ranged from 10 to 120 (mean, 88.7+/-26.5). Internal consistency was optimal (Cronbach alpha=.90), and item-to-total correlations were all significant, ranging from .56 (toilet use) to .82 (reaching into closets). Test-retest reliability was high (intraclass correlation coefficient, .97; 95% confidence interval, .95-.99; P<.001). Subjects reporting a fall in the previous year had lower FES scores than nonfallers (85.0+/-25.2 vs 94.4+/-27.9, P=.054). The FES correlated with POMA (Spearman rho=.40, P<.001), MMSE (rho=.37, P=.001), BADL (rho=.43, P<.001), and GDS (rho=-.53, P<.001) scores. These relationships remained significant in multivariable analysis for BADLs and GDS, confirming FES construct validity. There was a significant inverse relationship between FES score and the length of rehabilitation stay, independent of sociodemographic, functional, cognitive, and fall status. CONCLUSIONS: This adapted FES is reliable and valid in older patients undergoing postacute rehabilitation. The independent association between poor falls efficacy and increased length of stay has not been previously described and needs further investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. Vibration training (VT) is a new exercise method, with good acceptance among sedentary subjects, due to its passive principle: the machine moves the subject, not the opposite. We hypothesize that untrained subjects can benefit from a greater cardiovascular and metabolic stimulation than trained athletes, resembling classical aerobic-type activity, in addition of eliciting strength gains shown in diverse studies. Methods. 3 group of male subjects, inactive (SED), endurance trained athletes (END) and strength trained athletes (STR) underwent fitness (VO2max) and lower-body strength tests (isokinetic). Subjects were submitted to a session of oscillating VT, composed of 3 exercises (isometric half-squat, dynamic squat, dynamic squat with added load), each of 3 minutes duration, and repeated at 3 frequencies. VO2, heart rate and Borg scale were monitored. Results. 27 healthy subjects (10 SED, 9 END and 8 STR), mean age 24.5 (SED), 25.0 (STR) and 29.8 (END) were included. VO2max was significantly different as expected (47.9 vs. 52.9 vs. 63.9 ml/kg/min, resp. for SED, STR and END). Isokinetic dominant leg extensors strength was higher in STR (3.32 Nm/kg vs. 2.60 and 2.74 in SED and END). During VT, peak oxygen consumption (% of VO2max) attained was 59.3 in SED, 50.8 in STR and 48.0 in END (P<0.001 between SED and other subjects). Peak heart rate (% of heart rate max) was 82.7 in SED, 80.4 in STR and 72.4 in END. In SED, dynamic exercises without extra load elicited 51.0% of VO2max and 72.1% of heart rate max, and perceived effort reached 15.1/20. Conclusions. VT is an unconventional type of exercise, which has been shown to enhance strength, bone density, balance and flexibility. Users are attracted by the relative passivity. In SED, we show that VT elicits sufficient cardiovascular response to benefit overall fitness in addition to the known strength effects. VT's higher acceptance as an exercise in sedentary people, compared to jogging or cycling for example, can lead to better adherence to physical activity. Although long-term effects of VT on health are not avalaible, we believe this type of combination of aerobic and resistance-type exercise can be beneficial on multiple health parameters, especially cardiovascular health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have previously reported (Dobreva, I., Waeber, G., Mooser, V., James, R. W., and Widmann, C. (2003) J. Lipid Res. 44, 2382-2390) that low density lipoproteins (LDLs) induce activation of the p38 MAPK pathway, resulting in fibroblast spreading and lamellipodia formation. Here, we show that LDL-stimulated fibroblast spreading and wound sealing are due to secretion of a soluble factor. Using an antibody-based human protein array, interleukin-8 (IL-8) was identified as the main cytokine whose concentration was increased in supernatants from LDL-stimulated cells. Incubation of supernatants from LDL-treated cells with an anti-IL-8 blocking antibody completely abolished their ability to induce cell spreading and mediate wound closure. In addition, fibroblasts treated with recombinant IL-8 spread to the same extent as cells incubated with LDL or supernatants from LDL-treated cells. The ability of LDL and IL-8 to induce fibroblast spreading was mediated by the IL-8 receptor type II (CXCR-2). Furthermore, LDL-induced IL-8 production and subsequent wound closure required the activation of the p38 MAPK pathway, because both processes were abrogated by a specific p38 inhibitor. Therefore, the capacity of LDLs to induce fibroblast spreading and accelerate wound closure relies on their ability to stimulate IL-8 secretion in a p38 MAPK-dependent manner. Regulation of fibroblast shape and migration by lipoproteins may be relevant to atherosclerosis that is characterized by increased LDL cholesterol levels, IL-8 production, and extensive remodeling of the vessel wall.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Accurate catalogs of structural variants (SVs) in mammalian genomes are necessary to elucidate the potential mechanisms that drive SV formation and to assess their functional impact. Next generation sequencing methods for SV detection are an advance on array-based methods, but are almost exclusively limited to four basic types: deletions, insertions, inversions and copy number gains. RESULTS: By visual inspection of 100 Mbp of genome to which next generation sequence data from 17 inbred mouse strains had been aligned, we identify and interpret 21 paired-end mapping patterns, which we validate by PCR. These paired-end mapping patterns reveal a greater diversity and complexity in SVs than previously recognized. In addition, Sanger-based sequence analysis of 4,176 breakpoints at 261 SV sites reveal additional complexity at approximately a quarter of structural variants analyzed. We find micro-deletions and micro-insertions at SV breakpoints, ranging from 1 to 107 bp, and SNPs that extend breakpoint micro-homology and may catalyze SV formation. CONCLUSIONS: An integrative approach using experimental analyses to train computational SV calling is essential for the accurate resolution of the architecture of SVs. We find considerable complexity in SV formation; about a quarter of SVs in the mouse are composed of a complex mixture of deletion, insertion, inversion and copy number gain. Computational methods can be adapted to identify most paired-end mapping patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tractography is a class of algorithms aiming at in vivo mapping the major neuronal pathways in the white matter from diffusion magnetic resonance imaging (MRI) data. These techniques offer a powerful tool to noninvasively investigate at the macroscopic scale the architecture of the neuronal connections of the brain. However, unfortunately, the reconstructions recovered with existing tractography algorithms are not really quantitative even though diffusion MRI is a quantitative modality by nature. As a matter of fact, several techniques have been proposed in recent years to estimate, at the voxel level, intrinsic microstructural features of the tissue, such as axonal density and diameter, by using multicompartment models. In this paper, we present a novel framework to reestablish the link between tractography and tissue microstructure. Starting from an input set of candidate fiber-tracts, which are estimated from the data using standard fiber-tracking techniques, we model the diffusion MRI signal in each voxel of the image as a linear combination of the restricted and hindered contributions generated in every location of the brain by these candidate tracts. Then, we seek for the global weight of each of them, i.e., the effective contribution or volume, such that they globally fit the measured signal at best. We demonstrate that these weights can be easily recovered by solving a global convex optimization problem and using efficient algorithms. The effectiveness of our approach has been evaluated both on a realistic phantom with known ground-truth and in vivo brain data. Results clearly demonstrate the benefits of the proposed formulation, opening new perspectives for a more quantitative and biologically plausible assessment of the structural connectivity of the brain.

Relevância:

20.00% 20.00%

Publicador: