1000 resultados para Méthode de quantification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

En parallèle à l'avènement des modèles policiers guidés par le renseignement, les méthodes d'analyse criminelle et de renseignement forensique ont connu des développements importants ces dernières années. Des applications ont été proposées dans divers domaines des sciences forensiques afin d'exploiter et de gérer différents types de traces matérielles de façon systématique et plus performante. A cet égard, le domaine des faux documents d'identité n'a été l'objet que de peu d'attention bien qu'il s'agisse d'une criminalité grave dans laquelle le crime organisé est impliqué.La présente étude cherche à combler cette lacune en proposant une méthode de profilage des fausses pièces d'identité simple et généralisable qui vise à découvrir des liens existants sur la base des caractéristiques matérielles analysables visuellement. Ces caractéristiques sont considérées comme constituant la marque de fabrique particulière du faussaire et elle peuvent ainsi être exploitées pour inférer des liens entre fausses pièces d'identité provenant d'une même source.Un collectif de plus de 200 fausses pièces d'identité composé de trois types de faux documents a été récolté auprès des polices de neuf cantons suisses et a été intégré dans une banque de données ad hoc. Les liens détectés de façon systématique et automatique par cette banque de données ont été exploités et analysés afin de produire des renseignements d'ordre stratégique et opérationnel utiles à la lutte contre la fraude documentaire.Les démarches de profilage et de renseignement mises en place pour les trois types de fausses pièces d'identité étudiées se sont révélées efficaces, un fort pourcentage des documents s'avérant liés (de 30 % à 50 %). La fraude documentaire apparaît comme une criminalité structurée et interrégionale, pour laquelle les liens établis entre fausses pièces d'identité peuvent servir d'aide à l'enquête et de soutien aux décisions stratégiques. Les résultats suggèrent le développement d'approches préventives et répressives pour lutter contre la fraude documentaire.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metabolic labeling techniques have recently become popular tools for the quantitative profiling of proteomes. Classical stable isotope labeling with amino acids in cell cultures (SILAC) uses pairs of heavy/light isotopic forms of amino acids to introduce predictable mass differences in protein samples to be compared. After proteolysis, pairs of cognate precursor peptides can be correlated, and their intensities can be used for mass spectrometry-based relative protein quantification. We present an alternative SILAC approach by which two cell cultures are grown in media containing isobaric forms of amino acids, labeled either with 13C on the carbonyl (C-1) carbon or 15N on backbone nitrogen. Labeled peptides from both samples have the same nominal mass and nearly identical MS/MS spectra but generate upon fragmentation distinct immonium ions separated by 1 amu. When labeled protein samples are mixed, the intensities of these immonium ions can be used for the relative quantification of the parent proteins. We validated the labeling of cellular proteins with valine, isoleucine, and leucine with coverage of 97% of all tryptic peptides. We improved the sensitivity for the detection of the quantification ions on a pulsing instrument by using a specific fast scan event. The analysis of a protein mixture with a known heavy/light ratio showed reliable quantification. Finally the application of the technique to the analysis of two melanoma cell lines yielded quantitative data consistent with those obtained by a classical two-dimensional DIGE analysis of the same samples. Our method combines the features of the SILAC technique with the advantages of isobaric labeling schemes like iTRAQ. We discuss advantages and disadvantages of isobaric SILAC with immonium ion splitting as well as possible ways to improve it

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the past 20 years, therapeutic and rehabilitative modalities in the field of psychosocial rehabilitation have been diversified in becoming more specific. We have the possibility to offer individualized rehabilitation programs as well as in the general field of socio-professional goals as in the clinical field according to the patients' needs and personal assets. The content of these programs associates various forms of specialized medical and paramedical services. The indications are established trough a careful assessment. The rehabilitation unit of the University Department of Psychiatry in Lausanne has developed a multidisciplinary assessment method based on the bio-psychosocial integrative model and the vulnerability-stress model in integrating the level of experience of Wood for the analysis of the psychosocial functioning. This results in a structured assessment program, which leads to a multidisciplinary comprehensive assessment (difficulties versus adaptative resources)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coronary artery calcification (CAC) is quantified based on a computed tomography (CT) scan image. A calcified region is identified. Modified expectation maximization (MEM) of a statistical model for the calcified and background material is used to estimate the partial calcium content of the voxels. The algorithm limits the region over which MEM is performed. By using MEM, the statistical properties of the model are iteratively updated based on the calculated resultant calcium distribution from the previous iteration. The estimated statistical properties are used to generate a map of the partial calcium content in the calcified region. The volume of calcium in the calcified region is determined based on the map. The experimental results on a cardiac phantom, scanned 90 times using 15 different protocols, demonstrate that the proposed method is less sensitive to partial volume effect and noise, with average error of 9.5% (standard deviation (SD) of 5-7mm(3)) compared with 67% (SD of 3-20mm(3)) for conventional techniques. The high reproducibility of the proposed method for 35 patients, scanned twice using the same protocol at a minimum interval of 10 min, shows that the method provides 2-3 times lower interscan variation than conventional techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the impact of nonuniform dose distribution within lesions and tumor-involved organs of patients receiving Zevalin, and to discuss possible implications of equivalent uniform biological effective doses (EU-BED) on treatment efficacy and toxicity. MATLAB? -based software for voxel-based dosimetry was adopted for this purpose. METHODS: Eleven lesions from seven patients with either indolent or aggressive non-Hodgkin lymphoma were analyzed, along with four organs with disease. Absorbed doses were estimated by a direct integration of single-voxel kinetic data from serial tomographic images. After proper corrections, differential BED distributions and surviving cell fractions were estimated, allowing for the calculation of EU-BED. To quantify dose uniformity in each target area, a heterogeneity index was defined. RESULTS: Average doses were below those prescribed by conventional radiotherapy to eradicate lymphoma lesions. Dose heterogeneity and effect on tumor control varied among lesions, with no apparent relation to tumor mass. Although radiation doses to involved organs were safe, unexpected liver toxicity occurred in one patient who presented with a pattern of diffuse infiltration. CONCLUSION: Voxel-based dosimetry and radiobiologic modeling can be successfully applied to lesions and tumor-involved organs, representing a methodological advance over estimation of mean absorbed doses. However, effects on tumor control and organ toxicity still cannot be easily predicted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An enzyme-linked immunosorbent assay was standardized for the detection of cryptococcal antigen in serum and cerebrospinal fluid. The system was evaluated in clinical samples from patients infected by human immunodeficiency virus with and without previous cryptococcosis diagnosis. The evaluated system is highly sensitive and specific, and when it was compared with latex agglutination there were not significant differences. A standard curve with purified Cryptococcus neoformans antigen was settled down for the antigen quantification in positive samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence has emerged that the initiation and growth of gliomas is sustained by a subpopulation of cancer-initiating cells (CICs). Because of the difficulty of using markers to tag CICs in gliomas, we have previously exploited more robust phenotypic characteristics, including a specific morphology and intrincic autofluorescence, to identify and isolate a subpopulation of glioma CICs, called FL1(+). The objective of this study was to further validate our method in a large cohort of human glioma and a mouse model of glioma. Seventy-four human gliomas of all grades and the GFAP-V(12)HA-ras B8 mouse model were analyzed for in vitro self-renewal capacity and their content of FL1(+). Nonneoplastic brain tissue and embryonic mouse brain were used as control. Genetic traceability along passages was assessed with microsatellite analysis. We found that FL1(+) cells from low-grade gliomas and from control nonneoplasic brain tissue show a lower level of autofluorescence and undergo a restricted number of cell divisions before dying in culture. In contrast, we found that FL1(+) cells derived from many but not all high-grade gliomas acquire high levels of autofluorescence and can be propagated in long-term cultures. Moreover, FL1(+) cells show a remarkable traceability over time in vitro and in vivo. Our results show that FL1(+) cells can be found in all specimens of a large cohort of human gliomas of different grades and in a model of genetically induced mouse glioma as well as nonneoplastic brain. However, their self-renewal capacity is variable and seems to be dependent on the tumor grade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Lesion detection in multiple sclerosis (MS) is an essential part of its clinical diagnosis. In addition, radiological characterisation of MS lesions is an important research field that aims at distinguishing different MS types, monitoring drug response and prognosis. To date, various MR protocols have been proposed to obtain optimal lesion contrast for early and comprehensive diagnosis of the MS disease. In this study, we compare the sensitivity of five different MR contrasts for lesion detection: (i) the DIR sequence (Double Inversion Recovery, [4]), (ii) the Dark-fluid SPACE acquisition schemes, a 3D variant of a 2D FLAIR sequence [1], (iii) the MP2RAGE [2], an MP-RAGE variant that provides homogeneous T1 contrast and quantitative T1-values, and the sequences currently used for clinical MS diagnosis (2D FLAIR, MP-RAGE). Furthermore, we investigate the T1 relaxation times of cortical and sub-cortical regions in the brain hemispheres and the cerebellum at 3T. Methods 10 early-stage female MS patients (age: 31.64.7y; disease duration: 3.81.9y; disability score, EDSS: 1.80.4) and 10 healthy controls (age and gender-matched: 31.25.8y) were included in the study after obtaining informed written consent according to the local ethic protocol. All experiments were performed at 3T (Magnetom Trio a Tim System, Siemens, Germany) using a 32-channel head coil [5]. The imaging protocol included the following sequences, (all except for axial FLAIR 2D with 1x1x1.2 mm3 voxel and 256x256x160 matrix): DIR (TI1/TI2/TR XX/3652/10000 ms, iPAT=2, TA 12:02 min), MP-RAGE (TI/TR 900/2300 ms, iPAT=3, TA 3:47 min); MP2RAGE (TI1/TI2/TR 700/2500/5000 ms, iPAT=3, TA 8:22 min, cf. [2]); 3D FLAIR SPACE (only for patient 4-6, TI/TR 1800/5000 ms, iPAT=2, TA=5;52 min, cf. [1]); Axial FLAIR (0.9x0.9x2.5 mm3, 256x256x44 matrix, TI/TR 2500/9000 ms, iPAT=2, TA 4:05 min). Lesions were identified by two experienced neurologist and radiologist, manually contoured and assigned to regional locations (s. table 1). Regional lesion masks (RLM) from each contrast were compared for number and volumes of lesions. In addition, RLM were merged in a single "master" mask, which represented the sum of the lesions of all contrasts. T1 values were derived for each location from this mask for patients 5-10 (3D FLAIR contrast was missing for patient 1-4). Results & Discussion The DIR sequence appears the most sensitive for total lesions count, followed by the MP2RAGE (table 1). The 3D FLAIR SPACE sequence turns out to be more sensitive than the 2D FLAIR, presumably due to reduced partial volume effects. Looking for sub-cortical hemispheric lesions, the DIR contrast appears to be equally sensitive to the MP2RAGE and SPACE, but most sensitive for cerebellar MS plaques. The DIR sequence is also the one that reveals cortical hemispheric lesions best. T1 relaxation times at 3T in the WM and GM of the hemispheres and the cerebellum, as obtained with the MP2RAGE sequence, are shown in table 2. Extending previous studies, we confirm overall longer T1-values in lesion tissue and higher standard deviations compared to the non-lesion tissue and control tissue in healthy controls. We hypothesize a biological (different degree of axonal loss and demyelination) rather than technical origin. Conclusion In this study, we applied 5 MR contrasts including two novel sequences to investigate the contrast of highest sensitivity for early MS diagnosis. In addition, we characterized for the first time the T1 relaxation time in cortical and sub-cortical regions of the hemispheres and the cerebellum. Results are in agreement with previous publications and meaningful biological interpretation of the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: The purpose of this study was to compare a novel compressed sensing (CS)-based single-breath-hold multislice magnetic resonance cine technique with the standard multi-breath-hold technique for the assessment of left ventricular (LV) volumes and function. BACKGROUND: Cardiac magnetic resonance is generally accepted as the gold standard for LV volume and function assessment. LV function is 1 of the most important cardiac parameters for diagnosis and the monitoring of treatment effects. Recently, CS techniques have emerged as a means to accelerate data acquisition. METHODS: The prototype CS cine sequence acquires 3 long-axis and 4 short-axis cine loops in 1 single breath-hold (temporal/spatial resolution: 30 ms/1.5 × 1.5 mm(2); acceleration factor 11.0) to measure left ventricular ejection fraction (LVEFCS) as well as LV volumes and LV mass using LV model-based 4D software. For comparison, a conventional stack of multi-breath-hold cine images was acquired (temporal/spatial resolution 40 ms/1.2 × 1.6 mm(2)). As a reference for the left ventricular stroke volume (LVSV), aortic flow was measured by phase-contrast acquisition. RESULTS: In 94% of the 33 participants (12 volunteers: mean age 33 ± 7 years; 21 patients: mean age 63 ± 13 years with different LV pathologies), the image quality of the CS acquisitions was excellent. LVEFCS and LVEFstandard were similar (48.5 ± 15.9% vs. 49.8 ± 15.8%; p = 0.11; r = 0.96; slope 0.97; p < 0.00001). Agreement of LVSVCS with aortic flow was superior to that of LVSVstandard (overestimation vs. aortic flow: 5.6 ± 6.5 ml vs. 16.2 ± 11.7 ml, respectively; p = 0.012) with less variability (r = 0.91; p < 0.00001 for the CS technique vs. r = 0.71; p < 0.01 for the standard technique). The intraobserver and interobserver agreement for all CS parameters was good (slopes 0.93 to 1.06; r = 0.90 to 0.99). CONCLUSIONS: The results demonstrated the feasibility of applying the CS strategy to evaluate LV function and volumes with high accuracy in patients. The single-breath-hold CS strategy has the potential to replace the multi-breath-hold standard cardiac magnetic resonance technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable quantification of the macromolecule signals in short echo-time H-1 MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. H-1 spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.