209 resultados para Scale monitoring
Resumo:
Near infrared spectroscopy (NIRS) is a non-invasive method of estimating the haemoglobin concentration changes in certain tissues. It is frequently used to monitor oxygenation of the brain in neonates. At present it is not clear whether near infrared spectroscopy of other organs (e.g. the liver as a corresponding site in the splanchnic region, which reacts very sensitively to haemodynamic instability) provides reliable values on their tissue oxygenation. The aim of the study was to test near infrared spectroscopy by measuring known physiologic changes in tissue oxygenation of the liver in newborn infants during and after feeding via a naso-gastric tube. The test-retest variability of such measurements was also determined. On 28 occasions in 25 infants we measured the tissue oxygenation index (TOI) of the liver and the brain continuously before, during and 30 minutes after feeding via a gastric tube. Simultaneously we measured arterial oxygen saturation (SaO2), heart rate (HR) and mean arterial blood pressure (MAP). In 10 other newborn infants we performed a test-retest analysis of the liver tissue oxygenation index to estimate the variability in repeated intra-individual measurements. The tissue oxygenation index of the liver increased significantly from 56.7 +/- 7.5% before to 60.3 +/- 5.6% after feeding (p < 0.005), and remained unchanged for the next 30 minutes. The tissue oxygenation index of the brain (62.1 +/- 9.7%), SaO2 (94.4 +/- 7.1%), heart rate (145 +/- 17.3 min-1) and mean arterial blood pressure (52.8 +/- 10.2 mm Hg) did not change significantly. The test-retest variability for intra-individual measurements was 2.7 +/- 2.1%. After bolus feeding the tissue oxygenation index of the liver increased as expected. This indicates that near infrared spectroscopy is suitable for monitoring changes in tissue oxygenation of the liver in newborn infants.
Resumo:
High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy
Resumo:
Le monitoring de la problématique du cannabis en Suisse constitue un ensemble de travaux qui permet un suivi de la situation au niveau national et dont la mise en oeuvre est le fait d'un consortium d'instituts. Ce monitoring comprend l'étude présentée dans ce rapport, l'étude sentinelle. Elle s'intéresse à l'évolution de la situation en matière de cannabis ainsi qu'à la gestion de cette situation au niveau local. Ainsi, les observations relevées par des professionnels de terrain dans différents domaines (santé/social, école/formation professionnelle, police/justice) et dans quatre cantons suisses (St Gall, Tessin, Vaud, Zurich), dits "sentinelles", sont récoltées et analysées annuellement.
Resumo:
The study of natural T cell responses against pathogens or tumors, as well as the assessment of new immunotherapy strategies aimed at boosting these responses, requires increasingly precise ex vivo analysis of blood samples. For practical reasons, studies are often performed using purified PBMC samples, usually cryopreserved. Here, we report on FACS analyses of peripheral blood T cells, performed by direct antibody staining of non-purified total blood. For comparison, fresh PBMC, purified by Ficoll, were analysed. Our results show that the latter method can induce a bias in subpopulation distribution, in particular of CD8+ T cells, and sometimes lead to inaccurate measurement of antigen specific CD8+ T cell responses. Direct analysis of total blood can be applied to longitudinal immuno-monitoring of T cell-based therapy. While the need to purify and cryopreserve PBMC for subsequent studies is obvious, the use of whole blood has the advantage of providing unbiased results and only small amounts of blood are used.
Resumo:
Introduction: Falls efficacy, defined as confidence in performing activities without falling, is a measure of fear of falling associated with gait impairment, falls and functional decline in frail older people. This relationship has not been well studied in high-functioning older people. Objective: To evaluate the relationship between falls efficacy and gait performance in a cohort of high-functioning older people. Methods: Subjects (N = 864) were a subsample of communitydwelling older people aged 65 to 70 years, enrolled in the "Lc65+" cohort, who completed gait assessment at baseline. Data were collected on demographics, functional, cognitive, affective, and health status. Falls efficacy was assessed using the Falls Efficacy Scale- International (FES-I) that measures confidence in performing 16 activities of daily life (ADL) without falling (score from 16 to 64, higher score indicates lower confidence). Gait parameters were measured over a 20 m walk at preferred gait speed using Physilog, an ambulatory gait monitoring system. Results: Participants (mean age 68.0 ± 1.4 years, 55.0% women) had excellent physical (92.2% independent in basic ADL, mean gait speed 1.13 ± 0.16 m/sec) and cognitive (98.0% with MMSE 024) performance. Nevertheless, 22.1% reported depressive symptoms and 16.1% one or more fall in the previous year. Mean FES-I score was 18.8 ± 4.1. Falls efficacy was associated with gait speed (Spearman rho -0.23, P <.001) and gait variability (Spearman rho 0.10, P = .006), measured by the coefficient of variation of stride velocity. These associations remained in multivariate analysis for both gait speed (adj [beta] coeff: -0.008, 95%CI -0.005 to -0.010, P <.001) and gait variability (adj [beta] coeff 0.024, 95%CI 0.003 to 0.045, P = .023) independent of gender, falls, functional, affective, cognitive, and frailty (Fried's criteria) status. On average, compared to subjects with poor confidence in performing one ADL without falling, those with full confidence had a 0.02 m/sec (2%) faster gait speed and a 2% decrease in gait variability. Conclusion: Even in high-functioning older people, poor falls efficacy is associated with reduced gait speed and stability, independent of health, functional, and frailty status. The direction of this relationship needs to be investigated prospectively to determine causality and design interventions to improve gait performance, reduce fall risk, and prevent functional decline.
Resumo:
The root-colonizing bacterium Pseudomonas fluorescens CHA0 was used to construct an oxygen-responsive biosensor. An anaerobically inducible promoter of Pseudomonas aeruginosa, which depends on the FNR (fumarate and nitrate reductase regulation)-like transcriptional regulator ANR (anaerobic regulation of arginine deiminase and nitrate reductase pathways), was fused to the structural lacZ gene of Escherichia coli. By inserting the reporter fusion into the chromosomal attTn7 site of P. fluorescens CHA0 by using a mini-Tn7 transposon, the reporter strain, CHA900, was obtained. Grown in glutamate-yeast extract medium in an oxystat at defined oxygen levels, the biosensor CHA900 responded to a decrease in oxygen concentration from 210 x 10(2) Pa to 2 x 10(2) Pa of O(2) by a nearly 100-fold increase in beta-galactosidase activity. Half-maximal induction of the reporter occurred at about 5 x 10(2) Pa. This dose response closely resembles that found for E. coli promoters which are activated by the FNR protein. In a carbon-free buffer or in bulk soil, the biosensor CHA900 still responded to a decrease in oxygen concentration, although here induction was about 10 times lower and the low oxygen response was gradually lost within 3 days. Introduced into a barley-soil microcosm, the biosensor could report decreasing oxygen concentrations in the rhizosphere for a 6-day period. When the water content in the microcosm was raised from 60% to 85% of field capacity, expression of the reporter gene was elevated about twofold above a basal level after 2 days of incubation, suggesting that a water content of 85% caused mild anoxia. Increased compaction of the soil was shown to have a faster and more dramatic effect on the expression of the oxygen reporter than soil water content alone, indicating that factors other than the water-filled pore space influenced the oxygen status of the soil. These experiments illustrate the utility of the biosensor for detecting low oxygen concentrations in the rhizosphere and other soil habitats.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
In addition to the monographs which were published last year by the working group "Drug Monitoring" of the Swiss Society of Clinical Chemistry (SSCC) [1], new monographs have been written. The aim of these monographs is to give an overview of the most important information necessary for ordering a drug analysis or interpreting the results. Therefore, the targeted readers comprise laboratory health professionals and all receivers of laboratory reports. There is information provided on the indication for therapeutic drug monitoring, protein binding, metabolic pathways and enzymes involved, elimination half-life and elimination routes, and on therapeutic or toxic concentrations. Preanalytical considerations are of particular importance for therapeutic drug monitoring. Therefore, information is provided regarding a reasonable timing for the determination of drug concentrations as well as steady-state concentrations after changing the dose. Furthermore, the stability of the drug and its metabolite(s) after blood sampling is described. For readers with a specific interest in drug analysis, references to important publications are given. The number of monographs will be continuously enlarged. The updated files are presented on the homepage of the SSCC (www.sscc.ch).
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
BACKGROUND: The impact of osmotic therapies on brain oxygen has not been extensively studied in humans. We examined the effects on brain tissue oxygen tension (PbtO(2)) of mannitol and hypertonic saline (HTS) in patients with severe traumatic brain injury (TBI) and refractory intracranial hypertension. METHODS: 12 consecutive patients with severe TBI who underwent intracranial pressure (ICP) and PbtO(2) monitoring were studied. Patients were treated with mannitol (25%, 0.75 g/kg) for episodes of elevated ICP (>20 mm Hg) or HTS (7.5%, 250 ml) if ICP was not controlled with mannitol. PbtO(2), ICP, mean arterial pressure, cerebral perfusion pressure (CPP), central venous pressure and cardiac output were monitored continuously. RESULTS: 42 episodes of intracranial hypertension, treated with mannitol (n = 28 boluses) or HTS (n = 14 boluses), were analysed. HTS treatment was associated with an increase in PbtO(2) (from baseline 28.3 (13.8) mm Hg to 34.9 (18.2) mm Hg at 30 min, 37.0 (17.6) mm Hg at 60 min and 41.4 (17.7) mm Hg at 120 min; all p<0.01) while mannitol did not affect PbtO(2) (baseline 30.4 (11.4) vs 28.7 (13.5) vs 28.4 (10.6) vs 27.5 (9.9) mm Hg; all p>0.1). Compared with mannitol, HTS was associated with lower ICP and higher CPP and cardiac output. CONCLUSIONS: In patients with severe TBI and elevated ICP refractory to previous mannitol treatment, 7.5% hypertonic saline administered as second tier therapy is associated with a significant increase in brain oxygenation, and improved cerebral and systemic haemodynamics.
Resumo:
L'objectif de l'étude présentée est d'adapter et de valider une version française de la Stigma Scale (King, 2007) auprès d'une population de personnes souffrant de troubles psychiques. Dans une première phase, la stabilité temporelle (fidélité test-retest), la cohérence interne et la validité convergente de l'instrument original à 28 items traduit en français ont été évaluées auprès d'un échantillon de 183 patients. Les résultats d'analyses factorielles confirmatoires ne nous ont pas permis de confirmer la structure originale de l'instrument. Nous avons donc proposé, sur la base des résultats d'une analyse factorielle exploratoire, une version courte de l'échelle de stigmatisation (9 items) qui conserve la structure en trois facteurs du modèle original. Dans une deuxième phase, nous avons examiné les qualités psychométriques et validé cette version abrégée de l'échelle de stigmatisation auprès d'un second échantillon de 234 patients. Les indices d'ajustements de notre analyse factorielle confirmatoire confirme la structure en trois facteurs de la version abrégée de la Stigma Scale. Les résultats suggèrent que la version française abrégée de l'échelle de stigmatisation constitue un instrument utile, fiable et valide dans l'autoévaluation de la stigmatisation perçue par des personnes souffrant de troubles psychiques. - Aim People suffering from mental illness are exposed to stigma. However, only few tools are available to assess stigmatization as perceived from the patient's perspective. The aim of this study is to adapt and validate a French version of the Stigma Scale (King, 2007). This self-report questionnaire has a three-factor structure: discrimination, disclosure and positive aspects of mental illness. Discrimination subscale refers to perceived negative reactions by others. Disclosure subscale refers mainly to managing disclosure to avoid discrimination and finally positive aspects subscale taps into how patients are becoming more accepting, more understanding toward their illness. Method In the first step, internal consistency, convergent validity and test-retest reliability of the French adaptation of the 28-item scale have been assessed on a sample of 183 patients. Results of confirmatory factor analyses (CFA) did not confirm the hypothesized structure. In light of the failed attempts to validate the original version, an alternative 9-item short-form version of the Stigma Scale, maintaining the integrity of the original model, was developed based on results of exploratory factor analyses in the first sample and cross- validated in a new sample of 234 patients. Results Results of CFA did not confirm that the data fitted well to the three-factor model of the 28-item Stigma Scale (χ2/άί=2.02, GFI=0.77, AGFI=0.73, RMSEA=0.07, CFI=0.77 et NNFI=0.75). Cronbach's α are excellent for discrimination (0.84) and disclosure (0.83) subscales but poor for potential positive aspects (0.46). External validity is satisfactory. Overall Stigma Scale total score is negatively correlated with score on Rosenberg's Self-Esteem Scale (r = -0.49), and each sub-scale is significantly correlated with a visual analogue scale that refers to the specific aspect of stigma (0.43 < |r| < 0.60). Intraclass correlation coefficients between 0.68 and 0.89 indicate good test- retest reliability. Results of CFA demonstrate that the items chosen for the short version of the Stigma Scale have the expected fit properties fa2/df=1.02, GFI=0.98, AGFI=0.98, RMSEA=0.01, CFI=1.0 et NNFI=1.0). Considering the small number (3 items) of items in each subscales of the short version of the Stigma Scale, a coefficients for the discrimination (0.57), disclosure (0.80) and potential positive aspects subscales (0.62) are considered as good. Conclusion Our results suggest that the 9-item French short-version of the Stigma Scale is a useful, reliable and valid self-report questionnaire to assess perceived stigmatization in people suffering from mental illness. The time of completion is really short and questions are well understood and accepted by the patients.