872 resultados para integration of care
Resumo:
Hypoglycemia, if recurrent, may have severe consequences on cognitive and psychomotor development of neonates. Therefore, screening for hypoglycemia is a daily routine in every facility taking care of newborn infants. Point-of-care-testing (POCT) devices are interesting for neonatal use, as their handling is easy, measurements can be performed at bedside, demanded blood volume is small and results are readily available. However, such whole blood measurements are challenged by a wide variation of hematocrit in neonates and a spectrum of normal glucose concentration at the lower end of the test range. We conducted a prospective trial to check precision and accuracy of the best suitable POCT device for neonatal use from three leading companies in Europe. Of the three devices tested (Precision Xceed, Abbott; Elite XL, Bayer; Aviva Nano, Roche), Aviva Nano exhibited the best precision. None completely fulfilled the ISO-accuracy-criteria 15197: 2003 or 2011. Aviva Nano fulfilled these criteria in 92% of cases while the others were <87%. Precision Xceed reached the 95% limit of the 2003 ISO-criteria for values ≤4.2 mmol/L, but not for the higher range (71%). Although validated for adults, new POCT devices need to be specifically evaluated on newborn infants before adopting their routine use in neonatology.
Resumo:
Integration of kDNA sequences within the genome of the host cell shown by PCR amplification with primers to the conserved Trypanosoma cruzi kDNA minicircle sequence was confirmed by Southern hybridization with specific probes. The cells containing the integrated kDNA sequences were then perpetuated as transfected macrophage subclonal lines. The kDNA transfected macrophages expressed membrane antigens that were recognized by antibodies in a panel of sera from ten patients with chronic Chagas disease. These antigens barely expressed in the membrane of uninfected, control macrophage clonal lines were recognized neither by factors in the control, non-chagasic subjects nor in the chagasic sera. This finding suggests the presence of an autoimmune antibody in the chagasic sera that recognizes auto-antigens in the membrane of T. cruzi kDNA transfected macrophage subclonal lines.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
The MIGCLIM R package is a function library for the open source R software that enables the implementation of species-specific dispersal constraints into projections of species distribution models under environmental change and/or landscape fragmentation scenarios. The model is based on a cellular automaton and the basic modeling unit is a cell that is inhabited or not. Model parameters include dispersal distance and kernel, long distance dispersal, barriers to dispersal, propagule production potential and habitat invasibility. The MIGCLIM R package has been designed to be highly flexible in the parameter values it accepts, and to offer good compatibility with existing species distribution modeling software. Possible applications include the projection of future species distributions under environmental change conditions and modeling the spread of invasive species.
Resumo:
Click here to download PDF
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.
Resumo:
BACKGROUND: Glioblastoma, the most common adult primary malignant brain tumor, confers poor prognosis (median survival of 15 months) notwithstanding aggressive treatment. Combination chemotherapy including carmustine (BCNU) or temozolomide (TMZ) with the MGMT inhibitor O6-benzylguanine (O6BG) has been used, but has been associated with dose-limiting hematopoietic toxicity. OBJECTIVE: To assess safety and efficacy of a retroviral vector encoding the O6BG-resistant MGMTP140K gene for transduction and autologous transplantation of hematopoietic stem cells (HSCs) in MGMT unmethylated, newly diagnosed glioblastoma patients in an attempt to chemoprotect bone marrowduring combination O6BG/TMZ therapy. METHODS: Three patients have been enrolled in the first cohort. Patients underwent standard radiation therapy without TMZ followed by G-CSF mobilization, apheresis, and conditioning with 600 mg/m2 BCNU prior to infusion of gene-modified cells. Posttransplant, patients were treated with 28-day cycles of single doseTMZ (472 mg/m2) with 48-hour intravenous O6BG (120 mg/m2 bolus, then 30 mg/m2/d). RESULTS: The BCNU dose was nonmyeloablative with ANC ,500/mL for ≤3 d and nadir thrombocytopenia of 28,000/mL. Gene marking in pre-infusion colony forming units (CFUs) was 70.6%, 79.0%, and 74.0% in Patients 1, 2, and 3, respectively, by CFU-PCR. Following engraftment, gene marking in white blood cells and sorted granulocytes ranged between 0.37-0.84 and 0.33-0.83 provirus copies, respectively, by real-time PCR. Posttransplant gene marking in CFUs from CD34-selected cells ranged from 28.5% to 47.4%. Patients have received 4, 3, and 2 cycles of O6BG/TMZ, respectively, with evidence for selection of gene-modified cells. One patient has received a single dose-escalated cycle at 590 mg/m2 TMZ. No additional extra-hematopoietic toxicity has been observed thus far and all three patients exhibit stable disease at 7-8 months since diagnosis CONCLUSIONS: We believe that these data demonstrate the feasibility of achieving significant engraftment of MGMTP140K-modified cells with a well-tolerated dose of BCNU. Further follow-up will determine whether this approach will allow for further dose escalation of TMZ and improved survival.
Resumo:
The overarching purpose of these guidelines is to ensure the safety and promote the protection of patients, staff and visitors by ensuring that dangerous items or hazardous substances are not brought into the in-patient setting, including illicit substances, prescribed / over the counter medications, dangerous items and alcohol or any other hazardous or potentially hazardous item or substance.
Resumo:
To guarantee the success of a virtual library is essential that all users can access all the library resources independently of the user's location.
Resumo:
An object's motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported.
Resumo:
BACKGROUND Measurement of HbA1c is the most important parameter to assess glycemic control in diabetic patients. Different point-of-care devices for HbA1c are available. The aim of this study was to evaluate two point-of-care testing (POCT) analyzers (DCA Vantage from Siemens and Afinion from Axis-Shield). We studied the bias and precision as well as interference from carbamylated hemoglobin. METHODS Bias of the POCT analyzers was obtained by measuring 53 blood samples from diabetic patients with a wide range of HbA1c, 4%-14% (20-130 mmol/mol), and comparing the results with those obtained by the laboratory method: HPLC HA 8160 Menarini. Precision was performed by 20 successive determinations of two samples with low 4.2% (22 mmol/mol) and high 9.5% (80 mmol/mol) HbA1c values. The possible interference from carbamylated hemoglobin was studied using 25 samples from patients with chronic renal failure. RESULTS The means of the differences between measurements performed by each POCT analyzer and the laboratory method (95% confidence interval) were: 0.28% (p<0.005) (0.10-0.44) for DCA and 0.27% (p<0.001) (0.19-0.35) for Afinion. Correlation coefficients were: r=0.973 for DCA, and r=0.991 for Afinion. The mean bias observed by using samples from chronic renal failure patients were 0.2 (range -0.4, 0.4) for DCA and 0.2 (-0.2, 0.5) for Afinion. Imprecision results were: CV=3.1% (high HbA1c) and 2.97% (low HbA1c) for DCA, CV=1.95% (high HbA1c) and 2.66% (low HbA1c) for Afinion. CONCLUSIONS Both POCT analyzers for HbA1c show good correlation with the laboratory method and acceptable precision.