22 resultados para Point interpolation method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of the present study was to retrospectively estimate the absorbed dose to kidneys in 17 patients treated in clinical practice with 90Y-ibritumomab tiuxetan for non-Hodgkin's lymphoma, using appropriate dosimetric approaches available. METHODS: The single-view effective point source method, including background subtraction, is used for planar quantification of renal activity. Since the high uptake in the liver affects the activity estimate in the right kidney, the dose to the left kidney serves as a surrogate for the dose to both kidneys. Calculation of absorbed dose is based on the Medical Internal Radiation Dose methodology with adjustment for patient kidney mass. RESULTS: The median dose to kidneys, based on the left kidney only, is 2.1 mGy/MBq (range, 0.92-4.4), whereas a value of 2.5 mGy/MBq (range, 1.5-4.7) is obtained, considering the activity in both kidneys. CONCLUSIONS: Irrespective of the method, doses to kidneys obtained in the present study were about 10 times higher than the median dose of 0.22 mGy/MBq (range, 0.00-0.95) were originally reported from the study leading to Food and Drug Administration approval. Our results are in good agreement with kidney-dose estimates recently reported from high-dose myeloablative therapy with 90Y-ibritumomab tiuxetan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION. Reduced cerebral perfusion pressure (CPP) may worsen secondary damage and outcome after severe traumatic brain injury (TBI), however the optimal management of CPP is still debated. STUDY HYPOTHESIS: We hypothesized that the impact of CPP on outcome is related to brain tissue oxygen tension (PbtO2) level and that reduced CPP may worsen TBI prognosis when it is associated with brain hypoxia. DESIGN. Retrospective analysis of prospective database. METHODS. We analyzed 103 patients with severe TBI who underwent continuous PbtO2 and CPP monitoring for an average of 5 days. For each patient, duration of reduced CPP (\60 mm Hg) and brain hypoxia (PbtO2\15 mm Hg for[30 min [1]) was calculated with linear interpolation method and the relationship between CPP and PbtO2 was analyzed with Pearson's linear correlation coefficient. Outcome at 30 days was assessed with the Glasgow Outcome Score (GOS), dichotomized as good (GOS 4-5) versus poor (GOS 1-3). Multivariable associations with outcome were analyzed with stepwise forward logistic regression. RESULTS. Reduced CPP (n=790 episodes; mean duration 10.2 ± 12.3 h) was observed in 75 (74%) patients and was frequently associated with brain hypoxia (46/75; 61%). Episodes where reduced CPP were associated with normal brain oxygen did not differ significantly between patients with poor versus those with good outcome (8.2 ± 8.3 vs. 6.5 ± 9.7 h; P=0.35). In contrast, time where reduced CPP occurred simultaneously with brain hypoxia was longer in patients with poor than in those with good outcome (3.3±7.4 vs. 0.8±2.3 h; P=0.02). Outcome was significantly worse in patients who had both reduced CPP and brain hypoxia (61% had GOS 1-3 vs. 17% in those with reduced CPP but no brain hypoxia; P\0.01). Patients in whom a positive CPP-PbtO2 correlation (r[0.3) was found also were more likely to have poor outcome (69 vs. 31% in patients with no CPP-PbtO2 correlation; P\0.01). Brain hypoxia was an independent risk factor of poor prognosis (odds ratio for favorable outcome of 0.89 [95% CI 0.79-1.00] per hour spent with a PbtO2\15 mm Hg; P=0.05, adjusted for CPP, age, GCS, Marshall CT and APACHE II). CONCLUSIONS. Low CPP may significantly worsen outcome after severe TBI when it is associated with brain tissue hypoxia. PbtO2-targeted management of CPP may optimize TBI therapy and improve outcome of head-injured patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To assess the iodine status of Swiss population groups and to evaluate the influence of iodized salt as a vector for iodine fortification. DESIGN: The relationship between 24 h urinary iodine and Na excretions was assessed in the general population after correcting for confounders. Single-day intakes were estimated assuming that 92 % of dietary iodine was excreted in 24 h urine. Usual intake distributions were derived for male and female population groups after adjustment for within-subject variability. The estimated average requirement (EAR) cut-point method was applied as guidance to assess the inadequacy of the iodine supply. SETTING: Public health strategies to reduce the dietary salt intake in the general population may affect its iodine supply. SUBJECTS: The study population (1481 volunteers, aged ≥15 years) was randomly selected from three different linguistic regions of Switzerland. RESULTS: The 24 h urine samples from 1420 participants were determined to be properly collected. Mean iodine intakes obtained for men (n 705) and women (n 715) were 179 (sd 68.1) µg/d and 138 (sd 57.8) µg/d, respectively. Urinary Na and Ca, and BMI were significantly and positively associated with higher iodine intake, as were men and non-smokers. Fifty-four per cent of the total iodine intake originated from iodized salt. The prevalence of inadequate iodine intake as estimated by the EAR cut-point method was 2 % for men and 14 % for women. CONCLUSIONS: The estimated prevalence of inadequate iodine intake was within the optimal target range of 2-3 % for men, but not for women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For many drugs, finding the balance between efficacy and toxicity requires monitoring their concentrations in the patient's blood. Quantifying drug levels at the bedside or at home would have advantages in terms of therapeutic outcome and convenience, but current techniques require the setting of a diagnostic laboratory. We have developed semisynthetic bioluminescent sensors that permit precise measurements of drug concentrations in patient samples by spotting minimal volumes on paper and recording the signal using a simple point-and-shoot camera. Our sensors have a modular design consisting of a protein-based and a synthetic part and can be engineered to selectively recognize a wide range of drugs, including immunosuppressants, antiepileptics, anticancer agents and antiarrhythmics. This low-cost point-of-care method could make therapies safer, increase the convenience of doctors and patients and make therapeutic drug monitoring available in regions with poor infrastructure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Iron deficiency is a common and undertreated problem in inflammatory bowel disease (IBD). AIM: To develop an online tool to support treatment choice at the patient-specific level. METHODS: Using the RAND/UCLA Appropriateness Method (RUAM), a European expert panel assessed the appropriateness of treatment regimens for a variety of clinical scenarios in patients with non-anaemic iron deficiency (NAID) and iron deficiency anaemia (IDA). Treatment options included adjustment of IBD medication only, oral iron supplementation, high-/low-dose intravenous (IV) regimens, IV iron plus erythropoietin-stimulating agent (ESA), and blood transfusion. The panel process consisted of two individual rating rounds (1148 treatment indications; 9-point scale) and three plenary discussion meetings. RESULTS: The panel reached agreement on 71% of treatment indications. 'No treatment' was never considered appropriate, and repeat treatment after previous failure was generally discouraged. For 98% of scenarios, at least one treatment was appropriate. Adjustment of IBD medication was deemed appropriate in all patients with active disease. Use of oral iron was mainly considered an option in NAID and mildly anaemic patients without disease activity. IV regimens were often judged appropriate, with high-dose IV iron being the preferred option in 77% of IDA scenarios. Blood transfusion and IV+ESA were indicated in exceptional cases only. CONCLUSIONS: The RUAM revealed high agreement amongst experts on the management of iron deficiency in patients with IBD. High-dose IV iron was more often considered appropriate than other options. To facilitate dissemination of the recommendations, panel outcomes were embedded in an online tool, accessible via http://ferroscope.com/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coltop3D is a software that performs structural analysis by using digital elevation model (DEM) and 3D point clouds acquired with terrestrial laser scanners. A color representation merging slope aspect and slope angle is used in order to obtain a unique code of color for each orientation of a local slope. Thus a continuous planar structure appears in a unique color. Several tools are included to create stereonets, to draw traces of discontinuities, or to compute automatically density stereonet. Examples are shown to demonstrate the efficiency of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to describe the development and to test the reliability of a new method called INTERMED, for health service needs assessment. The INTERMED integrates the biopsychosocial aspects of disease and the relationship between patient and health care system in a comprehensive scheme and reflects an operationalized conceptual approach to case mix or case complexity. The method is developed to enhance interdisciplinary communication between (para-) medical specialists and to provide a method to describe case complexity for clinical, scientific, and educational purposes. First, a feasibility study (N = 21 patients) was conducted which included double scoring and discussion of the results. This led to a version of the instrument on which two interrater reliability studies were performed. In study 1, the INTERMED was double scored for 14 patients admitted to an internal ward by a psychiatrist and an internist on the basis of a joint interview conducted by both. In study 2, on the basis of medical charts, two clinicians separately double scored the INTERMED in 16 patients referred to the outpatient psychiatric consultation service. Averaged over both studies, in 94.2% of all ratings there was no important difference between the raters (more than 1 point difference). As a research interview, it takes about 20 minutes; as part of the whole process of history taking it takes about 15 minutes. In both studies, improvements were suggested by the results. Analyses of study 1 revealed that on most items there was considerable agreement; some items were improved. Also, the reference point for the prognoses was changed so that it reflected both short- and long-term prognoses. Analyses of study 2 showed that in this setting, less agreement between the raters was obtained due to the fact that the raters were less experienced and the scoring procedure was more susceptible to differences. Some improvements--mainly of the anchor points--were specified which may further enhance interrater reliability. The INTERMED proves to be a reliable method for classifying patients' care needs, especially when used by experienced raters scoring by patient interview. It can be a useful tool in assessing patients' care needs, as well as the level of needed adjustment between general and mental health service delivery. The INTERMED is easily applicable in the clinical setting at low time-costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of the Fry method to measure strain in deformed porphyritic granites is discussed. This method requires that the distribution of markers has to satisfy at least two conditions. It has to be homogeneous and isotropic. Statistics on point distribution with the help of a Morishita diagram can easily test homogeneity. Isotropy can be checked with a cumulative histogram of angles between points. Application of these tests to undeformed (Mte Capanne granite, Elba) and to deformed (Randa orthogneiss, Alps of Switzerland) porphyritic granite reveals that their K-feldspars phenocrysts both satisfy these conditions and can be used as strain markers with the Fry method. Other problems are also examined. One is the possible distribution of deformation on discrete shear-bands. Providing several tests are met, we conclude that the Fry method can be used to estimate strain in deformed porphyritic granites. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The aim of this study was to compare specificity and sensitivity of different biological markers that can be used in a forensic field to identify potentially dangerous drivers because of their alcohol habits. Methods: We studied 280 Swiss drivers after driving while under the alcohol influence. 33 were excluded for not having CDT N results, 247 were included (218 men (88%) and 29 women (12%). Mean age was 42,4 (SD:12, min: 20 max: 76). The evaluation of the alcohol consumption concerned the month before the CDT test and was considered as such after the interview: Heavy drinkers (>3 drinks per day): 60 (32.7%), < 3 drinks per day and moderate: 127 (51.4%) 114 (46.5%), abstinent: 60 (24.3%) 51 (21%). Alcohol intake was monitored by structured interviews, self-reported drinking habits and the C-Audit questionnaire as well as information provided by their family and general practitioner. Consumption was quantified in terms of standard drinks, which contain approximately 10 grams of pure alcohol (Ref. WHO). Results: comparison between moderate (less or equal to 3 drinks per day) and excessive drinkers (more than 3 drinks) Marker ROC area 95% CI cut-off sensitivity specificity CDT TIA 0.852 0.786-0917 2.6* 0.93 LR+1.43 0.35 LR-0.192 CDT N latex 0.875 0.821-0.930 2.5* 0.66 LR+ 6.93 0.90 LR- 0.369 Asialo+disialo-tf 0.881 0.826-0.936 1.2* 0.78 LR+4.07 0.80 LR-0.268 1.7° 0.66 LR+8.9 0.93 LR-0.360 GGT 0.659 0.580-0.737 85* 0.37 LR+2.14 0.83 LR-0.764 * cut-off point suggested by the manufacturer ° cut-off point suggested by our laboratory Conclusion: With the cut-off point established by the manufacturer, CDT TIA performed poorly in term of specificity. N latex CDT and CZE CDT were better, especially if a 1.7 cut-off is used with CZE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method is used to estimate the volumes of sediments of glacial valleys. This method is based on the concept of sloping local base level and requires only a digital terrain model and the limits of the alluvial valleys as input data. The bedrock surface of the glacial valley is estimated by a progressive excavation of the digital elevation model (DEM) of the filled valley area. This is performed using an iterative routine that replaces the altitude of a point of the DEM by the mean value of its neighbors minus a fixed value. The result is a curved surface, quadratic in 2D. The bedrock surface of the Rhone Valley in Switzerland was estimated by this method using the free digital terrain model Shuttle Radar Topography Mission (SRTM) (~92 m resolution). The results obtained are in good agreement with the previous estimations based on seismic profiles and gravimetric modeling, with the exceptions of some particular locations. The results from the present method and those from the seismic interpretation are slightly different from the results of the gravimetric data. This discrepancy may result from the presence of large buried landslides in the bottom of the Rhone Valley.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.