974 resultados para INTERIOR POINT METHOD


Relevância:

30.00% 30.00%

Publicador:

Resumo:

iii. Catheter-related bloodstream infection (CR-BSI) diagnosis usually involves catheter withdrawal. An alternative method for CR-BSI diagnosis is the differential time to positivity (DTP) between peripheral and catheter hub blood cultures. This study aims to validate the DTP method in short-term catheters. The results show a low prevalence of CR-BSI in the sample (8.4%). The DTP method is a valid alternative for CR-BSI diagnosis in those cases with monomicrobial cultures (80% sensitivity, 99% specificity, 92% positive predictive value, and 98% negative predictive value) and a cut-off point of 17.7 hours for positivity of hub blood culture may assess in CR-BSI diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We construct and analyze non-overlapping Schwarz methods for a preconditioned weakly over-penalized symmetric interior penalty (WOPSIP) method for elliptic problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of traffic engineering is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization, few works have focused on LSR (label switched router) label space. This paper proposes an algorithm that takes advantage of the MPLS label stack features in order to reduce the number of labels used in LSPs. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The described algorithm sets up NHLFE (next hop label forwarding entry) tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the described algorithm achieves a great reduction factor in the label space. The presented works apply for both types of connections: P2MP (point-to-multipoint) and P2P (point-to-point)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Measurement of HbA1c is the most important parameter to assess glycemic control in diabetic patients. Different point-of-care devices for HbA1c are available. The aim of this study was to evaluate two point-of-care testing (POCT) analyzers (DCA Vantage from Siemens and Afinion from Axis-Shield). We studied the bias and precision as well as interference from carbamylated hemoglobin. METHODS Bias of the POCT analyzers was obtained by measuring 53 blood samples from diabetic patients with a wide range of HbA1c, 4%-14% (20-130 mmol/mol), and comparing the results with those obtained by the laboratory method: HPLC HA 8160 Menarini. Precision was performed by 20 successive determinations of two samples with low 4.2% (22 mmol/mol) and high 9.5% (80 mmol/mol) HbA1c values. The possible interference from carbamylated hemoglobin was studied using 25 samples from patients with chronic renal failure. RESULTS The means of the differences between measurements performed by each POCT analyzer and the laboratory method (95% confidence interval) were: 0.28% (p<0.005) (0.10-0.44) for DCA and 0.27% (p<0.001) (0.19-0.35) for Afinion. Correlation coefficients were: r=0.973 for DCA, and r=0.991 for Afinion. The mean bias observed by using samples from chronic renal failure patients were 0.2 (range -0.4, 0.4) for DCA and 0.2 (-0.2, 0.5) for Afinion. Imprecision results were: CV=3.1% (high HbA1c) and 2.97% (low HbA1c) for DCA, CV=1.95% (high HbA1c) and 2.66% (low HbA1c) for Afinion. CONCLUSIONS Both POCT analyzers for HbA1c show good correlation with the laboratory method and acceptable precision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to describe the development and to test the reliability of a new method called INTERMED, for health service needs assessment. The INTERMED integrates the biopsychosocial aspects of disease and the relationship between patient and health care system in a comprehensive scheme and reflects an operationalized conceptual approach to case mix or case complexity. The method is developed to enhance interdisciplinary communication between (para-) medical specialists and to provide a method to describe case complexity for clinical, scientific, and educational purposes. First, a feasibility study (N = 21 patients) was conducted which included double scoring and discussion of the results. This led to a version of the instrument on which two interrater reliability studies were performed. In study 1, the INTERMED was double scored for 14 patients admitted to an internal ward by a psychiatrist and an internist on the basis of a joint interview conducted by both. In study 2, on the basis of medical charts, two clinicians separately double scored the INTERMED in 16 patients referred to the outpatient psychiatric consultation service. Averaged over both studies, in 94.2% of all ratings there was no important difference between the raters (more than 1 point difference). As a research interview, it takes about 20 minutes; as part of the whole process of history taking it takes about 15 minutes. In both studies, improvements were suggested by the results. Analyses of study 1 revealed that on most items there was considerable agreement; some items were improved. Also, the reference point for the prognoses was changed so that it reflected both short- and long-term prognoses. Analyses of study 2 showed that in this setting, less agreement between the raters was obtained due to the fact that the raters were less experienced and the scoring procedure was more susceptible to differences. Some improvements--mainly of the anchor points--were specified which may further enhance interrater reliability. The INTERMED proves to be a reliable method for classifying patients' care needs, especially when used by experienced raters scoring by patient interview. It can be a useful tool in assessing patients' care needs, as well as the level of needed adjustment between general and mental health service delivery. The INTERMED is easily applicable in the clinical setting at low time-costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of the Fry method to measure strain in deformed porphyritic granites is discussed. This method requires that the distribution of markers has to satisfy at least two conditions. It has to be homogeneous and isotropic. Statistics on point distribution with the help of a Morishita diagram can easily test homogeneity. Isotropy can be checked with a cumulative histogram of angles between points. Application of these tests to undeformed (Mte Capanne granite, Elba) and to deformed (Randa orthogneiss, Alps of Switzerland) porphyritic granite reveals that their K-feldspars phenocrysts both satisfy these conditions and can be used as strain markers with the Fry method. Other problems are also examined. One is the possible distribution of deformation on discrete shear-bands. Providing several tests are met, we conclude that the Fry method can be used to estimate strain in deformed porphyritic granites. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The aim of this study was to compare specificity and sensitivity of different biological markers that can be used in a forensic field to identify potentially dangerous drivers because of their alcohol habits. Methods: We studied 280 Swiss drivers after driving while under the alcohol influence. 33 were excluded for not having CDT N results, 247 were included (218 men (88%) and 29 women (12%). Mean age was 42,4 (SD:12, min: 20 max: 76). The evaluation of the alcohol consumption concerned the month before the CDT test and was considered as such after the interview: Heavy drinkers (>3 drinks per day): 60 (32.7%), < 3 drinks per day and moderate: 127 (51.4%) 114 (46.5%), abstinent: 60 (24.3%) 51 (21%). Alcohol intake was monitored by structured interviews, self-reported drinking habits and the C-Audit questionnaire as well as information provided by their family and general practitioner. Consumption was quantified in terms of standard drinks, which contain approximately 10 grams of pure alcohol (Ref. WHO). Results: comparison between moderate (less or equal to 3 drinks per day) and excessive drinkers (more than 3 drinks) Marker ROC area 95% CI cut-off sensitivity specificity CDT TIA 0.852 0.786-0917 2.6* 0.93 LR+1.43 0.35 LR-0.192 CDT N latex 0.875 0.821-0.930 2.5* 0.66 LR+ 6.93 0.90 LR- 0.369 Asialo+disialo-tf 0.881 0.826-0.936 1.2* 0.78 LR+4.07 0.80 LR-0.268 1.7° 0.66 LR+8.9 0.93 LR-0.360 GGT 0.659 0.580-0.737 85* 0.37 LR+2.14 0.83 LR-0.764 * cut-off point suggested by the manufacturer ° cut-off point suggested by our laboratory Conclusion: With the cut-off point established by the manufacturer, CDT TIA performed poorly in term of specificity. N latex CDT and CZE CDT were better, especially if a 1.7 cut-off is used with CZE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Una de las herramientas estadísticas más importantes para el seguimiento y análisis de la evolución de la actividad económica a corto plazo es la disponibilidad de estimaciones de la evolución trimestral de los componentes del PIB, en lo que afecta tanto a la oferta como a la demanda. La necesidad de disponer de esta información con un retraso temporal reducido hace imprescindible la utilización de métodos de trimestralización que permitan desagregar la información anual a trimestral. El método más aplicado, puesto que permite resolver este problema de manera muy elegante bajo un enfoque estadístico de estimador óptimo, es el método de Chow-Lin. Pero este método no garantiza que las estimaciones trimestrales del PIB en lo que respecta a la oferta y a la demanda coincidan, haciendo necesaria la aplicación posterior de algún método de conciliación. En este trabajo se desarrolla una ampliación multivariante del método de Chow-Lin que permite resolver el problema de la estimación de los valores trimestrales de manera óptima, sujeta a un conjunto de restricciones. Una de las aplicaciones potenciales de este método, que hemos denominado método de Chow-Lin restringido, es precisamente la estimación conjunta de valores trimestrales para cada uno de los componentes del PIB en lo que afecta tanto a la demanda como a la oferta condicionada a que ambas estimaciones trimestrales del PIB sean iguales, evitando así la necesidad de aplicar posteriormente métodos de conciliación

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Una de las herramientas estadísticas más importantes para el seguimiento y análisis de la evolución de la actividad económica a corto plazo es la disponibilidad de estimaciones de la evolución trimestral de los componentes del PIB, en lo que afecta tanto a la oferta como a la demanda. La necesidad de disponer de esta información con un retraso temporal reducido hace imprescindible la utilización de métodos de trimestralización que permitan desagregar la información anual a trimestral. El método más aplicado, puesto que permite resolver este problema de manera muy elegante bajo un enfoque estadístico de estimador óptimo, es el método de Chow-Lin. Pero este método no garantiza que las estimaciones trimestrales del PIB en lo que respecta a la oferta y a la demanda coincidan, haciendo necesaria la aplicación posterior de algún método de conciliación. En este trabajo se desarrolla una ampliación multivariante del método de Chow-Lin que permite resolver el problema de la estimación de los valores trimestrales de manera óptima, sujeta a un conjunto de restricciones. Una de las aplicaciones potenciales de este método, que hemos denominado método de Chow-Lin restringido, es precisamente la estimación conjunta de valores trimestrales para cada uno de los componentes del PIB en lo que afecta tanto a la demanda como a la oferta condicionada a que ambas estimaciones trimestrales del PIB sean iguales, evitando así la necesidad de aplicar posteriormente métodos de conciliación

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Iowa has the same problem that confronts most states in the United States: many bridges constructed more than 20 years ago either have deteriorated to the point that they are inadequate for original design loads or have been rendered inadequate by changes in design/maintenance standards or design loads. Inadequate bridges require either strengthening or posting for reduced loads. A sizeable number of single span, composite concrete deck - steel I beam bridges in Iowa currently cannot be rated to carry today's design loads. Various methods for strengthening the unsafe bridges have been proposed and some methods have been tried. No method appears to be as economical and promising as strengthening by post-tensioning of the steel beams. At the time this research study was begun, the feasibility of posttensioning existing composite bridges was unknown. As one would expect, the design of a bridge-strengthening scheme utilizing post-tensioning is quite complex. The design involves composite construction stressed in an abnormal manner (possible tension in the deck slab), consideration of different sizes of exterior and interior beams, cover-plated beams already designed for maximum moment at midspan and at plate cut-off points, complex live load distribution, and distribution of post-tensioningforces and moments among the bridge beams. Although information is available on many of these topics, there is miminal information on several of them and no information available on the total design problem. This study, therefore, is an effort to gather some of the missing information, primarily through testing a half-size bridge model and thus determining the feasibility of strengthening composite bridges by post-tensioning. Based on the results of this study, the authors anticipate that a second phase of the study will be undertaken and directed toward strengthening of one or more prototype bridges in Iowa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method is used to estimate the volumes of sediments of glacial valleys. This method is based on the concept of sloping local base level and requires only a digital terrain model and the limits of the alluvial valleys as input data. The bedrock surface of the glacial valley is estimated by a progressive excavation of the digital elevation model (DEM) of the filled valley area. This is performed using an iterative routine that replaces the altitude of a point of the DEM by the mean value of its neighbors minus a fixed value. The result is a curved surface, quadratic in 2D. The bedrock surface of the Rhone Valley in Switzerland was estimated by this method using the free digital terrain model Shuttle Radar Topography Mission (SRTM) (~92 m resolution). The results obtained are in good agreement with the previous estimations based on seismic profiles and gravimetric modeling, with the exceptions of some particular locations. The results from the present method and those from the seismic interpretation are slightly different from the results of the gravimetric data. This discrepancy may result from the presence of large buried landslides in the bottom of the Rhone Valley.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global positioning systems (GPS) offer a cost-effective and efficient method to input and update transportation data. The spatial location of objects provided by GPS is easily integrated into geographic information systems (GIS). The storage, manipulation, and analysis of spatial data are also relatively simple in a GIS. However, many data storage and reporting methods at transportation agencies rely on linear referencing methods (LRMs); consequently, GPS data must be able to link with linear referencing. Unfortunately, the two systems are fundamentally incompatible in the way data are collected, integrated, and manipulated. In order for the spatial data collected using GPS to be integrated into a linear referencing system or shared among LRMs, a number of issues need to be addressed. This report documents and evaluates several of those issues and offers recommendations. In order to evaluate the issues associated with integrating GPS data with a LRM, a pilot study was created. To perform the pilot study, point features, a linear datum, and a spatial representation of a LRM were created for six test roadway segments that were located within the boundaries of the pilot study conducted by the Iowa Department of Transportation linear referencing system project team. Various issues in integrating point features with a LRM or between LRMs are discussed and recommendations provided. The accuracy of the GPS is discussed, including issues such as point features mapping to the wrong segment. Another topic is the loss of spatial information that occurs when a three-dimensional or two-dimensional spatial point feature is converted to a one-dimensional representation on a LRM. Recommendations such as storing point features as spatial objects if necessary or preserving information such as coordinates and elevation are suggested. The lack of spatial accuracy characteristic of most cartography, on which LRM are often based, is another topic discussed. The associated issues include linear and horizontal offset error. The final topic discussed is some of the issues in transferring point feature data between LRMs.