967 resultados para Point interpolation method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coltop3D is a software that performs structural analysis by using digital elevation model (DEM) and 3D point clouds acquired with terrestrial laser scanners. A color representation merging slope aspect and slope angle is used in order to obtain a unique code of color for each orientation of a local slope. Thus a continuous planar structure appears in a unique color. Several tools are included to create stereonets, to draw traces of discontinuities, or to compute automatically density stereonet. Examples are shown to demonstrate the efficiency of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la University of Calgary, Canadà, entre desembre del 2007 i febrer del 2008. El projecte ha consistit en l'anàlisi de les dades d'una recerca en el camp de la psicologia de la música, concretament en com influeix la música en l'atenció a través de les vies dels estats emocionals i enèrgics de la persona. Per a la recerca es feu ús de videu en les sessions, obtenint dades visuals i auditives per a complementar les dades de tipus quantitatiu provinents dels resultats d'uns tests d'atenció subministrats. L'anàlisi es realitzà segons mètodes i tècniques de caràcter qualitatiu, apresos durant l'estada. Així mateix també s'ha aprofundit en la comprensió del paradigma qualitatiu com a paradigma vàlid i realment complementari del paradigma qualitatiu. S'ha focalitzat especialment en l'anàlisi de la conversa des d'un punt de vista interpretatiu així com l'anàlisi de llenguatge corporal i facial a partir de l'observació de videu, tot formulant descriptors i subdescriptors de la conducta que està relacionada amb la hipòtesis. Alguns descriptors havien estat formulats prèviament a l’anàlisi, en base a altres investigacions i al background de la investigadora; altres s’han anat descobrint durant l’anàlisi. Els descriptors i subdescriptors de la conducta estan relacionats amb l'intent dels estats anímics i enèrgics dels diferents participants. L'anàlisi s'ha realitzat com un estudi de casos, fent un anàlisi exhaustiu persona per persona amb l'objectiu de trobar patrons de reacció intrapersonals i intrapersonals. Els patrons observats s'utilitzaran com a contrast amb la informació quantitativa, tot realitzant triangulació amb les dades per trobar-ne possibles recolzaments o contradiccions entre sí. Els resultats preliminars indiquen relació entre el tipus de música i el comportament, sent que la música d'emotivitat negativa està associada a un tancament de la persona, però quan la música és enèrgica els participants s'activen (conductualment observat) i somriuen si aquesta és positiva.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

iii. Catheter-related bloodstream infection (CR-BSI) diagnosis usually involves catheter withdrawal. An alternative method for CR-BSI diagnosis is the differential time to positivity (DTP) between peripheral and catheter hub blood cultures. This study aims to validate the DTP method in short-term catheters. The results show a low prevalence of CR-BSI in the sample (8.4%). The DTP method is a valid alternative for CR-BSI diagnosis in those cases with monomicrobial cultures (80% sensitivity, 99% specificity, 92% positive predictive value, and 98% negative predictive value) and a cut-off point of 17.7 hours for positivity of hub blood culture may assess in CR-BSI diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of traffic engineering is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization, few works have focused on LSR (label switched router) label space. This paper proposes an algorithm that takes advantage of the MPLS label stack features in order to reduce the number of labels used in LSPs. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The described algorithm sets up NHLFE (next hop label forwarding entry) tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the described algorithm achieves a great reduction factor in the label space. The presented works apply for both types of connections: P2MP (point-to-multipoint) and P2P (point-to-point)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Measurement of HbA1c is the most important parameter to assess glycemic control in diabetic patients. Different point-of-care devices for HbA1c are available. The aim of this study was to evaluate two point-of-care testing (POCT) analyzers (DCA Vantage from Siemens and Afinion from Axis-Shield). We studied the bias and precision as well as interference from carbamylated hemoglobin. METHODS Bias of the POCT analyzers was obtained by measuring 53 blood samples from diabetic patients with a wide range of HbA1c, 4%-14% (20-130 mmol/mol), and comparing the results with those obtained by the laboratory method: HPLC HA 8160 Menarini. Precision was performed by 20 successive determinations of two samples with low 4.2% (22 mmol/mol) and high 9.5% (80 mmol/mol) HbA1c values. The possible interference from carbamylated hemoglobin was studied using 25 samples from patients with chronic renal failure. RESULTS The means of the differences between measurements performed by each POCT analyzer and the laboratory method (95% confidence interval) were: 0.28% (p<0.005) (0.10-0.44) for DCA and 0.27% (p<0.001) (0.19-0.35) for Afinion. Correlation coefficients were: r=0.973 for DCA, and r=0.991 for Afinion. The mean bias observed by using samples from chronic renal failure patients were 0.2 (range -0.4, 0.4) for DCA and 0.2 (-0.2, 0.5) for Afinion. Imprecision results were: CV=3.1% (high HbA1c) and 2.97% (low HbA1c) for DCA, CV=1.95% (high HbA1c) and 2.66% (low HbA1c) for Afinion. CONCLUSIONS Both POCT analyzers for HbA1c show good correlation with the laboratory method and acceptable precision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to describe the development and to test the reliability of a new method called INTERMED, for health service needs assessment. The INTERMED integrates the biopsychosocial aspects of disease and the relationship between patient and health care system in a comprehensive scheme and reflects an operationalized conceptual approach to case mix or case complexity. The method is developed to enhance interdisciplinary communication between (para-) medical specialists and to provide a method to describe case complexity for clinical, scientific, and educational purposes. First, a feasibility study (N = 21 patients) was conducted which included double scoring and discussion of the results. This led to a version of the instrument on which two interrater reliability studies were performed. In study 1, the INTERMED was double scored for 14 patients admitted to an internal ward by a psychiatrist and an internist on the basis of a joint interview conducted by both. In study 2, on the basis of medical charts, two clinicians separately double scored the INTERMED in 16 patients referred to the outpatient psychiatric consultation service. Averaged over both studies, in 94.2% of all ratings there was no important difference between the raters (more than 1 point difference). As a research interview, it takes about 20 minutes; as part of the whole process of history taking it takes about 15 minutes. In both studies, improvements were suggested by the results. Analyses of study 1 revealed that on most items there was considerable agreement; some items were improved. Also, the reference point for the prognoses was changed so that it reflected both short- and long-term prognoses. Analyses of study 2 showed that in this setting, less agreement between the raters was obtained due to the fact that the raters were less experienced and the scoring procedure was more susceptible to differences. Some improvements--mainly of the anchor points--were specified which may further enhance interrater reliability. The INTERMED proves to be a reliable method for classifying patients' care needs, especially when used by experienced raters scoring by patient interview. It can be a useful tool in assessing patients' care needs, as well as the level of needed adjustment between general and mental health service delivery. The INTERMED is easily applicable in the clinical setting at low time-costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of the Fry method to measure strain in deformed porphyritic granites is discussed. This method requires that the distribution of markers has to satisfy at least two conditions. It has to be homogeneous and isotropic. Statistics on point distribution with the help of a Morishita diagram can easily test homogeneity. Isotropy can be checked with a cumulative histogram of angles between points. Application of these tests to undeformed (Mte Capanne granite, Elba) and to deformed (Randa orthogneiss, Alps of Switzerland) porphyritic granite reveals that their K-feldspars phenocrysts both satisfy these conditions and can be used as strain markers with the Fry method. Other problems are also examined. One is the possible distribution of deformation on discrete shear-bands. Providing several tests are met, we conclude that the Fry method can be used to estimate strain in deformed porphyritic granites. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The aim of this study was to compare specificity and sensitivity of different biological markers that can be used in a forensic field to identify potentially dangerous drivers because of their alcohol habits. Methods: We studied 280 Swiss drivers after driving while under the alcohol influence. 33 were excluded for not having CDT N results, 247 were included (218 men (88%) and 29 women (12%). Mean age was 42,4 (SD:12, min: 20 max: 76). The evaluation of the alcohol consumption concerned the month before the CDT test and was considered as such after the interview: Heavy drinkers (>3 drinks per day): 60 (32.7%), < 3 drinks per day and moderate: 127 (51.4%) 114 (46.5%), abstinent: 60 (24.3%) 51 (21%). Alcohol intake was monitored by structured interviews, self-reported drinking habits and the C-Audit questionnaire as well as information provided by their family and general practitioner. Consumption was quantified in terms of standard drinks, which contain approximately 10 grams of pure alcohol (Ref. WHO). Results: comparison between moderate (less or equal to 3 drinks per day) and excessive drinkers (more than 3 drinks) Marker ROC area 95% CI cut-off sensitivity specificity CDT TIA 0.852 0.786-0917 2.6* 0.93 LR+1.43 0.35 LR-0.192 CDT N latex 0.875 0.821-0.930 2.5* 0.66 LR+ 6.93 0.90 LR- 0.369 Asialo+disialo-tf 0.881 0.826-0.936 1.2* 0.78 LR+4.07 0.80 LR-0.268 1.7° 0.66 LR+8.9 0.93 LR-0.360 GGT 0.659 0.580-0.737 85* 0.37 LR+2.14 0.83 LR-0.764 * cut-off point suggested by the manufacturer ° cut-off point suggested by our laboratory Conclusion: With the cut-off point established by the manufacturer, CDT TIA performed poorly in term of specificity. N latex CDT and CZE CDT were better, especially if a 1.7 cut-off is used with CZE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter discusses the detection and correction ofresidual motion errors that appear in airborne synthetic apertureradar (SAR) interferograms due to the lack of precision in the navigationsystem. As it is shown, the effect of this lack of precision istwofold: azimuth registration errors and phase azimuth undulations.Up to now, the correction of the former was carried out byestimating the registration error and interpolating, while the latterwas based on the estimation of the phase azimuth undulations tocompensate the phase of the computed interferogram. In this letter,a new correction method is proposed, which avoids the interpolationstep and corrects at the same time the azimuth phase undulations.Additionally, the spectral diversity technique, used to estimateregistration errors, is critically analyzed. Airborne L-bandrepeat-pass interferometric data of the German Aerospace Center(DLR) experimental airborne SAR is used to validate the method

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The graphical representation of spatial soil properties in a digital environment is complex because it requires a conversion of data collected in a discrete form onto a continuous surface. The objective of this study was to apply three-dimension techniques of interpolation and visualization on soil texture and fertility properties and establish relationships with pedogenetic factors and processes in a slope area. The GRASS Geographic Information System was used to generate three-dimensional models and ParaView software to visualize soil volumes. Samples of the A, AB, BA, and B horizons were collected in a regular 122-point grid in an area of 13 ha, in Pinhais, PR, in southern Brazil. Geoprocessing and graphic computing techniques were effective in identifying and delimiting soil volumes of distinct ranges of fertility properties confined within the soil matrix. Both three-dimensional interpolation and the visualization tool facilitated interpretation in a continuous space (volumes) of the cause-effect relationships between soil texture and fertility properties and pedological factors and processes, such as higher clay contents following the drainage lines of the area. The flattest part with more weathered soils (Oxisols) had the highest pH values and lower Al3+ concentrations. These techniques of data interpolation and visualization have great potential for use in diverse areas of soil science, such as identification of soil volumes occurring side-by-side but that exhibit different physical, chemical, and mineralogical conditions for plant root growth, and monitoring of plumes of organic and inorganic pollutants in soils and sediments, among other applications. The methodological details for interpolation and a three-dimensional view of soil data are presented here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study the dual space and reiteration theorems for the real method of interpolation for infinite families of Banach spaces introduced in [2]. We also give examples of interpolation spaces constructed with this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.