859 resultados para Fuzzy c-means algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Postpartum hemorrhage (PPH) is one of the main causes of maternal deaths even in industrialized countries. It represents an emergency situation which necessitates a rapid decision and in particular an exact diagnosis and root cause analysis in order to initiate the correct therapeutic measures in an interdisciplinary cooperation. In addition to established guidelines, the benefits of standardized therapy algorithms have been demonstrated. A therapy algorithm for the obstetric emergency of postpartum hemorrhage in the German language is not yet available. The establishment of an international (Germany, Austria and Switzerland D-A-CH) "treatment algorithm for postpartum hemorrhage" was an interdisciplinary project based on the guidelines of the corresponding specialist societies (anesthesia and intensive care medicine and obstetrics) in the three countries as well as comparable international algorithms for therapy of PPH.The obstetrics and anesthesiology personnel must possess sufficient expertise for emergency situations despite lower case numbers. The rarity of occurrence for individual patients and the life-threatening situation necessitate a structured approach according to predetermined treatment algorithms. This can then be carried out according to the established algorithm. Furthermore, this algorithm presents the opportunity to train for emergency situations in an interdisciplinary team.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

N. Bostrom’s simulation argument and two additional assumptions imply that we are likely to live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and material models. Even though the computer hardware does provide a material model of the target, this does not suffice to underwrite the simulation argument because the ways in which parts of the computer hardware interact during simulations do not resemble the ways in which neurons interact in the brain. Further, there are computer simulations of all kinds of systems, and it would be unreasonable to infer that some computers display consciousness just because they simulate brains rather than, say, galaxies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE To systematically evaluate the dependence of intravoxel-incoherent-motion (IVIM) parameters on the b-value threshold separating the perfusion and diffusion compartment, and to implement and test an algorithm for the standardized computation of this threshold. METHODS Diffusion weighted images of the upper abdomen were acquired at 3 Tesla in eleven healthy male volunteers with 10 different b-values and in two healthy male volunteers with 16 different b-values. Region-of-interest IVIM analysis was applied to the abdominal organs and skeletal muscle with a systematic increase of the b-value threshold for computing pseudodiffusion D*, perfusion fraction Fp , diffusion coefficient D, and the sum of squared residuals to the bi-exponential IVIM-fit. RESULTS IVIM parameters strongly depended on the choice of the b-value threshold. The proposed algorithm successfully provided optimal b-value thresholds with the smallest residuals for all evaluated organs [s/mm2]: e.g., right liver lobe 20, spleen 20, right renal cortex 150, skeletal muscle 150. Mean D* [10(-3) mm(2) /s], Fp [%], and D [10(-3) mm(2) /s] values (±standard deviation) were: right liver lobe, 88.7 ± 42.5, 22.6 ± 7.4, 0.73 ± 0.12; right renal cortex: 11.5 ± 1.8, 18.3 ± 2.9, 1.68 ± 0.05; spleen: 41.9 ± 57.9, 8.2 ± 3.4, 0.69 ± 0.07; skeletal muscle: 21.7 ± 19.0; 7.4 ± 3.0; 1.36 ± 0.04. CONCLUSION IVIM parameters strongly depend upon the choice of the b-value threshold used for computation. The proposed algorithm may be used as a robust approach for IVIM analysis without organ-specific adaptation. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

10.1002/hlca.200390311.abs A series of oligonucleotides containing (5′S)-5′-C-butyl- and (5′S)-5′-C-isopentyl-substituted 2′-deoxyribonucleosides were designed, prepared, and characterized with the intention to explore alkyl-zipper formation between opposing alkyl chains across the minor groove of oligonucleotide duplexes as a means to modulate DNA-duplex stability. From four possible arrangements of the alkyl groups that differ in the density of packing of the alkyl chains across the minor groove, three (duplex types I–III, Fig. 2) could experimentally be realized and their duplex-forming properties analyzed by UV-melting curves, CD spectroscopy, and isothermal titration calorimetry (ITC), as well as by molecular modeling. The results show that all arrangements of alkyl residues within the minor groove of DNA are thermally destabilizing by 1.5–3°/modification in Tm. We found that, within the proposed duplexes with more loosely packed alkyl groups (type-III duplexes), accommodation of alkyl residues without extended distorsion of the helical parameters of B-DNA is possible but does not lead to higher thermodynamic stability. The more densely packed and more unevenly distributed arrangement (type-II duplexes) seems to suffer from ecliptic positioning of opposite alkyl groups, which might account for a systematic negative contribution to stability due to steric interactions. The decreased stability in the type-III duplexes described here may be due either to missing hydrophobic interactions of the alkyl groups (not bulky enough to make close contacts), or to an overcompensation of favorable alkyl-zipper formation presumably by loss of structured H2O in the minor groove.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cataloging geocentric objects can be put in the framework of Multiple Target Tracking (MTT). Current work tends to focus on the S = 2 MTT problem because of its favorable computational complexity of O(n²). The MTT problem becomes NP-hard for a dimension of S˃3. The challenge is to find an approximation to the solution within a reasonable computation time. To effciently approximate this solution a Genetic Algorithm is used. The algorithm is applied to a simulated test case. These results represent the first steps towards a method that can treat the S˃3 problem effciently and with minimal manual intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both, the correct associations among the observations and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. The number S corresponds to the number of fences involved in the problem. Each fence consists of a set of observations where each observation belongs to a different object. The S ≥ 3 MTT problem is an NP-hard combinatorial optimization problem. There are two general ways to solve this. One way is to seek the optimum solution, this can be achieved by applying a branch-and- bound algorithm. When using these algorithms the problem has to be greatly simplified to keep the computational cost at a reasonable level. Another option is to approximate the solution by using meta-heuristic methods. These methods aim to efficiently explore the different possible combinations so that a reasonable result can be obtained with a reasonable computational effort. To this end several population-based meta-heuristic methods are implemented and tested on simulated optical measurements. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: Near-infrared spectroscopy (NIRS) enables the non-invasive measurement of changes in hemodynamics and oxygenation in tissue. Changes in light-coupling due to movement of the subject can cause movement artifacts (MAs) in the recorded signals. Several methods have been developed so far that facilitate the detection and reduction of MAs in the data. However, due to fixed parameter values (e.g., global threshold) none of these methods are perfectly suitable for long-term (i.e., hours) recordings or were not time-effective when applied to large datasets. We aimed to overcome these limitations by automation, i.e., data adaptive thresholding specifically designed for long-term measurements, and by introducing a stable long-term signal reconstruction. Our new technique (“acceleration-based movement artifact reduction algorithm”, AMARA) is based on combining two methods: the “movement artifact reduction algorithm” (MARA, Scholkmann et al. Phys. Meas. 2010, 31, 649–662), and the “accelerometer-based motion artifact removal” (ABAMAR, Virtanen et al. J. Biomed. Opt. 2011, 16, 087005). We describe AMARA in detail and report about successful validation of the algorithm using empirical NIRS data, measured over the prefrontal cortex in adolescents during sleep. In addition, we compared the performance of AMARA to that of MARA and ABAMAR based on validation data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND AIMS The Barcelona Clinic Liver Cancer (BCLC) staging system is the algorithm most widely used to manage patients with hepatocellular carcinoma (HCC). We aimed to investigate the extent to which the BCLC recommendations effectively guide clinical practice and assess the reasons for any deviation from the recommendations. MATERIAL AND METHODS The first-line treatments assigned to patients included in the prospective Bern HCC cohort were analyzed. RESULTS Among 223 patients included in the cohort, 116 were not treated according to the BCLC algorithm. Eighty percent of the patients in BCLC stage 0 (very early HCC) and 60% of the patients in BCLC stage A (early HCC) received recommended curative treatment. Only 29% of the BCLC stage B patients (intermediate HCC) and 33% of the BCLC stage C patients (advanced HCC) were treated according to the algorithm. Eighty-nine percent of the BCLC stage D patients (terminal HCC) were treated with best supportive care, as recommended. In 98 patients (44%) the performance status was disregarded in the stage assignment. CONCLUSION The management of HCC in clinical practice frequently deviates from the BCLC recommendations. Most of the curative therapy options, which have well-defined selection criteria, were allocated according to the recommendations, while the majority of the palliative therapy options were assigned to patients with tumor stages not aligned with the recommendations. The only parameter which is subjective in the algorithm, the performance status, is also the least respected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A limiting factor in the accuracy and precision of U/Pb zircon dates is accurate correction for initial disequilibrium in the 238U and 235U decay chains. The longest-lived-and therefore most abundant-intermediate daughter product in the 235U isotopic decay chain is 231Pa (T1/2 = 32.71 ka), and the partitioning behavior of Pa in zircon is not well constrained. Here we report high-precision thermal ionization mass spectrometry (TIMS) U-Pb zircon data from two samples from Ocean Drilling Program (ODP) Hole 735B, which show evidence for incorporation of excess 231Pa during zircon crystallization. The most precise analyses from the two samples have consistent Th-corrected 206Pb/238U dates with weighted means of 11.9325 ± 0.0039 Ma (n = 9) and 11.920 ± 0.011 Ma (n = 4), but distinctly older 207Pb/235U dates that vary from 12.330 ± 0.048 Ma to 12.140 ± 0.044 Ma and 12.03 ± 0.24 to 12.40 ± 0.27 Ma, respectively. If the excess 207Pb is due to variable initial excess 231Pa, calculated initial (231Pa)/(235U) activity ratios for the two samples range from 5.6 ± 1.0 to 9.6 ± 1.1 and 3.5 ± 5.2 to 11.4 ± 5.8. The data from the more precisely dated sample yields estimated DPazircon/DUzircon from 2.2-3.8 and 5.6-9.6, assuming (231Pa)/(235U) of the melt equal to the global average of recently erupted mid-ocean ridge basaltic glasses or secular equilibrium, respectively. High precision ID-TIMS analyses from nine additional samples from Hole 735B and nearby Hole 1105A suggest similar partitioning. The lower range of DPazircon/DUzircon is consistent with ion microprobe measurements of 231Pa in zircons from Holocene and Pleistocene rhyolitic eruptions (Schmitt (2007; doi:10.2138/am.2007.2449) and Schmitt (2011; doi:10.1146/annurev-earth-040610-133330)). The data suggest that 231Pa is preferentially incorporated during zircon crystallization over a range of magmatic compositions, and excess initial 231Pa may be more common in zircons than acknowledged. The degree of initial disequilibrium in the 235U decay chain suggested by the data from this study, and other recent high precision datasets, leads to resolvable discordance in high precision dates of Cenozoic to Mesozoic zircons. Minor discordance in zircons of this age may therefore reflect initial excess 231Pa and does not require either inheritance or Pb loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have performed quantitative X-ray diffraction (qXRD) analysis of 157 grab or core-top samples from the western Nordic Seas between (WNS) ~57°-75°N and 5° to 45° W. The RockJock Vs6 analysis includes non-clay (20) and clay (10) mineral species in the <2 mm size fraction that sum to 100 weight %. The data matrix was reduced to 9 and 6 variables respectively by excluding minerals with low weight% and by grouping into larger groups, such as the alkali and plagioclase feldspars. Because of its potential dual origins calcite was placed outside of the sum. We initially hypothesized that a combination of regional bedrock outcrops and transport associated with drift-ice, meltwater plumes, and bottom currents would result in 6 clusters defined by "similar" mineral compositions. The hypothesis was tested by use of a fuzzy k-mean clustering algorithm and key minerals were identified by step-wise Discriminant Function Analysis. Key minerals in defining the clusters include quartz, pyroxene, muscovite, and amphibole. With 5 clusters, 87.5% of the observations are correctly classified. The geographic distributions of the five k-mean clusters compares reasonably well with the original hypothesis. The close spatial relationship between bedrock geology and discrete cluster membership stresses the importance of this variable at both the WNS-scale and at a more local scale in NE Greenland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CMCC Global Ocean Physical Reanalysis System (C-GLORS) is used to simulate the state of the ocean in the last decades. It consists of a variational data assimilation system (OceanVar), capable of assimilating all in-situ observations along with altimetry data, and a forecast step performed by the ocean model NEMO coupled with the LIM2 sea-ice model. KEY STRENGTHS: - Data are available for a large number of ocean parameters - An extensive validation has been conducted and is freely available - The reanalysis is performed at high resolution (1/4 degree) and spans the last 30 years KEY LIMITATIONS: - Quality may be discontinuos and depend on observation coverage - Uncertainty estimates are simply derived through verification skill scores

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is motivated in providing and evaluating a fusion algorithm of remotely sensed images, i.e. the fusion of a high spatial resolution panchromatic image with a multi-spectral image (also known as pansharpening) using the dual-tree complex wavelet transform (DT-CWT), an effective approach for conducting an analytic and oversampled wavelet transform to reduce aliasing, and in turn reduce shift dependence of the wavelet transform. The proposed scheme includes the definition of a model to establish how information will be extracted from the PAN band and how that information will be injected into the MS bands with low spatial resolution. The approach was applied to Spot 5 images where there are bands falling outside PAN’s spectrum. We propose an optional step in the quality evaluation protocol, which is to study the quality of the merger by regions, where each region represents a specific feature of the image. The results show that DT-CWT based approach offers good spatial quality while retaining the spectral information of original images, case SPOT 5. The additional step facilitates the identification of the most affected regions by the fusion process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Se definen conceptos y se aplica el teorema de Valverde para escribir un algoritmo que computa bases de similaridades. This paper studies sorne theory and methods to build a representation theorem basis of a similarity from the basis of its subsimilarities, providing an alternative recursive method to compute the basis of a similarity.