36 resultados para Fuzzy c-means algorithm

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-parametric and quantitative magnetic resonance imaging (MRI) techniques have come into the focus of interest, both as a research and diagnostic modality for the evaluation of patients suffering from mild cognitive decline and overt dementia. In this study we address the question, if disease related quantitative magnetization transfer effects (qMT) within the intra- and extracellular matrices of the hippocampus may aid in the differentiation between clinically diagnosed patients with Alzheimer disease (AD), patients with mild cognitive impairment (MCI) and healthy controls. We evaluated 22 patients with AD (n=12) and MCI (n=10) and 22 healthy elderly (n=12) and younger (n=10) controls with multi-parametric MRI. Neuropsychological testing was performed in patients and elderly controls (n=34). In order to quantify the qMT effects, the absorption spectrum was sampled at relevant off-resonance frequencies. The qMT-parameters were calculated according to a two-pool spin-bath model including the T1- and T2 relaxation parameters of the free pool, determined in separate experiments. Histograms (fixed bin-size) of the normalized qMT-parameter values (z-scores) within the anterior and posterior hippocampus (hippocampal head and body) were subjected to a fuzzy-c-means classification algorithm with downstreamed PCA projection. The within-cluster sums of point-to-centroid distances were used to examine the effects of qMT- and diffusion anisotropy parameters on the discrimination of healthy volunteers, patients with Alzheimer and MCIs. The qMT-parameters T2(r) (T2 of the restricted pool) and F (fractional pool size) differentiated between the three groups (control, MCI and AD) in the anterior hippocampus. In our cohort, the MT ratio, as proposed in previous reports, did not differentiate between MCI and AD or healthy controls and MCI, but between healthy controls and AD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Traditionally, ontologies describe knowledge representation in a denotational, formalized, and deductive way. In addition, in this paper, we propose a semiotic, inductive, and approximate approach to ontology creation. We define a conceptual framework, a semantics extraction algorithm, and a first proof of concept applying the algorithm to a small set of Wikipedia documents. Intended as an extension to the prevailing top-down ontologies, we introduce an inductive fuzzy grassroots ontology, which organizes itself organically from existing natural language Web content. Using inductive and approximate reasoning to reflect the natural way in which knowledge is processed, the ontology’s bottom-up build process creates emergent semantics learned from the Web. By this means, the ontology acts as a hub for computing with words described in natural language. For Web users, the structural semantics are visualized as inductive fuzzy cognitive maps, allowing an initial form of intelligence amplification. Eventually, we present an implementation of our inductive fuzzy grassroots ontology Thus,this paper contributes an algorithm for the extraction of fuzzy grassroots ontologies from Web data by inductive fuzzy classification.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Social Web offers increasingly simple ways to publish and disseminate personal or opinionated information, which can rapidly exhibit a disastrous influence on the online reputation of organizations. Based on social Web data, this study describes the building of an ontology based on fuzzy sets. At the end of a recurring harvesting of folksonomies by Web agents, the aggregated tags are purified, linked, and transformed to a so-called fuzzy grassroots ontology by means of a fuzzy clustering algorithm. This self-updating ontology is used for online reputation analysis, a crucial task of reputation management, with the goal to follow the online conversation going on around an organization to discover and monitor its reputation. In addition, an application of the Fuzzy Online Reputation Analysis (FORA) framework, lesson learned, and potential extensions are discussed in this article.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to assess a pharmacokinetic algorithm to predict ketamine plasma concentration and drive a target-controlled infusion (TCI) in ponies. Firstly, the algorithm was used to simulate the course of ketamine enantiomers plasma concentrations after the administration of an intravenous bolus in six ponies based on individual pharmacokinetic parameters obtained from a previous experiment. Using the same pharmacokinetic parameters, a TCI of S-ketamine was then performed over 120 min to maintain a concentration of 1 microg/mL in plasma. The actual plasma concentrations of S-ketamine were measured from arterial samples using capillary electrophoresis. The performance of the simulation for the administration of a single bolus was very good. During the TCI, the S-ketamine plasma concentrations were maintained within the limit of acceptance (wobble and divergence <20%) at a median of 79% (IQR, 71-90) of the peak concentration reached after the initial bolus. However, in three ponies the steady concentrations were significantly higher than targeted. It is hypothesized that an inaccurate estimation of the volume of the central compartment is partly responsible for that difference. The algorithm allowed good predictions for the single bolus administration and an appropriate maintenance of constant plasma concentrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To assess the diagnostic accuracy, image quality, and radiation dose of an iterative reconstruction algorithm compared with a filtered back projection (FBP) algorithm for abdominal computed tomography (CT) at different tube voltages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To investigate whether an adaptive statistical iterative reconstruction (ASIR) algorithm improves the image quality at low-tube-voltage (80-kVp), high-tube-current (675-mA) multidetector abdominal computed tomography (CT) during the late hepatic arterial phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful treatment of prosthetic hip joint infection (PI) means elimination of infection and restored hip function. However, functional outcome is rarely studied. We analyzed the outcome of the strict use of a treatment algorithm for PI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The terminal homologation by CH(2) insertion into the peptides mentioned in the title is described. This involves replacement of the N-terminal amino acid residue by a β(2) - and of the C-terminal amino acid residue by a β(3) -homo-amino acid moiety (β(2) hXaa and β(3) hXaa, resp.; Fig. 1). In this way, the structure of the peptide chain from the N-terminal to the C-terminal stereogenic center is identical, and the modified peptide is protected against cleavage by exopeptidases (Figs. 2 and 3). Neurotensin (NT; 1) and its C-terminal fragment NT(8-13) are ligands of the G-protein-coupled receptors (GPCR) NT1, NT2, NT3, and NT analogs are promising tools to be used in cancer diagnostics and therapy. The affinities of homologated NT analogs, 2b-2e, for NT1 and NT2 receptors were determined by using cell homogenates and tumor tissues (Table 1); in the latter experiments, the affinities for the NT1 receptor are more or less the same as those of NT (0.5-1.3 vs. 0.6 nM). At the same time, one of the homologated NT analogs, 2c, survives in human plasma for 7 days at 37° (Fig. 6). An NMR analysis of NT(8-13) (Tables 2 and 4, and Fig. 8) reveals that this N-terminal NT fragment folds to a turn in CD(3) OH. - In the case of the human analgesic opiorphin (3a), a pentapeptide, and of the HIV-derived B27-KK10 (4a), a decapeptide, terminal homologation (→3b and 4b, resp.) led to a 7- and 70-fold half-life increase in plasma (Fig. 9). With N-terminally homologated NPY, 5c, we were not able to determine serum stability; the peptide consisting of 36 amino acid residues is subject to cleavage by endopetidases. Three of the homologated compounds, 2b, 2c, and 5c, were shown to be agonists (Fig. 7 and 11). A comparison of terminal homologation with other stability-increasing terminal modifications of peptides is performed (Fig. 5), and possible applications of the neurotensin analogs, described herein, are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Generation of coherent short-wavelength radiation across a plasma column is dramatically improved under traveling-wave excitation (TWE). The latter is optimized when its propagation is close to the speed of light, which implies small-angle target-irradiation. Yet, short-wavelength lasing needs large irradiation angles in order to increase the optical penetration of the pump into the plasma core. Pulse-front back-tilt is considered to overcome such trade-off. In fact, the TWE speed depends on the pulse-front slope (envelope of amplitude), whereas the optical penetration depth depends on the wave-front slope (envelope of phase). Pulse-front tilt by means of compressor misalignment was found effective only if coupled with a high-magnification front-end imaging/focusing component. It is concluded that speed matching should be accomplished with minimal compressor misalignment and maximal imaging magnification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel means to locate and treat lower gastrointestinal bleeding (lGB) allow to reduce the rate of required surgical interventions and help to limit the extend of resection. The risk stratification of patients with lGB is the primary step of our recommended treatment algorithm. Accordingly, risk stratifying instruments, which are only partly validated up to now, are gaining significance in lGB. Whereas, gastro-duodenoscopy and colonoscopy prior to angiography or scintigraphy are established diagnostic tools, capsule enteroscopy offers a novel approach to hemodynamic stable patients with lGB that are difficult to localize. With its every increasing sensitivity, Angio-Computer Tomography is likely to replace scintigraphy and diagnostic angiography in the very near future. In addition, recent advances in superselective microembolisation have been shown to have the potential rendering surgical interventions in a majority of patients with acute lGB unnecessary. The extend of required surgical resection is largely dependent on the success to localize the bleeding source of prior diagnostics. Only if the source is identified, a limited segmental resection should be performed. Should surgery be required, we suggest to maintain the effort to localize the bleeding, either by prior laparoscopy and/or by intraoperative entero-colonoscopy. Eventually, if the source of bleeding remains unclear total colectomy with ileorectal anastomosis represents the procedure of choice in patients with acute lGB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

GOALS OF WORK: To investigate the self-reported symptoms related to endocrine therapy in women with early or advanced breast cancer and the impact of these symptoms on quality of life (QL) indicators. MATERIALS AND METHODS: Symptom occurrence was assessed by the Checklist for Patients on Endocrine Therapy (C-PET) and symptom intensity was assessed by linear analogue self-assessment (LASA) indicators. Patients also responded to global LASA indicators for physical well-being, mood, coping effort and treatment burden. Associations between symptoms and these indicators were analysed by linear regression models. MAIN RESULTS: Among 373 women, the distribution of symptom intensity showed considerable variation in patients reporting a symptom as present. Even though patients recorded a symptom as absent, some patients reported having experienced that symptom when responding to symptom intensity, as seen for decreased sex drive, tiredness and vaginal dryness. Six of 13 symptoms and lower age had a detrimental impact on the global indicators, particularly tiredness and irritability. CONCLUSIONS: Patients' experience of endocrine symptoms needs to be considered both in patient care and research, when interpreting the association between symptoms and QL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GLAaS algorithm for pretreatment intensity modulation radiation therapy absolute dose verification based on the use of amorphous silicon detectors, as described in Nicolini et al. [G. Nicolini, A. Fogliata, E. Vanetti, A. Clivio, and L. Cozzi, Med. Phys. 33, 2839-2851 (2006)], was tested under a variety of experimental conditions to investigate its robustness, the possibility of using it in different clinics and its performance. GLAaS was therefore tested on a low-energy Varian Clinac (6 MV) equipped with an amorphous silicon Portal Vision PV-aS500 with electronic readout IAS2 and on a high-energy Clinac (6 and 15 MV) equipped with a PV-aS1000 and IAS3 electronics. Tests were performed for three calibration conditions: A: adding buildup on the top of the cassette such that SDD-SSD = d(max) and comparing measurements with corresponding doses computed at d(max), B: without adding any buildup on the top of the cassette and considering only the intrinsic water-equivalent thickness of the electronic portal imaging devices device (0.8 cm), and C: without adding any buildup on the top of the cassette but comparing measurements against doses computed at d(max). This procedure is similar to that usually applied when in vivo dosimetry is performed with solid state diodes without sufficient buildup material. Quantitatively, the gamma index (gamma), as described by Low et al. [D. A. Low, W. B. Harms, S. Mutic, and J. A. Purdy, Med. Phys. 25, 656-660 (1998)], was assessed. The gamma index was computed for a distance to agreement (DTA) of 3 mm. The dose difference deltaD was considered as 2%, 3%, and 4%. As a measure of the quality of results, the fraction of field area with gamma larger than 1 (%FA) was scored. Results over a set of 50 test samples (including fields from head and neck, breast, prostate, anal canal, and brain cases) and from the long-term routine usage, demonstrated the robustness and stability of GLAaS. In general, the mean values of %FA remain below 3% for deltaD equal or larger than 3%, while they are slightly larger for deltaD = 2% with %FA in the range from 3% to 8%. Since its introduction in routine practice, 1453 fields have been verified with GLAaS at the authors' institute (6 MV beam). Using a DTA of 3 mm and a deltaD of 4% the authors obtained %FA = 0.9 +/- 1.1 for the entire data set while, stratifying according to the dose calculation algorithm, they observed: %FA = 0.7 +/- 0.9 for fields computed with the analytical anisotropic algorithm and %FA = 2.4 +/- 1.3 for pencil-beam based fields with a statistically significant difference between the two groups. If data are stratified according to field splitting, they observed %FA = 0.8 +/- 1.0 for split fields and 1.0 +/- 1.2 for nonsplit fields without any significant difference.