845 resultados para HLRF-BASED ALGORITHMS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim - To use Monte Carlo (MC) together with voxel phantoms to analyze the tissue heterogeneity effect in the dose distributions and equivalent uniform dose (EUD) for (125)I prostate implants. Background - Dose distribution calculations in low dose-rate brachytherapy are based on the dose deposition around a single source in a water phantom. This formalism does not take into account tissue heterogeneities, interseed attenuation, or finite patient dimensions effects. Tissue composition is especially important due to the photoelectric effect. Materials and Methods - The computed tomographies (CT) of two patients with prostate cancer were used to create voxel phantoms for the MC simulations. An elemental composition and density were assigned to each structure. Densities of the prostate, vesicles, rectum and bladder were determined through the CT electronic densities of 100 patients. The same simulations were performed considering the same phantom as pure water. Results were compared via dose-volume histograms and EUD for the prostate and rectum. Results - The mean absorbed doses presented deviations of 3.3-4.0% for the prostate and of 2.3-4.9% for the rectum, when comparing calculations in water with calculations in the heterogeneous phantom. In the calculations in water, the prostate D 90 was overestimated by 2.8-3.9% and the rectum D 0.1cc resulted in dose differences of 6-8%. The EUD resulted in an overestimation of 3.5-3.7% for the prostate and of 7.7-8.3% for the rectum. Conclusions - The deposited dose was consistently overestimated for the simulation in water. In order to increase the accuracy in the determination of dose distributions, especially around the rectum, the introduction of the model-based algorithms is recommended.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O escalonamento é uma das decisões mais importantes no funcionamento de uma linha de produção. No âmbito desta dissertação foi realizada uma descrição do problema do escalonamento, identificando alguns métodos para a optimização dos problemas de escalonamento. Foi realizado um estudo ao caso do problema de máquina única através do teste de várias instâncias com o objectivo de minimizar o atraso pesado, aplicando uma Meta-Heurística baseada na Pesquisa Local e dois algoritmos baseados no SB. Os resultados obtidos reflectem que os algoritmos baseados no SB apresentaram resultados mais próximos do óptimo, em relação ao algoritmo baseado na PL. Os resultados obtidos permitem sustentar a hipótese de não existirem algoritmos específicos para os problemas de escalonamento. A melhor forma de encontrar uma solução de boa qualidade em tempo útil é experimentar diferentes algoritmos e comparar o desempenho das soluções obtidas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot. Although the dominant approach, when using RL, has been to apply value function based algorithms, the system here detailed is characterized by the use of direct policy search methods. Rather than approximating a value function, these methodologies approximate a policy using an independent function approximator with its own parameters, trying to maximize the future expected reward. The policy based algorithm presented in this paper is used for learning the internal state/action mapping of a behavior. In this preliminary work, we demonstrate its feasibility with simulated experiments using the underwater robot GARBI in a target reaching task

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is increasing evidence to suggest that the presence of mesoscopic heterogeneities constitutes the predominant attenuation mechanism at seismic frequencies. As a consequence, centimeter-scale perturbations of the subsurface physical properties should be taken into account for seismic modeling whenever detailed and accurate responses of the target structures are desired. This is, however, computationally prohibitive since extremely small grid spacings would be necessary. A convenient way to circumvent this problem is to use an upscaling procedure to replace the heterogeneous porous media by equivalent visco-elastic solids. In this work, we solve Biot's equations of motion to perform numerical simulations of seismic wave propagation through porous media containing mesoscopic heterogeneities. We then use an upscaling procedure to replace the heterogeneous poro-elastic regions by homogeneous equivalent visco-elastic solids and repeat the simulations using visco-elastic equations of motion. We find that, despite the equivalent attenuation behavior of the heterogeneous poro-elastic medium and the equivalent visco-elastic solid, the seismograms may differ due to diverging boundary conditions at fluid-solid interfaces, where there exist additional options for the poro-elastic case. In particular, we observe that the seismograms agree for closed-pore boundary conditions, but differ significantly for open-pore boundary conditions. This is an interesting result, which has potentially important implications for wave-equation-based algorithms in exploration geophysics involving fluid-solid interfaces, such as, for example, wave field decomposition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is increasing evidence to suggest that the presence of mesoscopic heterogeneities constitutes an important seismic attenuation mechanism in porous rocks. As a consequence, centimetre-scale perturbations of the rock physical properties should be taken into account for seismic modelling whenever detailed and accurate responses of specific target structures are desired, which is, however, computationally prohibitive. A convenient way to circumvent this problem is to use an upscaling procedure to replace each of the heterogeneous porous media composing the geological model by corresponding equivalent visco-elastic solids and to solve the visco-elastic equations of motion for the inferred equivalent model. While the overall qualitative validity of this procedure is well established, there are as of yet no quantitative analyses regarding the equivalence of the seismograms resulting from the original poro-elastic and the corresponding upscaled visco-elastic models. To address this issue, we compare poro-elastic and visco-elastic solutions for a range of marine-type models of increasing complexity. We found that despite the identical dispersion and attenuation behaviour of the heterogeneous poro-elastic and the equivalent visco-elastic media, the seismograms may differ substantially due to diverging boundary conditions, where there exist additional options for the poro-elastic case. In particular, we observe that at the fluid/porous-solid interface, the poro- and visco-elastic seismograms agree for closed-pore boundary conditions, but differ significantly for open-pore boundary conditions. This is an important result which has potentially far-reaching implications for wave-equation-based algorithms in exploration geophysics involving fluid/porous-solid interfaces, such as, for example, wavefield decomposition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genotype-based algorithms are valuable tools for the identification of patients eligible for CCR5 inhibitors administration in clinical practice. Among the available methods, geno2pheno[coreceptor] (G2P) is the most used online tool for tropism prediction. This study was conceived to assess if the combination of G2P prediction with V3 peptide net charge (NC) value could improve the accuracy of tropism prediction. A total of 172 V3 bulk sequences from 143 patients were analyzed by G2P and NC values. A phenotypic assay was performed by cloning the complete env gene and tropism determination was assessed on U87_CCR5(+)/CXCR4(+) cells. Sequences were stratified according to the agreement between NC values and G2P results. Of sequences predicted as X4 by G2P, 61% showed NC values higher than 5; similarly, 76% of sequences predicted as R5 by G2P had NC values below 4. Sequences with NC values between 4 and 5 were associated with different G2P predictions: 65% of samples were predicted as R5-tropic and 35% of sequences as X4-tropic. Sequences identified as X4 by NC value had at least one positive residue at positions known to be involved in tropism prediction and positive residues in position 32. These data supported the hypothesis that NC values between 4 and 5 could be associated with the presence of dual/mixed-tropic (DM) variants. The phenotypic assay performed on a subset of sequences confirmed the tropism prediction for concordant sequences and showed that NC values between 4 and 5 are associated with DM tropism. These results suggest that the combination of G2P and NC could increase the accuracy of tropism prediction. A more reliable identification of X4 variants would be useful for better selecting candidates for Maraviroc (MVC) administration, but also as a predictive marker in coreceptor switching, strongly associated with the phase of infection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Care for patients with colon and rectal cancer has improved in the last twenty years however still considerable variation exists in cancer management and outcome between European countries. Therefore, EURECCA, which is the acronym of European Registration of cancer care, is aiming at defining core treatment strategies and developing a European audit structure in order to improve the quality of care for all patients with colon and rectal cancer. In December 2012 the first multidisciplinary consensus conference about colon and rectum was held looking for multidisciplinary consensus. The expert panel consisted of representatives of European scientific organisations involved in cancer care of patients with colon and rectal cancer and representatives of national colorectal registries. Methods: The expert panel had delegates of the European Society of Surgical Oncology (ESSO), European Society for Radiotherapy & Oncology (ESTRO), European Society of Pathology (ESP), European Society for Medical Oncology (ESMO), European Society of Radiology (ESR), European Society of Coloproctology (ESCP), European CanCer Organisation (ECCO), European Oncology Nursing Society (EONS) and the European Colorectal Cancer Patient Organisation (EuropaColon), as well as delegates from national registries or audits. Experts commented and voted on the two web-based online voting rounds before the meeting (between 4th and 25th October and between the 20th November and 3rd December 2012) as well as one online round after the meeting (4th20th March 2013) and were invited to lecture on the subjects during the meeting (13th15th December 2012). The sentences in the consensus document were available during the meeting and a televoting round during the conference by all participants was performed. All sentences that were voted on are available on the EURECCA website www.canceraudit.eu. The consensus document was divided in sections describing evidence based algorithms of diagnostics, pathology, surgery, medical oncology, radiotherapy, and follow-up where applicable for treatment of colon cancer, rectal cancer and stage IV separately. Consensus was achieved using the Delphi method. Results: The total number of the voted sentences was 465. All chapters were voted on by at least 75% of the experts. Of the 465 sentences, 84% achieved large consensus, 6% achieved moderate consensus, and 7% resulted in minimum consensus. Only 3% was disagreed by more than 50% of the members. Conclusions: It is feasible to achieve European Consensus on key diagnostic and treatment issues using the Delphi method. This consensus embodies the expertise of professionals from all disciplines involved in the care for patients with colon and rectal cancer. Diagnostic and treatment algorithms were developed to implement the current evidence and to define core treatment guidance for multidisciplinary team management of colon and rectal cancer throughout Europe.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Analyses of brain responses to external stimuli are typically based on the means computed across conditions. However in many cognitive and clinical applications, taking into account their variability across trials has turned out to be statistically more sensitive than comparing their means. NEW METHOD: In this study we present a novel implementation of a single-trial topographic analysis (STTA) for discriminating auditory evoked potentials at predefined time-windows. This analysis has been previously introduced for extracting spatio-temporal features at the level of the whole neural response. Adapting the STTA on specific time windows is an essential step for comparing its performance to other time-window based algorithms. RESULTS: We analyzed responses to standard vs. deviant sounds and showed that the new implementation of the STTA gives above-chance decoding results in all subjects (in comparison to 7 out of 11 with the original method). In comatose patients, the improvement of the decoding performance was even more pronounced than in healthy controls and doubled the number of significant results. COMPARISON WITH EXISTING METHOD(S): We compared the results obtained with the new STTA to those based on a logistic regression in healthy controls and patients. We showed that the first of these two comparisons provided a better performance of the logistic regression; however only the new STTA provided significant results in comatose patients at group level. CONCLUSIONS: Our results provide quantitative evidence that a systematic investigation of the accuracy of established methods in normal and clinical population is an essential step for optimizing decoding performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Care for patients with colon and rectal cancer has improved in the last twenty years however still considerable variation exists in cancer management and outcome between European countries. Therefore, EURECCA, which is the acronym of European Registration of cancer care, is aiming at defining core treatment strategies and developing a European audit structure in order to improve the quality of care for all patients with colon and rectal cancer. In December 2012 the first multidisciplinary consensus conference about colon and rectum was held looking for multidisciplinary consensus. The expert panel consisted of representatives of European scientific organisations involved in cancer care of patients with colon and rectal cancer and representatives of national colorectal registries. Methods: The expert panel had delegates of the European Society of Surgical Oncology (ESSO), European Society for Radiotherapy & Oncology (ESTRO), European Society of Pathology (ESP), European Society for Medical Oncology (ESMO), European Society of Radiology (ESR), European Society of Coloproctology (ESCP), European CanCer Organisation (ECCO), European Oncology Nursing Society (EONS) and the European Colorectal Cancer Patient Organisation (EuropaColon), as well as delegates from national registries or audits. Experts commented and voted on the two web-based online voting rounds before the meeting (between 4th and 25th October and between the 20th November and 3rd December 2012) as well as one online round after the meeting (4th-20th March 2013) and were invited to lecture on the subjects during the meeting (13th-15th December 2012). The sentences in the consensus document were available during the meeting and a televoting round during the conference by all participants was performed. All sentences that were voted on are available on the EURECCA website www.canceraudit.eu. The consensus document was divided in sections describing evidence based algorithms of diagnostics, pathology, surgery, medical oncology, radiotherapy, and follow-up where applicable for treatment of colon cancer, rectal cancer and stage IV separately. Consensus was achieved using the Delphi method. Results: The total number of the voted sentences was 465. All chapters were voted on by at least 75% of the experts. Of the 465 sentences, 84% achieved large consensus, 6% achieved moderate consensus, and 7% resulted in minimum consensus. Only 3% was disagreed by more than 50% of the members. Conclusions: It is feasible to achieve European Consensus on key diagnostic and treatment issues using the Delphi method. This consensus embodies the expertise of professionals from all disciplines involved in the care for patients with colon and rectal cancer. Diagnostic and treatment algorithms were developed to implement the current evidence and to define core treatment guidance for multidisciplinary team management of colon and rectal cancer throughout Europe.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We tested and compared performances of Roach formula, Partin tables and of three Machine Learning (ML) based algorithms based on decision trees in identifying N+ prostate cancer (PC). 1,555 cN0 and 50 cN+ PC were analyzed. Results were also verified on an independent population of 204 operated cN0 patients, with a known pN status (187 pN0, 17 pN1 patients). ML performed better, also when tested on the surgical population, with accuracy, specificity, and sensitivity ranging between 48-86%, 35-91%, and 17-79%, respectively. ML potentially allows better prediction of the nodal status of PC, potentially allowing a better tailoring of pelvic irradiation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While red-green-blue (RGB) image of retina has quite limited information, retinal multispectral images provide both spatial and spectral information which could enhance the capability of exploring the eye-related problems in their early stages. In this thesis, two learning-based algorithms for reconstructing of spectral retinal images from the RGB images are developed by a two-step manner. First, related previous techniques are reviewed and studied. Then, the most suitable methods are enhanced and combined to have new algorithms for the reconstruction of spectral retinal images. The proposed approaches are based on radial basis function network to learn a mapping from tristimulus colour space to multi-spectral space. The resemblance level of reproduced spectral images and original images is estimated using spectral distance metrics spectral angle mapper, spectral correlation mapper, and spectral information divergence, which show a promising result from the suggested algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pendant la dernière décennie nous avons vu une transformation incroyable du monde de la musique qui est passé des cassettes et disques compacts à la musique numérique en ligne. Avec l'explosion de la musique numérique, nous avons besoin de systèmes de recommandation de musique pour choisir les chansons susceptibles d’être appréciés à partir de ces énormes bases de données en ligne ou personnelles. Actuellement, la plupart des systèmes de recommandation de musique utilisent l’algorithme de filtrage collaboratif ou celui du filtrage à base de contenu. Dans ce mémoire, nous proposons un algorithme hybride et original qui combine le filtrage collaboratif avec le filtrage basé sur étiquetage, amélioré par la technique de filtrage basée sur le contexte d’utilisation afin de produire de meilleures recommandations. Notre approche suppose que les préférences de l'utilisateur changent selon le contexte d'utilisation. Par exemple, un utilisateur écoute un genre de musique en conduisant vers son travail, un autre type en voyageant avec la famille en vacances, un autre pendant une soirée romantique ou aux fêtes. De plus, si la sélection a été générée pour plus d'un utilisateur (voyage en famille, fête) le système proposera des chansons en fonction des préférences de tous ces utilisateurs. L'objectif principal de notre système est de recommander à l'utilisateur de la musique à partir de sa collection personnelle ou à partir de la collection du système, les nouveautés et les prochains concerts. Un autre objectif de notre système sera de collecter des données provenant de sources extérieures, en s'appuyant sur des techniques de crawling et sur les flux RSS pour offrir des informations reliées à la musique tels que: les nouveautés, les prochains concerts, les paroles et les artistes similaires. Nous essayerons d’unifier des ensembles de données disponibles gratuitement sur le Web tels que les habitudes d’écoute de Last.fm, la base de données de la musique de MusicBrainz et les étiquettes des MusicStrands afin d'obtenir des identificateurs uniques pour les chansons, les albums et les artistes.