805 resultados para Concurrent computing
Resumo:
We consider the numerical treatment of the optical flow problem by evaluating the performance of the trust region method versus the line search method. To the best of our knowledge, the trust region method is studied here for the first time for variational optical flow computation. Four different optical flow models are used to test the performance of the proposed algorithm combining linear and nonlinear data terms with quadratic and TV regularization. We show that trust region often performs better than line search; especially in the presence of non-linearity and non-convexity in the model.
Resumo:
The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.
Resumo:
Eighty-Sixth General Assembly Joint Rules Governing Lobbyists (House Concurrent Resolution 7) House Adopted 2-3-2015, Senate Adopted 2-4-2015
Resumo:
BACKGROUND: Letrozole radiosensitises breast cancer cells in vitro. In clinical settings, no data exist for the combination of letrozole and radiotherapy. We assessed concurrent and sequential radiotherapy and letrozole in the adjuvant setting. METHODS: This phase 2 randomised trial was undertaken in two centres in France and one in Switzerland between Jan 12, 2005, and Feb 21, 2007. 150 postmenopausal women with early-stage breast cancer were randomly assigned after conserving surgery to either concurrent radiotherapy and letrozole (n=75) or sequential radiotherapy and letrozole (n=75). Randomisation was open label with a minimisation technique, stratified by investigational centres, chemotherapy (yes vs no), radiation boost (yes vs no), and value of radiation-induced lymphocyte apoptosis (< or = 16% vs >16%). Whole breast was irradiated to a total dose of 50 Gy in 25 fractions over 5 weeks. In the case of supraclavicular and internal mammary node irradiation, the dose was 44-50 Gy. Letrozole was administered orally once daily at a dose of 2.5 mg for 5 years (beginning 3 weeks pre-radiotherapy in the concomitant group, and 3 weeks post-radiotherapy in the sequential group). The primary endpoint was the occurrence of acute (during and within 6 weeks of radiotherapy) and late (within 2 years) radiation-induced grade 2 or worse toxic effects of the skin. Analyses were by intention to treat. This study is registered with ClinicalTrials.gov, number NCT00208273. FINDINGS: All patients were analysed apart from one in the concurrent group who withdrew consent before any treatment. During radiotherapy and within the first 12 weeks after radiotherapy, 31 patients in the concurrent group and 31 in the sequential group had any grade 2 or worse skin-related toxicity. The most common skin-related adverse event was dermatitis: four patients in the concurrent group and six in the sequential group had grade 3 acute skin dermatitis during radiotherapy. At a median follow-up of 26 months (range 3-40), two patients in each group had grade 2 or worse late effects (both radiation-induced subcutaneous fibrosis). INTERPRETATION: Letrozole can be safely delivered shortly after surgery and concomitantly with radiotherapy. Long-term follow-up is needed to investigate cardiac side-effects and cancer-specific outcomes. FUNDING: Novartis Oncology France.
Resumo:
Purpose/Objective(s): Letrozole radiosensitizes breast cancer cells in vitro. In clinical settings, no data exist for the combination of letrozole and radiotherapy. We assessed concurrent and sequential radiotherapy and letrozole in the adjuvant setting.Materials/Methods: The present study is registered with ClinicalTrials.gov, number NCT00208273. This Phase 2 randomized trial was undertaken in two centers in France and one in Switzerland between January 12, 2005, and February 21, 2007. One hundred fifty postmenopausal women with early-stage breast cancer were randomly assigned after conserving surgery to either concurrent radiotherapy and letrozole (n = 75) or sequential radiotherapy and letrozole (n = 75). Randomization was open label with a minimization technique, stratified by investigational centers, chemotherapy (yes vs. no), radiation boost (yes vs. no), and value of radiation-induced lymphocyte apoptosis (#16% vs. .16%). The whole breast was irradiated to a total dose of 50 Gy in 25 fractions over 5 weeks. In the case of supraclavicular and internal mammary node irradiation, the dose was 44 - 50 Gy. Letrozole was administered orally once daily at a dose of 2 - 5 mg for 5 years (beginning 3 weeks pre-radiotherapy in the concomitant group, and 3 weeks postradiotherapy in the sequential group). The primary endpoint was the occurrence of acute (during and within 6 weeks of radiotherapy) and late (within 2 years) radiation-induced Grade 2 or worse toxic effects of the skin and lung (functional pulmonary test and lung CT-scan). Analyses were by intention-to-treat. The long-term follow-up after 2 years was only performed in Montpellier (n = 121) and evaluated skin toxicity (clinical examination every 6 months), lung fibrosis (one CT-scan yearly), cosmetic outcome.Results: All patients were analyzed apart from 1 in the concurrent group who withdrew consent before any treatment.Within the first 2 years (n = 149), no lung toxicity was identified by CT scan and no modification from baseline was noted by the lung diffusion capacity test. Two patients in each group had Grade 2 or worse late effects (both radiation-induced subcutaneous fibrosis [RISF]). After 2 years (n = 121), and with a median follow-up of 50 months (38-62), 2 patients (1 in each arm) presented a Grade 3 RISF. No lung toxicity was identified by CT scan. Cosmetic results (photographies) and quality of life was good to excellent. All patients who had Grade 3 subcutaneous fibrosis had an RILA value of 16% or less, irrespective of the sequence with letrozole.Conclusions:With long-term follow-up, letrozole can be safely delivered shortly after surgery and concomitantly with radiotherapy.
Resumo:
Tietokonejärjestelmän osien ja ohjelmistojen suorituskykymittauksista saadaan tietoa,jota voidaan käyttää suorituskyvyn parantamiseen ja laitteistohankintojen päätöksen tukena. Tässä työssä tutustutaan suorituskyvyn mittaamiseen ja mittausohjelmiin eli ns. benchmark-ohjelmistoihin. Työssä etsittiin ja arvioitiin eri tyyppisiä vapaasti saatavilla olevia benchmark-ohjelmia, jotka soveltuvat Linux-laskentaklusterin suorituskyvynanalysointiin. Benchmarkit ryhmiteltiin ja arvioitiin testaamalla niiden ominaisuuksia Linux-klusterissa. Työssä käsitellään myös mittausten tekemisen ja rinnakkaislaskennan haasteita. Benchmarkkeja löytyi moneen tarkoitukseen ja ne osoittautuivat laadultaan ja laajuudeltaan vaihteleviksi. Niitä on myös koottu ohjelmistopaketeiksi, jotta laitteiston suorituskyvystä saisi laajemman kuvan kuin mitä yhdellä ohjelmalla on mahdollista saada. Olennaista on ymmärtää nopeus, jolla dataa saadaan siirretyä prosessorille keskusmuistista, levyjärjestelmistä ja toisista laskentasolmuista. Tyypillinen benchmark-ohjelma sisältää paljon laskentaa tarvitsevan matemaattisen algoritmin, jota käytetään tieteellisissä ohjelmistoissa. Benchmarkista riippuen tulosten ymmärtäminen ja hyödyntäminen voi olla haasteellista.
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
Resumo:
Our efforts are directed towards the understanding of the coscheduling mechanism in a NOW system when a parallel job is executed jointly with local workloads, balancing parallel performance against the local interactive response. Explicit and implicit coscheduling techniques in a PVM-Linux NOW (or cluster) have been implemented. Furthermore, dynamic coscheduling remains an open question when parallel jobs are executed in a non-dedicated Cluster. A basis model for dynamic coscheduling in Cluster systems is presented in this paper. Also, one dynamic coscheduling algorithm for this model is proposed. The applicability of this algorithm has been proved and its performance analyzed by simulation. Finally, a new tool (named Monito) for monitoring the different queues of messages in such an environments is presented. The main aim of implementing this facility is to provide a mean of capturing the bottlenecks and overheads of the communication system in a PVM-Linux cluster.
Resumo:
A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.
Resumo:
In metallurgic plants a high quality metal production is always required. Nowadays soft computing applications are more often used for automation of manufacturing process and quality control instead of mechanical techniques. In this thesis an overview of soft computing methods presents. As an example of soft computing application, an effective model of fuzzy expert system for the automotive quality control of steel degassing process was developed. The purpose of this work is to describe the fuzzy relations as quality hypersurfaces by varying number of linguistic variables and fuzzy sets.
Resumo:
This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.
Resumo:
Tutkimuksen selvitettiin miten skenaarioanalyysia voidaan käyttää uuden teknologian tutkimisessa. Työssä havaittiin, että skenaarioanalyysin soveltuvuuteen vaikuttaa eniten teknologisen muutoksen taso ja saatavilla olevan tiedon luonne. Skenaariomenetelmä soveltuu hyvin uusien teknologioiden tutkimukseen erityisesti radikaalien innovaatioiden kohdalla. Syynä tähän on niihin liittyvä suuri epävarmuus, kompleksisuus ja vallitsevan paradigman muuttuminen, joiden takia useat muut tulevaisuuden tutkimuksen menetelmät eivät ole tilanteessa käyttökelpoisia. Työn empiirisessä osiossa tutkittiin hilaverkkoteknologian tulevaisuutta skenaarioanalyysin avulla. Hilaverkot nähtiin mahdollisena disruptiivisena teknologiana, joka radikaalina innovaationa saattaa muuttaa tietokonelaskennan nykyisestä tuotepohjaisesta laskentakapasiteetin ostamisesta palvelupohjaiseksi. Tällä olisi suuri vaikutus koko nykyiseen ICT-toimialaan erityisesti tarvelaskennan hyödyntämisen ansiosta. Tutkimus tarkasteli kehitystä vuoteen 2010 asti. Teorian ja olemassa olevan tiedon perusteella muodostettiin vahvaan asiantuntijatietouteen nojautuen neljä mahdollista ympäristöskenaariota hilaverkoille. Skenaarioista huomattiin, että teknologian kaupallinen menestys on vielä monen haasteen takana. Erityisesti luottamus ja lisäarvon synnyttäminen nousivat tärkeimmiksi hilaverkkojen tulevaisuutta ohjaaviksi tekijöiksi.