936 resultados para Thread safe parallel run-time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän tutkielman tavoitteena on tutkia tekijöitä jotkavaikuttavat lyhyellä ja pitkällä aikavälillä kullan hintaan. Toiseksi tutkielmassa selvitetään mitä eri sijoitusmahdollisuuksia löytyy kultaan sijoitettaessa. Aineistona käytetään kuukausitasoista dataa Yhdysvaltain ja maailman hintaindekseistä, Yhdysvaltain ja maailman inflaatiosta ja inflaation volatiliteetista, kullan beetasta, kullan lainahinnasta, luottoriskistä ja Yhdysvaltojen ja maailman valuuttakurssi indeksistä joulukuulta 1972 elokuulle 2006. Yhteisintegraatio regressiotekniikoita käytettiin muodostamaan malli jonka avullatutkittiin päätekijöitä jotka vaikuttavat kullan hintaan. Kirjallisuutta tutkimalla selvitettiin miten kultaan voidaan sijoittaa. Empiirisettulokset ovat yhteneväisiä edellisten tutkimusten kanssa. Tukea löytyi sille, että kulta on pitkän ajan suoja inflaatiota vastaan ja kulta ja Yhdysvaltojen inflaatio liikkuvat pitkällä aikavälillä yhdessä. Kullan hintaan vaikuttavat kuitenkin lyhyen ajan tekijät pitkän ajan tekijöitä enemmän. Kulta on myös sijoittajalle helppo sijoituskohde, koska se on hyvin saatavilla markkinoilla ja eri instrumentteja on lukuisia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Parallel T-Coffee (PTC) was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results: In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions: The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an algorithm for the computation of reducible invariant tori of discrete dynamical systems that is suitable for tori of dimensions larger than 1. It is based on a quadratically convergent scheme that approximates, at the same time, the Fourier series of the torus, its Floquet transformation, and its Floquet matrix. The Floquet matrix describes the linearization of the dynamics around the torus and, hence, its linear stability. The algorithm presents a high degree of parallelism, and the computational effort grows linearly with the number of Fourier modes needed to represent the solution. For these reasons it is a very good option to compute quasi-periodic solutions with several basic frequencies. The paper includes some examples (flows) to show the efficiency of the method in a parallel computer. In these flows we compute invariant tori of dimensions up to 5, by taking suitable sections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of footwear and inclination on running biomechanics over short intervals are well documented. Although recognized that exercise duration can impact running biomechanics, it remains unclear how biomechanics change over time when running in minimalist shoes and on slopes. Our aims were to describe these biomechanical changes during a 50-minute run and compare them to those observed in standard shoes. Thirteen trained recreational male runners ran 50 minutes at 65% of their maximal aerobic velocity on a treadmill, once in minimalist shoes and once in standard shoes, 1 week apart in a random order. The 50-minute trial was divided into 5-minute segments of running at 0%, +5%, and -5% of treadmill incline sequentially. Data were collected using photocells, high-speed video cameras, and plantar-pressure insoles. At 0% incline, runners exhibited reduced leg stiffness and plantar flexion angles at foot strike and lower plantar pressure at the forefoot and toes in minimalist shoes from minute 34 of the protocol onward. However, only reduced plantar pressure at the toes was observed in standard shoes. Overall, similar biomechanical changes with increased exercise time were observed on the uphill and downhill inclines. The results might be due to the unfamiliarity of subjects to running in minimalist shoes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Hyperglycemia is a metabolic alteration in major burn patients associated with complications. The study aimed at evaluating the safety of general ICU glucose control protocols applied in major burns receiving prolonged ICU treatment. METHODS: 15year retrospective analysis of consecutive, adult burn patients admitted to a single specialized centre. EXCLUSION CRITERIA: death or length of stay <10 days, age <16years. VARIABLES: demographic variables, burned surface (TBSA), severity scores, infections, ICU stay, outcome. Metabolic variables: total energy, carbohydrate and insulin delivery/24h, arterial blood glucose and CRP values. Analysis of 4 periods: 1, before protocol; 2, tight doctor driven; 3, tight nurse driven; 4, moderate nurse driven. RESULTS: 229 patients, aged 45±20 years (mean±SD), burned 32±20% TBSA were analyzed. SAPSII was 35±13. TBSA, Ryan and ABSI remained stable. Inhalation injury increased. A total of 28,690 blood glucose samples were analyzed: the median value remained unchanged with a narrower distribution over time. After the protocol initiation, the normoglycemic values increased from 34.7% to 65.9%, with a reduction of hypoglycaemic events (no extreme hypoglycemia in period 4). Severe hyperglycemia persisted throughout with a decrease in period 4 (9.25% in period 4). Energy and glucose deliveries decreased in periods 3 and 4 (p<0.0001). Infectious complications increased during the last 2 periods (p=0.01). CONCLUSION: A standardized ICU glucose control protocol improved the glycemic control in adult burn patients, reducing glucose variability. Moderate glycemic control in burns was safe specifically related to hypoglycemia, reducing the incidence of hypoglycaemic events compared to the period before. Hyperglycemia persisted at a lower level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiovascular disease is the leading cause of death worldwide. Within this subset, coronary artery disease (CAD) is the most prevalent. Magnetic resonance angiography (MRA) is an emerging technique that provides a safe, non-invasive way of assessing CAD progression. To generate contrast between tissues, MR images are weighted according to the magnetic properties of those tissues. In cardiac MRI, T2 contrast, which is governed by the rate of transverse signal loss, is often created through the use of a T2-Preparation module. T2-Preparation, or T2-Prep, is a magnetization preparation scheme used to improve blood/myocardium contrast in cardiac MRI. T2-Prep methods generally use a non-selective +90°, 180°, 180°, -90° train of radiofrequency (RF) pulses (or variant thereof), to tip magnetization into the transverse plane, allow it to evolve, and then to restore it to the longitudinal plane. A key feature in this process is the combination of a +90° and -90° RF pulse. By changing either one of these, a mismatch occurs between signal excitation and restoration. This feature can be exploited to provide additional spectral or spatial selectivity. In this work, both of these possibilities are explored. The first - spectral selectivity - has been examined as a method of improving fat saturation in coronary MRA. The second - spatial selectivity - has been examined as a means of reducing imaging time by decreasing the field of view, and as a method of reducing artefacts originating from the tissues surrounding the heart. Two additional applications, parallel imaging and self-navigation, are also presented. This thesis is thus composed of four sections. The first, "A Fat Signal Suppression for Coronary MRA at 3T using a Water-Selective Adiabatic T2-Preparation Technique", was originally published in the journal Magnetic Resonance in Medicine (MRM) with co-authors Ruud B. van Heeswijk and Matthias Stuber. The second, "Combined T2-Preparation and 2D Pencil Beam Inner Volume Selection", again with co-authors Ruud van Heeswijk and Matthias Stuber, was also published in the journal MRM. The third, "A cylindrical, inner volume selecting 2D-T2-Prep improves GRAPPA-accelerated image quality in MRA of the right coronary artery", written with co-authors Jerome Yerly and Matthias Stuber, has been submitted to the "Journal of Cardiovascular Magnetic Resonance", and the fourth, "Combined respiratory self-navigation and 'pencil-beam' 2D-T2 -Prep for free-breathing, whole-heart coronary MRA", with co¬authors Jerome Chaptinel, Giulia Ginami, Gabriele Bonanno, Simone Coppo, Ruud van Heeswijk, Davide Piccini, and Matthias Stuber, is undergoing internal review prior to submission to the journal MRM. -- Les maladies cardiovasculaires sont la cause principale de décès dans le monde : parmi celles-ci, les maladies coronariennes sont les plus répandues. L'angiographie par résonance magnétique (ARM) est une technique émergente qui fournit une manière sûre, non invasive d'évaluer la progression de la coronaropathie. Pour obtenir un contraste entre les tissus, les images d'IRM sont pondérées en fonction des propriétés magnétiques de ces tissus. En IRM cardiaque, le contraste en T2, qui est lié à la décroissance du signal transversal, est souvent créé grâce à l'utilisàtion d'un module de préparation T2. La préparation T2, ou T2-Prep, est un système de préparation de l'aimantation qui est utilisé pour améliorer le contraste entre le sang et le myocarde lors d'une IRM cardiaque. Les méthodes de T2-Prep utilisent généralement une série non-sélective d'impulsions de radiofréquence (RF), typiquement [+ 90°, 180°, 180°, -90°] ou une variante, qui bascule l'aimantation dans le plan transversal, lui permet d'évoluer, puis la restaure dans le plan longitudinal. Un élément clé de ce processus est la combinaison des impulsions RF de +90° et -90°. En changeant l'une ou l'autre des impulsions, un décalage se produit entre l'excitation du signal et de la restauration. Cette fonction peut être exploitée pour fournir une sélectivité spectrale ou spatiale. Dans cette thèse, les deux possibilités sont explorées. La première - la sélectivité spectrale - a été examinée comme une méthode d'améliorer la saturation de la graisse dans l'IRM coronarienne. La deuxième - la sélectivité spatiale - a été étudiée comme un moyen de réduire le temps d'imagerie en diminuant le champ de vue, et comme une méthode de réduction des artefacts provenant des tissus entourant le coeur. Deux applications supplémentaires, l'imagerie parallèle et la self-navigation, sont également présentées. Cette thèse est ainsi composée de quatre sections. La première, "A Fat Signal Suppression for Coronary MRA at 3T using a Water-Selective Adiabatic T2-Preparation Technique", a été publiée dans la revue médicale Magnetic Resonance .in Medicine (MRM) avec les co-auteurs Ruud B. van Heeswijk et Matthias Stuber. La deuxième, Combined T2-Preparation and 2D Pencil Beam Inner Volume Selection", encore une fois avec les co-auteurs Ruud van Heeswijk et Matthias Stuber, a également été publiée dans le journal MRM. La troisième, "A cylindrical, inner volume selecting 2D-T2-Prep improves GRAPPA- accelerated image quality in MRA of the right coronary artery", écrite avec les co-auteurs Jérôme Yerly et Matthias Stuber, a été présentée au "Journal of Cardiovascular Magnetic Resonance", et la quatrième, "Combined respiratory self-navigation and 'pencil-beam' 2D-T2 -Prep for free-breathing, whole-heart coronary MRA", avec les co-auteurs Jérôme Chaptinel, Giulia Ginami, Gabriele Bonanno , Simone Coppo, Ruud van Heeswijk, Davide Piccini, et Matthias Stuber, subit un examen interne avant la soumission à la revue MRM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En los tiempos que corren la robótica forma uno de los pilares más importantes en la industria y una gran noticia para los ingenieros es la referente a las ventas de estos, ya que en 2013, unos 179.000 robots industriales se vendieron en todo el mundo, de nuevo un máximo histórico y un 12% más que en 2012 según datos de la IFR (International Federation of Robotics). Junto a esta noticia, la robótica colaborativa entra en juego en el momento que los robots y los seres humanos deben compartir el lugar de trabajo sin que nos veamos excluidos por las maquinas, por lo tanto lo que se intenta es que los robots mejoren la calidad del trabajo al hacerse cargo de los trabajos peligrosos, tediosos y sucios que no son posibles o seguros para los seres humanos. Otro concepto muy importante y directamente relacionado con lo anterior que está muy en boga y se escucha desde hace relativamente poco tiempo es el de la fabrica del futuro o “Factory Of The Future” la cual intenta que los operarios y los robots encuentren la sintonía en el entorno laboral y que los robots se consideren como maquinaria colaborativa y no como sustitutiva, considerándose como uno de los grandes nichos productivos en plena expansión. Dejando a un lado estos conceptos técnicos que nunca debemos olvidar si nuestra carrera profesional va enfocada en este ámbito industrial, el tema central de este proyecto está basado, como no podía ser de otro modo, en la robótica, que junto con la visión artificial, el resultado de esta fusión, ha dado un manipulador robótico al que se le ha dotado de cierta “inteligencia”. Se ha planteado un sencillo pero posible proceso de producción el cual es capaz de almacenar piezas de diferente forma y color de una forma autónoma solamente guiado por la imagen capturada con una webcam integrada en el equipo. El sistema consiste en una estructura soporte delimitada por una zona de trabajo en la cual se superponen unas piezas diseñadas al efecto las cuales deben ser almacenadas en su lugar correspondiente por el manipulador robótico. Dicho manipulador de cinemática paralela está basado en la tecnología de cables, comandado por cuatro motores que le dan tres grados de libertad (±X, ±Y, ±Z) donde el efector se encuentra suspendido sobre la zona de trabajo moviéndose de forma que es capaz de identificar las características de las piezas en situación, color y forma para ser almacenadas de una forma ordenada según unas premisas iníciales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using event-related brain potentials, the time course of error detection and correction was studied in healthy human subjects. A feedforward model of error correction was used to predict the timing properties of the error and corrective movements. Analysis of the multichannel recordings focused on (1) the error-related negativity (ERN) seen immediately after errors in response- and stimulus-locked averages and (2) on the lateralized readiness potential (LRP) reflecting motor preparation. Comparison of the onset and time course of the ERN and LRP components showed that the signs of corrective activity preceded the ERN. Thus, error correction was implemented before or at least in parallel with the appearance of the ERN component. Also, the amplitude of the ERN component was increased for errors, followed by fast corrective movements. The results are compatible with recent views considering the ERN component as the output of an evaluative system engaged in monitoring motor conflict.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the copper sulphides, chalcopyrite (CuFeS2), covellite (CuS) and chalcocite (Cu2S) are the most important source of minerals for copper mining industry. The acknowledge of behaviour of these sulphides related with bacterial leaching process are essential for optimization procedures. Despite of its importance, covellite has not deserved much interest of researchers regarding this matter. In this work it was studied the oxidation of covellite by the chemolithotrophic bacterium Thiobacillus ferrooxidans by using electrochemical techniques, such as open circuit potentials with the time and cyclic voltammetry. The experiments were carried out in acid medium (pH 1.8), containing or not Fe2+ as additional energy source, and in different periods of incubation; chemical controls were run in parallel. The results showed that a sulphur layer is formed spontaneously due the acid attack, covering the sulphide in the initial phase of incubation, blocking the sulphide oxidation. However, the bacterium was capable to oxidize this sulphur layer. In the presence of Fe2+ as supplemental energy source, the corrosion process was facilitated, because ocurred an indirect oxidation of covellite by Fe3+, which was produced by T. ferrooxidans oxidation of the Fe2+ added in the medium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Since barrier protection measures to avoid contact with allergens are being increasingly developed, we assessed the clinical efficacy and tolerability of a topical nasal microemulsion made of glycerol esters in patients with allergic rhinitis. Methods: Randomized, controlled, double-blind, parallel group, multicentre, multinational clinical trial in which adult patients with allergic rhinitis or rhinoconjunctivitis due to sensitization to birch, grass or olive tree pollens received treatment with topical microemulsion or placebo during the pollen seasons. Efficacy variables included scores in the mini-RQLQ questionnaire, number and severity of nasal, ocular and lung signs and symptoms, need for symptomatic medications and patients" satisfaction with treatment. Adverse events were also recorded. Results: Demographic characteristics were homogeneous between groups and mini-RQLQ scores did not differ significantly at baseline (visit 1). From symptoms recorded in the diary cards, the ME group showed statistically significant better scores for nasal congestion (0.72 vs. 1.01; p = 0.017) and mean total nasal symptoms (0.7 vs. 0.9; p = 0.045). At visit 2 (pollen season), lower values were observed in the mini-RQLQ in the ME group, although there were no statistically significant differences between groups in both full analysis set (FAS) and patients completing treatment (PPS) populations. The results obtained in the nasal symptoms domain of the mini-RQLQ at visit 2 showed the highest difference (−0.43; 95% CI: -0.88 to 0.02) for the ME group in the FAS population. The topical microemulsion was safe and well tolerated and no major discomforts were observed. Satisfaction rating with the treatment was similar between the groups. Conclusions: The topical application of the microemulsion is a feasible and safe therapy in the prevention of allergic symptoms, particularly nasal congestion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis concurrent communication event handling is implemented using thread pool approach. Concurrent events are handled with a Reactor design pattern and multithreading is implemented using a Leader/Followers design pattern. Main focus is to evaluate behaviour of implemented model by different numbers of concurrent connections and amount of used threads. Furthermore, model feasibility in a PeerHood middleware is evaluated. Implemented model is evaluated with created test environment which enables concurrent message sending from multiple connections to the system under test. Messages round trip times are measured in the tester application. In the evaluation processing delay into system is simulated and influence of delay to the average round trip time is analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis considers modeling and analysis of noise and interconnects in onchip communication. Besides transistor count and speed, the capabilities of a modern design are often limited by on-chip communication links. These links typically consist of multiple interconnects that run parallel to each other for long distances between functional or memory blocks. Due to the scaling of technology, the interconnects have considerable electrical parasitics that affect their performance, power dissipation and signal integrity. Furthermore, because of electromagnetic coupling, the interconnects in the link need to be considered as an interacting group instead of as isolated signal paths. There is a need for accurate and computationally effective models in the early stages of the chip design process to assess or optimize issues affecting these interconnects. For this purpose, a set of analytical models is developed for on-chip data links in this thesis. First, a model is proposed for modeling crosstalk and intersymbol interference. The model takes into account the effects of inductance, initial states and bit sequences. Intersymbol interference is shown to affect crosstalk voltage and propagation delay depending on bus throughput and the amount of inductance. Next, a model is proposed for the switching current of a coupled bus. The model is combined with an existing model to evaluate power supply noise. The model is then applied to reduce both functional crosstalk and power supply noise caused by a bus as a trade-off with time. The proposed reduction method is shown to be effective in reducing long-range crosstalk noise. The effects of process variation on encoded signaling are then modeled. In encoded signaling, the input signals to a bus are encoded using additional signaling circuitry. The proposed model includes variation in both the signaling circuitry and in the wires to calculate the total delay variation of a bus. The model is applied to study level-encoded dual-rail and 1-of-4 signaling. In addition to regular voltage-mode and encoded voltage-mode signaling, current-mode signaling is a promising technique for global communication. A model for energy dissipation in RLC current-mode signaling is proposed in the thesis. The energy is derived separately for the driver, wire and receiver termination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityön tarkoituksena on optimoida asiakkaiden sähkölaskun laskeminen hajautetun laskennan avulla. Älykkäiden etäluettavien energiamittareiden tullessa jokaiseen kotitalouteen, energiayhtiöt velvoitetaan laskemaan asiakkaiden sähkölaskut tuntiperusteiseen mittaustietoon perustuen. Kasvava tiedonmäärä lisää myös tarvittavien laskutehtävien määrää. Työssä arvioidaan vaihtoehtoja hajautetun laskennan toteuttamiseksi ja luodaan tarkempi katsaus pilvilaskennan mahdollisuuksiin. Lisäksi ajettiin simulaatioita, joiden avulla arvioitiin rinnakkaislaskennan ja peräkkäislaskennan eroja. Sähkölaskujen oikeinlaskemisen tueksi kehitettiin mittauspuu-algoritmi.