937 resultados para passing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Advances in biotechnology have shed light on many biological processes. In biological networks, nodes are used to represent the function of individual entities within a system and have historically been studied in isolation. Network structure adds edges that enable communication between nodes. An emerging fieldis to combine node function and network structure to yield network function. One of the most complex networks known in biology is the neural network within the brain. Modeling neural function will require an understanding of networks, dynamics, andneurophysiology. It is with this work that modeling techniques will be developed to work at this complex intersection. Methods: Spatial game theory was developed by Nowak in the context of modeling evolutionary dynamics, or the way in which species evolve over time. Spatial game theory offers a two dimensional view of analyzingthe state of neighbors and updating based on the surroundings. Our work builds upon this foundation by studying evolutionary game theory networks with respect to neural networks. This novel concept is that neurons may adopt a particular strategy that will allow propagation of information. The strategy may therefore act as the mechanism for gating. Furthermore, the strategy of a neuron, as in a real brain, isimpacted by the strategy of its neighbors. The techniques of spatial game theory already established by Nowak are repeated to explain two basic cases and validate the implementation of code. Two novel modifications are introduced in Chapters 3 and 4 that build on this network and may reflect neural networks. Results: The introduction of two novel modifications, mutation and rewiring, in large parametricstudies resulted in dynamics that had an intermediate amount of nodes firing at any given time. Further, even small mutation rates result in different dynamics more representative of the ideal state hypothesized. Conclusions: In both modificationsto Nowak's model, the results demonstrate the network does not become locked into a particular global state of passing all information or blocking all information. It is hypothesized that normal brain function occurs within this intermediate range and that a number of diseases are the result of moving outside of this range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In writing “Not in the Legends”, one of the images and concepts which constantly returned was that of pilgrimage. I began to write these poems while studying abroad in London, after having passed the previous semester in France and travelling around Europe. There was something in the repetition of sightseeing— walking six miles in Luxembourg to see the grave of General Patton, taking photographs of the apartment where Sylvia Plath ended her life, bowing before the bones of saints, searching through Père Lachaise for the grave of Théodore Gericault— which struck me as numinous and morbid. At the same time, I came to love living abroad and I grew discontent with both remaining and returning. I wanted the opportunity to live everywhere all the time and not have to choose between home and away. Returning from abroad, I turned my attention to the landscape of my native country. I found in the New England pilgrims a narrative of people who had left their home in search of growth and freedom. In these journeys I began to appreciate the significance of place and tried to understand what it meant to move from one place to another, how one chose a home, and why people searched for meaning in specific locations. The processes of moving from student to worker and from childhood to adulthood have weighed on me. I began to see these transitions towards maturity as travels to a different land. Memory and nostalgia are their own types of pilgrimage in their attempts to return to lost places, as is the reading of literature. These pilgrimages, real and metaphorical, form the thematic core of the collection. I read the work of many poets who came before me, returning to the places where the Canon was forged. Those poets have a large presence in the work I produced. I wondered how I, as a young poet, could earn my own place in the tradition and sought models in much the same way a painter studies the brushstrokes of a master. In the process, I have tried to uncover what it means to be a poet. Is it something like being a saint? Is it something like being a colonist? Or is to be the one who goes in search of saints and colonists? In trying to measure my own life and work based on the precedent, I have questioned what role era and generation have on the formation of identity. I focused my reading heavily on the early years of English poetry, trying to find the essence of the time when the language first achieved the transcendence of verse. In following the development of English poetry through Coleridge, John Berryman, and Allison Titus, I have explored the progression of those basic virtues in changing contexts. Those bearings, applied to my modern context, helped to shape the poetry I produced. Many of the poems in “Not in the Legends” are based on my own personal experience. In my recollections I have tried to interrogate nostalgia rather than falling into mere reminiscence. Rather than allowing myself poems of love and longing, I have tried to find the meaning of those emotions. A dominant conflict exists between adventure and comfort which mirrors the central engagement with the nature of being “here” or “there”. It is found in scenes of domesticity and wilderness as I attempt to understand my own simultaneous desire for both. For example, in “Canned Mangoes…” the intrusion of nature, even in a context as innocuous as a poem by Sir Walter Raleigh, unravels ordinary comforts of the domestic sphere. The character of “The Boy” from Samuel Beckett’s Waiting for Godot proved such an interesting subject for me because he is one who can transcend the normal boundaries of time and place. The title suggests connections to both place and time. “Legends” features the dual meaning of both myths and the keys to maps. To propose something “Not in the Legends” is to find something which has no precedent in our histories and our geographies, something beyond our field of knowledge and wholly new. One possible interpretation I devised was that each new generation lives a novel existence, the future being the true locus of that which is beyond our understanding. The title comes from Keats’ “Hyperion, a Fragment”, and details the aftermath of the Titanomachy. The Titans, having fallen to the Olympians, are a representation of the passing of one generation for the next. Their dejection is expressed by Saturn, who laments: Not in my own sad breast, Which is its own great judge and searcher out, Can I find reason why ye should be thus: Not in the legends of the first of days… (129-132) The emotions of the conquered Titans are unique and without antecedent. They are experiencing feelings which surpass all others in history. In this, they are the equivalent of the poet who feels that his or her own sufferings are special. In contrast are Whitman’s lines from “Song of Myself” which serve as an epigraph to this collection. He contends for a sense of continuity across time, a realization that youth, age, pleasure, and suffering have always existed and will always exist. Whitman finds consolation in this unity, accepting that kinship with past generations is more important that his own individuality. These opposing views offer two methods of presenting the self in history. The instinct of poetry suggests election. The poet writes because he feels his experiences are special, or because he believes he can serve as a synecdoche for everyone. I have fought this instinct by trying to contextualize myself in history. These poems serve as an attempt at prosopography with my own narrative a piece of the whole. Because the earth abides forever, our new stories get printed over the locations of the old and every place becomes a palimpsest of lives and acts. In this collection I have tried to untangle some of those layers, especially my own, to better understand the sprawling legend of history.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The separation of small molecules by capillary electrophoresis is governed by a complex interplay among several physical effects. Until recently, a systematic understanding of how the influence of all of these effects is observed experimentally has remained unclear. The work presented in this thesis involves the use of transient isotachophoretic stacking (tITP) and computer simulation to improve and better understand an in-capillary chemical assay for creatinine. This assay involves the use of electrophoretically mediated micro-analysis (EMMA) to carry out the Jaffé reaction inside a capillary tube. The primary contribution of this work is the elucidation of the role of the length and concentration of the hydroxide plug used to achieve tITP stacking of the product formed by the in-capillary EMMA/Jaffé method. Computer simulation using SIMUL 5.0 predicts that a 3-4 fold gain in sensitivity can be recognized by timing the tITP stacking event such that the Jaffé product peak is at its maximum height as that peak is electrophoresing past the detection window. Overall, the length of the hydroxide plug alters the timing of the stacking event and lower concentration plugs of hydroxide lead to more rapidly occurring tITP stacking events. Also, the inclusion of intentional tITP stacking in the EMMA/Jaffé method improves the sensitivity of the assay, including creatinine concentrations within the normal biological range. Ultimately, improvement in assay sensitivity can be rationally designed by using the length and concentration of the hydroxide plug to engineer the timing of the tITP stacking event such that stacking occurs as the Jaffé product is passing the detection window.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Theoretical studies of the problems of the securities markets in the Russian Federation incline to one or other of the two traditional approaches. The first consists of comparing the definition of "valuable paper" set forth in the current legislation of the Russian Federation, with the theoretical model of "Wertpapiere" elaborated by German scholars more than 90 years ago. The problem with this approach is, in Mr. Pentsov's opinion, that any new features of the definition of "security" that do not coincide with the theoretical model of "Wertpapiere" (such as valuable papers existing in non-material, electronic form) are claimed to be incorrect and removed from the current legislation of the Russian Federation. The second approach works on the basis of the differentiation between the Common Law concept of "security" and the Civil Law concept of "valuable paper". Mr. Pentsov's research, presented in an article written in English, uses both methodological tools and involves, firstly, a historical study of the origin and development of certain legal phenomena (securities) as they evolved in different countries, and secondly, a comparative, synchronic study of equivalent legal phenomena as they exist in different countries today. Employing the first method, Mr. Pentsov divided the historical development of the conception of "valuable paper" in Russia into five major stages. He found that, despite the existence of a relatively wide circulation of valuable papers, especially in the second half of the 19th century, Russian legislation before 1917 (the first stage) did not have a unified definition of valuable paper. The term was used, in both theoretical studies and legislation, but it covered a broad range of financial instruments such as stocks, bonds, government bonds, promissory notes, bills of exchange, etc. During the second stage, also, the legislation of the USSR did not have a unified definition of "valuable paper". After the end of the "new economic policy" (1922 - 1930) the stock exchanges and the securities markets in the USSR, with a very few exceptions, were abolished. And thus during the third stage (up to 1985), the use of valuable papers in practice was reduced to foreign economic relations (bills of exchange, stocks in enterprises outside the USSR) and to state bonds. Not surprisingly, there was still no unified definition of "valuable paper". After the beginning of Gorbachev's perestroika, a securities market began to re-appear in the USSR. However, the successful development of securities markets in the USSR was retarded by the absence of an appropriate regulatory framework. The first effort to improve the situation was the adoption of the Regulations on Valuable Papers, approved by resolution No. 590 of the Council of Ministers of the USSR, dated June 19, 1990. Section 1 of the Regulation contained the first statutory definition of "valuable paper" in the history of Russia. At the very beginning of the period of transition to a market economy, a number of acts contained different definitions of "valuable paper". This diversity clearly undermined the stability of the Russian securities market and did not achieve the goal of protecting the investor. The lack of unified criteria for the consideration of such non-standard financial instruments as "valuable papers" significantly contributed to the appearance of numerous fraudulent "pyramid" schemes that were outside of the regulatory scheme of Russia legislation. The situation was substantially improved by the adoption of the new Civil Code of the Russian Federation. According to Section 1 of Article 142 of the Civil Code, a valuable paper is a document that confirms, in compliance with an established form and mandatory requisites, certain material rights whose realisation or transfer are possible only in the process of its presentation. Finally, the recent Federal law No. 39 - FZ "On the Valuable Papers Market", dated April 22 1996, has also introduced the term "emission valuable papers". According to Article 2 of this Law, an "emission valuable paper" is any valuable paper, including non-documentary, that simultaneously has the following features: it fixes the composition of material and non-material rights that are subject to confirmation, cession and unconditional realisation in compliance with the form and procedure established by this federal law; it is placed by issues; and it has equal amount and time of realisation of rights within the same issue regardless of when the valuable paper was purchased. Thus the introduction of the conception of "emission valuable paper" became the starting point in the Russian federation's legislation for the differentiation between the legal regimes of "commercial papers" and "investment papers" similar to the Common Law approach. Moving now to the synchronic, comparative method of research, Mr. Pentsov notes that there are currently three major conceptions of "security" and, correspondingly, three approaches to its legal definition: the Common Law concept, the continental law concept, and the concept employed by Japanese Law. Mr. Pentsov proceeds to analyse the differences and similarities of all three, concluding that though the concept of "security" in the Common Law system substantially differs from that of "valuable paper" in the Continental Law system, nevertheless the two concepts are developing in similar directions. He predicts that in the foreseeable future the existing differences between these two concepts will become less and less significant. On the basis of his research, Mr. Pentsov arrived at the conclusion that the concept of "security" (and its equivalents) is not a static one. On the contrary, it is in the process of permanent evolution that reflects the introduction of new financial instruments onto the capital markets. He believes that the scope of the statutory definition of "security" plays an extremely important role in the protection of investors. While passing the Securities Act of 1933, the United States Congress determined that the best way to achieve the goal of protecting investors was to define the term "security" in sufficiently broad and general terms so as to include within the definition the many types of instruments that in the commercial world fall within the ordinary concept of "security' and to cover the countless and various devices used by those who seek to use the money of others on the promise of profits. On the other hand, the very limited scope of the current definition of "emission valuable paper" in the Federal Law of the Russian Federation entitled "On the Valuable Papers Market" does not allow the anti-fraud provisions of this law to be implemented in an efficient way. Consequently, there is no basis for the protection of investors. Mr. Pentsov proposes amendments which he believes would enable the Russian markets to become more efficient and attractive for both foreign and domestic investors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the major challenges for a mission to the Jovian system is the radiation tolerance of the spacecraft (S/C) and the payload. Moreover, being able to achieve science observations with high signal to noise ratios (SNR), while passing through the high flux radiation zones, requires additional ingenuity on the part of the instrument provider. Consequently, the radiation mitigation is closely intertwined with the payload, spacecraft and trajectory design, and requires a systems-level approach. This paper presents a design for the Io Volcano Observer (IVO), a Discovery mission concept that makes multiple close encounters with Io while orbiting Jupiter. The mission aims to answer key outstanding questions about Io, especially the nature of its intense active volcanism and the internal processes that drive it. The payload includes narrow-angle and wide-angle cameras (NAC and WAC), dual fluxgate magnetometers (FGM), a thermal mapper (ThM), dual ion and neutral mass spectrometers (INMS), and dual plasma ion analyzers (PIA). The radiation mitigation is implemented by drawing upon experiences from designs and studies for missions such as the Radiation Belt Storm Probes (RBSP) and Jupiter Europa Orbiter (JEO). At the core of the radiation mitigation is IVO's inclined and highly elliptical orbit, which leads to rapid passes through the most intense radiation near Io, minimizing the total ionizing dose (177 krads behind 100 mils of Aluminum with radiation design margin (RDM) of 2 after 7 encounters). The payload and the spacecraft are designed specifically to accommodate the fast flyby velocities (e.g. the spacecraft is radioisotope powered, remaining small and agile without any flexible appendages). The science instruments, which collect the majority of the high-priority data when close to Io and thus near the peak flux, also have to mitigate transient noise in their detectors. The cameras use a combination of shielding and CMOS detectors with extremely fast readout to mi- imize noise. INMS microchannel plate detectors and PIA channel electron multipliers require additional shielding. The FGM is not sensitive to noise induced by energetic particles and the ThM microbolometer detector is nearly insensitive. Detailed SNR calculations are presented. To facilitate targeting agility, all of the spacecraft components are shielded separately since this approach is more mass efficient than using a radiation vault. IVO uses proven radiation-hardened parts (rated at 100 krad behind equivalent shielding of 280 mils of Aluminum with RDM of 2) and is expected to have ample mass margin to increase shielding if needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to develop a new simple method for analyzing one-dimensional transcranial magnetic stimulation (TMS) mapping studies in humans. Motor evoked potentials (MEP) were recorded from the abductor pollicis brevis (APB) muscle during stimulation at nine different positions on the scalp along a line passing through the APB hot spot and the vertex. Non-linear curve fitting according to the Levenberg-Marquardt algorithm was performed on the averaged amplitude values obtained at all points to find the best-fitting symmetrical and asymmetrical peak functions. Several peak functions could be fitted to the experimental data. Across all subjects, a symmetric, bell-shaped curve, the complementary error function (erfc) gave the best results. This function is characterized by three parameters giving its amplitude, position, and width. None of the mathematical functions tested with less or more than three parameters fitted better. The amplitude and position parameters of the erfc were highly correlated with the amplitude at the hot spot and with the location of the center of gravity of the TMS curve. In conclusion, non-linear curve fitting is an accurate method for the mathematical characterization of one-dimensional TMS curves. This is the first method that provides information on amplitude, position and width simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Estimation of respiratory deadspace is often based on the CO2 expirogram, however presence of the CO2 sensor increases equipment deadspace, which in turn influences breathing pattern and calculation of lung volume. In addition, it is necessary to correct for the delay between the sensor and flow signals. We propose a new method for estimation of effective deadspace using the molar mass (MM) signal from an ultrasonic flowmeter device, which does not require delay correction. We hypothesize that this estimation is correlated with that calculated from the CO2 signal using the Fowler method. METHODS: Breath-by-breath CO2, MM and flow measurements were made in a group of 77 term-born healthy infants. Fowler deadspace (Vd,Fowler) was calculated after correcting for the flow-dependent delay in the CO2 signal. Deadspace estimated from the MM signal (Vd,MM) was defined as the volume passing through the flowhead between start of expiration and the 10% rise point in MM. RESULTS: Correlation (r = 0.456, P < 0.0001) was found between Vd,MM and Vd,Fowler averaged over all measurements, with a mean difference of -1.4% (95% CI -4.1 to 1.3%). Vd,MM ranged from 6.6 to 11.4 ml between subjects, while Vd,Fowler ranged from 5.9 to 12.0 ml. Mean intra-measurement CV over 5-10 breaths was 7.8 +/- 5.6% for Vd,MM and 7.8 +/- 3.7% for Vd,Fowler. Mean intra-subject CV was 6.0 +/- 4.5% for Vd,MM and 8.3 +/- 5.9% for Vd,Fowler. Correcting for the CO2 signal delay resulted in a 12% difference (P = 0.022) in Vd,Fowler. Vd,MM could be obtained more frequently than Vd,Fowler in infants with CLD, with a high variability. CONCLUSIONS: Use of the MM signal provides a feasible estimate of Fowler deadspace without introducing additional equipment deadspace. The simple calculation without need for delay correction makes individual adjustment for deadspace in FRC measurements possible. This is especially important given the relative large range of deadspace seen in this homogeneous group of infants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important problem in computational biology is finding the longest common subsequence (LCS) of two nucleotide sequences. This paper examines the correctness and performance of a recently proposed parallel LCS algorithm that uses successor tables and pruning rules to construct a list of sets from which an LCS can be easily reconstructed. Counterexamples are given for two pruning rules that were given with the original algorithm. Because of these errors, performance measurements originally reported cannot be validated. The work presented here shows that speedup can be reliably achieved by an implementation in Unified Parallel C that runs on an Infiniband cluster. This performance is partly facilitated by exploiting the software cache of the MuPC runtime system. In addition, this implementation achieved speedup without bulk memory copy operations and the associated programming complexity of message passing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The prevalence of Ventilated Improved Pit (VIP) latrines in Ghana suggests that the design must have a high user acceptance. The two key factors attributed to user acceptance of a VIP latrine over an alternative latrine design, such as the basic pit latrine, are its ability to remove foul odors and maintain low fly populations; both of which are a direct result of an adequate ventilation flow rate. Adequate ventilation for odorless conditions in a VIP latrine has been defined by the United Nations Development Program (UNDP) and the World Bank, as an air flow rate equivalent to 6 air changes per hour (6 ACH) of the superstructure’s air volume. Additionally, the UNDP determined that the three primary factors that affect ventilation are: 1) wind passing over the mouth of the vent pipe, 2) wind passing into the superstructure, and 3) solar radiation on to the vent pipe. Previous studies also indicate that vent pipes with larger diameters increase flow rates, and the application of carbonaceous materials to the pit sludge reduces odor and insect prevalence. Furthermore, proper design and construction is critical for the correct functioning of VIP latrines. Under-designing could cause problems with odor and insect control; over-designing would increase costs unnecessarily, thereby making it potentially unaffordable for benefactors to independently construct, repair or replace a VIP latrine. The present study evaluated the design of VIP latrines used by rural communities in the Upper West Region of Ghana with the focus of assessing adequate ventilation for odor removal and insect control. Thirty VIP latrines from six communities in the Upper West Region of Ghana were sampled. Each VIP latrine’s ventilation flow rate and micro-environment was measured using a hot-wire anemometer probe and portable weather station for a minimum of four hours. To capture any temporal or seasonal variations in ventilation, ten of the latrines were sampled monthly over the course of three months for a minimum of 12 hours. A latrine usage survey and a cost analysis were also conducted to further assess the VIP latrine as an appropriated technology for sustainable development in the Upper West Region. It was found that the average air flow rate over the entire sample set was 11.3 m3/hr. The minimum and maximum air flow rates were 0.0 m3/hr and 48.0 m3/hr respectively. Only 1 of the 30 VIP latrines (3%) was found to have an air flow rate greater than the UNDP-defined odorless condition of 6 ACH. Furthermore, 19 VIP latrines (63%) were found to have an average air flow rate of less than half the flow rate required to achieve 6 ACH. The dominant factors affecting ventilation flow rate were wind passing over the mouth of the vent pipe and air buoyancy forces, which were the effect of differences in temperature between the substructure and the ambient environment. Of 76 usable VIP latrines found in one community, 68.4% were in actual use. The cost of a VIP latrine was found to be equivalent to approximately 12% of the mean annual household income for Upper West Region inhabitants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This technical report discusses the application of the Lattice Boltzmann Method (LBM) and Cellular Automata (CA) simulation in fluid flow and particle deposition. The current work focuses on incompressible flow simulation passing cylinders, in which we incorporate the LBM D2Q9 and CA techniques to simulate the fluid flow and particle loading respectively. For the LBM part, the theories of boundary conditions are studied and verified using the Poiseuille flow test. For the CA part, several models regarding simulation of particles are explained. And a new Digital Differential Analyzer (DDA) algorithm is introduced to simulate particle motion in the Boolean model. The numerical results are compared with a previous probability velocity model by Masselot [Masselot 2000], which shows a satisfactory result.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: The major goal of acute ischemic stroke treatment is fast and sufficient recanalization. Percutaneous transluminal balloon angioplasty (PTA) and/or placement of a stent might achieve both by compressing the thrombus at the occlusion site. This study assesses the feasibility, recanalization rate, and complications of the 2 techniques in an animal model. MATERIALS AND METHODS: Thirty cranial vessels of 7 swine were occluded by injection of radiopaque thrombi. Fifteen vessel occlusions were treated by PTA alone and 15, by placement of a stent and postdilation. Recanalization was documented immediately after treatment and after 1, 2, and 3 hours. Thromboembolic events and dissections were documented. RESULTS: PTA was significantly faster to perform (mean, 16.6 minutes versus 33.0 minutes for stent placement; P < .001), but the mean recanalization rate after 1 hour was significantly better after stent placement compared with PTA alone (67.5% versus 14.6%, P < .001). Due to the self-expanding force of the stent, vessel diameter further increased with time, whereas the recanalization result after PTA was prone to reocclusion. Besides thromboembolic events related to the passing maneuvers at the occlusion site, no thrombus fragmentation and embolization occurred during balloon inflation or stent deployment. Flow to side branches could also be restored at the occlusion site because it was possible to direct thrombus compression. CONCLUSIONS: Stent placement and postdilation proved to be much more efficient in terms of acute and short-term vessel recanalization compared with PTA alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The characteristics of moving sound sources have strong implications on the listener's distance perception and the estimation of velocity. Modifications of the typical sound emissions as they are currently occurring due to the tendency towards electromobility have an impact on the pedestrian's safety in road traffic. Thus, investigations of the relevant cues for velocity and distance perception of moving sound sources are not only of interest for the psychoacoustic community, but also for several applications, like e.g. virtual reality, noise pollution and safety aspects of road traffic. This article describes a series of psychoacoustic experiments in this field. Dichotic and diotic stimuli of a set of real-life recordings taken from a passing passenger car and a motorcycle were presented to test subjects who in turn were asked to determine the velocity of the object and its minimal distance from the listener. The results of these psychoacoustic experiments show that the estimated velocity is strongly linked to the object's distance. Furthermore, it could be shown that binaural cues contribute significantly to the perception of velocity. In a further experiment, it was shown that - independently of the type of the vehicle - the main parameter for distance determination is the maximum sound pressure level at the listener's position. The article suggests a system architecture for the adequate consideration of moving sound sources in virtual auditory environments. Virtual environments can thus be used to investigate the influence of new vehicle powertrain concepts and the related sound emissions of these vehicles on the pedestrians' ability to estimate the distance and velocity of moving objects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present in this paper several contributions on the collision detection optimization centered on hardware performance. We focus on the broad phase which is the first step of the collision detection process and propose three new ways of parallelization of the well-known Sweep and Prune algorithm. We first developed a multi-core model takes into account the number of available cores. Multi-core architecture enables us to distribute geometric computations with use of multi-threading. Critical writing section and threads idling have been minimized by introducing new data structures for each thread. Programming with directives, like OpenMP, appears to be a good compromise for code portability. We then proposed a new GPU-based algorithm also based on the "Sweep and Prune" that has been adapted to multi-GPU architectures. Our technique is based on a spatial subdivision method used to distribute computations among GPUs. Results show that significant speed-up can be obtained by passing from 1 to 4 GPUs in a large-scale environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND & AIMS Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The transverse broadening of an energetic jet passing through a non-Abelian plasma is believed to be described by the thermal expectation value of a light-cone Wilson loop. In this exploratory study, we measure the light-cone Wilson loop with classical lattice gauge theory simulations. We observe, as suggested by previous studies, that there are strong interactions already at short transverse distances, which may lead to more efficient jet quenching than in leading-order perturbation theory. We also verify that the asymptotics of the Wilson loop do not change qualitatively when crossing the light cone, which supports arguments in the literature that infrared contributions to jet quenching can be studied with dimensionally reduced simulations in the space-like domain. Finally we speculate on possibilities for full four-dimensional lattice studies of the same observable, perhaps by employing shifted boundary conditions in order to simulate ensembles boosted by an imaginary velocity.