872 resultados para High Performance Computing
Resumo:
The definition and programming of distributed applications has become a major research issue due to the increasing availability of (large scale) distributed platforms and the requirements posed by the economical globalization. However, such a task requires a huge effort due to the complexity of the distributed environments: large amount of users may communicate and share information across different authority domains; moreover, the “execution environment” or “computations” are dynamic since the number of users and the computational infrastructure change in time. Grid environments, in particular, promise to be an answer to deal with such complexity, by providing high performance execution support to large amount of users, and resource sharing across different organizations. Nevertheless, programming in Grid environments is still a difficult task. There is a lack of high level programming paradigms and support tools that may guide the application developer and allow reusability of state-of-the-art solutions. Specifically, the main goal of the work presented in this thesis is to contribute to the simplification of the development cycle of applications for Grid environments by bringing structure and flexibility to three stages of that cycle through a commonmodel. The stages are: the design phase, the execution phase, and the reconfiguration phase. The common model is based on the manipulation of patterns through pattern operators, and the division of both patterns and operators into two categories, namely structural and behavioural. Moreover, both structural and behavioural patterns are first class entities at each of the aforesaid stages. At the design phase, patterns can be manipulated like other first class entities such as components. This allows a more structured way to build applications by reusing and composing state-of-the-art patterns. At the execution phase, patterns are units of execution control: it is possible, for example, to start or stop and to resume the execution of a pattern as a single entity. At the reconfiguration phase, patterns can also be manipulated as single entities with the additional advantage that it is possible to perform a structural reconfiguration while keeping some of the behavioural constraints, and vice-versa. For example, it is possible to replace a behavioural pattern, which was applied to some structural pattern, with another behavioural pattern. In this thesis, besides the proposal of the methodology for distributed application development, as sketched above, a definition of a relevant set of pattern operators was made. The methodology and the expressivity of the pattern operators were assessed through the development of several representative distributed applications. To support this validation, a prototype was designed and implemented, encompassing some relevant patterns and a significant part of the patterns operators defined. This prototype was based in the Triana environment; Triana supports the development and deployment of distributed applications in the Grid through a dataflow-based programming model. Additionally, this thesis also presents the analysis of a mapping of some operators for execution control onto the Distributed Resource Management Application API (DRMAA). This assessment confirmed the suitability of the proposed model, as well as the generality and flexibility of the defined pattern operators
Resumo:
Wireless Body Area Networks (WBANs) have emerged as a promising technology for medical and non-medical applications. WBANs consist of a number of miniaturized, portable, and autonomous sensor nodes that are used for long-term health monitoring of patients. These sensor nodes continuously collect information of patients, which are used for ubiquitous health monitoring. In addition, WBANs may be used for managing catastrophic events and increasing the effectiveness and performance of rescue forces. The huge amount of data collected by WBAN nodes demands scalable, on-demand, powerful, and secure storage and processing infrastructure. Cloud computing is expected to play a significant role in achieving the aforementioned objectives. The cloud computing environment links different devices ranging from miniaturized sensor nodes to high-performance supercomputers for delivering people-centric and context-centric services to the individuals and industries. The possible integration of WBANs with cloud computing (WBAN-cloud) will introduce viable and hybrid platform that must be able to process the huge amount of data collected from multiple WBANs. This WBAN-cloud will enable users (including physicians and nurses) to globally access the processing and storage infrastructure at competitive costs. Because WBANs forward useful and life-critical information to the cloud – which may operate in distributed and hostile environments, novel security mechanisms are required to prevent malicious interactions to the storage infrastructure. Both the cloud providers and the users must take strong security measures to protect the storage infrastructure.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Poster presented in Work in Progress Session, 28th GI/ITG International Conference on Architecture of Computing Systems (ARCS 2015). 24 to 26, Mar, 2015. Porto, Portugal.
Resumo:
Premature degradation of ordinary Portland cement (OPC) concrete infrastructures is a current and serious problem with overwhelming costs amounting to several trillion dollars. The use of concrete surface treatments with waterproofing materials to prevent the access of aggressive substances is an important way of enhancing concrete durability. The most common surface treatments use polymeric resins based on epoxy, silicone (siloxane), acrylics, polyurethanes or polymethacrylate. However, epoxy resins have low resistance to ultraviolet radiation while polyurethanes are sensitive to high alkalinity environments. Geopolymers constitute a group of materials with high resistance to chemical attack that could also be used for coating of concrete infrastructures exposed to harsh chemical environments. This article presents results of an experimental investigation on the resistance to chemical attack (by sulfuric and nitric acid) of several materials: OPC concrete, high performance concrete (HPC), epoxy resin, acrylic painting and a fly ash based geopolymeric mortar. Three types of acids, each with high concentrations of 10%, 20% and 30%, were used to simulate long term degradation by chemical attack. The results show that the epoxy resin had the best resistance to chemical attack, irrespective of the acid type and acid concentration.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
An appropriate assessment of end-to-end network performance presumes highly efficient time tracking and measurement with precise time control of the stopping and resuming of program operation. In this paper, a novel approach to solving the problems of highly efficient and precise time measurements on PC-platforms and on ARM-architectures is proposed. A new unified High Performance Timer and a corresponding software library offer a unified interface to the known time counters and automatically identify the fastest and most reliable time source, available in the user space of a computing system. The research is focused on developing an approach of unified time acquisition from the PC hardware and accordingly substituting the common way of getting the time value through Linux system calls. The presented approach provides a much faster means of obtaining the time values with a nanosecond precision than by using conventional means. Moreover, it is capable of handling the sequential time value, precise sleep functions and process resuming. This ability means the reduction of wasting computer resources during the execution of a sleeping process from 100% (busy-wait) to 1-1.5%, whereas the benefits of very accurate process resuming times on long waits are maintained.
Resumo:
Objectives and Methods: Self-report studies have shown an association between music performance anxiety (MPA) and hyperventilation complaints. However, hyperventilation was never assessed physiologically in MPA. This study investigated the self-reported affective experience, self-reported physiological symptoms, and cardiorespiratory variables including partial pressure of end-tidal CO(2) (Petco(2)), which is an indicator for hyperventilation, in 67 music students before a private and a public performance. The response coherence between these response domains was also investigated.ResultsFrom the private to the public session, the intensity of all self-report variables increased (all p values < .001). As predicted, the higher the musician's usual MPA level, the larger were these increases (p values < .10). With the exception of Petco(2), the main cardiorespiratory variables also increased from the private to the public session (p values < .05). These increases were not modulated by the usual MPA level (p values > .10). Petco(2) showed a unique response pattern reflected by an MPA-by-session interaction (p < .01): it increased from the private to the public session for musicians with low MPA levels and decreased for musicians with high MPA levels. Self-reported physiological symptoms were related to the self-reported affective experience (p values < .05) rather than to physiological measures (p values > .17).ConclusionsThese findings show for the first time how respiration is stimulated before a public performance in music students with different MPA levels. The hypothesis of a hyperventilation tendency in high-performance-anxious musicians is supported. The response coherence between physiological symptoms and physiological activation is weak.
Resumo:
AbstractPerforming publicly has become increasingly important in a variety of professions. This condition is associated with performance anxiety in almost all performers. Whereas some performers successfully cope with this anxiety, for others it represents a major problem and even threatens their career. Musicians and especially music students were shown to be particularly affected by performance anxiety.Therefore, the goal of this PhD thesis was to gain a better understanding of performance anxiety in university music students. More precisely, the first part of this thesis aimed at increasing knowledge on the occurrence, the experience, and the management of performance anxiety (Article 1). The second part aimed at investigating the hypothesis that there is an underlying hyperventilation problem in musicians with a high level of anxiety before a performance. This hypothesis was addressed in two ways: firstly, by investigating the association between the negative affective dimension of music performance anxiety (MPA) and self-perceived physiological symptoms that are known to co-occur with hyperventilation (Article 2) and secondly, by analyzing this association on the physiological level before a private (audience-free) and a public performance (Article 3). Article 4 places some key variables of Article 3 in a larger context by jointly analyzing the phases before, during, and after performing.The main results of the self-report data show (a) that stage fright is experienced as a problem by one-third of the surveyed students, (b) that the students express a considerable need for more help to better cope with it, and (c) that there is a positive association between negative feelings of MPA and the self-reported hyperventilation complaints before performing. This latter finding was confirmed on the physiological level in a tendency of particularly high performance-anxious musicians to hyperventilate. Furthermore, the psycho-physiological activation increased from a private to a public performance, and was higher during the performances than before or after them. The physiological activation was mainly independent of the MPA score. Finally, there was a low response coherence between the actual physiological activation and the self-reports on the instantaneous anxiety, tension, and perceived physiological activation.Given the high proportion of music students who consider stage fright as a problem and given the need for more help to better cope with it, a better understanding of this phenomenon and its inclusion in the educational process is fundamental to prevent future occupational problems. On the physiological level, breathing exercises might be a good means to decrease - but also to increase - the arousal associated with a public performance in order to meet an optimal level of arousal needed for a good performance.
Resumo:
Al llarg d'aquest treball es presenten les tecnologies i els processos que s'han fet servir en la construcció de l'esmentada botiga virtual i el seu gestor. L'objectiu és crear un programari flexible, portable, reutilitzable, d'alt rendiment i intel·ligible
Resumo:
Technological limitations and power constraints are resulting in high-performance parallel computing architectures that are based on large numbers of high-core-count processors. Commercially available processors are now at 8 and 16 cores and experimental platforms, such as the many-core Intel Single-chip Cloud Computer (SCC) platform, provide much higher core counts. These trends are presenting new sets of challenges to HPC applications including programming complexity and the need for extreme energy efficiency.In this work, we first investigate the power behavior of scientific PGAS application kernels on the SCC platform, and explore opportunities and challenges for power management within the PGAS framework. Results obtained via empirical evaluation of Unified Parallel C (UPC) applications on the SCC platform under different constraints, show that, for specific operations, the potential for energy savings in PGAS is large; and power/performance trade-offs can be effectively managed using a cross-layerapproach. We investigate cross-layer power management using PGAS language extensions and runtime mechanisms that manipulate power/performance tradeoffs. Specifically, we present the design, implementation and evaluation of such a middleware for application-aware cross-layer power management of UPC applications on the SCC platform. Finally, based on our observations, we provide a set of recommendations and insights that can be used to support similar power management for PGAS applications on other many-core platforms.
Resumo:
High-dose cefepime therapy is recommended for febrile neutropenia. Safety issues have been raised in a recent meta-analysis reporting an increased risk of mortality during cefepime therapy. Cefepime-related neurological toxicity has been associated with overdosing due to severe renal dysfunction. This study aimed to investigate the association between cefepime plasma concentrations and neurological toxicity in febrile neutropenic patients. Cefepime trough concentrations (by high-performance liquid chromatography) were retrospectively analyzed for 30 adult febrile neutropenic patients receiving the recommended high-dose regimen (6 g/day for a glomerular filtration rate [GFR] of >50 ml/min). The dose adjustment to renal function was evaluated by the ratio of the cefepime daily dose per 100 ml/min of glomerular filtration. The association between cefepime plasma concentrations and neurological toxicity was assessed on the basis of consistent neurological symptoms and/or signs (by NCI criteria). The median cefepime concentration was 8.7 mg/liter (range, 2.1 to 38 mg/liter) at a median of 4 days (range, 2 to 15 days) after the start of therapy. Neurological toxicity (altered mental status, hallucinations, or myoclonia) was attributed to cefepime in 6/30 (20%) patients (median GFR, 45 ml/min; range, 41 to 65 ml/min) receiving a median dose of 13.2 g/day per 100 ml/min GFR (range, 9.2 to 14.3 g/day per 100 ml/min GFR). Cefepime discontinuation resulted in complete neurological recovery for five patients and improvement for one patient. A multivariate logistic regression model confirmed high cefepime concentrations as an independent predictor of neurological toxicity, with a 50% probability threshold at ≥22 mg/liter (P = 0.05). High cefepime plasma concentrations are associated with neurological toxicity in febrile neutropenic patients with mild renal dysfunction. Careful adherence to normalized dosing per 100 ml/min GFR is crucial. Monitoring of plasma concentrations may contribute to preventing neurological toxicity of high-dose therapy for this life-threatening condition.
Resumo:
Earth System Models (ESM) have been successfuly developed over past few years, and are currently beeing used for simulating present day-climate, seasonal to interanual predictions of climate change. The supercomputer performance plays an important role in climate modeling since one of the challenging issues for climate modellers is to efficiently and accurately couple earth System components on present day computers architectures. At the Barcelona Supercomputing Center (BSC), we work with the EC- Earth System Model. The EC- Earth is an ESM, which currently consists of an atmosphere (IFS) and an ocean (NEMO) model that communicate with each other through the OASIS coupler. Additional modules (e.g. for chemistry and vegetation ) are under development. The EC-Earth ESM has been ported successfully over diferent high performance computin platforms (e.g, IBM P6 AIX, CRAY XT-5, Intelbased Linux Clusters, SGI Altix) at diferent sites in Europ (e.g., KNMI, ICHEC, ECMWF). The objective of the first phase of the project was to identify and document the issues related with the portability and performance of EC-Earth on the MareNostrum supercomputer, a System based on IBM PowerPC 970MP processors and run under a Linux Suse Distribution. EC-Earth was successfully ported to MareNostrum, and a compilation incompatibilty was solved by a two step compilation approach using XLF version 10.1 and 12.1 compilers. In addition, the EC-Earth performance was analyzed with respect to escalability and trace analysis with the Paravear software. This analysis showed that EC-Earth with a larger number of IFS CPUs (<128) is not feasible at the moment since some issues exists with the IFS-NEMO balance and MPI Communications.
Resumo:
BACKGROUND: The diagnosis of hypertension in children is difficult because of the multiple sex-, age-, and height-specific thresholds to define elevated blood pressure (BP). Blood pressure-to-height ratio (BPHR) has been proposed to facilitate the identification of elevated BP in children. OBJECTIVE: We assessed the performance of BPHR at a single screening visit to identify children with hypertension that is sustained elevated BP. METHOD: In a school-based study conducted in Switzerland, BP was measured at up to three visits in 5207 children. Children had hypertension if BP was elevated at the three visits. Sensitivity, specificity, negative predictive value (NPV), and positive predictive value (PPV) for the identification of hypertension were assessed for different thresholds of BPHR. The ability of BPHR at a single screening visit to discriminate children with and without hypertension was evaluated with receiver operating characteristic (ROC) curve analyses. RESULTS: The prevalence of systolic/diastolic hypertension was 2.2%. Systolic BPHR had a better performance to identify hypertension compared with diastolic BPHR (area under the ROC curve: 0.95 vs. 0.84). The highest performance was obtained with a systolic BPHR threshold set at 0.80 mmHg/cm (sensitivity: 98%; specificity: 85%; PPV: 12%; and NPV: 100%) and a diastolic BPHR threshold set at 0.45 mmHg/cm (sensitivity: 79%; specificity: 70%; PPV: 5%; and NPV: 99%). The PPV was higher among tall or overweight children. CONCLUSION: BPHR at a single screening visit had a high performance to identify hypertension in children, although the low prevalence of hypertension led to a low PPV.
Resumo:
CodeML (part of the PAML package) im- plements a maximum likelihood-based approach to de- tect positive selection on a specific branch of a given phylogenetic tree. While CodeML is widely used, it is very compute-intensive. We present SlimCodeML, an optimized version of CodeML for the branch-site model. Our performance analysis shows that SlimCodeML substantially outperforms CodeML (up to 9.38 times faster), especially for large-scale genomic analyses.