901 resultados para Many-core systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electric utility business is an inherently dangerous area to work in with employees exposed to many potential hazards daily. One such hazard is an arc flash. An arc flash is a rapid release of energy, referred to as incident energy, caused by an electric arc. Due to the random nature and occurrence of an arc flash, one can only prepare and minimize the extent of harm to themself, other employees and damage to equipment due to such a violent event. Effective January 1, 2009 the National Electric Safety Code (NESC) requires that an arc-flash assessment be performed by companies whose employees work on or near energized equipment to determine the potential exposure to an electric arc. To comply with the NESC requirement, Minnesota Power’s (MP’s) current short circuit and relay coordination software package, ASPEN OneLinerTM and one of the first software packages to implement an arc-flash module, is used to conduct an arc-flash hazard analysis. At the same time, the package is benchmarked against equations provided in the IEEE Std. 1584-2002 and ultimately used to determine the incident energy levels on the MP transmission system. This report goes into the depth of the history of arc-flash hazards, analysis methods, both software and empirical derived equations, issues of concern with calculation methods and the work conducted at MP. This work also produced two offline software products to conduct and verify an offline arc-flash hazard analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Worldwide, rural populations are far less likely to have access to clean drinking water than are urban ones. In many developing countries, the current approach to rural water supply uses a model of demand-driven, community-managed water systems. In Suriname, South America rural populations have limited access to improved water supplies; community-managed water supply systems have been installed in several rural communities by nongovernmental organizations as part of the solution. To date, there has been no review of the performance of these water supply systems. This report presents the results of an investigation of three rural water supply systems constructed in Saramaka villages in the interior of Suriname. The investigation used a combination of qualitative and quantitative methods, coupled with ethnographic information, to construct a comprehensive overview of these water systems. This overview includes the water use of the communities, the current status of the water supply systems, histories and sustainability of the water supply projects, technical reviews, and community perceptions. From this overview, factors important to the sustainability of these water systems were identified. Community water supply systems are engineered solutions that operate through social cooperation. The results from this investigation show that technical adequacy is the first and most critical factor for long-term sustainability of a water system. It also shows that technical adequacy is dependent on the appropriateness of the engineering design for the social, cultural, and natural setting in which it takes place. The complex relationships between technical adequacy, community support, and the involvement of women play important roles in the success of water supply projects. Addressing these factors during the project process and taking advantage of alternative water resources may increase the supply of improved drinking water to rural communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanoparticles are fascinating where physical and optical properties are related to size. Highly controllable synthesis methods and nanoparticle assembly are essential [6] for highly innovative technological applications. Among nanoparticles, nonhomogeneous core-shell nanoparticles (CSnp) have new properties that arise when varying the relative dimensions of the core and the shell. This CSnp structure enables various optical resonances, and engineered energy barriers, in addition to the high charge to surface ratio. Assembly of homogeneous nanoparticles into functional structures has become ubiquitous in biosensors (i.e. optical labeling) [7, 8], nanocoatings [9-13], and electrical circuits [14, 15]. Limited nonhomogenous nanoparticle assembly has only been explored. Many conventional nanoparticle assembly methods exist, but this work explores dielectrophoresis (DEP) as a new method. DEP is particle polarization via non-uniform electric fields while suspended in conductive fluids. Most prior DEP efforts involve microscale particles. Prior work on core-shell nanoparticle assemblies and separately, nanoparticle characterizations with dielectrophoresis and electrorotation [2-5], did not systematically explore particle size, dielectric properties (permittivity and electrical conductivity), shell thickness, particle concentration, medium conductivity, and frequency. This work is the first, to the best of our knowledge, to systematically examine these dielectrophoretic properties for core-shell nanoparticles. Further, we conduct a parametric fitting to traditional core-shell models. These biocompatible core-shell nanoparticles were studied to fill a knowledge gap in the DEP field. Experimental results (chapter 5) first examine medium conductivity, size and shell material dependencies of dielectrophoretic behaviors of spherical CSnp into 2D and 3D particle-assemblies. Chitosan (amino sugar) and poly-L-lysine (amino acid, PLL) CSnp shell materials were custom synthesized around a hollow (gas) core by utilizing a phospholipid micelle around a volatile fluid templating for the shell material; this approach proves to be novel and distinct from conventional core-shell models wherein a conductive core is coated with an insulative shell. Experiments were conducted within a 100 nl chamber housing 100 um wide Ti/Au quadrapole electrodes spaced 25 um apart. Frequencies from 100kHz to 80MHz at fixed local field of 5Vpp were tested with 10-5 and 10-3 S/m medium conductivities for 25 seconds. Dielectrophoretic responses of ~220 and 340(or ~400) nm chitosan or PLL CSnp were compiled as a function of medium conductivity, size and shell material.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many reverse engineering approaches have been developed to analyze software systems written in different languages like C/C++ or Java. These approaches typically rely on a meta-model, that is either specific for the language at hand or language independent (e.g. UML). However, one language that was hardly addressed is Lisp. While at first sight it can be accommodated by current language independent meta-models, Lisp has some unique features (e.g. macros, CLOS entities) that are crucial for reverse engineering Lisp systems. In this paper we propose a suite of new visualizations that reveal the special traits of the Lisp language and thus help in understanding complex Lisp systems. To validate our approach we apply them on several large Lisp case studies, and summarize our experience in terms of a series of recurring visual patterns that we have detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software metrics offer us the promise of distilling useful information from vast amounts of software in order to track development progress, to gain insights into the nature of the software, and to identify potential problems. Unfortunately, however, many software metrics exhibit highly skewed, non-Gaussian distributions. As a consequence, usual ways of interpreting these metrics --- for example, in terms of "average" values --- can be highly misleading. Many metrics, it turns out, are distributed like wealth --- with high concentrations of values in selected locations. We propose to analyze software metrics using the Gini coefficient, a higher-order statistic widely used in economics to study the distribution of wealth. Our approach allows us not only to observe changes in software systems efficiently, but also to assess project risks and monitor the development process itself. We apply the Gini coefficient to numerous metrics over a range of software projects, and we show that many metrics not only display remarkably high Gini values, but that these values are remarkably consistent as a project evolves over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The many different proxy records from the European Project for Ice Coring in Antarctica (EPICA) Dome C ice core allow for the first time a comparison of nine glacial terminations in great detail. Despite the fact that all terminations cover the transition from a glacial maximum into an interglacial, there are large differences between single terminations. For some terminations, Antarctic temperature increased only moderately, while for others, the amplitude of change at the termination was much larger. For the different terminations, the rate of change in temperature is more similar than the magnitude or duration of change. These temperature changes were accompanied by vast changes in dust and sea salt deposition all over Antarctica. Here we investigate the phasing between a South American dust proxy (non-sea-salt calcium flux, nssCa2+), a sea ice proxy (sea salt sodium flux, ssNa+) and a proxy for Antarctic temperature (deuterium, δD). In particular, we look into whether a similar sequence of events applies to all terminations, despite their different characteristics. All proxies are derived from the EPICA Dome C ice core, resulting in a relative dating uncertainty between the proxies of less than 20 years. At the start of the terminations, the temperature (δD) increase and dust (nssCa2+ flux) decrease start synchronously. The sea ice proxy (ssNa+ flux), however, only changes once the temperature has reached a particular threshold, approximately 5°C below present day temperatures (corresponding to a δD value of −420‰). This reflects to a large extent the limited sensitivity of the sea ice proxy during very cold periods with large sea ice extent. At terminations where this threshold is not reached (TVI, TVIII), ssNa+ flux shows no changes. Above this threshold, the sea ice proxy is closely coupled to the Antarctic temperature, and interglacial levels are reached at the same time for both ssNa+ and δD. On the other hand, once another threshold at approximately 2°C below present day temperature is passed (corresponding to a δD value of −402‰), nssCa2+ flux has reached interglacial levels and does not change any more, despite further warming. This threshold behaviour most likely results from a combination of changes to the threshold friction velocity for dust entrainment and to the distribution of surface wind speeds in the dust source region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Telescopic systems of structural members with clearance are found in many applications, e.g., mobile cranes, rack feeders, fork lifters, stacker cranes (see Figure 1). Operating these machines, undesirable vibrations may reduce the performance and increase safety problems. Therefore, this contribution has the aim to reduce these harmful vibrations. For a better understanding, the dynamic behaviour of these constructions is analysed. The main interest is the overlapping area of each two sections of the above described systems (see markings in Figure 1) which is investigated by measurements and by computations. A test rig is constructed to determine the dynamic behaviour by measuring fundamental vibrations and higher frequent oscillations, damping coefficients, special appearances and more. For an appropriate physical model, the governing boundary value problem is derived by applying Hamilton’s principle and a classical discretisation procedure is used to generate a coupled system of nonlinear ordinary differential equations as the corresponding truncated mathematical model. On the basis of this model, a controller concept for preventing harmful vibrations is developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multivariate Methoden stellen ein wesentliches Instrumentarium zur Datenanalyse in der Ökologie dar. Sie werden in der Ökologie häufig eingesetzt und sind seit langem Gegenstand der Lehre in der Abteilung Geobotanik der Universität Freiburg. In den letzten Jahren wurde als Werkzeug das Programm R eingeführt. R ist ein frei verfügbares, kommandozeilenorientiertes Statistikprogramm, das für eine Reihe von Betriebssystemen angeboten wird (R-Development Core-Team 2007). Das Programm befindet sich in rascher Entwicklung (derzeit Version 2.10) und wird zunehmend auch von Ökologen eingesetzt. Bislang existiert kein deutschsprachiges Lehrbuch zur Anwendung multivariater Methoden mit R. Mit MultiStaR wird versucht, diese Lücke zu schließen und den Studierenden Lernmaterialien an die Hand zu geben, die Übungen mit dem eigentlichen Analysewerkzeug mit einschließen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earth observations (EO) represent a growing and valuable resource for many scientific, research and practical applications carried out by users around the world. Access to EO data for some applications or activities, like climate change research or emergency response activities, becomes indispensable for their success. However, often EO data or products made of them are (or are claimed to be) subject to intellectual property law protection and are licensed under specific conditions regarding access and use. Restrictive conditions on data use can be prohibitive for further work with the data. Global Earth Observation System of Systems (GEOSS) is an initiative led by the Group on Earth Observations (GEO) with the aim to provide coordinated, comprehensive, and sustained EO and information for making informed decisions in various areas beneficial to societies, their functioning and development. It seeks to share data with users world-wide with the fewest possible restrictions on their use by implementing GEOSS Data Sharing Principles adopted by GEO. The Principles proclaim full and open exchange of data shared within GEOSS, while recognising relevant international instruments and national policies and legislation through which restrictions on the use of data may be imposed.The paper focuses on the issue of the legal interoperability of data that are shared with varying restrictions on use with the aim to explore the options of making data interoperable. The main question it addresses is whether the public domain or its equivalents represent the best mechanism to ensure legal interoperability of data. To this end, the paper analyses legal protection regimes and their norms applicable to EO data. Based on the findings, it highlights the existing public law statutory, regulatory, and policy approaches, as well as private law instruments, such as waivers, licenses and contracts, that may be used to place the datasets in the public domain, or otherwise make them publicly available for use and re-use without restrictions. It uses GEOSS and the particular characteristics of it as a system to identify the ways to reconcile the vast possibilities it provides through sharing of data from various sources and jurisdictions on the one hand, and the restrictions on the use of the shared resources on the other. On a more general level the paper seeks to draw attention to the obstacles and potential regulatory solutions for sharing factual or research data for the purposes that go beyond research and education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accumulation and delta O-18 data from Alpine ice cores provide information on past temperature and precipitation. However, their correlation with seasonal or annual mean temperature and precipitation at nearby sites is often low. This is partly due to the irregular sampling of the atmosphere by the ice core (i.e. ice cores almost only record precipitation events and not dry periods) and the possible incongruity between annual layers and calendar years. Using daily meteorological data from a nearby station and reanalyses, we replicate the ice core from the Grenzgletscher (Switzerland, 4200m a.s.l.) on a sample-by-sample basis by calculating precipitation-weighted temperature (PWT) over short intervals. Over the last 15 yr of the ice core record, accumulation and delta O-18 variations can be well reproduced on a sub-seasonal scale. This allows a wiggle-matching approach for defining quasi-annual layers, resulting in high correlations between measured quasi-annual delta O-18 and PWT. Further back in time, the agreement deteriorates. Nevertheless, we find significant correlations over the entire length of the record (1938-1993) of ice core delta O-18 with PWT, but not with annual mean temperature. This is due to the low correlations between PWT and annual mean temperature, a characteristic which in ERA-Interim reanalysis is also found for many other continental mid-to-high-latitude regions. The fact that meteorologically very different years can lead to similar combinations of PWT and accumulation poses limitations to the use of delta O-18 from Alpine ice cores for temperature reconstructions. Rather than for reconstructing annual mean temperature, delta O-18 from Alpine ice cores should be used to reconstruct PWT over quasi-annual periods. This variable is reproducible in reanalysis or climate model data and could thus be assimilated into conventional climate models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Neuronavigation has become an intrinsic part of preoperative surgical planning and surgical procedures. However, many surgeons have the impression that accuracy decreases during surgery. OBJECTIVE To quantify the decrease of neuronavigation accuracy and identify possible origins, we performed a retrospective quality-control study. METHODS Between April and July 2011, a neuronavigation system was used in conjunction with a specially prepared head holder in 55 consecutive patients. Two different neuronavigation systems were investigated separately. Coregistration was performed with laser-surface matching, paired-point matching using skin fiducials, anatomic landmarks, or bone screws. The initial target registration error (TRE1) was measured using the nasion as the anatomic landmark. Then, after draping and during surgery, the accuracy was checked at predefined procedural landmark steps (Mayfield measurement point and bone measurement point), and deviations were recorded. RESULTS After initial coregistration, the mean (SD) TRE1 was 2.9 (3.3) mm. The TRE1 was significantly dependent on patient positioning, lesion localization, type of neuroimaging, and coregistration method. The following procedures decreased neuronavigation accuracy: attachment of surgical drapes (DTRE2 = 2.7 [1.7] mm), skin retractor attachment (DTRE3 = 1.2 [1.0] mm), craniotomy (DTRE3 = 1.0 [1.4] mm), and Halo ring installation (DTRE3 = 0.5 [0.5] mm). Surgery duration was a significant factor also; the overall DTRE was 1.3 [1.5] mm after 30 minutes and increased to 4.4 [1.8] mm after 5.5 hours of surgery. CONCLUSION After registration, there is an ongoing loss of neuronavigation accuracy. The major factors were draping, attachment of skin retractors, and duration of surgery. Surgeons should be aware of this silent loss of accuracy when using neuronavigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this roadmap paper is to summarize the state-of-the-art and identify research challenges when developing, deploying and managing self-adaptive software systems. Instead of dealing with a wide range of topics associated with the field, we focus on four essential topics of self-adaptation: design space for self-adaptive solutions, software engineering processes for self-adaptive systems, from centralized to decentralized control, and practical run-time verification & validation for self-adaptive systems. For each topic, we present an overview, suggest future directions, and focus on selected challenges. This paper complements and extends a previous roadmap on software engineering for self-adaptive systems published in 2009 covering a different set of topics, and reflecting in part on the previous paper. This roadmap is one of the many results of the Dagstuhl Seminar 10431 on Software Engineering for Self-Adaptive Systems, which took place in October 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Glacier highstands since the Last Glacial Maximum are well documented for many regions, but little is known about glacier fluctuations and lowstands during the Holocene. This is because the traces of minimum extents are difficult to identify and at many places are still ice covered, limiting the access to sample material. Here we report a new approach to assess minimal glacier extent, using a 72-m long surface-to-bedrock ice core drilled on Khukh Nuru Uul, a glacier in the Tsambagarav mountain range of the Mongolian Altai (4130 m asl, 48°39.338′N, 90°50.826′E). The small ice cap has low ice temperatures and flat bedrock topography at the drill site. This indicates minimal lateral glacier flow and thereby preserved climate signals. The upper two-thirds of the ice core contain 200 years of climate information with annual resolution, whereas the lower third is subject to strong thinning of the annual layers with a basal ice age of approximately 6000 years before present (BP). We interpret the basal ice age as indicative of ice-free conditions in the Tsambagarav mountain range at 4100 m asl prior to 6000 years BP. This age marks the onset of the Neoglaciation and the end of the Holocene Climate Optimum. The ice-free conditions allow for adjusting the Equilibrium Line Altitude (ELA) and derive the glacier extent in the Mongolian Altai during the Holocene Climate Optimum. Based on the ELA-shift, we conclude that most of the glaciers are not remnants of the Last Glacial Maximum but were formed during the second part of the Holocene. The ice core derived accumulation reconstruction suggests important changes in the precipitation pattern over the last 6000 years. During formation of the glacier, more humid conditions than presently prevailed followed by a long dry period from 5000 years BP until 250 years ago. Present conditions are more humid than during the past millennia. This is consistent with precipitation evolution derived from lake sediment studies in the Altai.