908 resultados para Large detector-systems performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a new algebraic-graph method for identification of islanding in power system grids is proposed. The proposed method identifies all the possible cases of islanding, due to the loss of a equipment, by means of a factorization of the bus-branch incidence matrix. The main features of this new method include: (i) simple implementation, (ii) high speed, (iii) real-time adaptability, (iv) identification of all islanding cases and (v) identification of the buses that compose each island in case of island formation. The method was successfully tested on large-scale systems such as the reduced south Brazilian system (45 buses/72 branches) and the south-southeast Brazilian system (810 buses/1340 branches). (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work presented here is the characterization of structure and dynamics of different types of supramolecular systems by advanced NMR spectroscopy. One of the characteristic features of NMR spectroscopy is based on its high selectivity. Thus, it is desirable to exploit this technique for studying structure and dynamics of large supramolecular systems without isotopic enrichment. The observed resonance frequencies are not only isotope specific but also influenced by local fields, in particular by the distribution of electron density around the investigated nucleus. Barbituric acid are well known for forming strongly hydrogen-bonded complexes with variety of adenine derivatives. The prototropic tautomerism of this material facilitates an adjustment to complementary bases containing a DDA(A = hydrogen bond acceptor site, D = hydrogen bond donor site) or ADA sequences, thereby yielding strongly hydrogen-bonded complexes. In this contribution solid-state structures of the enolizable chromophor "1-n-butyl-5-(4-nitrophenyl)-barbituric acid" that features adjustable hydrogen-bonding properties and the molecular assemblies with three different strength of bases (Proton sponge, adenine mimetic 2,6-diaminopyridine (DAP) and 2,6-diacetamidopyridine (DAC)) are studied. Diffusion NMR spectroscopy gives information over such interactions and has become the method of choice for measuring the diffusion coefficient, thereby reflecting the effective size and shape of a molecular species. In this work the investigation of supramolecular aggregates in solution state by means of DOSY NMR techniques are performed. The underlying principles of DOSY NMR experiment are discussed briefly and more importantly two applications demonstrating the potential of this method are focused on. Calix[n]arenes have gained a rather prominent position, both as host materials and as platforms to design specific receptors. In this respect, several different capsular contents of tetra urea calix[4]arenes (benzene, benzene-d6, 1-fluorobenzene, 1-fluorobenzene-d5, 1,4-difluorobenzene, and cobaltocenium) are studied by solid state NMR spectroscopy. In the solid state, the study of the interaction between tetra urea calix[4]arenes and guest is simplified by the fact that the guests molecule remains complexed and positioned within the cavity, thus allowing a more direct investigation of the host-guest interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To analyse the frequency of and identify risk factors for patient-reported medical errors in Switzerland. The joint effect of risk factors on error-reporting probability was modelled for hypothetical patients. METHODS: A representative population sample of Swiss citizens (n = 1306) was surveyed as part of the Commonwealth Fund’s 2010 lnternational Survey of the General Public’s Views of their Health Care System’s Performance in Eleven Countries. Data on personal background, utilisation of health care, coordination of care problems and reported errors were assessed. Logistic regression analysis was conducted to identify risk factors for patients’ reports of medical mistakes and medication errors. RESULTS: 11.4% of participants reported at least one error in their care in the previous two years (8% medical errors, 5.3% medication errors). Poor coordination of care experiences was frequent. 7.8% experienced that test results or medical records were not available, 17.2% received conflicting information from care providers and 11.5% reported that tests were ordered although they had been done before. Age (OR = 0.98, p = 0.014), poor health (OR = 2.95, p = 0.007), utilisation of emergency care (OR = 2.45, p = 0.003), inpatient-stay (OR = 2.31, p = 0.010) and poor care coordination (OR = 5.43, p <0.001) are important predictors for reporting error. For high utilisers of care that unify multiple risk factors the probability that errors are reported rises up to p = 0.8. CONCLUSIONS: Patient safety remains a major challenge for the Swiss health care system. Despite the health related and economic burden associated with it, the widespread experience of medical error in some subpopulations also has the potential to erode trust in the health care system as a whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding large software systems is a challenging task, and to support it many approaches have been developed. Often, the result of these approaches categorize existing entities into new groups or associates them with mutually exclusive properties. In this paper we present the Distribution Map as a generic technique to visualize and analyze this type of result. Our technique is based on the notion of focus, which shows whether a property is well-encapsulated or cross-cutting, and the notion of spread, which shows whether the property is present in several parts of the system. We present a basic visualization and complement it with measurements that quantify focus and spread. To validate our technique we show evidence of applying it on the result sets of different analysis approaches. As a conclusion we propose that the Distribution Map technique should belong to any reverse engineering toolkit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is an important and difficult challenge to protect modern interconnected power system from blackouts. Applying advanced power system protection techniques and increasing power system stability are ways to improve the reliability and security of power systems. Phasor-domain software packages such as Power System Simulator for Engineers (PSS/E) can be used to study large power systems but cannot be used for transient analysis. In order to observe both power system stability and transient behavior of the system during disturbances, modeling has to be done in the time-domain. This work focuses on modeling of power systems and various control systems in the Alternative Transients Program (ATP). ATP is a time-domain power system modeling software in which all the power system components can be modeled in detail. Models are implemented with attention to component representation and parameters. The synchronous machine model includes the saturation characteristics and control interface. Transient Analysis Control System is used to model the excitation control system, power system stabilizer and the turbine governor system of the synchronous machine. Several base cases of a single machine system are modeled and benchmarked against PSS/E. A two area system is modeled and inter-area and intra-area oscillations are observed. The two area system is reduced to a two machine system using reduced dynamic equivalencing. The original and the reduced systems are benchmarked against PSS/E. This work also includes the simulation of single-pole tripping using one of the base case models. Advantages of single-pole tripping and comparison of system behavior against three-pole tripping are studied. Results indicate that the built-in control system models in PSS/E can be effectively reproduced in ATP. The benchmarked models correctly simulate the power system dynamics. The successful implementation of a dynamically reduced system in ATP shows promise for studying a small sub-system of a large system without losing the dynamic behaviors. Other aspects such as relaying can be investigated using the benchmarked models. It is expected that this work will provide guidance in modeling different control systems for the synchronous machine and in representing dynamic equivalents of large power systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Navigating large software systems is difficult as the various artifacts are distributed in a huge space, while the relationships between different artifacts often remain hidden and obscure. As a consequence, developers using a modern interactive development environment (IDE) are forced to open views on numerous source artifacts to reveal these hidden relationships, leading to a crowded workspace with many opened windows or tabs. Developers often lose the overview in such a cluttered workspace as IDEs provide little support to get rid of unused windows. AutumnLeaves automatically selects windows unlikely for future use to be closed or grayed out while important ones are displayed more prominently. This reduces the number of windows opened at a time and adds structure to the developer's workspace. We validate AutumnLeaves with a benchmark evaluation using recorded navigation data of various developers to determine the prediction quality of the employed algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die hohe Komplexität zellularer intralogistischer Systeme und deren Steuerungsarchitektur legt die Verwendung moderner Simulations- und Visualisierungstechniken nahe, um schon im Vorfeld Aussagen über die Leistungsfähigkeit und Zukunftssicherheit eines geplanten Systems treffen zu können. In dieser Arbeit wird ein Konzept für ein Simulationssystem zur VR-basierten Steuerungsverifikation zellularer Intralogistiksysteme vorgestellt. Beschrieben wird die Erstellung eines Simulationsmodells für eine real existierende Anlage und es wird ein Überblick über die Bestandteile der Simulation, insbesondere die Anbindung der Steuerung des realen agentenbasierten Systems, gegeben.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experiment was conducted using Angus cross steer calves of three frame sizes (small, medium, and large) to compare performance of two different grow/finish feeding programs. Half of the cattle in each frame size group were fed a high energy ration through the growing period, similar to calves going directly into the feedlot. The other half was fed a low energy ration, similar to a backgrounding diet, for a period prior to the finishing phase. All cattle were fed a high energy ration through the finishing period. The data showed the cattle fed the low energy growing diet experienced some compensatory gains as shown by ultrasound backfat and average daily gains coupled with intakes greater than the increases seen in the high energy treatment. Carcass data and overall performance data showed no ill effects due to the low energy growing ration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study uses the widths, the spacing and the grain-size pattern of Oligo/Miocene alluvial fan conglomerates in the central segment of the Swiss Alpine foreland to reconstruct the topographic development of the Alps. These data are analysed with models of longitudinal stream profile development, to propose that the Alpine topography evolved from an early transient state where streams adjusted to rock uplift by headward retreat, to a mature phase where any changes in rock uplift were accommodated by vertical incision. The first stage comprises the time interval between ca 31 Ma and 22 Ma, when the Alpine streams deposited many small fans with a lateral spacing of <30 km in the north Alpine foreland. As the range evolved, the streams joined and the fans coalesced into a few large depositional systems with a lateral spacing of ca 80 to 100 km at 22 Ma. The models used here suggest that the overall elevation of the Alps increased rapidly within <5 Myr. The variability in pebble size increased either due to variations in sediment supply, enhanced orographic effects, or preferentially due to a change towards a stormier palaeoclimate. By 22 Ma, only two large rivers carried material into the foreland fans, suggesting that the major Alpine streams had established themselves. This second phase of stable drainage network was maintained until ca 5 Ma, when the uplift and erosion of the Molasse started and streams were redirected both in the Alps and in the foreland. This study illustrates that sedimentological archives of foreland basins can be used to reconstruct the chronology of the topographic development of mountain belts. It is suggested that the finite elevation of mountainous landscapes is reached early during orogeny and can be maintained for millions of years, provided that erosion is efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software developers often ask questions about software systems and software ecosystems that entail exploration and navigation, such as who uses this component?, and where is this feature implemented?. Software visualisation can be a great aid to understanding and exploring the answers to such questions, but visualisations require expertise to implement effectively, and they do not always scale well to large systems. We propose to automatically generate software visualisations based on software models derived from open source software corpora and from an analysis of the properties of typical developers queries and commonly used visualisations. The key challenges we see are (1) understanding how to match queries to suitable visualisations, and (2) scaling visualisations effectively to very large software systems and corpora. In the paper we motivate the idea of automatic software visualisation, we enumerate the challenges and our proposals to address them, and we describe some very initial results in our attempts to develop scalable visualisations of open source software corpora.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter presents an evaluation and initial testing of a meta-application (meta-app) for enhanced communication and improved interaction (e.g., appointment scheduling) between stakeholders (e.g., citizens) in cognitive cities. The underlying theoretical models as well as the paper prototype are presented to ensure the comprehensibility of the user interface. This paper prototype of the meta-app was evaluated through interviews with various experts in different fields (e.g., a strategic consultant, a small and medium-sized enterprises cofounder in the field of online marketing, an IT project leader, and an innovation manager). The results and implications of the evaluation show that the idea behind this meta-app has the potential to improve the living standards of citizens and to lead to a next step in the realization and maturity of the meta-app. The meta-app helps citizens more effectively manage their time and organize their personal schedules and thus allows them to have more leisure time and take full advantage of it to ensure a good work-life balance to enable them to be the most efficient and productive during their working time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The new computing paradigm known as cognitive computing attempts to imitate the human capabilities of learning, problem solving, and considering things in context. To do so, an application (a cognitive system) must learn from its environment (e.g., by interacting with various interfaces). These interfaces can run the gamut from sensors to humans to databases. Accessing data through such interfaces allows the system to conduct cognitive tasks that can support humans in decision-making or problem-solving processes. Cognitive systems can be integrated into various domains (e.g., medicine or insurance). For example, a cognitive system in cities can collect data, can learn from various data sources and can then attempt to connect these sources to provide real time optimizations of subsystems within the city (e.g., the transportation system). In this study, we provide a methodology for integrating a cognitive system that allows data to be verbalized, making the causalities and hypotheses generated from the cognitive system more understandable to humans. We abstract a city subsystem—passenger flow for a taxi company—by applying fuzzy cognitive maps (FCMs). FCMs can be used as a mathematical tool for modeling complex systems built by directed graphs with concepts (e.g., policies, events, and/or domains) as nodes and causalities as edges. As a verbalization technique we introduce the restriction-centered theory of reasoning (RCT). RCT addresses the imprecision inherent in language by introducing restrictions. Using this underlying combinatorial design, our approach can handle large data sets from complex systems and make the output understandable to humans.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Population growth is always increasing, and thus the concept of smart and cognitive cities is becoming more important. Developed countries are aware of and working towards needed changes in city management. However, emerging countries require the optimization of their own city management. This chapter illustrates, based on a use case, how a city in an emerging country can quickly progress using the concept of smart and cognitive cities. Nairobi, the capital of Kenya, is chosen for the test case. More than half of the population of Nairobi lives in slums with poor sanitation, and many slum inhabitants often share a single toilet, so the proper functioning and reliable maintenance of toilets are crucial. For this purpose, an approach for processing text messages based on cognitive computing (using soft computing methods) is introduced. Slum inhabitants can inform the responsible center via text messages in cases when toilets are not functioning properly. Through cognitive computer systems, the responsible center can fix the problem in a quick and efficient way by sending repair workers to the area. Focusing on the slum of Kibera, an easy-to-handle approach for slum inhabitants is presented, which can make the city more efficient, sustainable and resilient (i.e., cognitive).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.