20 resultados para Many fermion systems
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Few real software systems are built completely from scratch nowadays. Instead, systems are built iteratively and incrementally, while integrating and interacting with components from many other systems. Adaptation, reconfiguration and evolution are normal, ongoing processes throughout the lifecycle of a software system. Nevertheless the platforms, tools and environments we use to develop software are still largely based on an outmoded model that presupposes that software systems are closed and will not significantly evolve after deployment. We claim that in order to enable effective and graceful evolution of modern software systems, we must make these systems more amenable to change by (i) providing explicit, first-class models of software artifacts, change, and history at the level of the platform, (ii) continuously analysing static and dynamic evolution to track emergent properties, and (iii) closing the gap between the domain model and the developers' view of the evolving system. We outline our vision of dynamic, evolving software systems and identify the research challenges to realizing this vision.
Resumo:
The N-H center dot center dot center dot pi hydrogen bond is an important intermolecular interaction in many biological systems. We have investigated the infrared (IR) and ultraviolet (UV) spectra of the supersonic-jet cooled complex of pyrrole with benzene and benzene-d(6) (Pyr center dot Bz, Pyr center dot Bz-d(6)). DFT-D density functional, SCS-MP2 and SCS-CC2 calculations predict a T-shaped and (almost) C(s) symmetric structure with an N-H center dot center dot center dot pi hydrogen bond to the benzene ring. The pyrrole is tipped by omega(S(0)) = +/- 13 degrees relative to the surface normal of Bz. The N center dot center dot center dot ring distance is 3.13 angstrom. In the S(1) excited state, SCS-CC2 calculations predict an increased tipping angle omega(S(1)) = +/- 21 degrees. The IR depletion spectra support the T-shaped geometry: The NH stretch is redshifted by -59 cm(-1), relative to the "free" NH stretch of pyrrole at 3531 cm(-1), indicating a moderately strong N-H center dot center dot center dot pi interaction. The interaction is weaker than in the (Pyr)(2) dimer, where the NH donor shift is -87 cm(-1) [Dauster et al., Phys. Chem. Chem. Phys., 2008, 10, 2827]. The IR C-H stretch frequencies and intensities of the Bz subunit are very similar to those of the acceptor in the (Bz)(2) dimer, confirming that Bz acts as the acceptor. While the S(1) <- S(0) electronic origin of Bz is forbidden and is not observable in the gas-phase, the UV spectrum of Pyr center dot Bz in the same region exhibits a weak 0(0)(0) band that is red-shifted by 58 cm(-1) relative to that of Bz (38 086 cm(-1)). The origin appears due to symmetry-breaking of the p-electron system of Bz by the asymmetric pyrrole NH center dot center dot center dot pi hydrogen bond. This contrasts with (Bz)(2), which does not exhibit a 0(0)(0) band. The Bz moiety in Pyr center dot Bz exhibits a 6a(0)(1) band at 0(0)(0) + 518 cm(-1) that is about 20x more intense than the origin band. The symmetry breaking by the NH center dot center dot center dot pi hydrogen bond splits the degeneracy of the v(6)(e(2g)) vibration, giving rise to 6a' and 6b' sub-bands that are spaced by similar to 6 cm(-1). Both the 0(0)(0) and 6(0)(1) bands of Pyr center dot Bz carry a progression in the low-frequency (10 cm(-1)) excited-state tipping vibration omega', in agreement with the change of the omega tipping angle predicted by SCS-MP2 and SCS-CC2 calculations.
Resumo:
Antimicrobial peptides are intrinsic to the innate immune system in many organ systems, but little is known about their expression in the central nervous system. We examined cerebrospinal fluid (CSF) and serum from patients with active bacterial meningitis to assess antimicrobial peptides and possible bactericidal properties of the CSF. We found antimicrobial peptides (human cathelicidin LL-37) in the CSF of patients with bacterial meningitis but not in control CSF. We next characterized the expression, secretion, and bactericidal properties of rat cathelin-related antimicrobial peptide, the homologue of the human LL-37, in rat astrocytes and microglia after incubation with different bacterial components. Using real-time polymerase chain reaction and Western blotting, we determined that supernatants from both astrocytes and microglia incubated with bacterial component supernatants had antimicrobial activity. The expression of rat cathelin-related antimicrobial peptide in rat glial cells involved different signal transduction pathways and was induced by the inflammatory cytokines interleukin 1beta and tumor necrosis factor. In an experimental model of meningitis, infant rats were intracisternally infected with Streptococcus pneumoniae, and rat cathelin-related antimicrobial peptide was localized in glia, choroid plexus, and ependymal cells by immunohistochemistry. Together, these results suggest that cathelicidins produced by glia and other cells play an important part in the innate immune response against pathogens in central nervous system bacterial infections.
Resumo:
While analysis and interpretation of structural epileptogenic lesion is an essential task for the neuroradiologist in clinical practice, a substantial body of epilepsy research has shown that focal lesions influence brain areas beyond the epileptogenic lesion, across ensembles of functionally and anatomically connected brain areas. In this review article, we aim to provide an overview about altered network compositions in epilepsy, as measured with current advanced neuroimaging techniques to characterize the initiation and spread of epileptic activity in the brain with multimodal noninvasive imaging techniques. We focus on resting-state functional magnetic resonance imaging (MRI) and simultaneous electroencephalography/fMRI, and oppose the findings in idiopathic generalized versus focal epilepsies. These data indicate that circumscribed epileptogenic lesions can have extended effects on many brain systems. Although epileptic seizures may involve various brain areas, seizure activity does not spread diffusely throughout the brain but propagates along specific anatomic pathways that characterize the underlying epilepsy syndrome. Such a functionally oriented approach may help to better understand a range of clinical phenomena such as the type of cognitive impairment, the development of pharmacoresistance, the propagation pathways of seizures, or the success of epilepsy surgery.
Resumo:
Intussusceptive capillary growth represents a new principle for microvascular growth as described in the lungs of growing rats. According to this concept, the capillary network expands by the formation of slender transcapillary tissue pillars, which give rise to new vascular meshes. The process was first observed in Mercox casts of the lung microvasculature, which revealed the existence of multiple tiny holes with diameters around 1.5 microns. Consecutive transmission electron microscopic investigation of serial sections demonstrated that the holes corresponded to slender tissue pillars (Burri and Tarek, 1990). The corrosion cast technique thus appears to be an adequate screening method for intussusceptive growth. In the present investigation, Mercox casts of various vascular systems, namely, those of the eye, submandibular gland, heart, liver, stomach, small and large intestine, trachea, kidney, uterus and ovary were prepared from rats aged between 4 and 9 weeks in order to screen them for the existence of the typical tiny holes representing tissue pillars. In all organs investigated, these structures were observed in various locations to a variable degree. They were mainly encountered within dilated vascular segments or at triple or quadruple branching points of the circulation. Even in capillary networks with a three-dimensional arrangement could these pillars be detected. Intussusception thus appears to be a principle of growth appertaining to many vascular systems.
Resumo:
Background. No comprehensive systematic review has been published since 1998 about the frequency with which cancer patients use complementary and alternative medicine (CAM). Methods. MEDLINE, AMED, and Embase databases were searched for surveys published until January 2009. Surveys conducted in Australia, Canada, Europe, New Zealand, and the United States with at least 100 adult cancer patients were included. Detailed information on methods and results was independently extracted by 2 reviewers. Methodological quality was assessed using a criteria list developed according to the STROBE guideline. Exploratory random effects metaanalysis and metaregression were applied. Results. Studies from 18 countries (152; >65 000 cancer patients) were included. Heterogeneity of CAM use was high and to some extent explained by differences in survey methods. The combined prevalence for “current use” of CAM across all studies was 40%. The highest was in the United States and the lowest in Italy and the Netherlands. Metaanalysis suggested an increase in CAM use from an estimated 25% in the 1970s and 1980s to more than 32% in the 1990s and to 49% after 2000. Conclusions. The overall prevalence of CAM use found was lower than often claimed. However, there was some evidence that the use has increased considerably over the past years. Therefore, the health care systems ought to implement clear strategies of how to deal with this. To improve the validity and reporting of future surveys, the authors suggest criteria for methodological quality that should be fulfilled and reporting standards that should be required.
Resumo:
OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.
Resumo:
Many reverse engineering approaches have been developed to analyze software systems written in different languages like C/C++ or Java. These approaches typically rely on a meta-model, that is either specific for the language at hand or language independent (e.g. UML). However, one language that was hardly addressed is Lisp. While at first sight it can be accommodated by current language independent meta-models, Lisp has some unique features (e.g. macros, CLOS entities) that are crucial for reverse engineering Lisp systems. In this paper we propose a suite of new visualizations that reveal the special traits of the Lisp language and thus help in understanding complex Lisp systems. To validate our approach we apply them on several large Lisp case studies, and summarize our experience in terms of a series of recurring visual patterns that we have detected.
Resumo:
Software metrics offer us the promise of distilling useful information from vast amounts of software in order to track development progress, to gain insights into the nature of the software, and to identify potential problems. Unfortunately, however, many software metrics exhibit highly skewed, non-Gaussian distributions. As a consequence, usual ways of interpreting these metrics --- for example, in terms of "average" values --- can be highly misleading. Many metrics, it turns out, are distributed like wealth --- with high concentrations of values in selected locations. We propose to analyze software metrics using the Gini coefficient, a higher-order statistic widely used in economics to study the distribution of wealth. Our approach allows us not only to observe changes in software systems efficiently, but also to assess project risks and monitor the development process itself. We apply the Gini coefficient to numerous metrics over a range of software projects, and we show that many metrics not only display remarkably high Gini values, but that these values are remarkably consistent as a project evolves over time.
Resumo:
BACKGROUND Neuronavigation has become an intrinsic part of preoperative surgical planning and surgical procedures. However, many surgeons have the impression that accuracy decreases during surgery. OBJECTIVE To quantify the decrease of neuronavigation accuracy and identify possible origins, we performed a retrospective quality-control study. METHODS Between April and July 2011, a neuronavigation system was used in conjunction with a specially prepared head holder in 55 consecutive patients. Two different neuronavigation systems were investigated separately. Coregistration was performed with laser-surface matching, paired-point matching using skin fiducials, anatomic landmarks, or bone screws. The initial target registration error (TRE1) was measured using the nasion as the anatomic landmark. Then, after draping and during surgery, the accuracy was checked at predefined procedural landmark steps (Mayfield measurement point and bone measurement point), and deviations were recorded. RESULTS After initial coregistration, the mean (SD) TRE1 was 2.9 (3.3) mm. The TRE1 was significantly dependent on patient positioning, lesion localization, type of neuroimaging, and coregistration method. The following procedures decreased neuronavigation accuracy: attachment of surgical drapes (DTRE2 = 2.7 [1.7] mm), skin retractor attachment (DTRE3 = 1.2 [1.0] mm), craniotomy (DTRE3 = 1.0 [1.4] mm), and Halo ring installation (DTRE3 = 0.5 [0.5] mm). Surgery duration was a significant factor also; the overall DTRE was 1.3 [1.5] mm after 30 minutes and increased to 4.4 [1.8] mm after 5.5 hours of surgery. CONCLUSION After registration, there is an ongoing loss of neuronavigation accuracy. The major factors were draping, attachment of skin retractors, and duration of surgery. Surgeons should be aware of this silent loss of accuracy when using neuronavigation.
Resumo:
Temporal data are a core element of a reservation. In this paper we formulate 10 requirements and 14 sub-requirements for handling temporal data in online hotel reservation systems (OHRS) from a usability viewpoint. We test the fulfillment of these requirements for city and resort hotels in Austria and Switzerland. Some of the requirements are widely met; however, many requirements are fulfilled only by a surprisingly small number of hotels. In particular, numerous systems offer options for selecting data which lead to error messages in the next step. A few screenshots illustrate flaws of the systems. We also draw conclusions on the state of applying software engineering principles in the development of Web pages.
Resumo:
The goal of this roadmap paper is to summarize the state-of-the-art and identify research challenges when developing, deploying and managing self-adaptive software systems. Instead of dealing with a wide range of topics associated with the field, we focus on four essential topics of self-adaptation: design space for self-adaptive solutions, software engineering processes for self-adaptive systems, from centralized to decentralized control, and practical run-time verification & validation for self-adaptive systems. For each topic, we present an overview, suggest future directions, and focus on selected challenges. This paper complements and extends a previous roadmap on software engineering for self-adaptive systems published in 2009 covering a different set of topics, and reflecting in part on the previous paper. This roadmap is one of the many results of the Dagstuhl Seminar 10431 on Software Engineering for Self-Adaptive Systems, which took place in October 2010.
Resumo:
Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.
Resumo:
Ore-forming and geoenviromental systems commonly involve coupled fluid flowand chemical reaction processes. The advanced numerical methods and computational modeling have become indispensable tools for simulating such processes in recent years. This enables many hitherto unsolvable geoscience problems to be addressed using numerical methods and computational modeling approaches. For example, computational modeling has been successfully used to solve ore-forming and mine site contamination/remediation problems, in which fluid flow and geochemical processes play important roles in the controlling dynamic mechanisms. The main purpose of this paper is to present a generalized overview of: (1) the various classes and models associated with fluid flow/chemically reacting systems in order to highlight possible opportunities and developments for the future; (2) some more general issues that need attention in the development of computational models and codes for simulating ore-forming and geoenviromental systems; (3) the related progresses achieved on the geochemical modeling over the past 50 years or so; (4) the general methodology for modeling of oreforming and geoenvironmental systems; and (5) the future development directions associated with modeling of ore-forming and geoenviromental systems.
Resumo:
A Hennessy-Milner property, relating modal equivalence and bisimulations, is defined for many-valued modal logics that combine a local semantics based on a complete MTL-chain (a linearly ordered commutative integral residuated lattice) with crisp Kripke frames. A necessary and sufficient algebraic condition is then provided for the class of image-finite models of these logics to admit the Hennessy-Milner property. Complete characterizations are obtained in the case of many-valued modal logics based on BL-chains (divisible MTL-chains) that are finite or have universe [0,1], including crisp Lukasiewicz, Gödel, and product modal logics.