933 resultados para non-trivial data structures
Resumo:
In this thesis we determine necessary and sufficient conditions for the existence of an equitably ℓ-colourable balanced incomplete block design for any positive integer ℓ > 2. In particular, we present a method for constructing non-trivial equitably ℓ-colourable BIBDs and prove that these designs are the only non-trivial equitably ℓ-colourable BIBDs that exist. We also observe that every equitable ℓ-colouring of a BIBD yields both an equalised ℓ-colouring and a proper 2-colouring of the same BIBD. We also discuss generalisations of these concepts including open questions for further research. The main results presented in this thesis also appear in [7].
Resumo:
As a device, the laser is an elegant conglomerate of elementary physical theories and state-of-the-art techniques ranging from quantum mechanics, thermal and statistical physics, material growth and non-linear mathematics. The laser has been a commercial success in medicine and telecommunication while driving the development of highly optimised devices specifically designed for a plethora of uses. Due to their low-cost and large-scale predictability many aspects of modern life would not function without the lasers. However, the laser is also a window into a system that is strongly emulated by non-linear mathematical systems and are an exceptional apparatus in the development of non-linear dynamics and is often used in the teaching of non-trivial mathematics. While single-mode semiconductor lasers have been well studied, a unified comparison of single and two-mode lasers is still needed to extend the knowledge of semiconductor lasers, as well as testing the limits of current model. Secondly, this work aims to utilise the optically injected semiconductor laser as a tool so study non-linear phenomena in other fields of study, namely ’Rogue waves’ that have been previously witnessed in oceanography and are suspected as having non-linear origins. The first half of this thesis includes a reliable and fast technique to categorise the dynamical state of optically injected two mode and single mode lasers. Analysis of the experimentally obtained time-traces revealed regions of various dynamics and allowed the automatic identification of their respective stability. The impact of this method is also extended to the detection regions containing bi-stabilities. The second half of the thesis presents an investigation into the origins of Rogue Waves in single mode lasers. After confirming their existence in single mode lasers, their distribution in time and sudden appearance in the time-series is studied to justify their name. An examination is also performed into the existence of paths that make Rogue Waves possible and the impact of noise on their distribution is also studied.
Resumo:
Uncertainty in decision-making for patients’ risk of re-admission arises due to non-uniform data and lack of knowledge in health system variables. The knowledge of the impact of risk factors will provide clinicians better decision-making and in reducing the number of patients admitted to the hospital. Traditional approaches are not capable to account for the uncertain nature of risk of hospital re-admissions. More problems arise due to large amount of uncertain information. Patients can be at high, medium or low risk of re-admission, and these strata have ill-defined boundaries. We believe that our model that adapts fuzzy regression method will start a novel approach to handle uncertain data, uncertain relationships between health system variables and the risk of re-admission. Because of nature of ill-defined boundaries of risk bands, this approach does allow the clinicians to target individuals at boundaries. Targeting individuals at boundaries and providing them proper care may provide some ability to move patients from high risk to low risk band. In developing this algorithm, we aimed to help potential users to assess the patients for various risk score thresholds and avoid readmission of high risk patients with proper interventions. A model for predicting patients at high risk of re-admission will enable interventions to be targeted before costs have been incurred and health status have deteriorated. A risk score cut off level would flag patients and result in net savings where intervention costs are much higher per patient. Preventing hospital re-admissions is important for patients, and our algorithm may also impact hospital income.
Resumo:
This is the final Report to the Iowa DOT Offices of Construction and the Highway Division for the calendar year 1999 research project entitled - Continuation of Benchmarking Project: Phase IV. This project continues efforts started in 1995 with the development of a performance measurement system. The performance measurements were used to identify areas that required improvement and process improvement teams (PITs) were launched to make recommendations for improvement. This report provides a brief historical background, documents Benchmark Steering Team Activities, describes measurement activities including the employee survey and collection of non-survey data. Then a retrospective of past PIT activities is given, which sets the stage for the substantial increase in PIT activity that occurred during the winter of 1998/9. Finally, the report closes with suggestions for future directions in Benchmarking Activity.
Resumo:
Scientific workflows orchestrate the execution of complex experiments frequently using distributed computing platforms. Meta-workflows represent an emerging type of such workflows which aim to reuse existing workflows from potentially different workflow systems to achieve more complex and experimentation minimizing workflow design and testing efforts. Workflow interoperability plays a profound role in achieving this objective. This paper is focused at fostering interoperability across meta-workflows that combine workflows of different workflow systems from diverse scientific domains. This is achieved by formalizing definitions of meta-workflow and its different types to standardize their data structures used to describe workflows to be published and shared via public repositories. The paper also includes thorough formalization of two workflow interoperability approaches based on this formal description: the coarse-grained and fine-grained workflow interoperability approach. The paper presents a case study from Astrophysics which successfully demonstrates the use of the concepts of meta-workflows and workflow interoperability within a scientific simulation platform.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Starting with an evaluator for a language, an abstract machine for the same language can be mechanically derived using successive program transformations. This has relevance to studying both the space and time properties of programs because these can be estimated by counting transitions of the abstract machine and measuring the size of the additional data structures needed, such as environments and stacks. In this article we use this process to derive a function that accurately counts the number of steps required to evaluate expressions in a simple language.
Resumo:
Target space duality is one of the most profound properties of string theory. However it customarily requires that the background fields satisfy certain invariance conditions in order to perform it consistently; for instance the vector fields along the directions that T-duality is performed have to generate isometries. In the present paper we examine in detail the possibility to perform T-duality along non-isometric directions. In particular, based on a recent work of Kotov and Strobl, we study gauged 2D sigma models where gauge invariance for an extended set of gauge transformations imposes weaker constraints than in the standard case, notably the corresponding vector fields are not Killing. This formulation enables us to follow a procedure analogous to the derivation of the Buscher rules and obtain two dual models, by integrating out once the Lagrange multipliers and once the gauge fields. We show that this construction indeed works in non-trivial cases by examining an explicit class of examples based on step 2 nilmanifolds.
Resumo:
Coinduction is a proof rule. It is the dual of induction. It allows reasoning about non--well--founded structures such as lazy lists or streams and is of particular use for reasoning about equivalences. A central difficulty in the automation of coinductive proof is the choice of a relation (called a bisimulation). We present an automation of coinductive theorem proving. This automation is based on the idea of proof planning. Proof planning constructs the higher level steps in a proof, using knowledge of the general structure of a family of proofs and exploiting this knowledge to control the proof search. Part of proof planning involves the use of failure information to modify the plan by the use of a proof critic which exploits the information gained from the failed proof attempt. Our approach to the problem was to develop a strategy that makes an initial simple guess at a bisimulation and then uses generalisation techniques, motivated by a critic, to refine this guess, so that a larger class of coinductive problems can be automatically verified. The implementation of this strategy has focused on the use of coinduction to prove the equivalence of programs in a small lazy functional language which is similar to Haskell. We have developed a proof plan for coinduction and a critic associated with this proof plan. These have been implemented in CoClam, an extended version of Clam with encouraging results. The planner has been successfully tested on a number of theorems.
Resumo:
Coinduction is a method of growing importance in reasoning about functional languages, due to the increasing prominence of lazy data structures. Through the use of bisimulations and proofs that bisimilarity is a congruence in various domains it can be used to prove the congruence of two processes. A coinductive proof requires a relation to be chosen which can be proved to be a bisimulation. We use proof planning to develop a heuristic method which automatically constucts a candidate relation. If this relation doesn't allow the proof to go through a proof critic analyses the reasons why it failed and modifies the relation accordingly. Several proof tools have been developed to aid coinductive proofs but all require user interaction. Crucially they require the user to supply an appropriate relation which the system can then prove to be a bisimulation.
Resumo:
We theoretically explore atomic Bose-Einstein condensates (BECs) subject to position-dependent spin-orbit coupling (SOC). This SOC can be produced by cyclically laser coupling four internal atomic ground (or metastable) states in an environment where the detuning from resonance depends on position. The resulting spin-orbit coupled BEC (SOBEC) phase separates into domains, each of which contain density modulations-stripes-aligned either along the x or y direction. In each domain, the stripe orientation is determined by the sign of the local detuning. When these stripes have mismatched spatial periods along domain boundaries, non-trivial topological spin textures form at the interface, including skyrmions-like spin vortices and anti-vortices. In contrast to vortices present in conventional rotating BECs, these spin-vortices are stable topological defects that are not present in the corresponding homogenous stripe-phase SOBECs.
Resumo:
As rural communities experience rapid economic, demographic, and political change, program interventions that focus on the development of community leadership capacity could be valuable. Community leadership development programs have been deployed in rural U.S. communities for the past 30 years by university extension units, chambers of commerce, and other nonprofit foundations. Prior research on program outcomes has largely focused on trainees’ self-reported change in individual leadership knowledge, skills, and attitudes. However, postindustrial leadership theories suggest that leadership in the community relies not on individuals but on social relationships that develop across groups akin to social bridging. The purpose of this study is to extend and strengthen prior evaluative research on community leadership development programs by examining program effects on opportunities to develop bridging social capital using more rigorous methods. Data from a quasi-experimental study of rural community leaders (n = 768) in six states are used to isolate unique program effects on individual changes in both cognitive and behavioral community leadership outcomes. Regression modeling shows that participation in community leadership development programs is associated with increased leadership development in knowledge, skills, attitudes, and behaviors that are a catalyst for social bridging. The community capitals framework is used to show that program participants are significantly more likely to broaden their span of involvement across community capital asset areas over time compared to non-participants. Data on specific program structure elements show that skills training may be important for cognitive outcomes while community development learning and group projects are important for changes in organizational behavior. Suggestions for community leadership program practitioners are presented.
Resumo:
The BL Lac object 1ES 1011+496 was discovered at Very High Energy (VHE, E>100GeV) γ-rays by MAGIC in spring 2007. Before that the source was little studied in different wavelengths. Therefore a multi-wavelength (MWL) campaign was organized in spring 2008. Along MAGIC, the MWL campaign included the Mets¨ahovi radio observatory, Bell and KVA optical telescopes and the Swift and AGILE satellites. MAGIC observations span from March to May, 2008 for a total of 27.9 hours, of which 19.4 hours remained after quality cuts. The light curve showed no significant variability yielding an integral flux above 200 GeV of (1.3 ± 0.3) × 10^(−11) photons cm^(−2) s^( −1) . The differential VHE spectrum could be described with a power-law function with a spectral index of 3.3 ± 0.4. Both results were similar to those obtained during the discovery. Swift XRT observations revealed an X-ray flare, characterized by a harder-when-brighter trend, as is typical for high synchrotron peak BL Lac objects (HBL). Strong optical variability was found during the campaign, but no conclusion on the connection between the optical and VHE γ-ray bands could be drawn. The contemporaneous SED shows a synchrotron dominated source, unlike concluded in previous work based on non-simultaneous data, and is well described by a standard one–zone synchrotron self–Compton model. We also performed a study on the source classification. While the optical and X-ray data taken during our campaign show typical characteristics of an HBL, we suggest, based on archival data, that 1ES 1011+496 is actually a borderline case between intermediate and high synchrotron peak frequency BL Lac objects.
Resumo:
In this study, the Schwarz Information Criterion (SIC) is applied in order to detect change-points in the time series of surface water quality variables. The application of change-point analysis allowed detecting change-points in both the mean and the variance in series under study. Time variations in environmental data are complex and they can hinder the identification of the so-called change-points when traditional models are applied to this type of problems. The assumptions of normality and uncorrelation are not present in some time series, and so, a simulation study is carried out in order to evaluate the methodology’s performance when applied to non-normal data and/or with time correlation.
Resumo:
In many instances of holographic correspondences between a d-dimensional boundary theory and a (. d+. 1)-dimensional bulk, a direct argument in the boundary theory implies that there must exist a simple and precise relation between the Euclidean on-shell action of a (. d-. 1)-brane probing the bulk geometry and the Euclidean gravitational bulk action. This relation is crucial for the consistency of holography, yet it is non-trivial from the bulk perspective. In particular, we show that it relies on a nice isoperimetric inequality that must be satisfied in a large class of Poincaré-Einstein spaces. Remarkably, this inequality follows from theorems by Lee and Wang.