831 resultados para Formal Methods. Component-Based Development. Competition. Model Checking
Resumo:
Heuristics, simulation, artificial intelligence techniques and combinations thereof have all been employed in the attempt to make computer systems adaptive, context-aware, reconfigurable and self-managing. This paper complements such efforts by exploring the possibility to achieve runtime adaptiveness using mathematically-based techniques from the area of formal methods. It is argued that formal methods @ runtime represents a feasible approach, and promising preliminary results are summarised to support this viewpoint. The survey of existing approaches to employing formal methods at runtime is accompanied by a discussion of their challenges and of the future research required to overcome them. © 2011 Springer-Verlag.
Resumo:
The Ocean Model Intercomparison Project (OMIP) is an endorsed project in the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses CMIP6 science questions, investigating the origins and consequences of systematic model biases. It does so by providing a framework for evaluating (including assessment of systematic biases), understanding, and improving ocean, sea-ice, tracer, and biogeochemical components of climate and earth system models contributing to CMIP6. Among the WCRP Grand Challenges in climate science (GCs), OMIP primarily contributes to the regional sea level change and near-term (climate/decadal) prediction GCs. OMIP provides (a) an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing; and (b) a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) detailing methods for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II (Interannual Forcing) have become the standard methods to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP, HighResMIP (High Resolution MIP), as well as the ocean/sea-ice OMIP simulations.
Resumo:
The evaluation and identification of habitats that function as nurseries for marine species has the potential to improve conservation and management. A key assessment of nursery habitat is estimating individual growth. However, the discrete growth of crustaceans presents a challenge for many traditional in situ techniques to accurately estimate growth over a short temporal scale. To evaluate the use of nucleic acid ratios (R:D) for juvenile blue crab (Callinectes sapidus), I developed and validated an R:D-based index of growth in the laboratory. R:D based growth estimates of crabs collected in the Patuxent River, MD indicated growth ranged from 0.8-25.9 (mg·g-1·d-1). Overall, there was no effect of size on growth, whereas there was a weak, but significant effect of date. These data provide insight into patterns of habitat-specific growth. These results highlight the complexity of the biological and physical factors which regulate growth of juvenile blue crabs in the field.
Resumo:
Alexander’s Ecological Dominance and Social Competition (EDSC) model currently provides the most comprehensive overview of human traits in the development of a theory of human evolution and sociality (Alexander, 1990; Finn, Geary & Ward, 2005; Irons, 2005). His model provides a basis for explaining the evolution of human socio-cognitive abilities. Our paper examines the extension of Alexander’s model to incorporate the human trait of information behavior in synergy with ecological dominance and social competition as a human socio-cognitive competence. This paper discusses the various interdisciplinary perspectives exploring how evolution has shaped information behavior and why information behavior is emerging as an important human socio-cognitive competence. This paper outlines these issues, including the extension of Spink and Currier’s (2006a,b) evolution of information behavior model towards a more integrated understanding of how information behaviors have evolved (Spink & Cole, 2006).
Resumo:
Adults diagnosed with primary brain tumours often experience physical, cognitive and neuropsychiatric impairments and decline in quality of life. Although disease and treatment-related information is commonly provided to cancer patients and carers, newly diagnosed brain tumour patients and their carers report unmet information needs. Few interventions have been designed or proven to address these information needs. Accordingly, a three-study research program, that incorporated both qualitative and quantitative research methods, was designed to: 1) identify and select an intervention to improve the provision of information, and meet the needs of patients with a brain tumour; 2) use an evidence-based approach to establish the content, language and format for the intervention; and 3) assess the acceptability of the intervention, and the feasibility of evaluation, with newly diagnosed brain tumour patients. Study 1: Structured concept mapping techniques were undertaken with 30 health professionals, who identified strategies or items for improving care, and rated each of 42 items for importance, feasibility, and the extent to which such care was provided. Participants also provided data to interpret the relationship between items, which were translated into ‘maps’ of relationships between information and other aspects of health care using multidimensional scaling and hierarchical cluster analysis. Results were discussed by participants in small groups and individual interviews to understand the ratings, and facilitators and barriers to implementation. A care coordinator was rated as the most important strategy by health professionals. Two items directly related to information provision were also seen as highly important: "information to enable the patient or carer to ask questions" and "for doctors to encourage patients to ask questions". Qualitative analyses revealed that information provision was individualised, depending on patients’ information needs and preferences, demographic variables and distress, the characteristics of health professionals who provide information, the relationship between the individual patient and health professional, and influenced by the fragmented nature of the health care system. Based on quantitative and qualitative findings, a brain tumour specific question prompt list (QPL) was chosen for development and feasibility testing. A QPL consists of a list of questions that patients and carers may want to ask their doctors. It is designed to encourage the asking of questions in the medical consultation, allowing patients to control the content, and amount of information provided by health professionals. Study 2: The initial structure and content of the brain tumour specific QPL developed was based upon thematic analyses of 1) patient materials for brain tumour patients, 2) QPLs designed for other patient populations, and 3) clinical practice guidelines for the psychosocial care of glioma patients. An iterative process of review and refinement of content was undertaken via telephone interviews with a convenience sample of 18 patients and/or carers. Successive drafts of QPLs were sent to patients and carers and changes made until no new topics or suggestions arose in four successive interviews (saturation). Once QPL content was established, readability analyses and redrafting were conducted to achieve a sixth-grade reading level. The draft QPL was also reviewed by eight health professionals, and shortened and modified based on their feedback. Professional design of the QPL was conducted and sent to patients and carers for further review. The final QPL contained questions in seven colour-coded sections: 1) diagnosis; 2) prognosis; 3) symptoms and problems; 4) treatment; 5) support; 6) after treatment finishes; and 7) the health professional team. Study 3: A feasibility study was conducted to determine the acceptability of the QPL and the appropriateness of methods, to inform a potential future randomised trial to evaluate its effectiveness. A pre-test post-test design was used with a nonrandomised control group. The control group was provided with ‘standard information’, the intervention group with ‘standard information’ plus the QPL. The primary outcome measure was acceptability of the QPL to participants. Twenty patients from four hospitals were recruited a median of 1 month (range 0-46 months) after diagnosis, and 17 completed baseline and follow-up interviews. Six participants would have preferred to receive the information booklet (standard information or QPL) at a different time, most commonly at diagnosis. Seven participants reported on the acceptability of the QPL: all said that the QPL was helpful, and that it contained questions that were useful to them; six said it made it easier to ask questions. Compared with control group participants’ ratings of ‘standard information’, QPL group participants’ views of the QPL were more positive; the QPL had been read more times, was less likely to be reported as ‘overwhelming’ to read, and was more likely to prompt participants to ask questions of their health professionals. The results from the three studies of this research program add to the body of literature on information provision for brain tumour patients. Together, these studies suggest that a QPL may be appropriate for the neuro-oncology setting and acceptable to patients. The QPL aims to assist patients to express their information needs, enabling health professionals to better provide the type and amount of information that patients need to prepare for treatment and the future. This may help health professionals meet the challenge of giving patients sufficient information, without providing ‘too much’ or ‘unnecessary’ information, or taking away hope. Future studies with rigorous designs are now needed to determine the effectiveness of the QPL.
Resumo:
Responding to the global and unprecedented challenge of capacity building for twenty-first century life, this book is a practical guide for tertiary education institutions to quickly and effectively renew the curriculum towards education for sustainable development. The book begins by exploring why curriculum change has been so slow. It then describes a model for rapid curriculum renewal, highlighting the important roles of setting timeframes, formal and informal leadership, and key components and action strategies. The second part of the book provides detailed coverage of six core elements that have been trialled and peer reviewed by institutions around the world: - raising awareness among staff and students - mapping graduate attributes - auditing the curriculum - developing niche degrees, flagship courses and fully integrated programs - engaging and catalysing community and student markets - integrating curriculum with green campus operations. With input from more than seventy academics and grounded in engineering education experiences, this book will provide academic staff with tools and insights to rapidly align program offerings with the needs of present and future generations of students.
Resumo:
Modern robots are increasingly expected to function in uncertain and dynamically challenging environments, often in proximity with humans. In addition, wide scale adoption of robots requires on-the-fly adaptability of software for diverse application. These requirements strongly suggest the need to adopt formal representations of high level goals and safety specifications, especially as temporal logic formulas. This approach allows for the use of formal verification techniques for controller synthesis that can give guarantees for safety and performance. Robots operating in unstructured environments also face limited sensing capability. Correctly inferring a robot's progress toward high level goal can be challenging.
This thesis develops new algorithms for synthesizing discrete controllers in partially known environments under specifications represented as linear temporal logic (LTL) formulas. It is inspired by recent developments in finite abstraction techniques for hybrid systems and motion planning problems. The robot and its environment is assumed to have a finite abstraction as a Partially Observable Markov Decision Process (POMDP), which is a powerful model class capable of representing a wide variety of problems. However, synthesizing controllers that satisfy LTL goals over POMDPs is a challenging problem which has received only limited attention.
This thesis proposes tractable, approximate algorithms for the control synthesis problem using Finite State Controllers (FSCs). The use of FSCs to control finite POMDPs allows for the closed system to be analyzed as finite global Markov chain. The thesis explicitly shows how transient and steady state behavior of the global Markov chains can be related to two different criteria with respect to satisfaction of LTL formulas. First, the maximization of the probability of LTL satisfaction is related to an optimization problem over a parametrization of the FSC. Analytic computation of gradients are derived which allows the use of first order optimization techniques.
The second criterion encourages rapid and frequent visits to a restricted set of states over infinite executions. It is formulated as a constrained optimization problem with a discounted long term reward objective by the novel utilization of a fundamental equation for Markov chains - the Poisson equation. A new constrained policy iteration technique is proposed to solve the resulting dynamic program, which also provides a way to escape local maxima.
The algorithms proposed in the thesis are applied to the task planning and execution challenges faced during the DARPA Autonomous Robotic Manipulation - Software challenge.
Resumo:
The intent of this study is to provide formal apparatus which facilitates the investigation of problems in the methodology of science. The introduction contains several examples of such problems and motivates the subsequent formalism.
A general definition of a formal language is presented, and this definition is used to characterize an individual’s view of the world around him. A notion of empirical observation is developed which is independent of language. The interplay of formal language and observation is taken as the central theme. The process of science is conceived as the finding of that formal language that best expresses the available experimental evidence.
To characterize the manner in which a formal language imposes structure on its universe of discourse, the fundamental concepts of elements and states of a formal language are introduced. Using these, the notion of a basis for a formal language is developed as a collection of minimal states distinguishable within the language. The relation of these concepts to those of model theory is discussed.
An a priori probability defined on sets of observations is postulated as a reflection of an individual’s ontology. This probability, in conjunction with a formal language and a basis for that language, induces a subjective probability describing an individual’s conceptual view of admissible configurations of the universe. As a function of this subjective probability, and consequently of language, a measure of the informativeness of empirical observations is introduced and is shown to be intuitively plausible – particularly in the case of scientific experimentation.
The developed formalism is then systematically applied to the general problems presented in the introduction. The relationship of scientific theories to empirical observations is discussed and the need for certain tacit, unstatable knowledge is shown to be necessary to fully comprehend the meaning of realistic theories. The idea that many common concepts can be specified only by drawing on knowledge obtained from an infinite number of observations is presented, and the problems of reductionism are examined in this context.
A definition of when one formal language can be considered to be more expressive than another is presented, and the change in the informativeness of an observation as language changes is investigated. In this regard it is shown that the information inherent in an observation may decrease for a more expressive language.
The general problem of induction and its relation to the scientific method are discussed. Two hypotheses concerning an individual’s selection of an optimal language for a particular domain of discourse are presented and specific examples from the introduction are examined.
Resumo:
Engineering changes (ECs) are essential in complex product development, and their management is a crucial discipline for engineering industries. Numerous methods have been developed to support EC management (ECM), of which the change prediction method (CPM) is one of the most established. This article contributes a requirements-based benchmarking approach to assess and improve existing methods. The CPM is selected to be improved. First, based on a comprehensive literature survey and insights from industrial case studies, a set of 25 requirements for change management methods are developed. Second, these requirements are used as benchmarking criteria to assess the CPM in comparison to seven other promising methods. Third, the best-in-class solutions for each requirement are investigated to draw improvement suggestions for the CPM. Finally, an enhanced ECM method which implements these improvements is presented. © 2013 © 2013 The Author(s). Published by Taylor & Francis.
Resumo:
When implementing autonomic management of multiple non-functional concerns a trade-off must be found between the ability to develop independently management of the individual concerns (following the separation of concerns principle) and the detection and resolution of conflicts that may arise when combining the independently developed management code. Here we discuss strategies to establish this trade-off and introduce a model checking based methodology aimed at simplifying the discovery and handling of conflicts arising from deployment-within the same parallel application-of independently developed management policies. Preliminary results are shown demonstrating the feasibility of the approach.
Resumo:
Stakeholder analysis plays a critical role in business analysis. However, the majority of the stakeholder identification and analysis methods focus on the activities and processes and ignore the artefacts being processed by human beings. By focusing on the outputs of the organisation, an artefact-centric view helps create a network of artefacts, and a component-based structure of the organisation and its supply chain participants. Since the relationship is based on the components, i.e. after the stakeholders are identified, the interdependency between stakeholders and the focal organisation can be measured. Each stakeholder is associated with two types of dependency, namely the stakeholder’s dependency on the focal organisation and the focal organisation’s dependency on the stakeholder. We identify three factors for each type of dependency and propose the equations that calculate the dependency indexes. Once both types of the dependency indexes are calculated, each stakeholder can be placed and categorised into one of the four groups, namely critical stakeholder, mutual benefits stakeholder, replaceable stakeholder, and easy care stakeholder. The mutual dependency grid and the dependency gap analysis, which further investigates the priority of each stakeholder by calculating the weighted dependency gap between the focal organisation and the stakeholder, subsequently help the focal organisation to better understand its stakeholders and manage its stakeholder relationships.
Resumo:
Component-based software engineering has recently emerged as a promising solution to the development of system-level software. Unfortunately, current approaches are limited to specific platforms and domains. This lack of generality is particularly problematic as it prevents knowledge sharing and generally drives development costs up. In the past, we have developed a generic approach to component-based software engineering for system-level software called OpenCom. In this paper, we present OpenComL an instantiation of OpenCom to Linux environments and show how it can be profiled to meet a range of system-level software in Linux environments. For this, we demonstrate its application to constructing a programmable router platform and a middleware for parallel environments.
Resumo:
In this work a new method is proposed of separated estimation for the ARMA spectral model based on the modified Yule-Walker equations and on the least squares method. The proposal of the new method consists of performing an AR filtering in the random process generated obtaining a new random estimate, which will reestimate the ARMA model parameters, given a better spectrum estimate. Some numerical examples will be presented in order to ilustrate the performance of the method proposed, which is evaluated by the relative error and the average variation coefficient.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Die Bor-Neuroneneinfang-Therapie (engl.: Boron Neutron Capture Therapy, BNCT) ist eine indirekte Strahlentherapie, welche durch die gezielte Freisetzung von dicht ionisierender Strahlung Tumorzellen zerstört. Die freigesetzten Ionen sind Spaltfragmente einer Kernreaktion, bei welcher das Isotop 10B ein niederenergetisches (thermisches) Neutron einfängt. Das 10B wird durch ein spezielles Borpräparat in den Tumorzellen angereichert, welches selbst nicht radioaktiv ist. rnAn der Johannes Gutenberg-Universität Mainz wurde die Forschung für die Anwendung eines klinischen Behandlungsprotokolls durch zwei Heilversuche bei Patienten mit kolorektalen Lebermetastasen an der Universität Pavia, Italien, angeregt, bei denen die Leber außerhalb des Körpers in einem Forschungsreaktor bestrahlt wurde. Als erster Schritt wurde in Kooperation verschiedener universitärer Institute eine klinische Studie zur Bestimmung klinisch relevanter Parameter wie der Borverteilung in verschiedenen Geweben und dem pharmakokinetischen Aufnahmeverhalten des Borpräparates initiiert.rnDie Borkonzentration in den Gewebeproben wurde hinsichtlich ihrer räumlichen Verteilung in verschiedenen Zellarealen bestimmt, um mehr über das Aufnahmeverhalten der Zellen für das BPA im Hinblick auf ihre biologischen Charakteristika zu erfahren. Die Borbestimung wurde per Quantitative Neutron Capture Radiography, Prompt Gamma Activation Analysis und Inductively Coupled Plasma Mass Spectroscopy parallel zur histologischen Analyse des Gewebes durchgeführt. Es war möglich zu zeigen, dass in Proben aus Tumorgewebe und aus tumorfreiem Gewebe mit unterschiedlichen morphologischen Eigenschaften eine sehr heterogene Borverteilung vorliegt. Die Ergebnisse der Blutproben werden für die Erstellung eines pharmakokinetischen Modells verwendet und sind in Übereinstimmung mit existierenden pharmakokinetische Modellen. Zusätzlich wurden die Methoden zur Borbestimmung über speziell hergestellte Referenzstandards untereinander verglichen. Dabei wurde eine gute Übereinstimmung der Ergebnisse festgestellt, ferner wurde für alle biologischen Proben Standardanalyseprotokolle erstellt.rnDie bisher erhaltenen Ergebnisse der klinischen Studie sind vielversprechend, lassen aber noch keine endgültigen Schlussfolgerungen hinsichtlich der Wirksamkeit von BNCT für maligne Lebererkrankungen zu. rn