901 resultados para Formal Methods. Component-Based Development. Competition. Model Checking
Resumo:
Tropical forests are carbon-dense and highly productive ecosystems. Consequently, they play an important role in the global carbon cycle. In the present study we used an individual-based forest model (FORMIND) to analyze the carbon balances of a tropical forest. The main processes of this model are tree growth, mortality, regeneration, and competition. Model parameters were calibrated using forest inventory data from a tropical forest at Mt. Kilimanjaro. The simulation results showed that the model successfully reproduces important characteristics of tropical forests (aboveground biomass, stem size distribution and leaf area index). The estimated aboveground biomass (385 t/ha) is comparable to biomass values in the Amazon and other tropical forests in Africa. The simulated forest reveals a gross primary production of 24 tcha-1yr-1. Modeling above- and belowground carbon stocks, we analyzed the carbon balance of the investigated tropical forest. The simulated carbon balance of this old-growth forest is zero on average. This study provides an example of how forest models can be used in combination with forest inventory data to investigate forest structure and local carbon balances.
Resumo:
We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.
Resumo:
Prominent challenges facing nurse leaders are the growing shortage of nurses and the increasingly complex care required by acutely ill patients. In organizations that shortage is exacerbated by turnover and intent to leave. Unsatisfactory working conditions are cited by nurses when they leave their current jobs. Disengagement from the job leads to plateaued performance, decreased organizational commitment, and increased turnover. Solutions to these challenges include methods both to retain and to increase the effectiveness of each nurse. ^ The specific aim of this study was to examine the relationships among organizational structures thought to foster the clinical development of the nurse, with indicators of the development of clinical expertise, resulting in outcomes of positive job attitudes and effectiveness. Causal loop modeling is incorporated as a systems tool to examine developmental cycles both for an organization and for an individual nurse to look beyond singular events and investigate deeper patterns that emerge over time. ^ The setting is an academic specialty-care institution, and the sample in this cross-sectional study consists of paired data from 225 RNs and their nurse managers. Two panels of survey instruments were created based on the model's theoretical variables, one completed by RNs and the other by their Nurse Managers. The RN survey panel examined the variables of structural empowerment, magnet essentials, knowledge as identified by the Benner developmental stage, psychological empowerment, job stage, engagement, intent to leave, job satisfaction and the early recognition of patient complications. The nurse manager survey panel examined the Benner developmental stage, job stage, and overall level of nursing performance. ^ Four regression models were created based on the outcome variables. Each model identified significant organizational and individual characteristics that predicted higher job satisfaction, decreased intent to leave, more effectiveness as measured by early recognition and acting upon subtle patient complications, and better job performance. ^ Implications for improving job attitudes and effectiveness focus on ways that nursing leaders can foster a more empowering and healthy work environment. ^
Resumo:
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.
Resumo:
En los últimos años la externalización de TI ha ganado mucha importancia en el mercado y, por ejemplo, el mercado externalización de servicios de TI sigue creciendo cada año. Ahora más que nunca, las organizaciones son cada vez más los compradores de las capacidades necesarias mediante la obtención de productos y servicios de los proveedores, desarrollando cada vez menos estas capacidades dentro de la empresa. La selección de proveedores de TI es un problema de decisión complejo. Los gerentes que enfrentan una decisión sobre la selección de proveedores de TI tienen dificultades en la elaboración de lo que hay que pensar, además en sus discursos. También de acuerdo con un estudio del SEI (Software Engineering Institute) [40], del 20 al 25 por ciento de los grandes proyectos de adquisición de TI fracasan en dos años y el 50 por ciento fracasan dentro de cinco años. La mala gestión, la mala definición de requisitos, la falta de evaluaciones exhaustivas, que pueden ser utilizadas para llegar a los mejores candidatos para la contratación externa, la selección de proveedores y los procesos de contratación inadecuados, la insuficiencia de procedimientos de selección tecnológicos, y los cambios de requisitos no controlados son factores que contribuyen al fracaso del proyecto. La mayoría de los fracasos podrían evitarse si el cliente aprendiese a comprender los problemas de decisión, hacer un mejor análisis de decisiones, y el buen juicio. El objetivo principal de este trabajo es el desarrollo de un modelo de decisión para la selección de proveedores de TI que tratará de reducir la cantidad de fracasos observados en las relaciones entre el cliente y el proveedor. La mayor parte de estos fracasos son causados por una mala selección, por parte del cliente, del proveedor. Además de estos problemas mostrados anteriormente, la motivación para crear este trabajo es la inexistencia de cualquier modelo de decisión basado en un multi modelo (mezcla de modelos adquisición y métodos de decisión) para el problema de la selección de proveedores de TI. En el caso de estudio, nueve empresas españolas fueron analizadas de acuerdo con el modelo de decisión para la selección de proveedores de TI desarrollado en este trabajo. Dos softwares se utilizaron en este estudio de caso: Expert Choice, y D-Sight. ABSTRACT In the past few years IT outsourcing has gained a lot of importance in the market and, for example, the IT services outsourcing market is still growing every year. Now more than ever, organizations are increasingly becoming acquirers of needed capabilities by obtaining products and services from suppliers and developing less and less of these capabilities in-house. IT supplier selection is a complex and opaque decision problem. Managers facing a decision about IT supplier selection have difficulty in framing what needs to be thought about further in their discourses. Also according to a study from SEI (Software Engineering Institute) [40], 20 to 25 percent of large information technology (IT) acquisition projects fail within two years and 50 percent fail within five years. Mismanagement, poor requirements definition, lack of comprehensive evaluations, which can be used to come up with the best candidates for outsourcing, inadequate supplier selection and contracting processes, insufficient technology selection procedures, and uncontrolled requirements changes are factors that contribute to project failure. The majority of project failures could be avoided if the acquirer learns how to understand the decision problems, make better decision analysis, and good judgment. The main objective of this work is the development of a decision model for IT supplier selection that will try to decrease the amount of failures seen in the relationships between the client-supplier. Most of these failures are caused by a not well selection of the supplier. Besides these problems showed above, the motivation to create this work is the inexistence of any decision model based on multi model (mixture of acquisition models and decision methods) for the problem of IT supplier selection. In the case study, nine different Spanish companies were analyzed based on the IT supplier selection decision model developed in this work. Two software products were used in this case study, Expert Choice and D-Sight.
Resumo:
En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.
Resumo:
El atrio incorporado en los edificios ha sido un recurso espacial que tempranamente se difundió a nivel global, siendo adoptado por las distintas arquitecturas locales en gran parte del mundo. Su masificación estuvo favorecida primero por la rápida evolución de la tecnología del acero y el vidrio, a partir del siglo XIX, y en segundo termino por el posterior desarrollo del hormigón armado. Otro aspecto que explica tal aceptación en la arquitectura contemporánea, es de orden social y radica en la llamativa cavidad del espacio describiendo grandes dimensiones y favoreciendo con ello, el desarrollo de una multiplicidad de usos en su interior que antes eran impensados. Al interior del atrio, la luz natural es clave en las múltiples vivencias que alberga y sea tal vez la condición ambiental más valorada, ya que entrega una sensación de bienestar al conectarnos visualmente con el ambiente natural. Por esta razón de acuerdo al método hipotético deductivo, se evaluaron los efectos de la configuración geométrica, la cubierta y la orientación en el desempeño de la iluminación natural en la planta baja, a partir un modelo extraído desde el inventario de los edificios atrio construidos en Santiago de Chile, en los últimos 30 años que fue desarrollado en el capitulo 2. El análisis cuantitativo de los edificios inventariados se elaboró en el capítulo 3, considerando las dimensiones de los atrios. Simultáneamente fueron clasificados los aspectos constructivos, los materiales y las características del ambiente interior de cada edificio. En esta etapa además, fueron identificadas las variables de estudio de las proporciones geométricas de la cavidad del atrio con los coeficientes de aspecto de las proporciones, en planta (PAR), en corte (SAR) y de la cavidad según (WI), (AR) y (RI). Del análisis de todos estos parámetros se extrajo el modelo de prueba. El enfoque del estudio del capítulo 4 fue la iluminación natural, se revisaron los conceptos y el comportamiento en el atrio, a partir de un modelo físico construido a escala para registro de la iluminancia bajo cielo soleado de la ciudad. Más adelante se construyó el modelo en ambiente virtual, relacionando las variables determinadas por la geometría de la cavidad y el cerramiento superior; examinándose de esta manera distintas transparencias, proporciones de apertura, en definitiva se evaluó un progresivo cerramiento de las aberturas, verificando el ingreso de la luz y disponibilidad a nivel de piso con la finalidad, de proveer lineamientos útiles en una primera etapa del diseño arquitectónico. Para el análisis de la iluminación natural se revisaron diferentes métodos de cálculo con el propósito de evaluar los niveles de iluminancia en un plano horizontal al interior del atrio. El primero de ellos fue el Factor de Luz Día (FLD) que corresponde, a la proporción de la iluminancia en un punto de evaluación interior respecto, la cantidad proveniente del exterior bajo cielo nublado, a partir de la cual se obtuvo resultados que revelaron la alta luminosidad del cielo nublado de la ciudad. Además fueron evaluadas las recientes métricas dinámicas que dan cuenta, de la cantidad de horas en las cuales de acuerdo a los extensos registros meteorológico de la ciudad, permitieron obtener el porcentajes de horas dentro de las cuales se cumplió el estándar de iluminancia requerido, llamado autonomía lumínica (DA) o mejor aún se permanece dentro de un rango de comodidad visual en el interior del atrio referido a la iluminancia diurna útil (UDI). En el Capítulo 5 se exponen los criterios aplicados al modelo de estudio y cada una de las variantes de análisis, además se profundizó en los antecedentes y procedencia de las fuentes de los registros climáticos utilizados en las simulaciones llevadas a cabo en el programa Daysim operado por Radiance. Que permitieron evaluar el desempeño lumínico y la precisión, de cada uno de los resultados para comprobar la disponibilidad de iluminación natural a través de una matriz. En una etapa posterior se discutieron los resultados, mediante la comparación de los datos logrados según cada una de las metodologías de simulación aplicada. Finalmente se expusieron las conclusiones y futuras lineas de trabajo, las primeras respecto el dominio del atrio de cuatro caras, la incidencia del control de cerramiento de la cubierta y la relación establecida con la altura; indicando en lo específico que las mediciones de iluminancia bajo el cielo soleado de verano, permitieron aclarar, el uso de la herramienta de simulación y método basado en el clima local, que debido a su reciente desarrollo, orienta a futuras líneas de trabajo profundizando en la evaluación dinámica de la iluminancia contrastado con monitorización de casos. ABSTRACT Atriums incorporated into buildings have been a spatial resource that quickly spread throughout the globe, being adopted by several local architecture methods in several places. Their widespread increase was highly favored, in the first place, with the rapid evolution of steel and glass technologies since the nineteen century, and, in second place, by the following development of reinforced concrete. Another issue that explains this success into contemporary architecture is associated with the social approach, and it resides in the impressive cavity that describes vast dimensions, allowing the development of multiple uses in its interior that had never been considered before. Inside the atrium, daylight it is a key element in the many experiences that involves and it is possibly the most relevant environmental factor, since it radiates a feeling of well-being by uniting us visually with the natural environment. It is because of this reason that, following the hypothetical deductive method, the effects in the performance of daylight on the floor plan were evaluated considering the geometric configuration, the deck and orientation factors. This study was based in a model withdrawn from the inventory of atrium buildings that were constructed in Santiago de Chile during the past thirty years, which will be explained later in chapter 2. The quantitative analysis of the inventory of those buildings was elaborated in chapter 3, considering the dimensions of the atriums. Simultaneously, several features such as construction aspects, materials and environmental qualities were identified inside of each building. At this stage, it were identified the variables of the geometric proportions of the atrium’s cavity with the plan aspect ratio of proportions in first plan (PAR), in section (SAR) and cavity according to well index (WI), aspect ratio (AR) and room index (RI). An experimental model was obtained from the analysis of all the mentioned parameters. The main focus of the study developed in chapter 4 is daylight. The atrium’s concept and behavior were analyzed from a physical model built under scale to register the illuminances under clear, sunny sky of the city. Later on, this physical model was built in a virtual environment, connecting the variables determined by the geometry of the cavity and the superior enclosure, allowing the examination of transparencies and opening proportions. To summarize, this stage consisted on evaluating a progressive enclosure of the openings, checking the access of natural light and its availability at the atrium floor, in an effort to provide useful guidelines during the first stage of the architectural design. For the analysis of natural lighting, several calculations methods were used in order to determine the levels of illuminances in a horizontal plane inside of the atrium. The first of these methods is the Daylight Factor (DF), which consists in the proportion of light in an evaluation interior place with the amount of light coming from the outside in a cloudy day. Results determined that the cloudy sky of the city has high levels of luminosity. In addition, the recent dynamic metrics were evaluated which reflects the hours quantity. According to the meteorological records of the city’s climate, the standard of illuminance- a standard measure called Daylight Autonomy (DA) – was met. This is even better when the results stay in the line of visual convenience within the atrium, which is referred to as Useful Daylight Illuminance (UDI). In chapter 5, it was presented the criteria applied to the study model and on each of the variants of the analysis. Moreover, the information of the climate records used for the simulations - carried out in the Daysim program managed by Radiance – are detailed. These simulations allowed the observation of the daylight performance and the accuracy of each of the results to confirm the availability of natural light through a matrix. In a later stage, the results were discussed comparing the collected data in each of the methods of simulation used. Finally, conclusions and further discussion are presented. Overall, the four side atrium’s domain and the effect of the control of the cover’s enclosure. Specifically, the measurements of the daylight under summer’s clear, sunny sky allowing clarifying the use of the simulation tool and the method based on the local climate. This method allows defining new and future lines of work deepening on the dynamic of the light in contrast with the monitoring of the cases.
Resumo:
Dictyostelium discoideum cells initiate development when nutrients are depleted. DNA synthesis decreases rapidly thereafter but resumes during late aggregation, only in prespore cells. This observation has been previously interpreted as indicating progression of prespore cells through the cell cycle during development. We show that developmental DNA replication occurs only in mitochondria and not in nuclei. We also show that the prestalk morphogen known as differentiation-inducing factor 1 can inhibit mitochondrial respiration. A model is proposed for cell type divergence, based on competition to become prespores, that involves mitochondrial replication in prespore cells and reduction of mitochondrial activity in prestalk cells.
Resumo:
Introdução: Entre as estratégias de ensino e aprendizagem utilizadas nas práticas pedagógicas, a Problem Based Learning (PBL) (Aprendizagem Baseada em Problemas) é utilizada desde 1960, em especial nos cursos de Medicina. Mesmo sendo uma estratégia valiosa, um dos seus obstáculos é a pouca prática dos alunos em atividades autodirigidas, pesquisa e construção coletiva do conhecimento. Objetivo: Rastrear elementos constitutivos da PBL através de dados colhidos em artigos pesquisados em sítios de divulgação científica; Avaliar, nos estudos selecionados, os aspectos positivos e negativos que estejam relacionados com a metodologia do Sistema PBL aplicada ao ensino médico no Brasil. Metodologia: Estudo bibliográfico de 13 textos utilizando um modelo de desconstrução, denominada Análise Textual Discursiva (ATD) que consiste em: transformação dos artigos em pedaços menores; análise textual; identificação de padrões convergentes e divergentes em relação a PBL; organização e síntese dos dados, culminando com a elaboração de estratégia adaptativa da PBL para o curso de Medicina. Resultados: Foram encontradas 116 citações que convergiam para referências positivos acerca da metodologia PBL e 40 citações que divergiam acerca dos pontos positivos. Os aspectos positivos como o desenvolvimento de atitudes e habilidades; desenvolvimento de competências anteriores ao curso; efeitos positivos depois de terminada a graduação, como autonomia de estudo e a articulação entre currículo e realidade profissional, representam pontos a serem reforçados na aula. Em contraponto, foi observado que dentre os negativos a não compreensão do papel do professor como tutor; necessidade de conteúdo formal tradicional pelos alunos e a expectativa que o professor retire as suas dúvidas são pontos a serem evitados. Conclusões: A metodologia PBL deverá servir como metodologia ativa para aproveitar ao máximo as habilidades que os alunos já apresentam, potencializando o aprendizado na educação médica em sala de aula. Palavras-Chave: PBL; curso de medicina; metodologia ativa; educação médica.
Resumo:
Aim To develop an appropriate dosing strategy for continuous intravenous infusions (CII) of enoxaparin by minimizing the percentage of steady-state anti-Xa concentration (C-ss) outside the therapeutic range of 0.5-1.2 IU ml(-1). Methods A nonlinear mixed effects model was developed with NONMEM (R) for 48 adult patients who received CII of enoxaparin with infusion durations that ranged from 8 to 894 h at rates between 100 and 1600 IU h(-1). Three hundred and sixty-three anti-Xa concentration measurements were available from patients who received CII. These were combined with 309 anti-Xa concentrations from 35 patients who received subcutaneous enoxaparin. The effects of age, body size, height, sex, creatinine clearance (CrCL) and patient location [intensive care unit (ICU) or general medical unit] on pharmacokinetic (PK) parameters were evaluated. Monte Carlo simulations were used to (i) evaluate covariate effects on C-ss and (ii) compare the impact of different infusion rates on predicted C-ss. The best dose was selected based on the highest probability that the C-ss achieved would lie within the therapeutic range. Results A two-compartment linear model with additive and proportional residual error for general medical unit patients and only a proportional error for patients in ICU provided the best description of the data. Both CrCL and weight were found to affect significantly clearance and volume of distribution of the central compartment, respectively. Simulations suggested that the best doses for patients in the ICU setting were 50 IU kg(-1) per 12 h (4.2 IU kg(-1) h(-1)) if CrCL < 30 ml min(-1); 60 IU kg(-1) per 12 h (5.0 IU kg(-1) h(-1)) if CrCL was 30-50 ml min(-1); and 70 IU kg(-1) per 12 h (5.8 IU kg(-1) h(-1)) if CrCL > 50 ml min(-1). The best doses for patients in the general medical unit were 60 IU kg(-1) per 12 h (5.0 IU kg(-1) h(-1)) if CrCL < 30 ml min(-1); 70 IU kg(-1) per 12 h (5.8 IU kg(-1) h(-1)) if CrCL was 30-50 ml min(-1); and 100 IU kg(-1) per 12 h (8.3 IU kg(-1) h(-1)) if CrCL > 50 ml min(-1). These best doses were selected based on providing the lowest equal probability of either being above or below the therapeutic range and the highest probability that the C-ss achieved would lie within the therapeutic range. Conclusion The dose of enoxaparin should be individualized to the patients' renal function and weight. There is some evidence to support slightly lower doses of CII enoxaparin in patients in the ICU setting.
Resumo:
It is not surprising that students are unconvinced about the benefits of formal methods if we do not show them how these methods can be integrated with other activities in the software lifecycle. In this paper, we describe an approach to integrating formal specification with more traditional verification and validation techniques in a course that teaches formal specification and specification-based testing. This is accomplished through a series of assignments on a single software component that involves specifying the component in Object-Z, validating that specification using inspection and a specification animation tool, and then testing an implementation of the specification using test cases derived from the formal specification.
Resumo:
In this paper, we present a formal hardware verification framework linking ASM with MDG. ASM (Abstract State Machine) is a state based language for describing transition systems. MDG (Multiway Decision Graphs) provides symbolic representation of transition systems with support of abstract sorts and functions. We implemented a transformation tool that automatically generates MDG models from ASM specifications, then formal verification techniques provided by the MDG tool, such as model checking or equivalence checking, can be applied on the generated models. We support this work with a case study of an Island Tunnel Controller, which behavior and structure were specified in ASM then using our ASM-MDG tool successfully verified within the MDG tool.
Resumo:
In this paper we describe an approach to interface Abstract State Machines (ASM) with Multiway Decision Graphs (MDG) to enable tool support for the formal verification of ASM descriptions. ASM is a specification method for software and hardware providing a powerful means of modeling various kinds of systems. MDGs are decision diagrams based on abstract representation of data and axe used primarily for modeling hardware systems. The notions of ASM and MDG axe hence closely related to each other, making it appealing to link these two concepts. The proposed interface between ASM and MDG uses two steps: first, the ASM model is transformed into a flat, simple transition system as an intermediate model. Second, this intermediate model is transformed into the syntax of the input language of the MDG tool, MDG-HDL. We have successfully applied this transformation scheme on a case study, the Island Tunnel Controller, where we automatically generated the corresponding MDG-HDL models from ASM specifications.
Resumo:
The following topics are dealt with: Requirements engineering; components; design; formal specification analysis; education; model checking; human computer interaction; software design and architecture; formal methods and components; software maintenance; software process; formal methods and design; server-based applications; review and testing; measurement; documentation; management and knowledge-based approaches.
Resumo:
There is an increasing emphasis on the use of software to control safety critical plants for a wide area of applications. The importance of ensuring the correct operation of such potentially hazardous systems points to an emphasis on the verification of the system relative to a suitably secure specification. However, the process of verification is often made more complex by the concurrency and real-time considerations which are inherent in many applications. A response to this is the use of formal methods for the specification and verification of safety critical control systems. These provide a mathematical representation of a system which permits reasoning about its properties. This thesis investigates the use of the formal method Communicating Sequential Processes (CSP) for the verification of a safety critical control application. CSP is a discrete event based process algebra which has a compositional axiomatic semantics that supports verification by formal proof. The application is an industrial case study which concerns the concurrent control of a real-time high speed mechanism. It is seen from the case study that the axiomatic verification method employed is complex. It requires the user to have a relatively comprehensive understanding of the nature of the proof system and the application. By making a series of observations the thesis notes that CSP possesses the scope to support a more procedural approach to verification in the form of testing. This thesis investigates the technique of testing and proposes the method of Ideal Test Sets. By exploiting the underlying structure of the CSP semantic model it is shown that for certain processes and specifications the obligation of verification can be reduced to that of testing the specification over a finite subset of the behaviours of the process.