991 resultados para formal model
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
In this paper, I provide a formal justi cation for a well-established coattail effect, when a popular candidate at one branch of government attracts votes to candidates from the same political party for other branches of government. A political agency frame- work with moral hazard is applied to analyze coattails in simultaneous presidential and congressional elections. I show that coattail voting is a natural outcome of the optimal reelection scheme adopted by a representative voter to motivate politicians' efforts in a retrospective voting environment. I assume that an office-motivated politician (executive or congressman) prefers her counterpart to be affiliated with the same political party. This correlation of incentives leads the voter to adopt a joint performance evaluation rule, which is conditioned on the politicians belonging to the same party or different parties. The two-sided coattail effects then arise. On the one hand, the executive's suc- cess/failure props up/drags down her partisan ally in congressional election, which implies presidential coattails. On the other hand, the executive's reelection itself is affected by the congressman's performance, which results in reverse coattails. JEL classi fication: D72, D86. Keywords: Coattail voting; Presidential coattails; Reverse coattails; Simultaneous elections; Political Agency; Retrospective voting.
Resumo:
Roughly fifteen years ago, the Church of Jesus Christ of Latter-day Saints published a new proposed standard file format. They call it GEDCOM. It was designed to allow different genealogy programs to exchange data.Five years later, in may 2000, appeared the GENTECH Data Modeling Project, with the support of the Federation of Genealogical Societies (FGS) and other American genealogical societies. They attempted to define a genealogical logic data model to facilitate data exchange between different genealogical programs. Although genealogists deal with an enormous variety of data sources, one of the central concepts of this data model was that all genealogical data could be broken down into a series of short, formal genealogical statements. It was something more versatile than only export/import data records on a predefined fields. This project was finally absorbed in 2004 by the National Genealogical Society (NGS).Despite being a genealogical reference in many applications, these models have serious drawbacks to adapt to different cultural and social environments. At the present time we have no formal proposal for a recognized standard to represent the family domain.Here we propose an alternative conceptual model, largely inherited from aforementioned models. The design is intended to overcome their limitations. However, its major innovation lies in applying the ontological paradigm when modeling statements and entities.
Resumo:
L'educació és, avui, una activitat important per a les institucions museístiques. La tasca educativa dels museus es pot considerar, sobretot, educació no formal. Les TIC, posen a l'abast dels museus multitud d'eines. L'ús que se'n fa, varia d'una institució a una altre; ho veurem a partir d'estudis de cas de tres institucions museístiques. Els propis museus, grups de treball, autors, administracions... han detectat la necessitat d'analitzar l'ús que les institucions museístiques fan de les TIC amb finalitats educatives; es dóna una importància creixent al fet de disposar d'eines i metodologies el màxim d'homogènies per analitzar aquest ús; l'objectiu d'aquest treball va en aquesta direcció: avaluar l'ús i també -imprescindible- els resultats educatius reals. He optat per estructurar la informació en un programa informàtic. La justificació és doble: simplificar els processos d'adquisició, gestió i avaluació de la informació i per altre banda, assegurar, amb la implementació, que l'estructura conceptual és robusta. Això s'ha aconseguit raonablement; conscient però que , més important que el programa en sí, -per això no s'aprofundeix en el seu funcionament intern- és l'estructura de dades utilitzada; per arribar a aquest resultat, ha estat necessari reflexionar sobre determinats conceptes; s'han inclòs part d'aquestes reflexions perquè són les bases del model de dades utilitzat.
Resumo:
This paper analyses the effect of unmet formal care needs on informal caregiving hours in Spain using the two wavesof the Informal Support Survey (1994, 2004). Testing for double sample selection from formal care receipt and theemergence of unmet needs provides evidence that the omission of either variable would causes underestimation of thenumber of informal caregiving hours. After controlling for these two factors the number of hours of care increaseswith both the degree of dependency and unmet needs. More importantly, in the presence of unmet needs, the numberof informal caregiving hours increases when some formal care is received. This result refutes the substitution modeland supports complementarity or task specificity between both types of care. For a given combination of formal careand unmet needs, informal caregiving hours increased between 1994 and 2004. Finally, in the model for 2004, theselection term associated with the unmet needs equation is larger than that of the formal care equation, suggestingthat using the number of formal care recipients as a quality indicator may be confounding, if we do not complete thisinformation with other quality indicators.
Resumo:
The remarkable growth of older population has moved long term care to the front ranks of the social policy agenda. Understanding the factors that determine the type and amount of formal care is important for predicting use in the future and developing long-term policy. In this context we jointly analyze the choice of care (formal, informal, both together or none) as well as the number of hours of care received. Given that the number of hours of care is not independent of the type of care received, we estimate, for the first time in this area of research, a sample selection model with the particularity that the first step is a multinomial logit model. With regard to the debate about complementarity or substitutability between formal and informal care, our results indicate that formal care acts as a reinforcement of the family care in certain cases: for very old care receivers, in those cases in which the individual has multiple disabilities, when many care hours are provided, and in case of mental illness and/or dementia. There exist substantial differences in long term care addressed to younger and older dependent people and dependent women are in risk of becoming more vulnerable to the shortage of informal caregivers in the future. Finally, we have documented that there are great disparities in the availability of public social care across regions.
Resumo:
The aim of the paper is to describe some of the challenges faced by schools, or by formal education in general, as a consequence of today"s mobilecentric society (henceforth MCS), the term we will use to denote the new, networked learning ecology that has arisen from the massive penetration of digital media in everyday life. After revisiting some of the ideas of McLuhan and Vygotsky in the light of this new technological scenario, we describe five traits of the MCS and the challenges illustrated through educational practices that we believe schools will face if they wish to preserve their function of individualization and socialization. We believe that despite the emergence of the MCS, the main function of the school is still to provide the"box of tools" (a set of psychological instruments, such as reading, writing, mathematical notation, digital literacy, etc.) that enables people to develop their learning skills and life projects and to become part of communities and groups. However, the complexity and mobility of the new learning environments means that the position held by schools needs to be reevaluated in the face of the informal learning paths and experiences both online and offline to which learners now have access. We also need to reevaluate the meaning of the school itself as an institution and the model of learner it should be training
Resumo:
The design methods and languages targeted to modern System-on-Chip designs are facing tremendous pressure of the ever-increasing complexity, power, and speed requirements. To estimate any of these three metrics, there is a trade-off between accuracy and abstraction level of detail in which a system under design is analyzed. The more detailed the description, the more accurate the simulation will be, but, on the other hand, the more time consuming it will be. Moreover, a designer wants to make decisions as early as possible in the design flow to avoid costly design backtracking. To answer the challenges posed upon System-on-chip designs, this thesis introduces a formal, power aware framework, its development methods, and methods to constraint and analyze power consumption of the system under design. This thesis discusses on power analysis of synchronous and asynchronous systems not forgetting the communication aspects of these systems. The presented framework is built upon the Timed Action System formalism, which offer an environment to analyze and constraint the functional and temporal behavior of the system at high abstraction level. Furthermore, due to the complexity of System-on-Chip designs, the possibility to abstract unnecessary implementation details at higher abstraction levels is an essential part of the introduced design framework. With the encapsulation and abstraction techniques incorporated with the procedure based communication allows a designer to use the presented power aware framework in modeling these large scale systems. The introduced techniques also enable one to subdivide the development of communication and computation into own tasks. This property is taken into account in the power analysis part as well. Furthermore, the presented framework is developed in a way that it can be used throughout the design project. In other words, a designer is able to model and analyze systems from an abstract specification down to an implementable specification.
Resumo:
Contemporary logic is confined to a few paradigmatic attitudes such as belief, knowledge, desire and intention. My purpose is to present a general model-theoretical semantics of propositional attitudes of any cognitive or volitive mode. In my view, one can recursively define the set of all psychological modes of attitudes. As Descartes anticipated, the two primitive modes are those of belief and desire. Complex modes are obtained by adding to primitive modes special cognitive and volitive ways or special propositional content or preparatory conditions. According to standard logic of attitudes (Hintikka), human agents are either perfectly rational or totally irrational. I will proceed to a finer analysis of propositional attitudes that accounts for our imperfect but minimal rationality. For that purpose I will use a non standard predicative logic according to which propositions with the same truth conditions can have different cognitive values and I will explicate subjective in addition to objective possibilities. Next I will enumerate valid laws of my general logic of propositional attitudes. At the end I will state principles according to which minimally rational agents dynamically revise attitudes of any mode.
Resumo:
Prerequisites and effects of proactive and preventive psycho-social student welfare activities in Finnish preschool and elementary school were of interest in the present thesis. So far, Finnish student welfare work has mainly focused on interventions and individuals, and the voluminous possibilities to enhance well-being of all students as a part of everyday school work have not been fully exploited. Consequently, in this thesis three goals were set: (1) To present concrete examples of proactive and preventive psycho-social student welfare activities in Finnish basic education; (2) To investigate measurable positive effects of proactive and preventive activities; and (3) To investigate implementation of proactive and preventive activities in ecological contexts. Two prominent phenomena in preschool and elementary school years—transition to formal schooling and school bullying—were chosen as examples of critical situations that are appropriate targets for proactive and preventive psycho-social student welfare activities. Until lately, the procedures concerning both school transitions and school bullying have been rather problem-focused and reactive in nature. Theoretically, we lean on the bioecological model of development by Bronfenbrenner and Morris with concentric micro-, meso-, exo- and macrosystems. Data were drawn from two large-scale research projects, the longitudinal First Steps Study: Interactive Learning in the Child–Parent– Teacher Triangle, and the Evaluation Study of the National Antibullying Program KiVa. In Study I, we found that the academic skills of children from preschool–elementary school pairs that implemented several supportive activities during the preschool year developed more quickly from preschool to Grade 1 compared with the skills of children from pairs that used fewer practices. In Study II, we focused on possible effects of proactive and preventive actions on teachers and found that participation in the KiVa antibullying program influenced teachers‘ self-evaluated competence to tackle bullying. In Studies III and IV, we investigated factors that affect implementation rate of these proactive and preventive actions. In Study III, we found that principal‘s commitment and support for antibullying work has a clear-cut positive effect on implementation adherence of student lessons of the KiVa antibullying program. The more teachers experience support for and commitment to anti-bullying work from their principal, the more they report having covered KiVa student lessons and topics. In Study IV, we wanted to find out why some schools implement several useful and inexpensive transition practices, whereas other schools use only a few of them. We were interested in broadening the scope and looking at local-level (exosystem) qualities, and, in fact, the local-level activities and guidelines, along with teacherreported importance of the transition practices, were the only factors significantly associated with the implementation rate of transition practices between elementary schools and partner preschools. Teacher- and school-level factors available in this study turned out to be mostly not significant. To summarize, the results confirm that school-based promotion and prevention activities may have beneficial effects not only on students but also on teachers. Second, various top-down processes, such as engagement at the level of elementary school principals or local administration may enhance implementation of these beneficial activities. The main message is that when aiming to support the lives of children the primary focus should be on adults. In future, promotion of psychosocial well-being and the intrinsic value of inter- and intrapersonal skills need to be strengthened in the Finnish educational systems. Future research efforts in student welfare and school psychology, as well as focused training for psychologists in educational contexts, should be encouraged in the departments of psychology and education in Finnish universities. Moreover, a specific research centre for school health and well-being should be established.
Resumo:
The purpose of this doctoral thesis is to widen and develop our theoretical frameworks for discussion and analyses of feedback practices in management accounting, particularly shedding light on its formal and informal aspects. The concept of feedback in management accounting has conventionally been analyzed within cybernetic control theory, in which feedback flows as a diagnostic or comparative loop between measurable outputs and pre-set goals (see e.g. Flamholtz et al. 1985; Flamholtz 1996, 1983), i.e. as a formal feedback loop. However, the everyday feedback practices in organizations are combinations of formal and informal elements. In addition to technique-driven feedback approaches (like budgets, measurement, and reward systems) we could also categorize social feedback practices that managers see relevant and effective in the pursuit of organizational control. While cybernetics or control theories successfully capture rational and measured aspects of organizational performance and offer a broad organizational context for the analysis, many individual and informal aspects remain vague and isolated. In order to discuss and make sense of the heterogeneous field of interpretations of formal and informal feedback, both in theory and practice, dichotomous approaches seem to be insufficient. Therefore, I suggest an analytical framework of formal and informal feedback with three dimensions (3D’s): source, time, and rule. Based on an abductive analysis of the theoretical and empirical findings from an interpretive case study around a business unit called Division Steelco, the 3Dframework and formal and informal feedback practices are further elaborated vis-á-vis the four thematic layers in the organizational control model by Flamholtz et al. (1985; Flamholtz 1996, 1983): core control system, organizational structure, organizational culture, and external environment. Various personal and cultural meanings given to the formal and informal feedback practices (“feedback as something”) create multidimensional interpretative contexts. Multidimensional frameworks aim to capture and better understand both the variety of interpretations and their implications to the functionality of feedback practices, important in interpretive research.
Resumo:
Today's networked systems are becoming increasingly complex and diverse. The current simulation and runtime verification techniques do not provide support for developing such systems efficiently; moreover, the reliability of the simulated/verified systems is not thoroughly ensured. To address these challenges, the use of formal techniques to reason about network system development is growing, while at the same time, the mathematical background necessary for using formal techniques is a barrier for network designers to efficiently employ them. Thus, these techniques are not vastly used for developing networked systems. The objective of this thesis is to propose formal approaches for the development of reliable networked systems, by taking efficiency into account. With respect to reliability, we propose the architectural development of correct-by-construction networked system models. With respect to efficiency, we propose reusable network architectures as well as network development. At the core of our development methodology, we employ the abstraction and refinement techniques for the development and analysis of networked systems. We evaluate our proposal by employing the proposed architectures to a pervasive class of dynamic networks, i.e., wireless sensor network architectures as well as to a pervasive class of static networks, i.e., network-on-chip architectures. The ultimate goal of our research is to put forward the idea of building libraries of pre-proved rules for the efficient modelling, development, and analysis of networked systems. We take into account both qualitative and quantitative analysis of networks via varied formal tool support, using a theorem prover the Rodin platform and a statistical model checker the SMC-Uppaal.
Resumo:
Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.
Resumo:
About ten years ago, triadic contexts were presented by Lehmann and Wille as an extension of Formal Concept Analysis. However, they have rarely been used up to now, which may be due to the rather complex structure of the resulting diagrams. In this paper, we go one step back and discuss how traditional line diagrams of standard (dyadic) concept lattices can be used for exploring and navigating triadic data. Our approach is inspired by the slice & dice paradigm of On-Line-Analytical Processing (OLAP). We recall the basic ideas of OLAP, and show how they may be transferred to triadic contexts. For modeling the navigation patterns a user might follow, we use the formalisms of finite state machines. In order to present the benefits of our model, we show how it can be used for navigating the IT Baseline Protection Manual of the German Federal Office for Information Security.
Resumo:
El siguiente documento se realizó, con el fin de entender inicialmente la visión mundial y de país de la importancia de las TIC para luego asentarlo en un escenario particular, esencialmente en el sector textil y de confecciones de Bogotá. Desarrollar este conocimiento permite entender los proyectos de integración de software y hardware que las empresas, en este caso las pymes del sector textil, realizan para mejorar ciertos aspectos en las áreas de sus entidades. Durante el desarrollo de este trabajo, se conocerán los modelos y procesos que existen en el tema para poder realizar un proyecto de adquisición de Tecnologías de la Información y la Comunicación en un pyme del sector textil.