899 resultados para Calculus, Operational
Resumo:
The real-time refinement calculus is an extension of the standard refinement calculus in which programs are developed from a precondition plus post-condition style of specification. In addition to adapting standard refinement rules to be valid in the real-time context, specific rules are required for the timing constructs such as delays and deadlines. Because many real-time programs may be nonterminating, a further extension is to allow nonterminating repetitions. A real-time specification constrains not only what values should be output, but when they should be output. Hence for a program to implement such a specification, it must guarantee to output values by the specified times. With standard programming languages such guarantees cannot be made without taking into account the timing characteristics of the implementation of the program on a particular machine. To avoid having to consider such details during the refinement process, we have extended our real-time programming language with a deadline command. The deadline command takes no time to execute and always guarantees to meet the specified time; if the deadline has already passed the deadline command is infeasible (miraculous in Dijkstra's terminology). When such a realtime program is compiled for a particular machine, one needs to ensure that all execution paths leading to a deadline are guaranteed to reach it by the specified time. We consider this checking as part of an extended compilation phase. The addition of the deadline command restores for the real-time language the advantage of machine independence enjoyed by non-real-time programming languages.
Resumo:
We define a language and a predicative semantics to model concurrent real-time programs. We consider different communication paradigms between the concurrent components of a program: communication via shared variables and asynchronous message passing (for different models of channels). The semantics is the basis for a refinement calculus to derive machine-independent concurrent real-time programs from specifications. We give some examples of refinement laws that deal with concurrency.
Resumo:
Purpose - The purpose of this study is to develop a performance measurement model for service operations using the analytic hierarchy process approach. Design/methodology/approach - The study reviews current relevant literature on performance measurement and develops a model for performance measurement. The model is then applied to the intensive care units (ICUs) of three different hospitals in developing nations. Six focus group discussions were undertaken, involving experts from the specific area under investigation, in order to develop an understandable performance measurement model that was both quantitative and hierarchical. Findings - A combination of outcome, structure and process-based factors were used as a foundation for the model. The analyses of the links between them were used to reveal the relative importance of each and their associated sub factors. It was considered to be an effective quantitative tool by the stakeholders. Research limitations/implications - This research only applies the model to ICUs in healthcare services. Practical implications - Performance measurement is an important area within the operations management field. Although numerous models are routinely being deployed both in practice and research, there is always room for improvement. The present study proposes a hierarchical quantitative approach, which considers both subjective and objective performance criteria. Originality/value - This paper develops a hierarchical quantitative model for service performance measurement. It considers success factors with respect to outcomes, structure and processes with the involvement of the concerned stakeholders based upon the analytic hierarchy process approach. The unique model is applied to the ICUs of hospitals in order to demonstrate its effectiveness. The unique application provides a comparative international study of service performance measurement in ICUs of hospitals in three different countries. © Emerald Group Publishing Limited.
Resumo:
The International Cooperation Agency (identified in this article as IDEA) working in Colombia is one of the most important in Colombian society with programs that support gender rights, human rights, justice and peace, scholarships, aboriginal population, youth, afro descendants population, economic development in communities, and environmental development. The identified problem is based on the diversified offer of services, collaboration and social intervention which requires diverse groups of people with multiple agendas, ways to support their mandates, disciplines, and professional competences. Knowledge creation and the growth and sustainability of the organization can be in danger because of a silo culture and the resulting reduced leverage of the separate group capabilities. Organizational memory is generally formed by the tacit knowledge of the organization members, given the value of accumulated experience that this kind of social work implies. Its loss is therefore a strategic and operational risk when most problem interventions rely on direct work in the socio-economic field and living real experiences with communities. The knowledge management solution presented in this article starts first, with the identification of the people and groups concerned and the creation of a knowledge map as a means to strengthen the ties between organizational members; second, by introducing a content management system designed to support the documentation process and knowledge sharing process; and third, introducing a methodology for the adaptation of a Balanced Scorecard based on the knowledge management processes. These three main steps lead to a knowledge management “solution” that has been implemented in the organization, comprising three components: a knowledge management system, training support and promotion of cultural change.
Resumo:
The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.
Resumo:
This thesis presents a number of methodological developments that were raised by a real life application to measuring the efficiency of bank branches. The advent of internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This fact requires the development of new forms of assessing and comparing branches of a bank. In addition, performance assessment models must also take into account the fact that bank branches are service and for-profit organisations to which providing adequate service quality as well as being profitable are crucial objectives. This study analyses bank branches performance in their new roles in three different areas: their effectiveness in fostering the use of new transaction channels such as the internet and the telephone (transactional efficiency); their effectiveness in increasing sales and their customer base (operational efficiency); and their effectiveness in generating profits without compromising the quality of service (profit efficiency). The chosen methodology for the overall analysis is Data Envelopment Analysis (DEA). The application attempted here required some adaptations to existing DEA models and indeed some new models so that some specialities of our data could be handled. These concern the development of models that can account for negative data, the development of models to measure profit efficiency, and the development of models that yield production units with targets that are nearer to their observed levels than targets yielded by traditional DEA models. The application of the developed models to a sample of Portuguese bank branches allowed their classification according to the three performance dimensions (transactional, operational and profit efficiency). It also provided useful insights to bank managers regarding how bank branches compare between themselves in terms of their performance, and how, in general, the three performance dimensions are connected between themselves.