951 resultados para Specifications
Resumo:
The personal computer revolution has resulted in the widespread availability of low-cost image analysis hardware. At the same time, new graphic file formats have made it possible to handle and display images at resolutions beyond the capability of the human eye. Consequently, there has been a significant research effort in recent years aimed at making use of these hardware and software technologies for flotation plant monitoring. Computer-based vision technology is now moving out of the research laboratory and into the plant to become a useful means of monitoring and controlling flotation performance at the cell level. This paper discusses the metallurgical parameters that influence surface froth appearance and examines the progress that has been made in image analysis of flotation froths. The texture spectrum and pixel tracing techniques developed at the Julius Kruttschnitt Mineral Research Centre are described in detail. The commercial implementation, JKFrothCam, is one of a number of froth image analysis systems now reaching maturity. In plants where it is installed, JKFrothCam has shown a number of performance benefits. Flotation runs more consistently, meeting product specifications while maintaining high recoveries. The system has also shown secondary benefits in that reagent costs have been significantly reduced as a result of improved flotation control. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Existing refinement calculi provide frameworks for the stepwise development of imperative programs from specifications. This paper presents a refinement calculus for deriving logic programs. The calculus contains a wide-spectrum logic programming language, including executable constructs such as sequential conjunction, disjunction, and existential quantification, as well as specification constructs such as general predicates, assumptions and universal quantification. A declarative semantics is defined for this wide-spectrum language based on executions. Executions are partial functions from states to states, where a state is represented as a set of bindings. The semantics is used to define the meaning of programs and specifications, including parameters and recursion. To complete the calculus, a notion of correctness-preserving refinement over programs in the wide-spectrum language is defined and refinement laws for developing programs are introduced. The refinement calculus is illustrated using example derivations and prototype tool support is discussed.
Resumo:
This review discusses the issues to be considered in establishing new or extending existing high dependency unit (HDU) services. A defined high dependency service becomes cost-effective when patient care requires more than one nurse for three patients. Professional guidelines for HDUs vary and there are no national accreditation criteria. Casemix and service delivery specifications for the HDU need to be defined and agreed upon within the institution. Establishing a new HDU service requires changes to care delivery. Many potential HDU patients are currently managed in general wards or in the intensive care unit. The service should be discussed widely and marketed within the institution, and the development of defined working relationships with the ICU and primary care teams oil the wards is mandatory.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.
Resumo:
1. Two broiler experiments and a layer experiments were conducted on Kunitz trypsin inhibitor (Kti) soybeans (SB) of low trypsin inhibitor (TI) activity to determine their nutritive value when included as mash in least-cost poultry diets. 2. Experiment 1 compared chick performance on the Kti or raw SB using a commercial full-fat SB meal (FFSBM) and a solvent extracted SB meal (SBM) as controls during a 20 d experimental period. Broiler experiment 2 compared Kti and raw SB, non-steamed, or steam-pelleted with and without DL-methionine supplementation added to every treatment containing 170 g SB/kg. For each broiler experiment the levels of each SB were 70, 120 and 170 g/kg with the control birds fed only 170 g SB/kg. 3. The layer experiment, compared steam-pelleted Kti and raw SB against a non-steamed Kti and raw SB each fed at two levels (70 and 110 g/kg) x 30 replicates from 29 weeks of age for 19 weeks in a completely randomised design. Production parameters were measured when diets were formulated to contain minimum required specifications and calculated apparent metabolisable energy (AME). At the completion of each trial, 2 broiler birds from each cage and 5 layer birds per treatment were killed, weighed, and their liver and pancreas weighed. 4. Both broiler experiments indicated that production parameters on the Kti SB treatments were significantly lower (P < 0.05) than on the two commercial control SB treatments. However, the Kti treatments were superior to the raw SB treatments. 5. Pancreas weight increased with increasing inclusion of both raw and Kti SB, suggesting that a TI was causing the depression in performance. The AME of the Kti SB was similar to that of commercial FFSB meal. After steam conditioning, the raw SB meal AME value of 9.5 MJ/kg dry matter (DM) was improved to 14.1 MJ/kg DM by reduced TI activity, but this AME improvement with TI activity reduction, plus the supplementation with DL-methionine on birds fed the raw SB had no effect (P > 0.05) on any parameter evaluated in experiment 2. 6. The layer experiment showed that hens on the Kti SB treatments had significantly greater live weight gain (LWG), egg weight and daily egg mass than birds given raw SB. A reduced food intake (FI) was observed in the Kti treatments but egg mass was generally similar to that on the FFSB control diet, indicating that Kti SB supported excellent egg production at an inclusion of 110 g/kg. The depressed performance observed for broiler chicks suggest that younger birds are more susceptible to the effects of SB TI.
Resumo:
This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.
Resumo:
Over the last decade, software architecture emerged as a critical issue in Software Engineering. This encompassed a shift from traditional programming towards software development based on the deployment and assembly of independent components. The specification of both the overall systems structure and the interaction patterns between their components became a major concern for the working developer. Although a number of formalisms to express behaviour and to supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from CCS behavioural specifications the corresponding architectural skeletons in the Microsoft .Net framework, in the form of executable C and Cω code. The prototyping process is fully supported by a specific tool developed in Haskell
Resumo:
More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications
Resumo:
The interoperability of IP video equipment is a critical problem for surveillance systems and other video application developers. ONVIF is one of the two specifications addressing the standardization of networked devices interface, and it is based on SOAP. This paper addresses the development of an ONVIF library to develop clients of video cameras. We address the choice of a web services toolkit, and how to use the selected toolkit to develop a basic library. From that, we discuss the implementation of features that ...
Resumo:
Software architecture is currently recognized as one of the most critical design steps in Software Engineering. The specification of the overall system structure, on the one hand, and of the interactions patterns between its components, on the other, became a major concern for the working developer. Although a number of formalisms to express behaviour and supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from behavioural specifications written in Cw, the corresponding architectural skeletons in the Microsoft .NET framework in the form of executable code
Resumo:
Nowadays despite improvements in usability and intuitiveness users have to adapt to the proposed systems to satisfy their needs. For instance, they must learn how to achieve tasks, how to interact with the system, and fulfill system's specifications. This paper proposes an approach to improve this situation enabling graphical user interface redefinition through virtualization and computer vision with the aim of increasing the system's usability. To achieve this goal the approach is based on enriched task models, virtualization and picture-driven computing.
Resumo:
ABSTRACT Based on the assumption that earnings persistence has implications for both financial analysis and compensation contracts, the aim of this paper is to investigate the role of earnings persistence assuming that (i) more persistent earnings are likely to be a better input to valuation models and (ii) more persistent earnings are likely to serve as a proxy for long-term market and managerial orientation. The analysis is based on Brazilian listed firms from 1995 to 2013, and while we document strong support for the relevance of earnings persistence in financial analysis and valuation, we fail to document a significant relationship between earnings persistence and long-term value orientation. These results are sensitive to different specifications, and additional results suggest that firms' idiosyncratic risk (total risk) is relevant to explain the focus on short-term outcomes (short-termism) across firms. The main contribution of this paper is to offer empirical evidence for the relevance of accounting numbers in both valuation and contractual theories in an emergent market.
Resumo:
A elaboração deste projecto, integrado no âmbito do Trabalho Final de Mestrado, para a obtenção do grau de Mestre em Engenharia Civil, tem como objectivo o dimensionamento de um passadiço pedonal em estrutura metálica, com fundações em betão armado. Este documento inclui quase todos os elementos necessários ao projecto de execução da referida estrutura. Para o dimensionamento do passadiço pedonal procedeu-se à quantificação das acções e posteriormente à verificação da segurança de todos os elementos estruturais tendo por base os critérios e especificações técnicas preconizados nas Normas Europeias relativas ao projecto estrutural (Eurocódigos estruturais). Tratando-se de um passadiço destinado à circulação de peões e cuja estrutura metálica apresenta um certo grau de flexibilidade devido à esbelteza dos elementos estruturais, esta poderá estar sujeita a acções dinâmicas periódicas provocadas pelas pessoas quando percorrem o passadiço, podendo ocasionar certos níveis de vibração que sob o ponto de vista de segurança estrutural serão pouco relevantes, sendo no entanto excessivos do ponto de vista do conforto humano. Foi por isso efectuado um estudo dinâmico, com o objectivo de caracterizar a resposta dinâmica da estrutura quando solicitada a carregamentos de natureza periódica como é o caso da acção do peão, de modo a garantir que a utilização desta estrutura esteja dentro dos parâmetros de conforto aceitáveis. A modelação da estrutura e consequente discretização geral desta, foi feita recorrendo a programa de elementos finitos, SAP2000, versão 14.0.0. O dimensionamento das ligações constitui outros dos aspectos fundamentais no projecto desta estrutura metálica.
Resumo:
It is proposed a new approach based on a methodology, assisted by a tool, to create new products in the automobile industry based on previous defined processes and experiences inspired on a set of best practices or principles: it is based on high-level models or specifications; it is component-based architecture centric; it is based on generative programming techniques. This approach follows in essence the MDA (Model Driven Architecture) philosophy with some specific characteristics. We propose a repository that keeps related information, such as models, applications, design information, generated artifacts and even information concerning the development process itself (e.g., generation steps, tests and integration milestones). Generically, this methodology receives the users' requirements to a new product (e.g., functional, non-functional, product specification) as its main inputs and produces a set of artifacts (e.g., design parts, process validation output) as its main output, that will be integrated in the engineer design tool (e.g. CAD system) facilitating the work.