998 resultados para Software specification
Resumo:
In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we present a model of specification-based testing of interactive systems. This model provides the basis for a framework to guide such testing. Interactive systems are traditionally decomposed into a functionality component and a user interface component; this distinction is termed dialogue separation and is the underlying basis for conceptual and architectural models of such systems. Correctness involves both proper behaviour of the user interface and proper computation by the underlying functionality. Specification-based testing is one method used to increase confidence in correctness, but it has had limited application to interactive system development to date.
Resumo:
L-studio/cpfg is a plant modeling software system designed for Windows 95/98/NT platforms. Its key components are the L-system-based plant simulator cpfg and the modeling environment called L-studio. We overview version 1.0 of this system from the user's perspective.
Resumo:
This paper uses a fully operational inter-regional computable general equilibrium (CGE) model implemented for the Brazilian economy, based on previous work by Haddad and Hewings, in order to assess the likely economic effects of road transportation policy changes in Brazil. Among the features embedded in this framework, modelling of external scale economies and transportation costs provides an innovative way of dealing explicitly with theoretical issues related to integrated regional systems. The model is calibrated for 109 regions. The explicit modelling of transportation costs built into the inter-regional CGE model, based on origin-destination flows, which takes into account the spatial structure of the Brazilian economy, creates the capability of integrating the inter-regional CGE model with a geo-coded transportation network model enhancing the potential of the framework in understanding the role of infrastructure on regional development. The transportation model used is the so-called Highway Development and Management, developed by the World Bank, implemented using the software TransCAD. Further extensions of the current model specification for integrating other features of transport planning in a continental industrialising country like Brazil are discussed, with the goal of building a bridge between conventional transport planning practices and the innovative use of CGE models. In order to illustrate the analytical power of the integrated system, the authors present a set of simulations, which evaluate the ex ante economic impacts of physical/qualitative changes in the Brazilian road network (for example, a highway improvement), in accordance with recent policy developments in Brazil. Rather than providing a critical evaluation of this debate, they intend to emphasise the likely structural impacts of such policies. They expect that the results will reinforce the need to better specifying spatial interactions in inter-regional CGE models.
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Background Meta-analysis is increasingly being employed as a screening procedure in large-scale association studies to select promising variants for follow-up studies. However, standard methods for meta-analysis require the assumption of an underlying genetic model, which is typically unknown a priori. This drawback can introduce model misspecifications, causing power to be suboptimal, or the evaluation of multiple genetic models, which augments the number of false-positive associations, ultimately leading to waste of resources with fruitless replication studies. We used simulated meta-analyses of large genetic association studies to investigate naive strategies of genetic model specification to optimize screenings of genome-wide meta-analysis signals for further replication. Methods Different methods, meta-analytical models and strategies were compared in terms of power and type-I error. Simulations were carried out for a binary trait in a wide range of true genetic models, genome-wide thresholds, minor allele frequencies (MAFs), odds ratios and between-study heterogeneity (tau(2)). Results Among the investigated strategies, a simple Bonferroni-corrected approach that fits both multiplicative and recessive models was found to be optimal in most examined scenarios, reducing the likelihood of false discoveries and enhancing power in scenarios with small MAFs either in the presence or in absence of heterogeneity. Nonetheless, this strategy is sensitive to tau(2) whenever the susceptibility allele is common (MAF epsilon 30%), resulting in an increased number of false-positive associations compared with an analysis that considers only the multiplicative model. Conclusion Invoking a simple Bonferroni adjustment and testing for both multiplicative and recessive models is fast and an optimal strategy in large meta-analysis-based screenings. However, care must be taken when examined variants are common, where specification of a multiplicative model alone may be preferable.
Resumo:
At present, there is a variety of formalisms for modeling and analyzing the communication behavior of components. Due to a tremendous increase in size and complexity of embedded systems accompanied by shorter time to market cycles and cost reduction, so called behavioral type systems become more and more important. This chapter presents an overview and a taxonomy of behavioral types. The intentions of this taxonomy are to provide a guidance for software engineers and to form the basis for future research.
Resumo:
Functional brain imaging techniques such as functional MRI (fMRI) that allow the in vivo investigation of the human brain have been exponentially employed to address the neurophysiological substrates of emotional processing. Despite the growing number of fMRI studies in the field, when taken separately these individual imaging studies demonstrate contrasting findings and variable pictures, and are unable to definitively characterize the neural networks underlying each specific emotional condition. Different imaging packages, as well as the statistical approaches for image processing and analysis, probably have a detrimental role by increasing the heterogeneity of findings. In particular, it is unclear to what extent the observed neurofunctional response of the brain cortex during emotional processing depends on the fMRI package used in the analysis. In this pilot study, we performed a double analysis of an fMRI dataset using emotional faces. The Statistical Parametric Mapping (SPM) version 2.6 (Wellcome Department of Cognitive Neurology, London, UK) and the XBAM 3.4 (Brain Imaging Analysis Unit, Institute of Psychiatry, Kings College London, UK) programs, which use parametric and non-parametric analysis, respectively, were used to assess our results. Both packages revealed that processing of emotional faces was associated with an increased activation in the brain`s visual areas (occipital, fusiform and lingual gyri), in the cerebellum, in the parietal cortex, in the cingulate cortex (anterior and posterior cingulate), and in the dorsolateral and ventrolateral prefrontal cortex. However, blood oxygenation level-dependent (BOLD) response in the temporal regions, insula and putamen was evident in the XBAM analysis but not in the SPM analysis. Overall, SPM and XBAM analyses revealed comparable whole-group brain responses. Further Studies are needed to explore the between-group compatibility of the different imaging packages in other cognitive and emotional processing domains. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a systematic approach to proving temporal properties of arbitrary Z specifications. The approach involves (i) transforming the Z specification to an abstract temporal structure (or state transition system), (ii) applying a model checker to the temporal structure, (iii) determining whether the temporal structure is too abstract based on the model checking result and (iv) refining the temporal structure where necessary. The approach is based on existing work from the model checking literature, adapting it to Z.
Resumo:
Objective. The aim of the present study was to evaluate the radiopacity of 5 root end filling materials (white MTA Angelus, MTA Bio, light-cured MTA, Sealepox RP, and Portland cement clinker with bismuth oxide and calcium sulfate). Method. Five specimens, 10 mm in diameter and 1 mm in thickness according to specification ISO 6876: 2001 were fabricated from each material and radiographed using Insigth occlusal films close to a graduated aluminum step-wedge (2 to 16 mm in thickness). Radiographs were digitized and compared to the aluminum step-wedge. The radiographic density data were converted into millimeters of aluminum (mm Al), using the Digora 1.51 software. Results were evaluated statistically using the analysis of variance (ANOVA) followed by Tukey test. The level of significance was set at 5% (P<.05%). Results. Radiopacity values ranged from 1.21 mm Al (light-cured MTA) to 6.45 mm Al (MTA Angelus). Comparison between materials showed significant difference (P<.05) between MTA Angelus and all other materials, between Sealepox RP and MTA Bio, and between light-cured MTA and Portland cement clinker. Light-cured MTA was significantly less radiopaque than all other materials. No significant difference (P>.05) was found between MTA Bio and Portland cement clinker. Conclusions. All retrograde filling materials evaluated showed greater radiopacity than dentin. All the materials, except light-cured MTA met the minimum radiopacity standards of 3 mm Al recognized by the ISO 6876: 2001 and ADA n.57. (Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2009; 108: e35-e38)
Resumo:
Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.