935 resultados para Computer Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines research on the processes taking place within the coal mineral matter at high temperatures and development of the relationship between ash fusion temperatures (AFT) and phase equilibria of the coal ash slags. A new thermodynamic database for the Al-Ca-Fe-O-Si system developed by the author was used in conjunction with the thermodynamic computer package F*A*C*T for these purposes. In addition, high temperature experimental studies were undertaken that involved heat treatment and quenching of the ash cones followed by the analyses using different techniques. The study provided new information on the processes taking place during AFT test and demonstrated the validity of the AFTs predictions with F*A*C*T. Examples of practical applications of the AFT prediction method are given in the paper. The results of this study are important not only for the AFT predictions, but also in general for the application of phase equilibrium science to the characterisation of the coal mineral matter interactions at high temperature. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Agricultural Production Systems slMulator, APSIM, is a cropping system modelling environment that simulates the dynamics of soil-plant-management interactions within a single crop or a cropping system. Adaptation of previously developed crop models has resulted in multiple crop modules in APSIM, which have low scientific transparency and code efficiency. A generic crop model template (GCROP) has been developed to capture unifying physiological principles across crops (plant types) and to provide modular and efficient code for crop modelling. It comprises a standard crop interface to the APSIM engine, a generic crop model structure, a crop process library, and well-structured crop parameter files. The process library contains the major science underpinning the crop models and incorporates generic routines based on physiological principles for growth and development processes that are common across crops. It allows APSIM to simulate different crops using the same set of computer code. The generic model structure and parameter files provide an easy way to test, modify, exchange and compare modelling approaches at process level without necessitating changes in the code. The standard interface generalises the model inputs and outputs, and utilises a standard protocol to communicate with other APSIM modules through the APSIM engine. The crop template serves as a convenient means to test new insights and compare approaches to component modelling, while maintaining a focus on predictive capability. This paper describes and discusses the scientific basis, the design, implementation and future development of the crop template in APSIM. On this basis, we argue that the combination of good software engineering with sound crop science can enhance the rate of advance in crop modelling. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Agricultural Production Systems Simulator (APSIM) is a modular modelling framework that has been developed by the Agricultural Production Systems Research Unit in Australia. APSIM was developed to simulate biophysical process in farming systems, in particular where there is interest in the economic and ecological outcomes of management practice in the face of climatic risk. The paper outlines APSIM's structure and provides details of the concepts behind the different plant, soil and management modules. These modules include a diverse range of crops, pastures and trees, soil processes including water balance, N and P transformations, soil pH, erosion and a full range of management controls. Reports of APSIM testing in a diverse range of systems and environments are summarised. An example of model performance in a long-term cropping systems trial is provided. APSIM has been used in a broad range of applications, including support for on-farm decision making, farming systems design for production or resource management objectives, assessment of the value of seasonal climate forecasting, analysis of supply chain issues in agribusiness activities, development of waste management guidelines, risk assessment for government policy making and as a guide to research and education activity. An extensive citation list for these model testing and application studies is provided. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In microarray studies, the application of clustering techniques is often used to derive meaningful insights into the data. In the past, hierarchical methods have been the primary clustering tool employed to perform this task. The hierarchical algorithms have been mainly applied heuristically to these cluster analysis problems. Further, a major limitation of these methods is their inability to determine the number of clusters. Thus there is a need for a model-based approach to these. clustering problems. To this end, McLachlan et al. [7] developed a mixture model-based algorithm (EMMIX-GENE) for the clustering of tissue samples. To further investigate the EMMIX-GENE procedure as a model-based -approach, we present a case study involving the application of EMMIX-GENE to the breast cancer data as studied recently in van 't Veer et al. [10]. Our analysis considers the problem of clustering the tissue samples on the basis of the genes which is a non-standard problem because the number of genes greatly exceed the number of tissue samples. We demonstrate how EMMIX-GENE can be useful in reducing the initial set of genes down to a more computationally manageable size. The results from this analysis also emphasise the difficulty associated with the task of separating two tissue groups on the basis of a particular subset of genes. These results also shed light on why supervised methods have such a high misallocation error rate for the breast cancer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An equivalent unit cell waveguide approach (WGA) is described to obtain reflection coefficient phase curves for designing a microstrip patch reflectarray supported by a ground plane with periodic apertures or slots. Based on the presented theory, a computer algorithm for determining the reflection coefficient of a plane wave normally incident on a multi-layer structure of patches and apertures is developed. The validity of the developed algorithm is verified by comparing the obtained results with those published in the literature and the ones generated by Agilent High Frequency Structure Simulator (HFSS). A good agreement in all the presented examples is obtained, proving that the developed theory and computer algorithm can be an effective tool for designing multi-layer microstrip reflectarrays with a periodically perforated ground plane. (C) 2003 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with an n-fold Weibull competing risk model. A characterisation of the WPP plot is given along with estimation of model parameters when modelling a given data set. These are illustrated through two examples. A study of the different possible shapes for the density and failure rate functions is also presented. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Precise needle puncture of the renal collecting system is an essential but challenging step for successful percutaneous nephrolithotomy. We evaluated the efficiency of a new real-time electromagnetic tracking system for in vivo kidney puncture. Materials and Methods: Six anesthetized female pigs underwent ureterorenoscopy to place a catheter with an electromagnetic tracking sensor into the desired puncture site and ascertain puncture success. A tracked needle with a similar electromagnetic tracking sensor was subsequently navigated into the sensor in the catheter. Four punctures were performed by each of 2 surgeons in each pig, including 1 each in the kidney, middle ureter, and right and left sides. Outcome measurements were the number of attempts and the time needed to evaluate the virtual trajectory and perform percutaneous puncture. Results: A total of 24 punctures were easily performed without complication. Surgeons required more time to evaluate the trajectory during ureteral than kidney puncture (median 15 seconds, range 14 to 18 vs 13, range 11 to 16, p ¼ 0.1). Median renal and ureteral puncture time was 19 (range 14 to 45) and 51 seconds (range 45 to 67), respectively (p ¼ 0.003). Two attempts were needed to achieve a successful ureteral puncture. The technique requires the presence of a renal stone for testing. Conclusions: The proposed electromagnetic tracking solution for renal collecting system puncture proved to be highly accurate, simple and quick. This method might represent a paradigm shift in percutaneous kidney access techniques

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Precise needle puncture of the renal collecting system is an essential but challenging step for successful percutaneous nephrolithotomy. We evaluated the efficiency of a new real-time electromagnetic tracking system for in vivo kidney puncture. Materials and Methods: Six anesthetized female pigs underwent ureterorenoscopy to place a catheter with an electromagnetic tracking sensor into the desired puncture site and ascertain puncture success. A tracked needle with a similar electromagnetic tracking sensor was subsequently navigated into the sensor in the catheter. Four punctures were performed by each of 2 surgeons in each pig, including 1 each in the kidney, middle ureter, and right and left sides. Outcome measurements were the number of attempts and the time needed to evaluate the virtual trajectory and perform percutaneous puncture. Results: A total of 24 punctures were easily performed without complication. Surgeons required more time to evaluate the trajectory during ureteral than kidney puncture (median 15 seconds, range 14 to 18 vs 13, range 11 to 16, p ¼ 0.1). Median renal and ureteral puncture time was 19 (range 14 to 45) and 51 seconds (range 45 to 67), respectively (p ¼ 0.003). Two attempts were needed to achieve a successful ureteral puncture. The technique requires the presence of a renal stone for testing. Conclusions: The proposed electromagnetic tracking solution for renal collecting system puncture proved to be highly accurate, simple and quick. This method might represent a paradigm shift in percutaneous kidney access techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dental implant recognition in patients without available records is a time-consuming and not straightforward task. The traditional method is a complete user-dependent process, where the expert compares a 2D X-ray image of the dental implant with a generic database. Due to the high number of implants available and the similarity between them, automatic/semi-automatic frameworks to aide implant model detection are essential. In this study, a novel computer-aided framework for dental implant recognition is suggested. The proposed method relies on image processing concepts, namely: (i) a segmentation strategy for semi-automatic implant delineation; and (ii) a machine learning approach for implant model recognition. Although the segmentation technique is the main focus of the current study, preliminary details of the machine learning approach are also reported. Two different scenarios are used to validate the framework: (1) comparison of the semi-automatic contours against implant’s manual contours of 125 X-ray images; and (2) classification of 11 known implants using a large reference database of 601 implants. Regarding experiment 1, 0.97±0.01, 2.24±0.85 pixels and 11.12±6 pixels of dice metric, mean absolute distance and Hausdorff distance were obtained, respectively. In experiment 2, 91% of the implants were successfully recognized while reducing the reference database to 5% of its original size. Overall, the segmentation technique achieved accurate implant contours. Although the preliminary classification results prove the concept of the current work, more features and an extended database should be used in a future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Involving groups in important management processes such as decision making has several advantages. By discussing and combining ideas, counter ideas, critical opinions, identified constraints, and alternatives, a group of individuals can test potentially better solutions, sometimes in the form of new products, services, and plans. In the past few decades, operations research, AI, and computer science have had tremendous success creating software systems that can achieve optimal solutions, even for complex problems. The only drawback is that people don’t always agree with these solutions. Sometimes this dissatisfaction is due to an incorrect parameterization of the problem. Nevertheless, the reasons people don’t like a solution might not be quantifiable, because those reasons are often based on aspects such as emotion, mood, and personality. At the same time, monolithic individual decisionsupport systems centered on optimizing solutions are being replaced by collaborative systems and group decision-support systems (GDSSs) that focus more on establishing connections between people in organizations. These systems follow a kind of social paradigm. Combining both optimization- and socialcentered approaches is a topic of current research. However, even if such a hybrid approach can be developed, it will still miss an essential point: the emotional nature of group participants in decision-making tasks. We’ve developed a context-aware emotion based model to design intelligent agents for group decision-making processes. To evaluate this model, we’ve incorporated it in an agent-based simulator called ABS4GD (Agent-Based Simulation for Group Decision), which we developed. This multiagent simulator considers emotion- and argument based factors while supporting group decision-making processes. Experiments show that agents endowed with emotional awareness achieve agreements more quickly than those without such awareness. Hence, participant agents that integrate emotional factors in their judgments can be more successful because, in exchanging arguments with other agents, they consider the emotional nature of group decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exhibition of information does not always attend to the preferences and characteristics of the users, nor the context that involves the user. With the aim of overcoming this gap, we propose an emotional context-aware model for adapting information contents to users and groups. The proposed model is based on OCC and Big Five models to handle emotion and personality respectively. The idea is to adapt the representation of the information in order to maximize the positive emotional valences and minimize the negatives. To evaluate the proposed model it was developed a prototype for adapting RSS news to users and group of users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new architecture targeting real-time and reliable Distributed Computer-Controlled Systems (DCCS). This architecture provides a structured approach for the integration of soft and/or hard real-time applications with Commercial O -The-Shelf (COTS) components. The Timely Computing Base model is used as the reference model to deal with the heterogeneity of system components with respect to guaranteeing the timeliness of applications. The reliability and availability requirements of hard real-time applications are guaranteed by a software-based fault-tolerance approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last three decades, computer architects have been able to achieve an increase in performance for single processors by, e.g., increasing clock speed, introducing cache memories and using instruction level parallelism. However, because of power consumption and heat dissipation constraints, this trend is going to cease. In recent times, hardware engineers have instead moved to new chip architectures with multiple processor cores on a single chip. With multi-core processors, applications can complete more total work than with one core alone. To take advantage of multi-core processors, parallel programming models are proposed as promising solutions for more effectively using multi-core processors. This paper discusses some of the existent models and frameworks for parallel programming, leading to outline a draft parallel programming model for Ada.