957 resultados para model-based reasoning processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reviews of the state of the professional practice in Requirements Engineering (RE) stress that the RE process is both complex and hard to describe, and suggest there is a significant difference between competent and "approved" practice. "Approved" practice is reflected by (in all likelihood, in fact, has its genesis in) RE education, so that the knowledge and skills taught to students do not match the knowledge and skills required and applied by competent practitioners.

A new understanding of the RE process has emerged from our recent study. RE is revealed as inherently creative, involving cycles of building and major reconstruction of the models developed, significantly different from the systematic and smoothly incremental process generally described in the literature. The process is better characterised as highly creative, opportunistic and insight driven. This mismatch between approved and actual practice provides a challenge to RE education - RE requires insight and creativity as well as technical knowledge. Traditional learning models applied to RE focus, however, on notation and prescribed processes acquired through repetition. We argue that traditional learning models fail to support the learning required for RE and propose both a new model based on cognitive flexibility and a framework for RE education to support this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adult peregrine falcons (Falco peregrinus macropus) have monotypic plumage and display strong reversed sexual dimorphism, with females significantly larger than males. Reversed sexual dimorphism is measurable among nestlings in the latter stages of their development and can therefore be used to differentiate between sexes. In the early stages of development, however, nestlings cannot be sexed with any degree of certainty because morphological differentiation between the sexes is not well developed. During this study we developed a model for sexing younger nestlings based on genetic analysis and morphometric data collected as part of a long-term banding study of this species. A discriminant function model based on morphological characteristics was developed for determining the sex of nestlings (n = 150) in the field and was shown to be 96.0% accurate. This predictive model was further tested against an independent morphometric dataset taken from a second group of nestlings (n = 131). The model correctly allocated sex to 96.2% of this second group of nestlings. Sex can reliably be determined (98.6% accurate) for nestlings that have a wing length of at least 9 cm using this model. Application of this model, therefore, allows the banding of younger nestlings and, as such, significantly increases the period of time over which banding can occur. Another important implication of this model is that by banding nestlings earlier, they are less likely to jump from the nest, therefore reducing the risk of injury to both the brood and the bander.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a model of exchange rate determination within an error correction framework. The intention is to identify both long and short term determinants that can be used to forecast the AUD/US exchange rate. The paper identifies a set of significant variables associated with exchange rate movements over a twenty year period from 1984 to 2004. Specifically, the overnight interest rate differential, Australia's foreign trade-weighted exposure to commodity prices as well as exchange rate volatility are variables identified that are able explain movements in the AUDIUS dollar relationship. An error correction model is subsequently constructed that incorporates an equilibrium correction term, a short-term interest rate differential variable, a commodity price variable and a proxy for exchange rate volatility. The model is then used to forecast out of sample and is found to dominate a naIve random walk model based on three different metrics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the present work is searching for the correlation between the carbon content in steels and the parameters of the rheological models, which are used to describe the materials behavior during hot plastic deformation. This correlation can be expected in the internal variable models, which are based on physical phenomena occurring in the material. Such a model, based on the dislocation density as the internal variable, is investigated in this work. The experiments including hot torsion tests are used for the analysis.
The procedure is composed of three parts. Plastometric tests were performed for steels with various carbon content. Optimization techniques were applied next to determine the coefficients in the internal variable rheological model for these steels. Two versions of the model are considered. One is based on the average dislocation density and the second accounts for the distribution of dislocation densities. Evaluation of correlation between carbon content and such coefficients in the models as activation energy for self diffusion, activation energy for recrystallization, grain boundary mobility, recovery coefficient etc. was the main objective of the work. In consequence, the model which may be used for simulation of hot forming processes for steels with various chemical compositions, is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigation of the role of hypothesis formation in complex (business) problem solving has resulted in a new approach to hypothesis generation. A prototypical hypothesis generation paradigm for management intelligence has been developed, reflecting a widespread need to support management in such areas as fraud detection and intelligent decision analysis. This dissertation presents this new paradigm and its application to goal directed problem solving methodologies, including case based reasoning. The hypothesis generation model, which is supported by a dynamic hypothesis space, consists of three components, namely, Anomaly Detection, Abductive Reasoning, and Conflict Resolution models. Anomaly detection activates the hypothesis generation model by scanning anomalous data and relations in its working environment. The respective heuristics are activated by initial indications of anomalous behaviour based on evidence from historical patterns, linkages with other cases, inconsistencies, etc. Abductive reasoning, as implemented in this paradigm, is based on joining conceptual graphs, and provides an inference process that can incorporate a new observation into a world model by determining what assumptions should be added to the world, so that it can explain new observations. Abductive inference is a weak mechanism for generating explanation and hypothesis. Although a practical conclusion cannot be guaranteed, the cues provided by the inference are very beneficial. Conflict resolution is crucial for the evaluation of explanations, especially those generated by a weak (abduction) mechanism.The measurements developed in this research for explanation and hypothesis provide an indirect way of estimating the ‘quality’ of an explanation for given evidence. Such methods are realistic for complex domains such as fraud detection, where the prevailing hypothesis may not always be relevant to the new evidence. In order to survive in rapidly changing environments, it is necessary to bridge the gap that exists between the system’s view of the world and reality.Our research has demonstrated the value of Case-Based Interaction, which utilises an hypothesis structure for the representation of relevant planning and strategic knowledge. Under, the guidance of case based interaction, users are active agents empowered by system knowledge, and the system acquires its auxiliary information/knowledge from this external source. Case studies using the new paradigm and drawn from the insurance industry have attracted wide interest. A prototypical system of fraud detection for motor vehicle insurance based on an hypothesis guided problem solving mechanism is now under commercial development. The initial feedback from claims managers is promising.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of fault-tolerant computing systems is a very difficult task. Two reasons contributed to this difficulty can be described as follows. The First is that, in normal practice, fault-tolerant computing policies and mechanisms are deeply embedded into most application programs, so that these application programs cannot cope with changes in environments, policies and mechanisms. These factors may change frequently in a distributed environment, especially in a heterogeneous environment. Therefore, in order to develop better fault-tolerant systems that can cope with constant changes in environments and user requirements, it is essential to separate the fault tolerant computing policies and mechanisms in application programs. The second is, on the other hand, a number of techniques have been proposed for the construction of reliable and fault-tolerant computing systems. Many computer systems are being developed to tolerant various hardware and software failures. However, most of these systems are to be used in specific application areas, since it is extremely difficult to develop systems that can be used in general-purpose fault-tolerant computing. The motivation of this thesis is based on these two aspects. The focus of the thesis is on developing a model based on the reactive system concepts for building better fault-tolerant computing applications. The reactive system concepts are an attractive paradigm for system design, development and maintenance because it separates policies from mechanisms. The stress of the model is to provide flexible system architecture for the general-purpose fault-tolerant application development, and the model can be applied in many specific applications. With this reactive system model, we can separate fault-tolerant computing polices and mechanisms in the applications, so that the development and maintenance of fault-tolerant computing systems can be made easier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drawing upon one research project Home-School-Community Partnerships for Enhancing Children's Numeracy Development we examine, critically, some problems entailed in the processes of conceptualizing the subjects and objects of inquiry, conducting field work with subjects (as knowing agents) and interpreting and disseminating the knowledge gained. Addressing these issues, in practice, has entailed some necessary consideration of fundamental tensions centred around the professional power-knowledge of teachers and a dominant cultural discourse that situates numeracy learning in the school.

A theoretical model (based upon Engeström's Activity Theory) was used to specify and analyse various types of partnerships within a network of mutually interconnected activities to support children's learning (Bloome et al., 2000; Engeström, 1999). By decentering the school, within this model, we have been led to a closer analysis of the concept of 'partnership' and of the social construction of parental and community involvement in children's numeracy development. One of the most problematic aspects of partnerships evident in our research is the way in which the term 'numeracy' is understood by different stakeholders. Awareness of this has shaped the conduct and dissemination of our research and ultimately enabled us to identify critical issues for further inquiry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper establishes the Full Potential Management (FPM) Model based upon the social model of disabilities coupled with principles of diversity management and disability-oriented human resource management. Despite the fact that the concept of management was once envisioned as having 'value to society ' by improving the quality of life through efficient practices (Rimler, 1976), management literature has narrowly defined management as a means to gain increased productivity and achieve organizational goals, thus overlooking the social formation and implementation design for a better life (Diener & Seligman, 2004; Small, 2004; Whitley 1989). Based upon the diversity literature, we propose that social-oriented diversity management principles and practices are the key to transforming management concepts from achieving organizational potential to achieving social aims that maximize the potential and quality of life of each person.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Each year, large amounts of money and labor are spent on patching the vulnerabilities in operating systems and various popular software to prevent exploitation by worms. Modeling the propagation process can help us to devise effective strategies against those worms' spreading. This paper presents a microcosmic analysis of worm propagation procedures. Our proposed model is different from traditional methods and examines deep inside the propagation procedure among nodes in the network by concentrating on the propagation probability and time delay described by a complex matrix. Moreover, since the analysis gives a microcosmic insight into a worm's propagation, the proposed model can avoid errors that are usually concealed in the traditional macroscopic analytical models. The objectives of this paper are to address three practical aspects of preventing worm propagation: (i) where do we patch? (ii) how many nodes do we need to patch? (iii) when do we patch? We implement a series of experiments to evaluate the effects of each major component in our microcosmic model. Based on the results drawn from the experiments, for high-risk vulnerabilities, it is critical that networks reduce the number of vulnerable nodes to below 80%. We believe our microcosmic model can benefit the security industry by allowing them to save significant money in the deployment of their security patching schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article reports on a study that investigates the possibilities of developing a professional learning model based on action research that could lead to sustained improvements in teaching and learning in schools in remote areas of Papua New Guinea. The issues related to the implementation of this model are discussed using a critical lens that questions the use of ‘western’ constructs about ‘successful’ professional learning and ‘quality’ education in Papua New Guinea. In the article, we discuss the notion of ‘professional learning’ and how action research can be conceived as a model for professional learning. Then, we discuss some of the issues and difficulties that are arising during the implementation of our study. The article concludes with a discussion of implications for future developments of professional learning for teachers in countries such as Papua New Guinea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A hybrid neural network model, based on the fusion of fuzzy adaptive resonance theory (FA ART) and the general regression neural network (GRNN), is proposed in this paper. Both FA and the GRNN are incremental learning systems and are very fast in network training. The proposed hybrid model, denoted as GRNNFA, is able to retain these advantages and, at the same time, to reduce the computational requirements in calculating and storing information of the kernels. A clustering version of the GRNN is designed with data compression by FA for noise removal. An adaptive gradient-based kernel width optimization algorithm has also been devised. Convergence of the gradient descent algorithm can be accelerated by the geometric incremental growth of the updating factor. A series of experiments with four benchmark datasets have been conducted to assess and compare effectiveness of GRNNFA with other approaches. The GRNNFA model is also employed in a novel application task for predicting the evacuation time of patrons at typical karaoke centers in Hong Kong in the event of fire. The results positively demonstrate the applicability of GRNNFA in noisy data regression problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data and information quality is a well-established research topic and gradually appears on the decision-makers' top concern lists. Many studies have been conducted on how to investigate the generic data/information quality issues and factors by providing a high-level abstract framework or model. Based on these previous studies, this study tries to discuss the actual data quality issues with the operation-level and middle-level managers emerged during the emergency department data collection and reporting processes. By conduct data quality issues and business processes mapping, possible data quality issues are summarised under the well-known TOP model and the recommendations of data quality improvement are suggested.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work demonstrates a novel Bayesian learning approach for model based analysis of Functional Magnetic Resonance (fMRI) data. We use a physiologically inspired hemodynamic model and investigate a method to simultaneously infer the neural activity together with hidden state and the physiological parameter of the model. This joint estimation problem is still an open topic. In our work we use a Particle Filter accompanied with a kernel smoothing approach to address this problem within a general filtering framework. Simulation results show that the proposed method is a consistent approach and has a good potential to be enhanced for further fMRI data analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous experience and research indicated that the Pareto Principle (80/20 Principle) has been widely used in many industries to achieve more with less. The study described in this paper concurs that this principle can be applied to improve the estimating accuracy and efficiency, especially in design development stage of projects. In fact, establishing an effective cost estimating model to improve accuracy and efficiency in design development stage has been a subject, which has attracted many research attentions over several decades. For over almost 40 years, research studies indicate that using the 80/20 Principle is one of the approaches. However, most of these studies were built by assumption, theoretical analysis or questionnaire survey. The objective of this research is to explore a logical and systematic method to establish a cost estimating model based on the Pareto Principle. This paper includes extensive literatures review on cost estimating accuracy and efficiency in the construction industry that points out the current gap of knowledge area and understanding of the topical. These reviews assist in developing the direction for the research and explore the potential methodology of using the Pareto Principle in the new cost estimating model. The findings of this paper suggest that combining the Pareto Principle with statistical analysis could be used as the technique to improve the accuracy and efficiency of current estimating methods in design development stage.