65 resultados para Computer-simulation


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a new method for blind source separation by exploiting phase and frequency redundancy of cyclostationary signals in a complementary way. It requires a weaker separation condition than those methods which only exploit the phase diversity or the frequency diversity of the source signals. The separation criterion is to diagonalize a polynomial matrix whose coefficient matrices consist of the correlation and cyclic correlation matrices, at time delay .TAU. = 0, of multiple measurements. An algorithm is proposed to perform the blind source separation. Computer simulation results illustrate the performance of the new algorithm in comparison with the existing ones.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thin paper presents a new algorithm for blind source separation (BSS) by exploiting phase and frequency redundancy of cyclostationary signals in a complementary way. The separation criterion is to diagonalize a polynomial matrix whose coefficient matrices consist of the correlation and cyclic correlation matrices of multiple measurements. Computer simulation results illustrate, the performance of the new algorithm in comparison with some existing algorithms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper addresses the problem of the design of a precoder for multiple transmit antenna communication systems with spatially and temporally correlated fading channels. Using the theories of matrix differential calculus, the paper derives a precoder for unitary space-time codes that can exploit the spatio-temporal correlation in the time-varying fading channels. The design criterion is based on minimizing the mean square error of the channel estimates. Computer simulation results show that a significant performance gain can be achieved by using the designed precoder.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper analyses the convergence behaviour of the parallel interference cancellation (PIC) detector in code division multiple access (CDMA) systems. Using the results from previous stability analysis of an iterated-map neural network, the paper derives a general condition from which the sufficient condition for convergence of the PIC detector with tentative decision functions that are monotonically increasing at a sublinear rate can be calculated. As examples, the paper derives the sufficient conditions for convergence of the PIC detector with the clip decision and the hyperbolic tangent decision functions. The paper also examines the convergence behaviour of the PIC detector with hyperbolic tangent decision function via computer simulation and compares it with the analytical results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Predicting the acoustic outcome and acceptance of a concert performance hall by its users could be a difficult and onerous task even for an acoustician. This paper discusses a process of how previous research findings from expert authorities have been assembled into a method of evaluating acoustic hall performance. Several parameters of acoustic qualities and quantitative measures have been identified in the literature. These relate to Beranek’s acoustic variables of performance. Existing famous concert halls which have been previously evaluated and rated are now studies in terms of their results from a computer simulation. The research findings suggest that the use of a simulation program can be extremely accurate in the prediction of acoustic performance of new non-existing concert halls.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Experiential simulations have been used effectively for teaching business, medicine and engineering. Many are supported by computer systems that create artificial virtual spaces so learners can safely practice intricate professional skills. Surprising few attempts have been made to utilise such approaches in teaching IT/IS principles and requirements engineering (RE) in particular. This paper reports on FAB ATM, which is one of those few learning environments which rely on computer simulation and which have been designed specifically to train IS professionals, and in particular, develop their RE skills. In its framework, FAB ATM combines and balances elements of video-based computer simulation with activities, such as classroom instructions. This paper explains the principles of the FAB ATM design, its coverage of RE activities and the anecdotal experiences of students and staff that have used this environment in practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The IS education field has made increasing use of computerised experiential simulations, but few attempts have been made to create an authentic learning environment that combines and balances elements of video-based computer simulation with real-life learning activities. This paper explores the design principles used to develop a CD-ROM simulation where learners use interviewing skills to elicit system requirements from simulated employees in an authentic context. The employees are videoed actors who converse with each other and with learners within a dynamic interaction model. The paper also describes how we combined this simulation with other teaching approaches such as in-class discussions, student team work, formal presentations, etc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During large scale wildfires, suppression activities are carried out under the direction of an Incident Management Team (IMT). The aim of the research was to increase understanding of decision processes potentially related to IMT effectiveness. An IMT comprises four major functions: Command, Operations, Planning, and Logistics. Four methodologies were used to study IMT processes: computer simulation experiments; analyses of wildfire reports; interviews with IMT members; and cognitive ethnographic studies of IMTs. Three processes were important determinants of IMT effectiveness: information management and cognitive overload; matching component function goals to overall goals; and team metacognition to detect and counter task-disruptive developments. These processes appear to be complex multi-person analogues of individual Incident Command processes identified previously. The findings have implications for issues such as: creating IMTs; training IMTs; managing IMTs; and providing decision support to IMTs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large scale bushfire (or wildfire) suppression activities are conducted under the control of an Incident Management Team (IMT) comprising four major functions: Command, Operations, Planning, and Logistics. Four methodologies were used to investigate processes determining the effectiveness of IMT decision making activities: (a) laboratory experiments using the Networked Fire Chief computer simulation program; (b) analyses of reports of significant fires; (c) structured interviews with experienced IMT staff; and, (d) cognitive ethnographic studies of IMTs. Three classes of team processes were found to be important determinants of IMT effectiveness: information sharing and management; matching of the four component function goals to overall IMT goals; and monitoring of the overall IMT situation to detect and correct task disruptive processes. Several non-rational processes with the potential for hindering IMT effectiveness were noted. Team metacognition emerged as a key process for understanding effective IMT decision making.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Viral marketing is a form of peer-to-peer communication in which individuals are encouraged to pass on promotional messages within their social networks. Conventional wisdom holds that the viral marketing process is both random and unmanageable. In this paper, we deconstruct the process and investigate the formation of the activated digital network as distinct from the underlying social network. We then consider the impact of the social structure of digital networks (random, scale free, and small world) and of the transmission behavior of individuals on campaign performance. Specifically, we identify alternative social network models to understand the mediating effects of the social structures of these models on viral marketing campaigns. Next, we analyse an actual viral marketing campaign and use the empirical data to develop and validate a computer simulation model for viral marketing. Finally, we conduct a number of simulation experiments to predict the spread of a viral message within different types of social network structures under different assumptions and scenarios. Our findings confirm that the social structure of digital networks play a critical role in the spread of a viral message. Managers seeking to optimize campaign performance should give consideration to these findings before designing and implementing viral marketing campaigns. We also demonstrate how a simulation model is used to quantify the impact of campaign management inputs and how these learnings can support managerial decision making.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of the present research study was to produce a global, cumulative model of number concept development for children between the ages of two and eight years old. The theoretical and methodological orientation of this study was greatly influenced by Richard Young's production system analysis of seriation by young children (Young, 1971, 1976) and by Newell's (1973) seminal paper, ‘You can't play twenty questions with nature and win’. The methodology used in this investigation thus was as follows. A series of complex number tasks encompassing many aspects of the concept of number were developed. Five children aged between three and seven years then were videotaped while performing some of these complex number tasks. From a detailed protocol analysis of the video-recordings, computer simulation models written in the production system language PSS3 (Ohlsson, 1979) were produced. Specific production system models were produced for each of following aspects of the children's number knowledge: (i) sharing of discrete quantities; (ii) comparison of shares; and (iii) conservation/addition/subtraction of number. These domain-specific models were based on the converging experimental evidence obtained from each of the children’s responses to variants of the complex number tasks. Each child thus received a different set of problems which were chosen systematically in order to clarify particular features of the child's abilities. After a production system model for each child had been produced within a domain, these models were compared and contrasted. From this analysis, developmental trends within the domain were identified and discussed. The research and educational implications of these developmental trends then were discussed. In the concluding parts of this study, the children's domain-specific production system models were cumulated into global, comprehensive models which accurately represented their behaviour in a variety of number tasks. These comprehensive models were compared and contrasted and general developmental trends in young children's number knowledge were identified and discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Investigation of the role of hypothesis formation in complex (business) problem solving has resulted in a new approach to hypothesis generation. A prototypical hypothesis generation paradigm for management intelligence has been developed, reflecting a widespread need to support management in such areas as fraud detection and intelligent decision analysis. This dissertation presents this new paradigm and its application to goal directed problem solving methodologies, including case based reasoning. The hypothesis generation model, which is supported by a dynamic hypothesis space, consists of three components, namely, Anomaly Detection, Abductive Reasoning, and Conflict Resolution models. Anomaly detection activates the hypothesis generation model by scanning anomalous data and relations in its working environment. The respective heuristics are activated by initial indications of anomalous behaviour based on evidence from historical patterns, linkages with other cases, inconsistencies, etc. Abductive reasoning, as implemented in this paradigm, is based on joining conceptual graphs, and provides an inference process that can incorporate a new observation into a world model by determining what assumptions should be added to the world, so that it can explain new observations. Abductive inference is a weak mechanism for generating explanation and hypothesis. Although a practical conclusion cannot be guaranteed, the cues provided by the inference are very beneficial. Conflict resolution is crucial for the evaluation of explanations, especially those generated by a weak (abduction) mechanism.The measurements developed in this research for explanation and hypothesis provide an indirect way of estimating the ‘quality’ of an explanation for given evidence. Such methods are realistic for complex domains such as fraud detection, where the prevailing hypothesis may not always be relevant to the new evidence. In order to survive in rapidly changing environments, it is necessary to bridge the gap that exists between the system’s view of the world and reality.Our research has demonstrated the value of Case-Based Interaction, which utilises an hypothesis structure for the representation of relevant planning and strategic knowledge. Under, the guidance of case based interaction, users are active agents empowered by system knowledge, and the system acquires its auxiliary information/knowledge from this external source. Case studies using the new paradigm and drawn from the insurance industry have attracted wide interest. A prototypical system of fraud detection for motor vehicle insurance based on an hypothesis guided problem solving mechanism is now under commercial development. The initial feedback from claims managers is promising.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It has been recognised that formal methods are useful as a modelling tool in requirements engineering. Specification languages such as Z permit the precise and unambiguous modelling of system properties and behaviour. However some system problems, particularly those drawn from the information systems problem domain, may be difficult to model in crisp or precise terms. It may also be desirable that formal modelling should commence as early as possible, even when our understanding of parts of the problem domain is only approximate. This thesis suggests fuzzy set theory as a possible representation scheme for this imprecision or approximation. A fuzzy logic toolkit that defines the operators, measures and modifiers necessary for the manipulation of fuzzy sets and relations is developed. The toolkit contains a detailed set of laws that demonstrate the properties of the definitions when applied to partial set membership. It also provides a set of laws that establishes an isomorphism between the toolkit notation and that of conventional Z when applied to boolean sets and relations. The thesis also illustrates how the fuzzy logic toolkit can be applied in the problem domains of interest. Several examples are presented and discussed including the representation of imprecise concepts as fuzzy sets and relations, system requirements as a series of linguistically quantified propositions, the modelling of conflict and agreement in terms of fuzzy sets and the partial specification of a fuzzy expert system. The thesis concludes with a consideration of potential areas for future research arising from the work presented here.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the last 30 to 40 years, many researchers have combined to build the knowledge base of theory and solution techniques that can be applied to the case of differential equations which include the effects of noise. This class of ``noisy'' differential equations is now known as stochastic differential equations (SDEs). Markov diffusion processes are included within the field of SDEs through the drift and diffusion components of the Itô form of an SDE. When these drift and diffusion components are moderately smooth functions, then the processes' transition probability densities satisfy the Fokker-Planck-Kolmogorov (FPK) equation -- an ordinary partial differential equation (PDE). Thus there is a mathematical inter-relationship that allows solutions of SDEs to be determined from the solution of a noise free differential equation which has been extensively studied since the 1920s. The main numerical solution technique employed to solve the FPK equation is the classical Finite Element Method (FEM). The FEM is of particular importance to engineers when used to solve FPK systems that describe noisy oscillators. The FEM is a powerful tool but is limited in that it is cumbersome when applied to multidimensional systems and can lead to large and complex matrix systems with their inherent solution and storage problems. I show in this thesis that the stochastic Taylor series (TS) based time discretisation approach to the solution of SDEs is an efficient and accurate technique that provides transition and steady state solutions to the associated FPK equation. The TS approach to the solution of SDEs has certain advantages over the classical techniques. These advantages include their ability to effectively tackle stiff systems, their simplicity of derivation and their ease of implementation and re-use. Unlike the FEM approach, which is difficult to apply in even only two dimensions, the simplicity of the TS approach is independant of the dimension of the system under investigation. Their main disadvantage, that of requiring a large number of simulations and the associated CPU requirements, is countered by their underlying structure which makes them perfectly suited for use on the now prevalent parallel or distributed processing systems. In summary, l will compare the TS solution of SDEs to the solution of the associated FPK equations using the classical FEM technique. One, two and three dimensional FPK systems that describe noisy oscillators have been chosen for the analysis. As higher dimensional FPK systems are rarely mentioned in the literature, the TS approach will be extended to essentially infinite dimensional systems through the solution of stochastic PDEs. In making these comparisons, the advantages of modern computing tools such as computer algebra systems and simulation software, when used as an adjunct to the solution of SDEs or their associated FPK equations, are demonstrated.