33 resultados para Time-sharing computer systems

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three experiments explored the effectiveness of continuous auditory displays, or sonifications, for conveying information about a simulated anesthetized patient's respiration. Experiment 1 established an effective respiratory sonification. Experiment 2 showed an effect of expertise in the use of respiratory sonification and revealed that some apparent differences in sonification effectiveness could be accounted for by response bias. Experiment 3 showed that sonification helps anesthesiologists to maintain high levels of awareness of the simulated patient's state while performing other tasks more effectively than when relying upon visual monitoring of the simulated patient state. Overall, sonification of patient physiology beyond traditional pulse oximetry appears to be a viable and useful adjunct to visual monitors. Actual and potential applications of this research include monitoring in a wide variety of busy critical care contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The anisotropic norm of a linear discrete-time-invariant system measures system output sensitivity to stationary Gaussian input disturbances of bounded mean anisotropy. Mean anisotropy characterizes the degree of predictability (or colouredness) and spatial non-roundness of the noise. The anisotropic norm falls between the H-2 and H-infinity norms and accommodates their loss of performance when the probability structure of input disturbances is not exactly known. This paper develops a method for numerical computation of the anisotropic norm which involves linked Riccati and Lyapunov equations and an associated special type equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The aim of this project was to design and evaluate a system that would produce tailored information for stroke patients and their carers, customised according to their informational needs, and facilitate communication between the patient and, health professional. Method: A human factors development approach was used to develop a computer system, which dynamically compiles stroke education booklets for patients and carers. Patients and carers are able to select the topics about which they wish to receive information, the amount of information they want, and the font size of the printed booklet. The system is designed so that the health professional interacts with it, thereby providing opportunities for communication between the health professional and patient/carer at a number of points in time. Results: Preliminary evaluation of the system by health professionals, patients and carers was positive. A randomised controlled trial that examines the effect of the system on patient and carer outcomes is underway. (C) 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Granulation is one of the fundamental operations in particulate processing and has a very ancient history and widespread use. Much fundamental particle science has occurred in the last two decades to help understand the underlying phenomena. Yet, until recently the development of granulation systems was mostly based on popular practice. The use of process systems approaches to the integrated understanding of these operations is providing improved insight into the complex nature of the processes. Improved mathematical representations, new solution techniques and the application of the models to industrial processes are yielding better designs, improved optimisation and tighter control of these systems. The parallel development of advanced instrumentation and the use of inferential approaches provide real-time access to system parameters necessary for improvements in operation. The use of advanced models to help develop real-time plant diagnostic systems provides further evidence of the utility of process system approaches to granulation processes. This paper highlights some of those aspects of granulation. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The bispectrum and third-order moment can be viewed as equivalent tools for testing for the presence of nonlinearity in stationary time series. This is because the bispectrum is the Fourier transform of the third-order moment. An advantage of the bispectrum is that its estimator comprises terms that are asymptotically independent at distinct bifrequencies under the null hypothesis of linearity. An advantage of the third-order moment is that its values in any subset of joint lags can be used in the test, whereas when using the bispectrum the entire (or truncated) third-order moment is required to construct the Fourier transform. In this paper, we propose a test for nonlinearity based upon the estimated third-order moment. We use the phase scrambling bootstrap method to give a nonparametric estimate of the variance of our test statistic under the null hypothesis. Using a simulation study, we demonstrate that the test obtains its target significance level, with large power, when compared to an existing standard parametric test that uses the bispectrum. Further we show how the proposed test can be used to identify the source of nonlinearity due to interactions at specific frequencies. We also investigate implications for heuristic diagnosis of nonstationarity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

These notes follow on from the material that you studied in CSSE1000 Introduction to Computer Systems. There you studied details of logic gates, binary numbers and instruction set architectures using the Atmel AVR microcontroller family as an example. In your present course (METR2800 Team Project I), you need to get on to designing and building an application which will include such a microcontroller. These notes focus on programming an AVR microcontroller in C and provide a number of example programs to illustrate the use of some of the AVR peripheral devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Algorithms for explicit integration of structural dynamics problems with multiple time steps (subcycling) are investigated. Only one such algorithm, due to Smolinski and Sleith has proved to be stable in a classical sense. A simplified version of this algorithm that retains its stability is presented. However, as with the original version, it can be shown to sacrifice accuracy to achieve stability. Another algorithm in use is shown to be only statistically stable, in that a probability of stability can be assigned if appropriate time step limits are observed. This probability improves rapidly with the number of degrees of freedom in a finite element model. The stability problems are shown to be a property of the central difference method itself, which is modified to give the subcycling algorithm. A related problem is shown to arise when a constraint equation in time is introduced into a time-continuous space-time finite element model. (C) 1998 Elsevier Science S.A.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Training-needs analysis is critical for defining and procuring effective training systems. However, traditional approaches to training-needs analysis are not suitable for capturing the demands of highly automated and computerized work domains. In this article, we propose that work domain analysis can identify the functional structure of a work domain that must be captured in a training system, so that workers can be trained to deal with unpredictable contingencies that cannot be handled by computer systems. To illustrate this argument, we outline a work domain analysis of a fighter aircraft that defines its functional structure in terms of its training objectives, measures of performance, basic training functions, physical functionality, and physical context. The functional structure or training needs identified by work domain analysis can then be used as a basis for developing functional specifications for training systems, specifically its design objectives, data collection capabilities, scenario generation capabilities, physical functionality, and physical attributes. Finally, work domain analysis also provides a useful framework for evaluating whether a tendered solution fulfills the training needs of a work domain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three different, well established systems for e-referral were examined. They ranged from a system in a single country handling a large number of cases (60,000 per year) to a global system covering many countries which handled fewer cases (150 per year). Nonetheless, there appeared to be a number of common features. Whether the purpose is e-transfer or e-consultation, the underlying model of the e-referral process is: the referrer initiates an e-request; the organization managing the process receives it, the organization allocates it for reply; the responder replies to the initiator. Various things can go wrong and the organization managing the e-referral process needs to be able to track requests through the system; this requires various performance metrics. E-referral can be conducted using email, or as messages passed either directly between computer systems or via a Web-link to a server. The experience of the three systems studied shows that significant changes in work practice are needed to launch an e-referral service successfully. The use of e-referral between primary and secondary care improves access to services and can be shown to be cost-effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.