555 resultados para MIXED MODELS
Resumo:
In this paper an approach is presented for identification of a reduced model for coherent areas in power systems using phasor measurement units to represent the inter-area oscillations of the system. The generators which are coherent in a wide range of operating conditions form the areas in power systems and the reduced model is obtained by representing each area by an equivalent machine. The reduced nonlinear model is then identified based on the data obtained from measurement units. The simulation is performed on three test systems and the obtained results show high accuracy of identification process.
Resumo:
OBJECTIVE: To synthesise the available evidence and estimate the comparative efficacy of control strategies to prevent total hip replacement (THR)-related surgical site infections (SSIs) using a mixed treatment comparison. DESIGN: Systematic review and mixed treatment comparison. SETTING: Hospital and other healthcare settings. PARTICIPANTS: Patients undergoing THR. PRIMARY AND SECONDARY OUTCOME MEASURES: The number of THR-related SSIs occurring following the surgical operation. RESULTS: 12 studies involving 123 788 THRs and 9 infection control strategies were identified. The strategy of 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation' significantly reduced the risk of THR-related SSI compared with the referent strategy (no systemic antibiotics+plain cement+conventional ventilation), OR 0.13 (95% credible interval (CrI) 0.03-0.35), and had the highest probability (47-64%) and highest median rank of being the most effective strategy. There was some evidence to suggest that 'systemic antibiotics+antibiotic-impregnated cement+laminar airflow' could potentially increase infection risk compared with 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation', 1.96 (95% CrI 0.52-5.37). There was no high-quality evidence that antibiotic-impregnated cement without systemic antibiotic prophylaxis was effective in reducing infection compared with plain cement with systemic antibiotics, 1.28 (95% CrI 0.38-3.38). CONCLUSIONS: We found no convincing evidence in favour of the use of laminar airflow over conventional ventilation for prevention of THR-related SSIs, yet laminar airflow is costly and widely used. Antibiotic-impregnated cement without systemic antibiotics may not be effective in reducing THR-related SSIs. The combination with the highest confidence for reducing SSIs was 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation'. Our evidence synthesis underscores the need to review current guidelines based on the available evidence, and to conduct further high-quality double-blind randomised controlled trials to better inform the current clinical guidelines and practice for prevention of THR-related SSIs.
Resumo:
Exact solutions of partial differential equation models describing the transport and decay of single and coupled multispecies problems can provide insight into the fate and transport of solutes in saturated aquifers. Most previous analytical solutions are based on integral transform techniques, meaning that the initial condition is restricted in the sense that the choice of initial condition has an important impact on whether or not the inverse transform can be calculated exactly. In this work we describe and implement a technique that produces exact solutions for single and multispecies reactive transport problems with more general, smooth initial conditions. We achieve this by using a different method to invert a Laplace transform which produces a power series solution. To demonstrate the utility of this technique, we apply it to two example problems with initial conditions that cannot be solved exactly using traditional transform techniques.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
A predictive model of terrorist activity is developed by examining the daily number of terrorist attacks in Indonesia from 1994 through 2007. The dynamic model employs a shot noise process to explain the self-exciting nature of the terrorist activities. This estimates the probability of future attacks as a function of the times since the past attacks. In addition, the excess of nonattack days coupled with the presence of multiple coordinated attacks on the same day compelled the use of hurdle models to jointly model the probability of an attack day and corresponding number of attacks. A power law distribution with a shot noise driven parameter best modeled the number of attacks on an attack day. Interpretation of the model parameters is discussed and predictive performance of the models is evaluated.
Resumo:
Childhood autism falls under the guise of autism spectrum disorders and is generally found in children over two years of age. There are of course variations in severity and clinical manifestations, however the most common features being disinterest in social interaction and engagement in ritualistic and repetitive behaviours. In Singapore the incidence of autism is on the rise as parents are becoming more aware of the early signs of autism and seek healthcare programmes to ensure the quality of life for their child is optimised. Two such programmes, Applied Behaiour Analysis and Floortime approach have proven successful in alleviating some of the behavioural and social skills problems associated with autism. Using positive behaviour reinforcement both Applied Behaviour Analysis and Floortime approach reward behaviour associated with positive social responses.
Resumo:
Background Less invasive methods of determining cardiac output are now readily available. Using indicator dilution technique, for example has made it easier to continuously measure cardiac output because it uses the existing intra-arterial line. Therefore gone is the need for a pulmonary artery floatation catheter and with it the ability to measure left atrial and left ventricular work indices as well the ability to monitor and measure a mixed venous saturation (SvO2). Purpose The aim of this paper is to put forward the notion that SvO2 provides valuable information about oxygen consumption and venous reserve; important measures in the critically ill to ensure oxygen supply meets cellular demand. In an attempt to portray this, a simplified example of the septic patient is offered to highlight the changing pathophysiological sequelae of the inflammatory process and its importance for monitoring SvO2. Relevance to clinical practice SvO2 monitoring, it could be argued, provides the gold standard for assessing arterial and venous oxygen indices in the critically ill. For the bedside ICU nurse the plethora of information inherent in SvO2 monitoring could provide them with important data that will assist in averting potential problems with oxygen delivery and consumption. However, it has been suggested that central venous saturation (ScvO2) might be an attractive alternative to SvO2 because of its less invasiveness and ease of obtaining a sample for analysis. There are problems with this approach and these are to do with where the catheter tip is sited and the nature of the venous admixture at this site. Studies have shown that ScvO2 is less accurate than SvO2 and should not be used as a sole guiding variable for decision-making. These studies have demonstrated that there is an unacceptably wide range in variance between ScvO2 and SvO2 and this is dependent on the presenting disease, in some cases SvO2 will be significantly lower than ScvO2. Conclusion Whilst newer technologies have been developed to continuously measure cardiac output, SvO2 monitoring is still an important adjunct to clinical decision-making in the ICU. Given the information that it provides, seeking alternatives such as ScvO2 or blood samples obtained from femorally placed central venous lines, can unnecessarily lead to inappropriate treatment being given or withheld. Instead when using ScvO2, trending of this variable should provide clinical determinates that are useable for the bedside ICU nurse, remembering that in most conditions SvO2 will be approximately 16% lower.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
This undergraduate student paper explores usage of mixed reality techniques as support tools for conceptual design. A proof-of-concept was developed to illustrate this principle. Using this as an example, a small group of designers was interviewed to determine their views on the use of this technology. These interviews are the main contribution of this paper. Several interesting applications were determined, suggesting possible usage in a wide range of domains. Paper-based sketching, mixed reality and sketch augmentation techniques complement each other, and the combination results in a highly intuitive interface.
Resumo:
In the current business world which companies’ competition is very compact in the business arena, quality in manufacturing and providing products and services can be considered as a means of seeking excellence and success of companies in this competition arena. Entering the era of e-commerce and emergence of new production systems and new organizational structures, traditional management and quality assurance systems have been challenged. Consequently, quality information system has been gained a special seat as one of the new tools of quality management. In this paper, quality information system has been studied with a review of the literature of the quality information system, and the role and position of quality Information System (QIS) among other information systems of a organization is investigated. The quality Information system models are analyzed and by analyzing and assessing presented models in quality information system a conceptual and hierarchical model of quality information system is suggested and studied. As a case study the hierarchical model of quality information system is developed by evaluating hierarchical models presented in the field of quality information system based on the Shetabkar Co.
Resumo:
Addressing the Crew Scheduling Problem (CSP) in transportation systems can be too complex to capture all details. The designed models usually ignore or simplify features which are difficult to formulate. This paper proposes an alternative formulation using a Mixed Integer Programming (MIP) approach to the problem. The optimisation model integrates the two phases of pairing generation and pairing optimisation by simultaneously sequencing trips into feasible duties and minimising total elapsed time of any duty. Crew scheduling constraints in which the crew have to return to their home depot at the end of the shift are included in the model. The flexibility of this model comes in the inclusion of the time interval of relief opportunities, allowing the crew to be relieved during a finite time interval. This will enhance the robustness of the schedule and provide a better representation of real-world conditions.
Resumo:
Software to create individualised finite element (FE) models of the osseoligamentous spine using pre-operative computed tomography (CT) data-sets for spinal surgery patients has recently been developed. This study presents a geometric sensitivity analysis of this software to assess the effect of intra-observer variability in user-selected anatomical landmarks. User-selected landmarks on the osseous anatomy were defined from CT data-sets for three scoliosis patients and these landmarks were used to reconstruct patient-specific anatomy of the spine and ribcage using parametric descriptions. The intra-observer errors in landmark co-ordinates for these anatomical landmarks were calculated. FE models of the spine and ribcage were created using the reconstructed anatomy for each patient and these models were analysed for a loadcase simulating clinical flexibility assessment. The intra-observer error in the anatomical measurements was low in comparison to the initial dimensions, with the exception of the angular measurements for disc wedge and zygapophyseal joint (z-joint) orientation and disc height. This variability suggested that CT resolution may influence such angular measurements, particularly for small anatomical features, such as the z-joints, and may also affect disc height. The results of the FE analysis showed low variation in the model predictions for spinal curvature with the mean intra-observer variability substantially less than the accepted error in clinical measurement. These findings demonstrate that intra-observer variability in landmark point selection has minimal effect on the subsequent FE predictions for a clinical loadcase.
Resumo:
Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.
Resumo:
Automated process discovery techniques aim at extracting process models from information system logs. Existing techniques in this space are effective when applied to relatively small or regular logs, but generate spaghetti-like and sometimes inaccurate models when confronted to logs with high variability. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. This leads to a collection of process models – each one representing a variant of the business process – as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity and low fitness. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically using subprocess extraction. Splitting is performed in a controlled manner in order to achieve user-defined complexity or fitness thresholds. Experiments on real-life logs show that the technique produces collections of models substantially smaller than those extracted by applying existing trace clustering techniques, while allowing the user to control the fitness of the resulting models.
Resumo:
Techniques for evaluating and selecting multivariate volatility forecasts are not yet understood as well as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a set of competing forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood-based loss function outperforms its competitors, including those based on the given portfolio application. This result indicates that considering the particular application of forecasts is not necessarily the most effective basis on which to select models.