894 resultados para Inter Session Variability Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to conduct a qualitative review of randomised controlled trials in relation to the treatment of adults with co-occurring mental health and substance use disorder (MH/SUD). In particular, integrated approaches are compared with non-integrated approaches to treatment. Ten articles were identified for inclusion in the review. The findings are equivocal with regard to the superior efficacy of integrated approaches to treatment, although the many limitations of the studies need to be considered in our understanding of this finding. Clearly, this is an extremely challenging client group to engage and maintain in intervention research, and the complexity and variability of the problems render control particularly difficult. The lack of available evidence to support the superiority of integration is discussed in relation to these challenges. Much remains to be investigated with regard to integrated management and care for people with co-occurring and MH/SUD, particularly for specific combinations of dual diagnosis and giving consideration to the level of inter-relatedness between the disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The equations governing saltwater intrusion in coastal aquifers are complex. Backward Euler time stepping approaches are often used to advance the solution to these equations in time, which typically requires that small time steps be taken in order to ensure that an accurate solution is obtained. We show that a method of lines approach incorporating variable order backward differentiation formulas can greatly improve the efficiency of the time stepping process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the filed of semantic grid, QoS-based Web service scheduling for workflow optimization is an important problem.However, in semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the scheduling consider not only quality properties of Web services, but also inter service dependencies which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address scheduling optimization problems in workflow applications in the presence of domain constraints and inter service dependencies. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study directly measured the load acting on the abutment of the osseointegrated implant system of transfemoral amputees during level walking, and studied the variability of the load within and among amputees. Twelve active transfemoral amputees (age: 54±12 years, mass:84.3±16.3 kg, height: 17.8±0.10 m) fitted with an osseointegrated implant for over 1 year participated in the study. The load applied on the abutment was measured during unimpeded, level walking in a straight line using a commercial six-channel transducer mounted between the abutment and the prosthetic knee. The pattern and the magnitude of the three-dimensional forces and moments were revealed. Results showed a low step-to-step variability of each subject, but a high subject-to-subject variability in local extrema of body-weight normalized forces and moments and impulse data. The high subject-to-subject variability suggests that the mechanical design of the implant system should be customized for each individual, or that a fit-all design should take into consideration the highest values of load within a broad range of amputees. It also suggests specific loading regime in rehabilitation training are necessary for a given subject. Thus the loading magnitude and variability demonstrated should be useful in designing an osseointegrated implant system better able to resist mechanical failure and in refining the rehabilitation protocol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heart disease is attributed as the highest cause of death in the world. Although this could be alleviated by heart transplantation, there is a chronic shortage of donor hearts and so mechanical solutions are being considered. Currently, many Ventricular Assist Devices (VADs) are being developed worldwide in an effort to increase life expectancy and quality of life for end stage heart failure patients. Current pre-clinical testing methods for VADs involve laboratory testing using Mock Circulation Loops (MCLs), and in vivo testing in animal models. The research and development of highly accurate MCLs is vital to the continuous improvement of VAD performance. The first objective of this study was to develop and validate a mathematical model of a MCL. This model could then be used in the design and construction of a variable compliance chamber to improve the performance of an existing MCL as well as form the basis for a new miniaturised MCL. An extensive review of literature was carried out on MCLs and mathematical modelling of their function. A mathematical model of a MCL was then created in the MATLAB/SIMULINK environment. This model included variable features such as resistance, fluid inertia and volumes (resulting from the pipe lengths and diameters); compliance of Windkessel chambers, atria and ventricles; density of both fluid and compressed air applied to the system; gravitational effects on vertical columns of fluid; and accurately modelled actuators controlling the ventricle contraction. This model was then validated using the physical properties and pressure and flow traces produced from a previously developed MCL. A variable compliance chamber was designed to reproduce parameters determined by the mathematical model. The function of the variability was achieved by controlling the transmural pressure across a diaphragm to alter the compliance of the system. An initial prototype was tested in a previously developed MCL, and a variable level of arterial compliance was successfully produced; however, the complete range of compliance values required for accurate physiological representation was not able to be produced with this initial design. The mathematical model was then used to design a smaller physical mock circulation loop, with the tubing sizes adjusted to produce accurate pressure and flow traces whilst having an appropriate frequency response characteristic. The development of the mathematical model greatly assisted the general design of an in vitro cardiovascular device test rig, while the variable compliance chamber allowed simple and real-time manipulation of MCL compliance to allow accurate transition between a variety of physiological conditions. The newly developed MCL produced an accurate design of a mechanical representation of the human circulatory system for in vitro cardiovascular device testing and education purposes. The continued improvement of VAD test rigs is essential if VAD design is to improve, and hence improve quality of life and life expectancy for heart failure patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: In the majority of exercise intervention studies, the aggregate reported weight loss is often small. The efficacy of exercise as a weight loss tool remains in question. The aim of the present study was to investigate the variability in appetite and body weight when participants engaged in a supervised and monitored exercise programme. ---------- Design: Fifty-eight obese men and women (BMI = 31·8 ± 4·5 kg/m2) were prescribed exercise to expend approximately 2092 kJ (500 kcal) per session, five times a week at an intensity of 70 % maximum heart rate for 12 weeks under supervised conditions in the research unit. Body weight and composition, total daily energy intake and various health markers were measured at weeks 0, 4, 8 and 12. ---------- Results: Mean reduction in body weight (3·2 ± 1·98 kg) was significant (P < 0·001); however, there was large individual variability (−14·7 to +2·7 kg). This large variability could be largely attributed to the differences in energy intake over the 12-week intervention. Those participants who failed to lose meaningful weight increased their food intake and reduced intake of fruits and vegetables. ---------- Conclusion: These data have demonstrated that even when exercise energy expenditure is high, a healthy diet is still required for weight loss to occur in many people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ticagrelor is an orally active ADP P2Y12 receptor antagonist in development by AstraZeneca plc for the reduction of recurrent ischemic events in patients with acute coronary syndromes (ACS). Prior to the development of ticagrelor, thienopyridine compounds, such as clopidogrel, were the focus of research into therapies for ACS. Although the thienopyridines are effective platelet aggregation inhibitors, they are prodrugs and, consequently, exert a slow onset of action. In addition, the variability in inter-individual metabolism of thienopyridine prodrugs has been associated with reduced efficacy in some patients. Ticagrelor is not a prodrug and exhibits a more rapid onset of action than the thienopyridine prodrugs. In clinical trials conducted to date, ticagrelor was a potent inhibitor of ADP-induced platelet aggregation and demonstrated effects that were comparable to clopidogrel. In a phase II, short-term trial, the bleeding profile of participants treated with ticagrelor was similar to that obtained with clopidogrel; however, an increased incidence of dyspnea was observed - an effect that has not been reported with the thienopyridines. Considering the occurrence of dyspnea, and the apparent non-superiority of ticagrelor to clopidogrel, it is difficult to justify a clear benefit to the continued development of ticagrelor. Outcomes from an ongoing phase III trial comparing ticagrelor with clopidogrel in 18,000 patients with ACS are likely to impact on the future development of ticagrelor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to verify within- and between-day repeatability and variability in children's oxygen uptake (VO^sub 2^), gross economy (GE; VO^sub 2^ divided by speed) and heart rate (HR) during treadmill walking based on self-selected speed (SS). Fourteen children (10.1 ± 1.4 years) undertook three testing sessions over 2 days in which four walking speeds, including SS were tested. Within- and between-day repeatability were assessed using the Bland and Altman method, and coefficients of variability (CV) were determined for each child across exercise bouts and averaged to obtain a mean group CV value for VO^sub 2^, GE, and HR per speed. Repeated measures analysis of variance showed no statistically significant differences in within- or between-day CV for VO^sub 2^, GE, or HR at any speed. Repeatability within- and between-day for VO^sub 2^, GE, and HR for all speeds was verified. These results suggest that submaximal VO^sub 2^ during treadmill walking is stable and reproducible at a range of speeds based on children's SS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, conceptual modelling of business processes involves the use of visual grammars for the representation of, amongst other things, activities, choices and events. These grammars, while very useful for experts, are difficult to understand by naive stakeholders. Annotations of such process models have been developed to assist in understanding aspects of these grammars via map-based approaches, and further work has looked at forms of 3D conceptual models. However, no one has sought to embed the conceptual models into a fully featured 3D world, using the spatial annotations to explicate the underlying model clearly. In this paper, we present an approach to conceptual process model visualisation that enhances a 3D virtual world with annotations representing process constructs, facilitating insight into the developed model. We then present a prototype implementation of a 3D Virtual BPMN Editor that embeds BPMN process models into a 3D world. We show how this gives extra support for tasks performed by the conceptual modeller, providing better process model communication to stakeholders..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The roles of weather variability and sunspots in the occurrence of cyanobacteria blooms, were investigated using cyanobacteria cell data collected from the Fred Haigh Dam, Queensland, Australia. Time series generalized linear model and classification and regression (CART) model were used in the analysis. Data on notified cell numbers of cyanobacteria and weather variables over the periods 2001 and 2005 were provided by the Australian Department of Natural Resources and Water, and Australian Bureau of Meteorology, respectively. The results indicate that monthly minimum temperature (relative risk [RR]: 1.13, 95% confidence interval [CI]: 1.02-1.25) and rainfall (RR: 1.11; 95% CI: 1.03-1.20) had a positive association, but relative humidity (RR: 0.94; 95% CI: 0.91-0.98) and wind speed (RR:0.90; 95% CI: 0.82-0.98) were negatively associated with the cyanobacterial numbers, after adjustment for seasonality and auto-correlation. The CART model showed that the cyanobacteria numbers were best described by an interaction between minimum temperature, relative humidity, and sunspot numbers. When minimum temperature exceeded 18%C and relative humidity was under 66%, the number of cyanobacterial cells rose by 2.15-fold. We conclude that the weather variability and sunspot activity may affect cyanobacterial blooms in dams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

These National Guidelines and Case Studies for Digital Modelling are the outcomes from one of a number of Building Information Modelling (BIM)-related projects undertaken by the CRC for Construction Innovation. Since the CRC opened its doors in 2001, the industry has seen a rapid increase in interest in BIM, and widening adoption. These guidelines and case studies are thus very timely, as the industry moves to model-based working and starts to share models in a new context called integrated practice. Governments, both federal and state, and in New Zealand are starting to outline the role they might take, so that in contrast to the adoption of 2D CAD in the early 90s, we ensure that a national, industry-wide benefit results from this new paradigm of working. Section 1 of the guidelines give us an overview of BIM: how it affects our current mode of working, what we need to do to move to fully collaborative model-based facility development. The role of open standards such as IFC is described as a mechanism to support new processes, and make the extensive design and construction information available to asset operators and managers. Digital collaboration modes, types of models, levels of detail, object properties and model management complete this section. It will be relevant for owners, managers and project leaders as well as direct users of BIM. Section 2 provides recommendations and guides for key areas of model creation and development, and the move to simulation and performance measurement. These are the more practical parts of the guidelines developed for design professionals, BIM managers, technical staff and ‘in the field’ workers. The guidelines are supported by six case studies including a summary of lessons learnt about implementing BIM in Australian building projects. A key aspect of these publications is the identification of a number of important industry actions: the need for BIM-compatible product information and a national context for classifying product data; the need for an industry agreement and setting process-for-process definition; and finally, the need to ensure a national standard for sharing data between all of the participants in the facility-development process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

These National Guidelines and Case Studies for Digital Modelling are the outcomes from one of a number of Building Information Modelling (BIM)-related projects undertaken by the CRC for Construction Innovation. Since the CRC opened its doors in 2001, the industry has seen a rapid increase in interest in BIM, and widening adoption. These guidelines and case studies are thus very timely, as the industry moves to model-based working and starts to share models in a new context called integrated practice. Governments, both federal and state, and in New Zealand are starting to outline the role they might take, so that in contrast to the adoption of 2D CAD in the early 90s, we ensure that a national, industry-wide benefit results from this new paradigm of working. Section 1 of the guidelines give us an overview of BIM: how it affects our current mode of working, what we need to do to move to fully collaborative model-based facility development. The role of open standards such as IFC is described as a mechanism to support new processes, and make the extensive design and construction information available to asset operators and managers. Digital collaboration modes, types of models, levels of detail, object properties and model management complete this section. It will be relevant for owners, managers and project leaders as well as direct users of BIM. Section 2 provides recommendations and guides for key areas of model creation and development, and the move to simulation and performance measurement. These are the more practical parts of the guidelines developed for design professionals, BIM managers, technical staff and ‘in the field’ workers. The guidelines are supported by six case studies including a summary of lessons learnt about implementing BIM in Australian building projects. A key aspect of these publications is the identification of a number of important industry actions: the need for BIMcompatible product information and a national context for classifying product data; the need for an industry agreement and setting process-for-process definition; and finally, the need to ensure a national standard for sharing data between all of the participants in the facility-development process.