959 resultados para Robust Stochastic Optimization
Improved speech recognition using adaptive audio-visual fusion via a stochastic secondary classifier
Resumo:
I am suspicious of tools without a purpose - tools that are not developed in response to a clearly defined problem. Of course tools without a purpose can still be useful. However the development of first generation CAD was seriously impeded because the solution came before the problem. We are in danger of repeating this mistake if we do not clarify the nature of the problem that we are trying to solve with the next generation of tools. Back in the 1980s I used to add a postscript slide at the end of CAD conference presentations and the applause would invariably turn to concern. The slide simple asked: can anyone remember what it was about design that needed aiding before we had computer aided design?
Resumo:
This paper proposes a new approach for delay-dependent robust H-infinity stability analysis and control synthesis of uncertain systems with time-varying delay. The key features of the approach include the introduction of a new Lyapunov–Krasovskii functional, the construction of an augmented matrix with uncorrelated terms, and the employment of a tighter bounding technique. As a result, significant performance improvement is achieved in system analysis and synthesis without using either free weighting matrices or model transformation. Examples are given to demonstrate the effectiveness of the proposed approach.
Resumo:
Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.
Resumo:
This paper presents a multi-objective optimization strategy for heavy truck suspension systems based on modified skyhook damping (MSD) control, which improves ride comfort and road-friendliness simultaneously. A four-axle heavy truck-road coupling system model was established using functional virtual prototype technology; the model was then validated through a ride comfort test. As the mechanical properties and time lag of dampers were taken into account, MSD control of active and semi-active dampers was implemented using Matlab/Simulink. Through co-simulations with Adams and Matlab, the effects of passive, semi-active MSD control, and active MSD control were analyzed and compared; thus, control parameters which afforded the best integrated performance were chosen. Simulation results indicated that MSD control improves a truck’s ride comfort and roadfriendliness, while the semi-active MSD control damper obtains road-friendliness comparable to the active MSD control damper.
Resumo:
Insufficient availability of osteogenic cells limits bone regeneration through cell-based therapies. This study investigated the potential of amniotic fluid–derived stem (AFS) cells to synthesize mineralized extracellular matrix within porous medical-grade poly-e-caprolactone (mPCL) scaffolds. The AFS cells were initially differentiated in two-dimensional (2D) culture to determine appropriate osteogenic culture conditions and verify physiologic mineral production by the AFS cells. The AFS cells were then cultured on 3D mPCL scaffolds (6-mm diameter9-mm height) and analyzed for their ability to differentiate to osteoblastic cells in this environment. The amount and distribution of mineralized matrix production was quantified throughout the mPCL scaffold using nondestructive micro computed tomography (microCT) analysis and confirmed through biochemical assays. Sterile microCT scanning provided longitudinal analysis of long-term cultured mPCL constructs to determine the rate and distribution of mineral matrix within the scaffolds. The AFS cells deposited mineralized matrix throughout the mPCL scaffolds and remained viable after 15 weeks of 3D culture. The effect of predifferentiation of the AFS cells on the subsequent bone formation in vivo was determined in a rat subcutaneous model. Cells that were pre-differentiated for 28 days in vitro produced seven times more mineralized matrix when implanted subcutaneously in vivo. This study demonstrated the potential of AFS cells to produce 3D mineralized bioengineered constructs in vitro and in vivo and suggests that AFS cells may be an effective cell source for functional repair of large bone defects
Resumo:
In sport and exercise biomechanics, forward dynamics analyses or simulations have frequently been used in attempts to establish optimal techniques for performance of a wide range of motor activities. However, the accuracy and validity of these simulations is largely dependent on the complexity of the mathematical model used to represent the neuromusculoskeletal system. It could be argued that complex mathematical models are superior to simple mathematical models as they enable basic mechanical insights to be made and individual-specific optimal movement solutions to be identified. Contrary to some claims in the literature, however, we suggest that it is currently not possible to identify the complete optimal solution for a given motor activity. For a complete optimization of human motion, dynamical systems theory implies that mathematical models must incorporate a much wider range of organismic, environmental and task constraints. These ideas encapsulate why sports medicine specialists need to adopt more individualized clinical assessment procedures in interpreting why performers' movement patterns may differ.
Resumo:
Traditionally, the aquisition of skills and sport movement has been characterised by numerous repetitions of presumed model movement pattern to be acquired by learners. This approach has been questioned by research identifying the presence of individualised movement patterns and the low probability of occurrence of two identical movements within and between individuals. In contrast, the differential learning approach claims advantage for incurring variability in the learning process by adding stochastic perturbations during practice. These ideas are exemplified by data from a high jump experiment which compared the effectiveness of classical and a differential training approach with pre-post test design. Results showed clear advantages for the group with additional stochastic perturbation during the aquisition phase in comparison to classically trained athletes. Analogies to similar phenomenological effects in the neurobiological literature are discussed.
Resumo:
In this study, the authors propose a novel video stabilisation algorithm for mobile platforms with moving objects in the scene. The quality of videos obtained from mobile platforms, such as unmanned airborne vehicles, suffers from jitter caused by several factors. In order to remove this undesired jitter, the accurate estimation of global motion is essential. However it is difficult to estimate global motions accurately from mobile platforms due to increased estimation errors and noises. Additionally, large moving objects in the video scenes contribute to the estimation errors. Currently, only very few motion estimation algorithms have been developed for video scenes collected from mobile platforms, and this paper shows that these algorithms fail when there are large moving objects in the scene. In this study, a theoretical proof is provided which demonstrates that the use of delta optical flow can improve the robustness of video stabilisation in the presence of large moving objects in the scene. The authors also propose to use sorted arrays of local motions and the selection of feature points to separate outliers from inliers. The proposed algorithm is tested over six video sequences, collected from one fixed platform, four mobile platforms and one synthetic video, of which three contain large moving objects. Experiments show our proposed algorithm performs well to all these video sequences.
Resumo:
The ICU is an integral part of any hospital and is under great load from patient arrivals as well as resource limitations. Scheduling of patients in the ICU is complicated by the two general types; elective surgery and emergency arrivals. This complicated situation is handled by creating a tentative initial schedule and then reacting to uncertain arrivals as they occur. For most hospitals there is little or no flexibility in the number of beds that are available for use now or in the future. We propose an integer programming model to handle a parallel machine reacting system for scheduled and unscheduled arrivals.
Resumo:
Surveillance for invasive non-indigenous species (NIS) is an integral part of a quarantine system. Estimating the efficiency of a surveillance strategy relies on many uncertain parameters estimated by experts, such as the efficiency of its components in face of the specific NIS, the ability of the NIS to inhabit different environments, and so on. Due to the importance of detecting an invasive NIS within a critical period of time, it is crucial that these uncertainties be accounted for in the design of the surveillance system. We formulate a detection model that takes into account, in addition to structured sampling for incursive NIS, incidental detection by untrained workers. We use info-gap theory for satisficing (not minimizing) the probability of detection, while at the same time maximizing the robustness to uncertainty. We demonstrate the trade-off between robustness to uncertainty, and an increase in the required probability of detection. An empirical example based on the detection of Pheidole megacephala on Barrow Island demonstrates the use of info-gap analysis to select a surveillance strategy.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.