987 resultados para African Institute for Mathematical Sciences
Resumo:
In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. The sugarcane transport system is very complex and uses daily schedules, consisting of a set of locomotives runs, to satisfy the requirements of the mill and harvesters. The total cost of sugarcane transport operations is very high; over 35% of the total cost of sugarcane production in Australia is incurred in cane transport. Efficient schedules for sugarcane transport can reduce the cost and limit the negative effects that this system can have on the raw sugar production system. There are several benefits to formulating the train scheduling problem as a blocking parallel-machine job shop scheduling (BPMJSS) problem, namely to prevent two trains passing in one section at the same time; to keep the train activities (operations) in sequence during each run (trip) by applying precedence constraints; to pass the trains on one section in the correct order (priorities of passing trains) by applying disjunctive constraints; and, to ease passing trains by solving rail conflicts by applying blocking constraints and Parallel Machine Scheduling. Therefore, the sugarcane rail operations are formulated as BPMJSS problem. A mixed integer programming and constraint programming approaches are used to describe the BPMJSS problem. The model is solved by the integration of constraint programming, mixed integer programming and search techniques. The optimality performance is tested by Optimization Programming Language (OPL) and CPLEX software on small and large size instances based on specific criteria. A real life problem is used to verify and validate the approach. Constructive heuristics and new metaheuristics including simulated annealing and tabu search are proposed to solve this complex and NP-hard scheduling problem and produce a more efficient scheduling system. Innovative hybrid and hyper metaheuristic techniques are developed and coded using C# language to improve the solutions quality and CPU time. Hybrid techniques depend on integrating heuristic and metaheuristic techniques consecutively, while hyper techniques are the complete integration between different metaheuristic techniques, heuristic techniques, or both.
Resumo:
In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
A vertex-centred finite volume method (FVM) for the Cahn-Hilliard (CH) and recently proposed Cahn-Hilliard-reaction (CHR) equations is presented. Information at control volume faces is computed using a high-order least-squares approach based on Taylor series approximations. This least-squares problem explicitly includes the variational boundary condition (VBC) that ensures that the discrete equations satisfy all of the boundary conditions. We use this approach to solve the CH and CHR equations in one and two dimensions and show that our scheme satisfies the VBC to at least second order. For the CH equation we show evidence of conservative, gradient stable solutions, however for the CHR equation, strict gradient-stability is more challenging to achieve.
Resumo:
We examine the solution of the two-dimensional Cahn-Hilliard-reaction (CHR) equation in the xy plane as a model of Li+ intercalation into LiFePO4 material. We validate our numerical solution against the solution of the depth-averaged equation, which has been used to model intercalation in the limit of highly orthotropic diffusivity and gradient penalty tensors. We then examine the phase-change behaviour in the full CHR system as these parameters become more isotropic, and find that as the Li+ diffusivity is increased in the x direction, phase separation persists at high currents, even in small crystals with averaged coherency strain included. The resulting voltage curves decrease monotonically, which has previously been considered a hallmark of crystals that fill homogeneously.
Resumo:
Objective: To assess the relationship between Bayesian MUNE and histological motor neuron counts in wild-type mice and in an animal model of ALS. Methods: We performed Bayesian MUNE paired with histological counts of motor neurons in the lumbar spinal cord of wild-type mice and transgenic SOD1 G93A mice that show progressive weakness over time. We evaluated the number of acetylcholine endplates that were innervated by a presynaptic nerve. Results: In wild-type mice, the motor unit number in the gastrocnemius muscle estimated by Bayesian MUNE was approximately half the number of motor neurons in the region of the spinal cord that contains the cell bodies of the motor neurons supplying the hindlimb crural flexor muscles. In SOD1 G93A mice, motor neuron numbers declined over time. This was associated with motor endplate denervation at the end-stage of disease. Conclusion: The number of motor neurons in the spinal cord of wild-type mice is proportional to the number of motor units estimated by Bayesian MUNE. In SOD1 G93A mice, there is a lower number of estimated motor units compared to the number of spinal cord motor neurons at the end-stage of disease, and this is associated with disruption of the neuromuscular junction. Significance: Our finding that the Bayesian MUNE method gives estimates of motor unit numbers that are proportional to the numbers of motor neurons in the spinal cord supports the clinical use of Bayesian MUNE in monitoring motor unit loss in ALS patients. © 2012 International Federation of Clinical Neurophysiology.
Resumo:
Objective: To use our Bayesian method of motor unit number estimation (MUNE) to evaluate lower motor neuron degeneration in ALS. Methods: In subjects with ALS we performed serial MUNE studies. We examined the repeatability of the test and then determined whether the loss of MUs was fitted by an exponential or Weibull distribution. Results: The decline in motor unit (MU) numbers was well-fitted by an exponential decay curve. We calculated the half life of MUs in the abductor digiti minimi (ADM), abductor pollicis brevis (APB) and/or extensor digitorum brevis (EDB) muscles. The mean half life of the MUs of ADM muscle was greater than those of the APB or EDB muscles. The half-life of MUs was less in the ADM muscle of subjects with upper limb than in those with lower limb onset. Conclusions: The rate of loss of lower motor neurons in ALS is exponential, the motor units of the APB decay more quickly than those of the ADM muscle and the rate of loss of motor units is greater at the site of onset of disease. Significance: This shows that the Bayesian MUNE method is useful in following the course and exploring the clinical features of ALS. 2012 International Federation of Clinical Neurophysiology.
Resumo:
Characterization of mass transfer properties was achieved in the longitudinal, radial, and tangential directions for four Australian hardwood species: spotted gum, blackbutt, jarrah, and messmate. Measurement of mass transfer properties for these species was necessary to complement current vacuum drying modeling research. Water-vapour diffusivity was determined in steady state using a specific vapometer. Permeability was determined using a specialized device developed to measure over a wide range of permeability values. Permeability values of some species and material directions were extremely low and undetectable by the mass flow meter device. Hence, a custom system based on volume evolution was conceived to determine very low, previously unpublished, wood permeability values. Mass diffusivity and permeability were lowest for spotted gum and highest for messmate. Except for messmate in the radial direction, the four species measured were less permeable in all directions than the lowest published figures, demonstrating the high impermeability of Australian hardwoods and partly accounting for their relatively slow drying rates. Permeability, water-vapour diffusivity, and associated anisotropic ratio data obtained for messmate were extreme or did not follow typical trends and is consequently the most difficult of the four woods to dry in terms of collapse and checking degradation. © The State of Queensland, Department of Agriculture, Fisheries and Forestry, 2012.
Resumo:
The Balanced method was introduced as a class of quasi-implicit methods, based upon the Euler-Maruyama scheme, for solving stiff stochastic differential equations. We extend the Balanced method to introduce a class of stable strong order 1. 0 numerical schemes for solving stochastic ordinary differential equations. We derive convergence results for this class of numerical schemes. We illustrate the asymptotic stability of this class of schemes is illustrated and is compared with contemporary schemes of strong order 1. 0. We present some evidence on parametric selection with respect to minimising the error convergence terms. Furthermore we provide a convergence result for general Balanced style schemes of higher orders.
Resumo:
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.
Resumo:
A new dualscale modelling approach is presented for simulating the drying of a wet hygroscopic porous material that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of wood at low temperatures and is valid in the so-called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradients of moisture content and temperature on the microscopic field using suitably-defined periodic boundary conditions, which allows the macroscopic mass and thermal fluxes to be defined as averages of the microscopic fluxes over the unit cell. This novel formulation accounts for the intricate coupling of heat and mass transfer at the microscopic scale but reduces to a classical homogenisation approach if a linear relationship is assumed between the microscopic gradient and flux. Simulation results for a sample of spruce wood highlight the potential and flexibility of the new dual-scale approach. In particular, for a given unit cell configuration it is not necessary to propose the form of the macroscopic fluxes prior to the simulations because these are determined as a direct result of the dual-scale formulation.
Resumo:
A Jacobian-free variable-stepsize method is developed for the numerical integration of the large, stiff systems of differential equations encountered when simulating transport in heterogeneous porous media. Our method utilises the exponential Rosenbrock-Euler method, which is explicit in nature and requires a matrix-vector product involving the exponential of the Jacobian matrix at each step of the integration process. These products can be approximated using Krylov subspace methods, which permit a large integration stepsize to be utilised without having to precondition the iterations. This means that our method is truly "Jacobian-free" - the Jacobian need never be formed or factored during the simulation. We assess the performance of the new algorithm for simulating the drying of softwood. Numerical experiments conducted for both low and high temperature drying demonstrates that the new approach outperforms (in terms of accuracy and efficiency) existing simulation codes that utilise the backward Euler method via a preconditioned Newton-Krylov strategy.
Resumo:
This paper studies time integration methods for large stiff systems of ordinary differential equations (ODEs) of the form u'(t) = g(u(t)). For such problems, implicit methods generally outperform explicit methods, since the time step is usually less restricted by stability constraints. Recently, however, explicit so-called exponential integrators have become popular for stiff problems due to their favourable stability properties. These methods use matrix-vector products involving exponential-like functions of the Jacobian matrix, which can be approximated using Krylov subspace methods that require only matrix-vector products with the Jacobian. In this paper, we implement exponential integrators of second, third and fourth order and demonstrate that they are competitive with well-established approaches based on the backward differentiation formulas and a preconditioned Newton-Krylov solution strategy.
Resumo:
Motor unit number estimation (MUNE) is a method which aims to provide a quantitative indicator of progression of diseases that lead to loss of motor units, such as motor neurone disease. However the development of a reliable, repeatable and fast real-time MUNE method has proved elusive hitherto. Ridall et al. (2007) implement a reversible jump Markov chain Monte Carlo (RJMCMC) algorithm to produce a posterior distribution for the number of motor units using a Bayesian hierarchical model that takes into account biological information about motor unit activation. However we find that the approach can be unreliable for some datasets since it can suffer from poor cross-dimensional mixing. Here we focus on improved inference by marginalising over latent variables to create the likelihood. In particular we explore how this can improve the RJMCMC mixing and investigate alternative approaches that utilise the likelihood (e.g. DIC (Spiegelhalter et al., 2002)). For this model the marginalisation is over latent variables which, for a larger number of motor units, is an intractable summation over all combinations of a set of latent binary variables whose joint sample space increases exponentially with the number of motor units. We provide a tractable and accurate approximation for this quantity and also investigate simulation approaches incorporated into RJMCMC using results of Andrieu and Roberts (2009).
Resumo:
Detailed representations of complex flow datasets are often difficult to generate using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows. We review two popular texture based techniques and their application to flow datasets sourced from active research projects. The techniques investigated were Line integral convolution (LIC) [1], and Image based flow visualisation (IBFV) [18]. We evaluated these and report on their effectiveness from a visualisation perspective. We also report on their ease of implementation and computational overheads.