990 resultados para 01 Mathematical Sciences
Resumo:
Three types of shop scheduling problems, the flow shop, the job shop and the open shop scheduling problems, have been widely studied in the literature. However, very few articles address the group shop scheduling problem introduced in 1997, which is a general formulation that covers the three above mentioned shop scheduling problems and the mixed shop scheduling problem. In this paper, we apply tabu search to the group shop scheduling problem and evaluate the performance of the algorithm on a set of benchmark problems. The computational results show that our tabu search algorithm is typically more efficient and faster than the other methods proposed in the literature. Furthermore, the proposed tabu search method has found some new best solutions of the benchmark instances.
Resumo:
In this paper, three metaheuristics are proposed for solving a class of job shop, open shop, and mixed shop scheduling problems. We evaluate the performance of the proposed algorithms by means of a set of Lawrence’s benchmark instances for the job shop problem, a set of randomly generated instances for the open shop problem, and a combined job shop and open shop test data for the mixed shop problem. The computational results show that the proposed algorithms perform extremely well on all these three types of shop scheduling problems. The results also reveal that the mixed shop problem is relatively easier to solve than the job shop problem due to the fact that the scheduling procedure becomes more flexible by the inclusion of more open shop jobs in the mixed shop.
Resumo:
In this paper, we propose three meta-heuristic algorithms for the permutation flowshop (PFS) and the general flowshop (GFS) problems. Two different neighborhood structures are used for these two types of flowshop problem. For the PFS problem, an insertion neighborhood structure is used, while for the GFS problem, a critical-path neighborhood structure is adopted. To evaluate the performance of the proposed algorithms, two sets of problem instances are tested against the algorithms for both types of flowshop problems. The computational results show that the proposed meta-heuristic algorithms with insertion neighborhood for the PFS problem perform slightly better than the corresponding algorithms with critical-path neighborhood for the GFS problem. But in terms of computation time, the GFS algorithms are faster than the corresponding PFS algorithms.
Resumo:
A hospital consists of a number of wards, units and departments that provide a variety of medical services and interact on a day-to-day basis. Nearly every department within a hospital schedules patients for the operating theatre (OT) and most wards receive patients from the OT following post-operative recovery. Because of the interrelationships between units, disruptions and cancellations within the OT can have a flow-on effect to the rest of the hospital. This often results in dissatisfied patients, nurses and doctors, escalating waiting lists, inefficient resource usage and undesirable waiting times. The objective of this study is to use Operational Research methodologies to enhance the performance of the operating theatre by improving elective patient planning using robust scheduling and improving the overall responsiveness to emergency patients by solving the disruption management and rescheduling problem. OT scheduling considers two types of patients: elective and emergency. Elective patients are selected from a waiting list and scheduled in advance based on resource availability and a set of objectives. This type of scheduling is referred to as ‘offline scheduling’. Disruptions to this schedule can occur for various reasons including variations in length of treatment, equipment restrictions or breakdown, unforeseen delays and the arrival of emergency patients, which may compete for resources. Emergency patients consist of acute patients requiring surgical intervention or in-patients whose conditions have deteriorated. These may or may not be urgent and are triaged accordingly. Most hospitals reserve theatres for emergency cases, but when these or other resources are unavailable, disruptions to the elective schedule result, such as delays in surgery start time, elective surgery cancellations or transfers to another institution. Scheduling of emergency patients and the handling of schedule disruptions is an ‘online’ process typically handled by OT staff. This means that decisions are made ‘on the spot’ in a ‘real-time’ environment. There are three key stages to this study: (1) Analyse the performance of the operating theatre department using simulation. Simulation is used as a decision support tool and involves changing system parameters and elective scheduling policies and observing the effect on the system’s performance measures; (2) Improve viability of elective schedules making offline schedules more robust to differences between expected treatment times and actual treatment times, using robust scheduling techniques. This will improve the access to care and the responsiveness to emergency patients; (3) Address the disruption management and rescheduling problem (which incorporates emergency arrivals) using innovative robust reactive scheduling techniques. The robust schedule will form the baseline schedule for the online robust reactive scheduling model.
Resumo:
Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.
Resumo:
A distinct calcium profile is strongly implicated in regulating the multi-layered structure of the epidermis. However, the mechanisms that govern the regulation of this calcium profile are currently unclear. It clearly depends on the relatively impermeable barrier of the stratum corneum (passive regulation) but may also depend on calcium exchanges between keratinocytes and extracellular fluid (active regulation). Using a mathematical model that treats the viable sublayers of unwounded human and murine epidermis as porous media and assumes that their calcium profiles are passively regulated, we demonstrate that these profiles are also actively regulated. To obtain this result, we found that diffusion governs extracellular calcium motion in the viable epidermis and hence intracellular calcium is the main source of the epidermal calcium profile. Then, by comparison with experimental calcium profiles and combination with a hypothesised cell velocity distribution in the viable epidermis, we found that the net influx of calcium ions into keratinocytes from extracellular fluid may be constant and positive throughout the stratum basale and stratum spinosum, and that there is a net outflux of these ions in the stratum granulosum. Hence the calcium exchange between keratinocytes and extracellular fluid differs distinctly between the stratum granulosum and the underlying sublayers, and these differences actively regulate the epidermal calcium profile. Our results also indicate that plasma membrane dysfunction may be an early event during keratinocyte disintegration in the stratum granulosum.
Resumo:
The present study examined experimentally the phenological responses of a range of plant species to rises in temperature. We used the climate-change field protocol of the International Tundra Experiment (ITEX), which measures plant responses to warming of 1 to 2°C inside small open-topped chambers. The field study was established on the Bogong High Plains, Australia, in subalpine open heathlands; the most common treeless plant community on the Bogong High Plains. The study included areas burnt by fire in 2003, and therefore considers the interactive effects of warming and fire, which have rarely been studied in high mountain environments. From November 2003 to March 2006, various phenological phases were monitored inside and outside chambers during the snow-free periods. Warming resulted in earlier occurrence of key phenological events in 7 of the 14 species studied. Burning altered phenology in 9 of 10 species studied, with both earlier and later phenological changes depending on the species. There were no common phenological responses to warming or burning among species of the same family, growth form or flowering type (i.e. early or late-flowering species), when all phenological events were examined. The proportion of plants that formed flower buds was influenced by fire in half of the species studied. The findings support previous findings of ITEX and other warming experiments; that is, species respond individualistically to experimental warming. The inter-year variation in phenological response, the idiosyncratic nature of the responses to experimental warming among species, and an inherent resilience to fire, may result in community resilience to short-term climate change. In the first 3 years of experimental warming, phenological responses do not appear to be driving community-level change. Our findings emphasise the value of examining multiple species in climate-change studies.
Resumo:
The likely phenological responses of plants to climate warming can be measured through experimental manipulation of field sites, but results are rarely validated against year-to-year changes in climate. Here, we describe the response of 1-5 years of experimental warming on phenology (budding, flowering and seed maturation) of six common subalpine plant species in the Australian Alps using the International Tundra Experiment (ITEX) protocol.2. Phenological changes in some species (particularly the forb Craspedia jamesii) were detected in experimental plots within a year of warming, whereas changes in most other species (the forb Erigeron bellidioides, the shrub Asterolasia trymalioides and the graminoids Carex breviculmis and Poa hiemata) did not develop until after 2-4 years; thus, there appears to be a cumulative effect of warming for some species across multiple years.3. There was evidence of changes in the length of the period between flowering and seed maturity in one species (P. hiemata) that led to a similar timing of seed maturation, suggesting compensation.4. Year-to-year variation in phenology was greater than variation between warmed and control plots and could be related to differences in thawing degree days (particularly, for E. bellidioides) due to earlier timing of budding and other events under warmer conditions. However, in Carex breviculmis, there was no association between phenology and temperature changes across years.5. These findings indicate that, although phenological changes occurred earlier in response to warming in all six species, some species showed buffered rather than immediate responses.6. Synthesis. Warming in ITEX open-top chambers in the Australian Alps produced earlier budding, flowering and seed set in several alpine species. Species also altered the timing of these events, particularly budding, in response to year-to-year temperature variation. Some species responded immediately, whereas in others the cumulative effects of warming across several years were required before a response was detected.
Resumo:
PySSM is a Python package that has been developed for the analysis of time series using linear Gaussian state space models (SSM). PySSM is easy to use; models can be set up quickly and efficiently and a variety of different settings are available to the user. It also takes advantage of scientific libraries Numpy and Scipy and other high level features of the Python language. PySSM is also used as a platform for interfacing between optimised and parallelised Fortran routines. These Fortran routines heavily utilise Basic Linear Algebra (BLAS) and Linear Algebra Package (LAPACK) functions for maximum performance. PySSM contains classes for filtering, classical smoothing as well as simulation smoothing.
Resumo:
Increasing resistance of rabbits to myxomatosis in Australia has led to the exploration of Rabbit Haemorrhagic Disease, also called Rabbit Calicivirus Disease (RCD) as a possible control agent. While the initial spread of RCD in Australia resulted in widespread rabbit mortality in affected areas, the possible population dynamic effects of RCD and myxomatosis operating within the same system have not been properly explored. Here we present early mathematical modelling examining the interaction between the two diseases. In this study we use a deterministic compartment model, based on the classical SIR model in infectious disease modelling. We consider, here, only a single strain of myxomatosis and RCD and neglect latent periods. We also include logistic population growth, with the inclusion of seasonal birth rates. We assume there is no cross-immunity due to either disease. The mathematical model allows for the possibility of both diseases to be simultaneously present in an individual, although results are also presented for the case where co infection is not possible, since co-infection is thought to be rare and questions exist as to whether it can occur. The simulation results of this investigation show that it is a crucial issue and should be part of future field studies. A single simultaneous outbreak of RCD and myxomatosis was simulated, while ignoring natural births and deaths, appropriate for a short timescale of 20 days. Simultaneous outbreaks may be more common in Queensland. For the case where co-infection is not possible we find that the simultaneous presence of myxomatosis in the population suppresses the prevalence of RCD, compared to an outbreak of RCD with no outbreak of myxomatosis, and thus leads to a less effective control of the population. The reason for this is that infection with myxomatosis removes potentially susceptible rabbits from the possibility of infection with RCD (like a vaccination effect). We found that the reduction in the maximum prevalence of RCD was approximately 30% for an initial prevalence of 20% of myxomatosis, for the case where there was no simultaneous outbreak of myxomatosis, but the peak prevalence was only 15% when there was a simultaneous outbreak of myxomatosis. However, this maximum reduction will depend on other parameter values chosen. When co-infection is allowed then this suppression effect does occur but to a lesser degree. This is because the rabbits infected with both diseases reduces the prevalence of myxomatosis. We also simulated multiple outbreaks over a longer timescale of 10 years, including natural population growth rates, with seasonal birth rates and density dependent(logistic) death rates. This shows how both diseases interact with each other and with population growth. Here we obtain sustained outbreaks occurring approximately every two years for the case of a simultaneous outbreak of both diseases but without simultaneous co-infection, with the prevalence varying from 0.1 to 0.5. Without myxomatosis present then the simulation predicts RCD dies out quickly without further introduction from elsewhere. With the possibility of simultaneous co-infection of rabbits, sustained outbreaks are possible but then the outbreaks are less severe and more frequent (approximately yearly). While further model development is needed, our work to date suggests that: 1) the diseases are likely to interact via their impacts on rabbit abundance levels, and 2) introduction of RCD can suppress myxomatosis prevalence. We recommend that further modelling in conjunction with field studies be carried out to further investigate how these two diseases interact in the population.
Resumo:
Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.
Resumo:
Precise identification of the time when a change in a hospital outcome has occurred enables clinical experts to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for survival time of a clinical procedure in the presence of patient mix in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step change in the mean survival time of patients who underwent cardiac surgery. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. Markov Chain Monte Carlo is used to obtain posterior distributions of the change point parameters including location and magnitude of changes and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time CUSUM control charts for different magnitude scenarios. The proposed estimator shows a better performance where a longer follow-up period, censoring time, is applied. In comparison with the alternative built-in CUSUM estimator, more accurate and precise estimates are obtained by the Bayesian estimator. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.
Resumo:
Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.
Resumo:
The crosstalk between fibroblasts and keratinocytes is a vital component of the wound healing process, and involves the activity of a number of growth factors and cytokines. In this work, we develop a mathematical model of this crosstalk in order to elucidate the effects of these interactions on the regeneration of collagen in a wound that heals by second intention. We consider the role of four components that strongly affect this process: transforming growth factor-beta, platelet-derived growth factor, interleukin-1 and keratinocyte growth factor. The impact of this network of interactions on the degradation of an initial fibrin clot, as well as its subsequent replacement by a matrix that is mainly comprised of collagen, is described through an eight-component system of nonlinear partial differential equations. Numerical results, obtained in a two-dimensional domain, highlight key aspects of this multifarious process such as reepithelialisation. The model is shown to reproduce many of the important features of normal wound healing. In addition, we use the model to simulate the treatment of two pathological cases: chronic hypoxia, which can lead to chronic wounds; and prolonged inflammation, which has been shown to lead to hypertrophic scarring. We find that our model predictions are qualitatively in agreement with previously reported observations, and provide an alternative pathway for gaining insight into this complex biological process.
Resumo:
A Multimodal Seaport Container Terminal (MSCT) is a complex system which requires careful planning and control in order to operate efficiently. It consists of a number of subsystems that require optimisation of the operations within them, as well as synchronisation of machines and containers between the various subsystems. Inefficiency in the terminal can delay ships from their scheduled timetables, as well as cause delays in delivering containers to their inland destinations, both of which can be very costly to their operators. The purpose of this PhD thesis is to use Operations Research methodologies to optimise and synchronise these subsystems as an integrated application. An initial model is developed for the overall MSCT; however, due to a large number of assumptions that had to be made, as well as other issues, it is found to be too inaccurate and infeasible for practical use. Instead, a method of developing models for each subsystem is proposed that then be integrated with each other. Mathematical models are developed for the Storage Area System (SAS) and Intra-terminal Transportation System (ITTS). The SAS deals with the movement and assignment of containers to stacks within the storage area, both when they arrive and when they are rehandled to retrieve containers below them. The ITTS deals with scheduling the movement of containers and machines between the storage areas and other sections of the terminal, such as the berth and road/rail terminals. Various constructive heuristics are explored and compared for these models to produce good initial solutions for large-sized problems, which are otherwise impractical to compute by exact methods. These initial solutions are further improved through the use of an innovative hyper-heuristic algorithm that integrates the SAS and ITTS solutions together and optimises them through meta-heuristic techniques. The method by which the two models can interact with each other as an integrated system will be discussed, as well as how this method can be extended to the other subsystems of the MSCT.