356 resultados para RM(rate monotonic)algorithm
Resumo:
In cloud computing, resource allocation and scheduling of multiple composite web services is an important and challenging problem. This is especially so in a hybrid cloud where there may be some low-cost resources available from private clouds and some high-cost resources from public clouds. Meeting this challenge involves two classical computational problems: one is assigning resources to each of the tasks in the composite web services; the other is scheduling the allocated resources when each resource may be used by multiple tasks at different points of time. In addition, Quality-of-Service (QoS) issues, such as execution time and running costs, must be considered in the resource allocation and scheduling problem. Here we present a Cooperative Coevolutionary Genetic Algorithm (CCGA) to solve the deadline-constrained resource allocation and scheduling problem for multiple composite web services. Experimental results show that our CCGA is both efficient and scalable.
Resumo:
In a context where over-indebtedness and financial exclusion have been recognised as problems in Australia, it is undesirable that those who can least afford it, pay a high cost for short-term consumer credit. Evidence points to an increase in consumer debt in Australia and consequential over-indebtedness which has been shown to lead to a wide range of social problems.2 There is also evidence of financial exclusion, where consumers suffer a lack of access to mainstream financial services, and in Australia this is particularly the case with regard to access to safe and affordable credit.3 Financial exclusion can only exacerbate over-indebtedness, given that financially excluded, predominantly low income consumers , have been shown to turn to high cost credit to meet their short term credit needs. This is a problem that has been explored most recently in the Victorian Consumer Credit Review...
Resumo:
Single particle analysis (SPA) coupled with high-resolution electron cryo-microscopy is emerging as a powerful technique for the structure determination of membrane protein complexes and soluble macromolecular assemblies. Current estimates suggest that ∼104–105 particle projections are required to attain a 3 Å resolution 3D reconstruction (symmetry dependent). Selecting this number of molecular projections differing in size, shape and symmetry is a rate-limiting step for the automation of 3D image reconstruction. Here, we present SwarmPS, a feature rich GUI based software package to manage large scale, semi-automated particle picking projects. The software provides cross-correlation and edge-detection algorithms. Algorithm-specific parameters are transparently and automatically determined through user interaction with the image, rather than by trial and error. Other features include multiple image handling (∼102), local and global particle selection options, interactive image freezing, automatic particle centering, and full manual override to correct false positives and negatives. SwarmPS is user friendly, flexible, extensible, fast, and capable of exporting boxed out projection images, or particle coordinates, compatible with downstream image processing suites.
Resumo:
In previous research (Chung et al., 2009), the potential of the continuous risk profile (CRP) to proactively detect the systematic deterioration of freeway safety levels was presented. In this paper, this potential is investigated further, and an algorithm is proposed for proactively detecting sites where the collision rate is not sufficiently high to be classified as a high collision concentration location but where a systematic deterioration of safety level is observed. The approach proposed compares the weighted CRP across different years and uses the cumulative sum (CUSUM) algorithm to detect the sites where changes in collision rate are observed. The CRPs of the detected sites are then compared for reproducibility. When high reproducibility is observed, a growth factor is used for sequential hypothesis testing to determine if the collision profiles are increasing over time. Findings from applying the proposed method using empirical data are documented in the paper together with a detailed description of the method.
Resumo:
Recently the application of the quasi-steady-state approximation (QSSA) to the stochastic simulation algorithm (SSA) was suggested for the purpose of speeding up stochastic simulations of chemical systems that involve both relatively fast and slow chemical reactions [Rao and Arkin, J. Chem. Phys. 118, 4999 (2003)] and further work has led to the nested and slow-scale SSA. Improved numerical efficiency is obtained by respecting the vastly different time scales characterizing the system and then by advancing only the slow reactions exactly, based on a suitable approximation to the fast reactions. We considerably extend these works by applying the QSSA to numerical methods for the direct solution of the chemical master equation (CME) and, in particular, to the finite state projection algorithm [Munsky and Khammash, J. Chem. Phys. 124, 044104 (2006)], in conjunction with Krylov methods. In addition, we point out some important connections to the literature on the (deterministic) total QSSA (tQSSA) and place the stochastic analogue of the QSSA within the more general framework of aggregation of Markov processes. We demonstrate the new methods on four examples: Michaelis–Menten enzyme kinetics, double phosphorylation, the Goldbeter–Koshland switch, and the mitogen activated protein kinase cascade. Overall, we report dramatic improvements by applying the tQSSA to the CME solver.
Resumo:
This thesis investigates profiling and differentiating customers through the use of statistical data mining techniques. The business application of our work centres on examining individuals’ seldomly studied yet critical consumption behaviour over an extensive time period within the context of the wireless telecommunication industry; consumption behaviour (as oppose to purchasing behaviour) is behaviour that has been performed so frequently that it become habitual and involves minimal intentions or decision making. Key variables investigated are the activity initialised timestamp and cell tower location as well as the activity type and usage quantity (e.g., voice call with duration in seconds); and the research focuses are on customers’ spatial and temporal usage behaviour. The main methodological emphasis is on the development of clustering models based on Gaussian mixture models (GMMs) which are fitted with the use of the recently developed variational Bayesian (VB) method. VB is an efficient deterministic alternative to the popular but computationally demandingMarkov chainMonte Carlo (MCMC) methods. The standard VBGMMalgorithm is extended by allowing component splitting such that it is robust to initial parameter choices and can automatically and efficiently determine the number of components. The new algorithm we propose allows more effective modelling of individuals’ highly heterogeneous and spiky spatial usage behaviour, or more generally human mobility patterns; the term spiky describes data patterns with large areas of low probability mixed with small areas of high probability. Customers are then characterised and segmented based on the fitted GMM which corresponds to how each of them uses the products/services spatially in their daily lives; this is essentially their likely lifestyle and occupational traits. Other significant research contributions include fitting GMMs using VB to circular data i.e., the temporal usage behaviour, and developing clustering algorithms suitable for high dimensional data based on the use of VB-GMM.
Resumo:
Biochemical reactions underlying genetic regulation are often modelled as a continuous-time, discrete-state, Markov process, and the evolution of the associated probability density is described by the so-called chemical master equation (CME). However the CME is typically difficult to solve, since the state-space involved can be very large or even countably infinite. Recently a finite state projection method (FSP) that truncates the state-space was suggested and shown to be effective in an example of a model of the Pap-pili epigenetic switch. However in this example, both the model and the final time at which the solution was computed, were relatively small. Presented here is a Krylov FSP algorithm based on a combination of state-space truncation and inexact matrix-vector product routines. This allows larger-scale models to be studied and solutions for larger final times to be computed in a realistic execution time. Additionally the new method computes the solution at intermediate times at virtually no extra cost, since it is derived from Krylov-type methods for computing matrix exponentials. For the purpose of comparison the new algorithm is applied to the model of the Pap-pili epigenetic switch, where the original FSP was first demonstrated. Also the method is applied to a more sophisticated model of regulated transcription. Numerical results indicate that the new approach is significantly faster and extendable to larger biological models.
Resumo:
A healthy human would be expected to show periodic blinks, making a brief closure of the eyelids. Most blinks are spontaneous, occurring regularly with no external stimulus. However a reflex blink can occur in response to external stimuli such as a bright light, a sudden loud noise, or an object approaching toward the eyes. A voluntary or forced blink is another type of blink in which the person deliberately closes the eyes and the lower eyelid raises to meet the upper eyelid. A complete blink, in which the upper eyelid touches the lower eyelid, contributes to the health of ocular surface by providing a fresh layer of tears as well as maintaining optical integrity by providing a smooth tear film over the cornea. The rate of blinking and its completeness vary depending on the task undertaken during blink assessment, the direction of gaze, the emotional state of the subjects and the method under which the blink was measured. It is also well known that wearing contact lenses (both rigid and soft lenses) can induce significant changes in blink rate and completeness. It is been established that efficient blinking plays an important role in ocular surface health during contact lens wear and for improving contact lens performance and comfort. Inefficient blinking during contact lens wear may be related to a low blink rate or incomplete blinking and can often be a reason for dry eye symptoms or ocular surface staining. It has previously been shown that upward gaze can affect blink rate, causing it to become faster. In the first experiment, it was decided to expand on previous studies in this area by examining the effect of various gaze directions (i.e. upward gaze, primary gaze, downward gaze and lateral gaze) as well as head angle (recumbent position) on normal subjects’ blink rate and completeness through the use of filming with a high-speed camera. The results of this experiment showed that as the open palpebral aperture (and exposed ocular surface area) increased from downward gaze to upward gaze, the number of blinks significantly increased (p<0.04). Also, the size of closed palpebral aperture significantly increased from downward gaze to upward gaze (p<0.005). A weak positive correlation (R² = 0.18) between the blink rate and ocular surface area was found in this study. Also, it was found that the subjects showed 81% complete blinks, 19% incomplete blinks and 2% of twitch blinks in primary gaze, consistent with previous studies. The difference in the percentage of incomplete blinks between upward gaze and downward gaze was significant (p<0.004), showing more incomplete blinks in upward gaze. The findings of this experiment suggest that while blink rate becomes slower in downward gaze, the completeness of blinking is typically better, thereby potentially reducing the risk of tear instability. On the other hand, in upward gaze while the completeness of blinking becomes worse, this is potentially offset by increased blink frequency. In addition, blink rate and completeness were not affected by lateral gaze or head angle, possibly because these conditions have similar size of the open palpebral aperture compared with primary gaze. In the second experiment, an investigation into the changes in blink rate and completeness was carried out in primary gaze and downward gaze with soft and rigid contact lenses in unadapted wearers. Not surprisingly, rigid lens wear caused a significant increase in the blink rate in both primary (p<0.001) and downward gaze (p<0.02). After fitting rigid contact lenses, the closed palpebral aperture (blink completeness) did not show any changes but the open palpebral aperture showed a significant narrowing (p<0.04). This might occur from the subjects’ attempt to avoid interaction between the upper eyelid and the edge of the lens to minimize discomfort. After applying topical anaesthetic eye drops in the eye fitted with rigid lenses, the increased blink rate dropped to values similar to that before lens insertion and the open palpebral aperture returned to baseline values, suggesting that corneal and/or lid margin sensitivity was mediating the increased blink rate and narrowed palpebral aperture. We also investigated the changes in the blink rate and completeness with soft contact lenses including a soft sphere, double slab-off toric design and periballast toric design. Soft contact lenses did not cause any significant changes in the blink rate, closed palpebral aperture, open palpebral aperture and the percentage of incomplete blinks in either primary gaze or downward gaze. After applying anaesthetic eye drops, the blink rate reduced in both primary gaze and downward gaze, however this difference was not statistically significant. The size of the closed palpebral aperture and open palpebral aperture did not show any significant changes after applying anaesthetic eye drops. However it should be noted that the effects of rigid and soft contact lenses that we observed in these studies were only the immediate reaction to contact lenses and in the longer term, it is likely that these responses will vary as the eye adapts to the presence of the lenses.
Resumo:
This paper investigates the field programmable gate array (FPGA) approach for multi-objective and multi-disciplinary design optimisation (MDO) problems. One class of optimisation method that has been well-studied and established for large and complex problems, such as those inherited in MDO, is multi-objective evolutionary algorithms (MOEAs). The MOEA, nondominated sorting genetic algorithm II (NSGA-II), is hardware implemented on an FPGA chip. The NSGA-II on FPGA application to multi-objective test problem suites has verified the designed implementation effectiveness. Results show that NSGA-II on FPGA is three orders of magnitude better than the PC based counterpart.