151 resultados para Optimal control problems
Resumo:
Mosquito-borne diseases pose some of the greatest challenges in public health, especially in tropical and sub-tropical regions of theworld. Efforts to control these diseases have been underpinned by a theoretical framework developed for malaria by Ross and Macdonald, including models, metrics for measuring transmission, and theory of control that identifies key vulnerabilities in the transmission cycle. That framework, especially Macdonald’s formula for R0 and its entomological derivative, vectorial capacity, are nowused to study dynamics and design interventions for many mosquito-borne diseases. A systematic review of 388 models published between 1970 and 2010 found that the vast majority adopted the Ross–Macdonald assumption of homogeneous transmission in a well-mixed population. Studies comparing models and data question these assumptions and point to the capacity to model heterogeneous, focal transmission as the most important but relatively unexplored component in current theory. Fine-scale heterogeneity causes transmission dynamics to be nonlinear, and poses problems for modeling, epidemiology and measurement. Novel mathematical approaches show how heterogeneity arises from the biology and the landscape on which the processes of mosquito biting and pathogen transmission unfold. Emerging theory focuses attention on the ecological and social context formosquito blood feeding, themovement of both hosts and mosquitoes, and the relevant spatial scales for measuring transmission and for modeling dynamics and control.
Resumo:
A control allocation system implements a function that maps the desired control forces generated by the vehicle motion controller into the commands of the different actuators. In this article, a survey of control allocation methods for over-actuated underwater vehicles is presented. The methods are applicable for both surface vessels and underwater vehicles. The paper presents a survey of control allocation methods with focus on mathematical representation and solvability of thruster allocation problems. The paper is useful for university students and engineers who want to get an overview of state-of-the art control allocation methods as well as advance methods to solve more complex problems.
Interaction of psychosocial risk factors explain increased neck problems among female office workers
Resumo:
This study investigated the relationship between psychosocial risk factors and (1) neck symptoms and (2) neck pain and disability as measured by the neck disability index (NDI). Female office workers employed in local private and public organizations were invited to participate, with 333 completing a questionnaire. Data were collected on various risk factors including age, negative affectivity, history of previous neck trauma, physical work environment, and task demands. Sixty-one percent of the sample reported neck symptoms lasting greater than 8 days in the last 12 months. The mean NDI of the sample was 15.5 out of 100, indicating mild neck pain and disability. In a hierarchical multivariate logistic regression, low supervisor support was the only psychosocial risk factor identified with the presence of neck symptoms. Similarly, low supervisor support was the only factor associated with the score on the NDI. These associations remained after adjustment for potential confounders of age, negative affectivity, and physical risk factors. The interaction of job demands, decision authority, and supervisor support was significantly associated with the NDI in the final model and this association increased when those with previous trauma were excluded. Interestingly, and somewhat contrary to initial expectations, as job demands increased, high decision authority had an increasing effect on the NDI when supervisor support was low.
Resumo:
Electrification of vehicular systems has gained increased momentum in recent years with particular attention to constant power loads (CPLs). Since a CPL potentially threatens system stability, stability analysis of hybrid electric vehicle with CPLs becomes necessary. A new power buffer configuration with battery is introduced to mitigate the effect of instability caused by CPLs. Model predictive control (MPC) is applied to regulate the power buffer to decouple source and load dynamics. Moreover, MPC provides an optimal tradeoff between modification of load impedance, variation of dc-link voltage and battery current ripples. This is particularly important during transients or starting of system faults, since battery response is not very fast. Optimal tradeoff becomes even more significant when considering low-cost power buffer without battery. This paper analyzes system models for both voltage swell and voltage dip faults. Furthermore, a dual mode MPC algorithm is implemented in real time offering improved stability. A comprehensive set of experimental results is included to verify the efficacy of the proposed power buffer.
Resumo:
This thesis investigates the use of building information models for access control and security applications in critical infrastructures and complex building environments. It examines current problems in security management for physical and logical access control and proposes novel solutions that exploit the detailed information available in building information models. The project was carried out as part of the Airports of the Future Project and the research was modelled based on real-world problems identified in collaboration with our industry partners in the project.
Resumo:
This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.
Resumo:
Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.
Resumo:
A system requiring a waste management license from an enforcement agency has been introduced in many countries. A license system is usually coupled with fines, a manifest, and a disposal tax. However, these policy devices have not been integrated into an optimal policy. In this paper we derive an optimal waste management policy by using those policy devices. Waste management policies are met with three difficult problems: asymmetric information, the heterogeneity of waste management firms, and non-compliance by waste management firms and waste disposers. The optimal policy in this paper overcomes all three problems.
Resumo:
Maintenance decisions for large-scale asset systems are often beyond an asset manager's capacity to handle. The presence of a number of possibly conflicting decision criteria, the large number of possible maintenance policies, and the reality of budget constraints often produce complex problems, where the underlying trade-offs are not apparent to the asset manager. This paper presents the decision support tool "JOB" (Justification and Optimisation of Budgets), which has been designed to help asset managers of large systems assess, select, interpret and optimise the effects of their maintenance policies in the presence of limited budgets. This decision support capability is realized through an efficient, scalable backtracking- based algorithm for the optimisation of maintenance policies, while enabling the user to view a number of solutions near this optimum and explore tradeoffs with other decision criteria. To assist the asset manager in selecting between various policies, JOB also provides the capability of Multiple Criteria Decision Making. In this paper, the JOB tool is presented and its applicability for the maintenance of a complex power plant system.
Resumo:
This paper translates the concepts of sustainable production to three dimensions of economic, environmental and ecological sustainability to analyze optimal production scales by solving optimizing problems. Economic optimization seeks input-output combinations to maximize profits. Environmental optimization searches for input-output combinations that minimize the polluting effects of materials balance on the surrounding environment. Ecological optimization looks for input-output combinations that minimize the cumulative destruction of the entire ecosystem. Using an aggregate space, the framework illustrates that these optimal scales are often not identical because markets fail to account for all negative externalities. Profit-maximizing firms normally operate at the scales which are larger than optimal scales from the viewpoints of environmental and ecological sustainability; hence policy interventions are favoured. The framework offers a useful tool for efficiency studies and policy implication analysis. The paper provides an empirical investigation using a data set of rice farms in South Korea.
Resumo:
This paper investigates demodulation of differentially phase modulated signals DPMS using optimal HMM filters. The optimal HMM filter presented in the paper is computationally of order N3 per time instant, where N is the number of message symbols. Previously, optimal HMM filters have been of computational order N4 per time instant. Also, suboptimal HMM filters have be proposed of computation order N2 per time instant. The approach presented in this paper uses two coupled HMM filters and exploits knowledge of ...
Resumo:
In this paper conditional hidden Markov model (HMM) filters and conditional Kalman filters (KF) are coupled together to improve demodulation of differential encoded signals in noisy fading channels. We present an indicator matrix representation for differential encoded signals and the optimal HMM filter for demodulation. The filter requires O(N3) calculations per time iteration, where N is the number of message symbols. Decision feedback equalisation is investigated via coupling the optimal HMM filter for estimating the message, conditioned on estimates of the channel parameters, and a KF for estimating the channel states, conditioned on soft information message estimates. The particular differential encoding scheme examined in this paper is differential phase shift keying. However, the techniques developed can be extended to other forms of differential modulation. The channel model we use allows for multiplicative channel distortions and additive white Gaussian noise. Simulation studies are also presented.
Resumo:
Evaluating agency theory and optimal contracting theory views of corporate philanthropy, we find that as corporate giving increases, shareholders reduce their valuation of firm cash holdings. Dividend increases following the 2003 Tax Reform Act are associated with reduced corporate giving. Using a natural experiment, we find that corporate giving is positively (negatively) associated with CEO charity preferences (CEO shareholdings and corporate governance quality). Evidence from CEO-affiliated charity donations, market reactions to insider-affiliated donations, its relation to CEO compensation, and firm contributions to director-affiliated charities indicates that corporate donations advance CEO interests and suggests misuses of corporate resources that reduce firm value.
Resumo:
This paper presents an efficient algorithm for optimizing the operation of battery storage in a low voltage distribution network with a high penetration of PV generation. A predictive control solution is presented that uses wavelet neural networks to predict the load and PV generation at hourly intervals for twelve hours into the future. The load and generation forecast, and the previous twelve hours of load and generation history, is used to assemble load profile. A diurnal charging profile can be compactly represented by a vector of Fourier coefficients allowing a direct search optimization algorithm to be applied. The optimal profile is updated hourly allowing the state of charge profile to respond to changing forecasts in load.
Resumo:
A new mesh adaptivity algorithm that combines a posteriori error estimation with bubble-type local mesh generation (BLMG) strategy for elliptic differential equations is proposed. The size function used in the BLMG is defined on each vertex during the adaptive process based on the obtained error estimator. In order to avoid the excessive coarsening and refining in each iterative step, two factor thresholds are introduced in the size function. The advantages of the BLMG-based adaptive finite element method, compared with other known methods, are given as follows: the refining and coarsening are obtained fluently in the same framework; the local a posteriori error estimation is easy to implement through the adjacency list of the BLMG method; at all levels of refinement, the updated triangles remain very well shaped, even if the mesh size at any particular refinement level varies by several orders of magnitude. Several numerical examples with singularities for the elliptic problems, where the explicit error estimators are used, verify the efficiency of the algorithm. The analysis for the parameters introduced in the size function shows that the algorithm has good flexibility.