58 resultados para Logic-based optimization algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the variability and stochastic nature of wind power system, accurate wind power forecasting has an important role in developing reliable and economic power system operation and control strategies. As wind variability is stochastic, Gaussian Process regression has recently been introduced to capture the randomness of wind energy. However, the disadvantages of Gaussian Process regression include its computation complexity and incapability to adapt to time varying time-series systems. A variant Gaussian Process for time series forecasting is introduced in this study to address these issues. This new method is shown to be capable of reducing computational complexity and increasing prediction accuracy. It is further proved that the forecasting result converges as the number of available data approaches innite. Further, a teaching learning based optimization (TLBO) method is used to train the model and to accelerate
the learning rate. The proposed modelling and optimization method is applied to forecast both the wind power generation of Ireland and that from a single wind farm to show the eectiveness of the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this work is to present an efficient CAD-based adjoint process chain for calculating parametric sensitivities (derivatives of the objective function with respect to the CAD parameters) in timescales acceptable for industrial design processes. The idea is based on linking parametric design velocities (geometric sensitivities computed from the CAD model) with adjoint surface sensitivities. A CAD-based design velocity computation method has been implemented based on distances between discrete representations of perturbed geometries. This approach differs from other methods due to the fact that it works with existing commercial CAD packages (unlike most analytical approaches) and it can cope with the changes in CAD model topology and face labeling. Use of the proposed method allows computation of parametric sensitivities using adjoint data at a computational cost which scales with the number of objective functions being considered, while it is essentially independent of the number of design variables. The gradient computation is demonstrated on test cases for a Nozzle Guide Vane (NGV) model and a Turbine Rotor Blade model. The results are validated against finite difference values and good agreement is shown. This gradient information can be passed to an optimization algorithm, which will use it to update the CAD model parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The VLT-FLAMES Tarantula Survey (VFTS) has secured mid-resolution spectra of over 300 O-type stars in the 30 Doradus region of the Large Magellanic Cloud. A homogeneous analysis of such a large sample requires automated techniques, an approach that will also be needed for the upcoming analysis of the Gaia surveys of the Northern and Southern Hemisphere supplementing the Gaia measurements. We point out the importance of Gaia for the study of O stars, summarize the O star science case of VFTS and present a test of the automated modeling technique using synthetically generated data. This method employs a genetic algorithm based optimization technique in combination with fastwind model atmospheres. The method is found to be robust and able to recover the main photospheric parameters accurately. Precise wind parameters can be obtained as well, however, as expected, for dwarf stars the rate of acceleration of the ow is poorly constrained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In previous papers, we have presented a logic-based framework based on fusion rules for merging structured news reports. Structured news reports are XML documents, where the textentries are restricted to individual words or simple phrases, such as names and domain-specific terminology, and numbers and units. We assume structured news reports do not require natural language processing. Fusion rules are a form of scripting language that define how structured news reports should be merged. The antecedent of a fusion rule is a call to investigate the information in the structured news reports and the background knowledge, and the consequent of a fusion rule is a formula specifying an action to be undertaken to form a merged report. It is expected that a set of fusion rules is defined for any given application. In this paper we extend the approach to handling probability values, degrees of beliefs, or necessity measures associated with textentries in the news reports. We present the formal definition for each of these types of uncertainty and explain how they can be handled using fusion rules. We also discuss the methods of detecting inconsistencies among sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the rapid growth in the quantity and complexity of scientific knowledge available for scientists, and allied professionals, the problems associated with harnessing this knowledge are well recognized. Some of these problems are a result of the uncertainties and inconsistencies that arise in this knowledge. Other problems arise from heterogeneous and informal formats for this knowledge. To address these problems, developments in the application of knowledge representation and reasoning technologies can allow scientific knowledge to be captured in logic-based formalisms. Using such formalisms, we can undertake reasoning with the uncertainty and inconsistency to allow automated techniques to be used for querying and combining of scientific knowledge. Furthermore, by harnessing background knowledge, the querying and combining tasks can be carried out more intelligently. In this paper, we review some of the significant proposals for formalisms for representing and reasoning with scientific knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

G protein-coupled receptors (GPCRs) represent a major focus in functional genomics programs and drug development research, but their important potential as drug targets contrasts with the still limited data available concerning their activation mechanism. Here, we investigated the activation mechanism of the cholecystokinin-2 receptor (CCK2R). The three-dimensional structure of inactive CCK2R was homology-modeled on the basis of crystal coordinates of inactive rhodopsin. Starting from the inactive CCK2R modeled structure, active CCK2R (namely cholecystokinin-occupied CCK2R) was modeled by means of steered molecular dynamics in a lipid bilayer and by using available data from other GPCRs, including rhodopsin. By comparing the modeled structures of the inactive and active CCK2R, we identified changes in the relative position of helices and networks of interacting residues, which were expected to stabilize either the active or inactive states of CCK2R. Using targeted molecular dynamics simulations capable of converting CCK2R from the inactive to the active state, we delineated structural changes at the atomic level. The activation mechanism involved significant movements of helices VI and V, a slight movement of helices IV and VII, and changes in the position of critical residues within or near the binding site. The mutation of key amino acids yielded inactive or constitutively active CCK2R mutants, supporting this proposed mechanism. Such progress in the refinement of the CCK2R binding site structure and in knowledge of CCK2R activation mechanisms will enable target-based optimization of nonpeptide ligands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an approach which enables new parameters to be added to a CAD model for optimization purposes. It aims to remove a common roadblock to CAD based optimization, where the parameterization of the model does not offer the shape sufficient flexibility for a truly optimized shape to be created. A technique has been developed which uses adjoint based sensitivity maps to predict
the sensitivity of performance to the addition to a model of four different feature types, allowing the feature providing the greatest benefit to be selected. The optimum position to add the feature is also discussed. It is anticipated that the approach could be used to iteratively add features to a model, providing greater flexibility to the shape of the model, and allowing the newly-added parameters to be used as design variables in a subsequent shape optimization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When multiple sources provide information about the same unknown quantity, their fusion into a synthetic interpretable message is often a tedious problem, especially when sources are conicting. In this paper, we propose to use possibility theory and the notion of maximal coherent subsets, often used in logic-based representations, to build a fuzzy belief structure that will be instrumental both for extracting useful information about various features of the information conveyed by the sources and for compressing this information into a unique possibility distribution. Extensions and properties of the basic fusion rule are also studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optimization of full-scale biogas plant operation is of great importance to make biomass a competitive source of renewable energy. The implementation of innovative control and optimization algorithms, such as Nonlinear Model Predictive Control, requires an online estimation of operating states of biogas plants. This state estimation allows for optimal control and operating decisions according to the actual state of a plant. In this paper such a state estimator is developed using a calibrated simulation model of a full-scale biogas plant, which is based on the Anaerobic Digestion Model No.1. The use of advanced pattern recognition methods shows that model states can be predicted from basic online measurements such as biogas production, CH4 and CO2 content in the biogas, pH value and substrate feed volume of known substrates. The machine learning methods used are trained and evaluated using synthetic data created with the biogas plant model simulating over a wide range of possible plant operating regions. Results show that the operating state vector of the modelled anaerobic digestion process can be predicted with an overall accuracy of about 90%. This facilitates the application of state-based optimization and control algorithms on full-scale biogas plants and therefore fosters the production of eco-friendly energy from biomass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an Invariant Information Local Sub-map Filter (IILSF) as a technique for consistent Simultaneous Localisation and Mapping (SLAM) in a large environment. It harnesses the benefits of sub-map technique to improve the consistency and efficiency of Extended Kalman Filter (EKF) based SLAM. The IILSF makes use of invariant information obtained from estimated locations of features in independent sub-maps, instead of incorporating every observation directly into the global map. Then the global map is updated at regular intervals. Applying this technique to the EKF based SLAM algorithm: (a) reduces the computational complexity of maintaining the global map estimates and (b) simplifies transformation complexities and data association ambiguities usually experienced in fusing sub-maps together. Simulation results show that the method was able to accurately fuse local map observations to generate an efficient and consistent global map, in addition to significantly reducing computational cost and data association ambiguities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As is now well established, a first order expansion of the Hohenberg-Kohn total energy density functional about a trial input density, namely, the Harris-Foulkes functional, can be used to rationalize a non self consistent tight binding model. If the expansion is taken to second order then the energy and electron density matrix need to be calculated self consistently and from this functional one can derive a charge self consistent tight binding theory. In this paper we have used this to describe a polarizable ion tight binding model which has the benefit of treating charge transfer in point multipoles. This admits a ready description of ionic polarizability and crystal field splitting. It is necessary in constructing such a model to find a number of parameters that mimic their more exact counterparts in the density functional theory. We describe in detail how this is done using a combination of intuition, exact analytical fitting, and a genetic optimization algorithm. Having obtained model parameters we show that this constitutes a transferable scheme that can be applied rather universally to small and medium sized organic molecules. We have shown that the model gives a good account of static structural and dynamic vibrational properties of a library of molecules, and finally we demonstrate the model's capability by showing a real time simulation of an enolization reaction in aqueous solution. In two subsequent papers, we show that the model is a great deal more general in that it will describe solvents and solid substrates and that therefore we have created a self consistent quantum mechanical scheme that may be applied to simulations in heterogeneous catalysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AgentSpeak is a logic-based programming language, based on the Belief-Desire-Intention (BDI) paradigm, suitable for building complex agent-based systems. To limit the computational complexity, agents in AgentSpeak rely on a plan library to reduce the planning problem to the much simpler problem of plan selection. However, such a plan library is often inadequate when an agent is situated in an uncertain environment. In this paper, we propose the AgentSpeak+ framework, which extends AgentSpeak with a mechanism for probabilistic planning. The beliefs of an AgentSpeak+ agent are represented using epistemic states to allow an agent to reason about its uncertain observations and the uncertain effects of its actions. Each epistemic state consists of a POMDP, used to encode the agent’s knowledge of the environment, and its associated probability distribution (or belief state). In addition, the POMDP is used to select the optimal actions for achieving a given goal, even when facing uncertainty.