959 resultados para optimization-based similarity reasoning


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Composite materials are finding increasing use on primary aerostructures to meet demanding performance targets while reducing environmental impact. This paper presents a finite-element-based preliminary optimization methodology for postbuckling stiffened panels, which takes into account damage mechanisms that lead to delamination and subsequent failure by stiffener debonding. A global-local modeling approach is adopted in which the boundary conditions on the local model are extracted directly from the global model. The optimization procedure is based on a genetic algorithm that maximizes damage resistance within the postbuckling regime. This routine is linked to a finite element package and the iterative procedure automated. For a given loading condition, the procedure optimized the stacking sequence of several areas of the panel, leading to an evolved panel that displayed superior damage resistance in comparison with nonoptimized designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of cutouts in composite plates was investigated by implementing a procedure known as Evolutionary Structural Optimization. Perforations were introduced into a finite element mesh of the plate from which one or more cutouts of a predetermined size were evolved. In the examples presented, plates were rejected from around each evolving cutout based on a predefined rejection criterion. The Limiting ply within each plate element around the cutout was determined based on the Tsai-Hill failure criterion. Finite element plates with values below the product of the average Tsai-Hill number and a rejection criterion were subsequently removed. This process was iterated until a steady state was reached and the rejection criterion was then incremented by an evolutionary rate and the above steps repeated until the desired cutout area was achieved. Various plates with differing lay-up and loading parameters were investigated to demonstrate the generality and robustness of this optimization procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents validated results of the optimization of cutouts in laminated carbon-fibre composite panels by adapting a recently developed optimization procedure known as Evolutionary Structural Optimization (ESO). An initial small cutout was introduced into each finite element model and elements were removed from around this cutout based on a predefined rejection criterion. In the examples presented, the limiting ply within each plate element around the cutout was determined based on the Tsai-Hill failure index. Plates with values below the product of the average Tsai-Hill number and a rejection ratio (RR) were subsequently removed. This process was iterated until a steady state was reached and the RR was then incremented by an evolutionary rate (ER). The above steps were repeated until a cutout of a desired area was achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental and numerical studies have shown that the occurrence of abrupt secondary instabilities, or mode-jumps, in a postbuckling stiffened composite panel may initiate structural failure. This study presents an optimisation methodology, using a genetic algorithm and finite element analysis for the lay-up optimisation of postbuckling composite plates to delay the onset of mode-jump instabilities. A simple and novel approach for detecting modejumps is proposed, based on the RMS value of out-of-plane pseudo-velocities at a number of locations distributed over the postbuckling structure

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision making is an important element throughout the life-cycle of large-scale projects. Decisions are critical as they have a direct impact upon the success/outcome of a project and are affected by many factors including the certainty and precision of information. In this paper we present an evidential reasoning framework which applies Dempster-Shafer Theory and its variant Dezert-Smarandache Theory to aid decision makers in making decisions where the knowledge available may be imprecise, conflicting and uncertain. This conceptual framework is novel as natural language based information extraction techniques are utilized in the extraction and estimation of beliefs from diverse textual information sources, rather than assuming these estimations as already given. Furthermore we describe an algorithm to define a set of maximal consistent subsets before fusion occurs in the reasoning framework. This is important as inconsistencies between subsets may produce results which are incorrect/adverse in the decision making process. The proposed framework can be applied to problems involving material selection and a Use Case based in the Engineering domain is presented to illustrate the approach. © 2013 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel method that leverages reasoning capabilities in a computer vision system dedicated to human action recognition. The proposed methodology is decomposed into two stages. First, a machine learning based algorithm - known as bag of words - gives a first estimate of action classification from video sequences, by performing an image feature analysis. Those results are afterward passed to a common-sense reasoning system, which analyses, selects and corrects the initial estimation yielded by the machine learning algorithm. This second stage resorts to the knowledge implicit in the rationality that motivates human behaviour. Experiments are performed in realistic conditions, where poor recognition rates by the machine learning techniques are significantly improved by the second stage in which common-sense knowledge and reasoning capabilities have been leveraged. This demonstrates the value of integrating common-sense capabilities into a computer vision pipeline. © 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a multi-camera application capable of processing high resolution images and extracting features based on colors patterns over graphic processing units (GPU). The goal is to work in real time under the uncontrolled environment of a sport event like a football match. Since football players are composed for diverse and complex color patterns, a Gaussian Mixture Models (GMM) is applied as segmentation paradigm, in order to analyze sport live images and video. Optimization techniques have also been applied over the C++ implementation using profiling tools focused on high performance. Time consuming tasks were implemented over NVIDIA's CUDA platform, and later restructured and enhanced, speeding up the whole process significantly. Our resulting code is around 4-11 times faster on a low cost GPU than a highly optimized C++ version on a central processing unit (CPU) over the same data. Real time has been obtained processing until 64 frames per second. An important conclusion derived from our study is the scalability of the application to the number of cores on the GPU. © 2011 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was carried out to investigate whether the electronic portal imaging (EPI) acquisition process could be optimized, and as a result tolerance and action levels be set for the PIPSPro QC-3V phantom image quality assessment. The aim of the optimization process was to reduce the dose delivered to the patient while maintaining a clinically acceptable image quality. This is of interest when images are acquired in addition to the planned patient treatment, rather than images being acquired using the treatment field during a patient's treatment. A series of phantoms were used to assess image quality for different acquisition settings relative to the baseline values obtained following acceptance testing. Eight Varian aS500 EPID systems on four matched Varian 600C/D linacs and four matched Varian 2100C/D linacs were compared for consistency of performance and images were acquired at the four main orthogonal gantry angles. Images were acquired using a 6 MV beam operating at 100 MU min(-1) and the low-dose acquisition mode. Doses used in the comparison were measured using a Farmer ionization chamber placed at d(max) in solid water. The results demonstrated that the number of reset frames did not have any influence on the image contrast, but the number of frame averages did. The expected increase in noise with corresponding decrease in contrast was also observed when reducing the number of frame averages. The optimal settings for the low-dose acquisition mode with respect to image quality and dose were found to be one reset frame and three frame averages. All patients at the Northern Ireland Cancer Centre are now imaged using one reset frame and three frame averages in the 6 MV 100 MU min(-1) low-dose acquisition mode. Routine EPID QC contrast tolerance (+/-10) and action (+/-20) levels using the PIPSPro phantom based around expected values of 190 (Varian 600C/D) and 225 (Varian 2100C/D) have been introduced. The dose at dmax from electronic portal imaging has been reduced by approximately 28%, and while the image quality has been reduced, the images produced are still clinically acceptable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on the Dempster-Shafer (D-S) theory of evidence and G. Yen's (1989), extension of the theory, the authors propose approaches to representing heuristic knowledge by evidential mapping and pooling the mass distribution in a complex frame by partitioning that frame using Shafter's partition technique. The authors have generalized Yen's model from Bayesian probability theory to the D-S theory of evidence. Based on such a generalized model, an extended framework for evidential reasoning systems is briefly specified in which a semi-graph method is used to describe the heuristic knowledge. The advantage of such a method is that it can avoid the complexity of graphs without losing the explicitness of graphs. The extended framework can be widely used to build expert systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational models of meaning trained on naturally occurring text successfully model human performance on tasks involving simple similarity measures, but they characterize meaning in terms of undifferentiated bags of words or topical dimensions. This has led some to question their psychological plausibility (Murphy, 2002; Schunn, 1999). We present here a fully automatic method for extracting a structured and comprehensive set of concept descriptions directly from an English part-of-speech-tagged corpus. Concepts are characterized by weighted properties, enriched with concept-property types that approximate classical relations such as hypernymy and function. Our model outperforms comparable algorithms in cognitive tasks pertaining not only to concept-internal structures (discovering properties of concepts, grouping properties by property type) but also to inter-concept relations (clustering into superordinates), suggesting the empirical validity of the property-based approach. Copyright © 2009 Cognitive Science Society, Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CCTV systems are broadly deployed in the present world. Despite this, the impact on anti-social and criminal behaviour has been minimal. Subject reacquisition is a fundamental task to ensure in-time reaction for intelligent surveillance. However, traditional reacquisition based on face recognition is not scalable, hence in this paper we use reasoning techniques to reduce the computational effort which deploys the time-of-flight information between interested zones such as airport security corridors. Also, to improve accuracy of reacquisition, we introduce the idea of revision as a method of post-processing.We demonstrate the significance and usefulness of our framework with an experiment which shows much less computational effort and better accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A PMU based WAMS is to be placed on a weakly coupled section of distribution grid, with high levels of distributed generation. In anticipation of PMU data a Siemens PSS/E model of the electrical environment has been used to return similar data to that expected from the WAMS. This data is then used to create a metric that reflects optimization, control and protection in the region. System states are iterated through with the most desirable one returning the lowest optimization metric, this state is assessed against the one returned by PSS/E under normal circumstances. This paper investigates the circumstances that trigger SPS in the region, through varying generation between 0 and 110% and compromising the network through line loss under summer minimum and winter maximum conditions. It is found that the optimized state can generally tolerate an additional 2 MW of generation (3% of total) before encroaching the same thresholds and in one instance moves the triggering to 100% of generation output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework supporting fast prototyping as well as tuning of distributed applications is presented. The approach is based on the adoption of a formal model that is used to describe the orchestration of distributed applications. The formal model (Orc by Misra and Cook) can be used to support semi-formal reasoning about the applications at hand. The paper describes how the framework can be used to derive and evaluate alternative orchestrations of a well know parallel/distributed computation pattern; and shows how the same formal model can be used to support generation of prototypes of distributed applications skeletons directly from the application description.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.