884 resultados para Problem analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the optimal reactive power planning problem under risk is presented. The classical mixed-integer nonlinear model for reactive power planning is expanded into two stage stochastic model considering risk. This new model considers uncertainty on the demand load. The risk is quantified by a factor introduced into the objective function and is identified as the variance of the random variables. Finally numerical results illustrate the performance of the proposed model, that is applied to IEEE 30-bus test system to determine optimal amount and location for reactive power expansion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topological optimization problems based on stress criteria are solved using two techniques in this paper. The first technique is the conventional Evolutionary Structural Optimization (ESO), which is known as hard kill, because the material is discretely removed; that is, the elements under low stress that are being inefficiently utilized have their constitutive matrix has suddenly reduced. The second technique, proposed in a previous paper, is a variant of the ESO procedure and is called Smooth ESO (SESO), which is based on the philosophy that if an element is not really necessary for the structure, its contribution to the structural stiffness will gradually diminish until it no longer influences the structure; its removal is thus performed smoothly. This procedure is known as "soft-kill"; that is, not all of the elements removed from the structure using the ESO criterion are discarded. Thus, the elements returned to the structure must provide a good conditioning system that will be resolved in the next iteration, and they are considered important to the optimization process. To evaluate elasticity problems numerically, finite element analysis is applied, but instead of using conventional quadrilateral finite elements, a plane-stress triangular finite element was implemented with high-order modes for solving complex geometric problems. A number of typical examples demonstrate that the proposed approach is effective for solving problems of bi-dimensional elasticity. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new methodology to analyze aeroelastic stability in a continuous range of flight envelope with varying parameter of velocity and altitude. The focus of the paper is to demonstrate that linear matrix inequalities can be used to evaluate the aeroelastic stability in a region of flight envelope instead of a single point, like classical methods. The proposed methodology can also be used to study if a system remains stable during an arbitrary motion from one point to another in the flight envelope, i.e., when the problem becomes time-variant. The main idea is to represent the system as a polytopic differential inclusion system using rational function approximation to write the model in time domain. The theory is outlined and simulations are carried out on the benchmark AGARD 445.6 wing to demonstrate the method. The classical pk-method is used for comparing results and validating the approach. It is shown that this method is efficient to identify stability regions in the flight envelope. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electric power distribution systems, and particularly those with overhead circuits, operate radially but as the topology of the systems is meshed, therefore a set of circuits needs to be disconnected. In this context the problem of optimal reconfiguration of a distribution system is formulated with the goal of finding a radial topology for the operation of the system. This paper utilizes experimental tests and preliminary theoretical analysis to show that radial topology is one of the worst topologies to use if the goal is to minimize power losses in a power distribution system. For this reason, it is important to initiate a theoretical and practical discussion on whether it is worthwhile to operate a distribution system in a radial form. This topic is becoming increasingly important within the modern operation of electrical systems, which requires them to operate as efficiently as possible, utilizing all available resources to improve and optimize the operation of electric power systems. Experimental tests demonstrate the importance of this issue. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for renewable energy sources, facing the consequences of Climate Change, results in growing investment for solar collectors’ use. Research in this field has accompanied this expansion and evacuated tube solar collector stands as an important study focus. Thus, several works have been published for representing the stratification of the fluid inside the tubes and the reservoir, as well as analytical modeling for the heat flow problem. Based on recent publications, this paper proposes the study of solar water heating with evacuated tubes, their operation characteristics and operating parameters. To develop this work, a computational tool will be used - in this case, the application of computational fluid dynamics (CFD) software. In possession of the implemented model, a numerical simulation will be performed to evaluate the behavior of the fluid within this solar collector and possible improvements to be applied in the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sparse traffic grooming is a practical problem to be addressed in heterogeneous multi-vendor optical WDM networks where only some of the optical cross-connects (OXCs) have grooming capabilities. Such a network is called as a sparse grooming network. The sparse grooming problem under dynamic traffic in optical WDM mesh networks is a relatively unexplored problem. In this work, we propose the maximize-lightpath-sharing multi-hop (MLS-MH) grooming algorithm to support dynamic traffic grooming in sparse grooming networks. We also present an analytical model to evaluate the blocking performance of the MLS-MH algorithm. Simulation results show that MLSMH outperforms an existing grooming algorithm, the shortest path single-hop (SPSH) algorithm. The numerical results from analysis show that it matches closely with the simulation. The effect of the number of grooming nodes in the network on the blocking performance is also analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software. We have been working to address this problem by finding ways to provide at least some of the benefits of formal software engineering techniques to end-user programmers. In this talk, focusing on the spreadsheet application paradigm, I present several of our approaches, focusing on methodologies that utilize source-code-analysis techniques to help end-users build more dependable spreadsheets. Behind the scenes, our methodologies use static analyses such as dataflow analysis and slicing, together with dynamic analyses such as execution monitoring, to support user tasks such as validation and fault localization. I show how, to accommodate the user base of spreadsheet languages, an interface to these methodologies can be provided in a manner that does not require an understanding of the theory behind the analyses, yet supports the interactive, incremental process by which spreadsheets are created. Finally, I present empirical results gathered in the use of our methodologies that highlight several costs and benefits trade-offs, and many opportunities for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Carr-Purcell-Meiboom-Gill (CPMG) pulse sequence has been used in many applications of magnetic resonance imaging (MRI) and low-resolution NMR (LRNMR) spectroscopy. Recently. CPMG was used in online LRNMR measurements that use long RF pulse trains, causing an increase in probe temperature and, therefore, tuning and matching maladjustments. To minimize this problem, the use of a low-power CPMG sequence based on low refocusing pulse flip angles (LRFA) was studied experimentally and theoretically. This approach has been used in several MRI protocols to reduce incident RF power and meet the specific absorption rate. The results for CPMG with LRFA of 3 pi/4 (CPMG(135)), pi/2 (CPMG(90)) and pi/4 (CPMG(45)) were compared with conventional CPMG with refocusing pi pulses. For a homogeneous field, with linewidth equal to Delta nu = 15 Hz, the refocusing flip angles can be as low as pi/4 to obtain the transverse relaxation time (T(2)) value with errors below 5%. For a less homogeneous magnetic field. Delta nu = 100 Hz, the choice of the LRFA has to take into account the reduction in the intensity of the CPMG signal and the increase in the time constant of the CPMG decay that also becomes dependent on longitudinal relaxation time (T(1)). We have compared the T(2) values measured by conventional CPMG and CPMG(90) for 30 oilseed species, and a good correlation coefficient, r = 0.98, was obtained. Therefore, for oilseeds, the T(2) measurements performed with pi/2 refocusing pulses (CPMG(90)), with the same pulse width of conventional CPMG, use only 25% of the RF power. This reduces the heating problem in the probe and reduces the power deposition in the samples. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we focus on the application of two mathematical alternative tasks to the teaching and learning of functions with high school students. The tasks were elaborated according to the following methodological approach: (i) Problem Solving and/or mathematics investigation and (ii) a pedagogical proposal, which defends that mathematical knowledge is developed by means of a balance between logic and intuition. We employed a qualitative research approach (characterized as a case study) aimed at analyzing the didactic pedagogical potential of this type of methodology in high school. We found that tasks such as those presented and discussed in this paper provide a more significant learning for the students, allowing a better conceptual understanding, becoming still more powerful when one considers the social-cultural context of the students.