20 resultados para UNCONSTRAINED MINIMIZATION

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The employment of flexibility in the design of façades makes them adaptable to adverse weather conditions, resulting in both minimization of environmental discomfort and improvement of energy efficiency. The present study highlights the potential of flexible façades as a resource to reduce rigidity and form repetition, which are usually employed in condominiums of standardized houses; as such, the work presented herein contributes to field of study of architectural projects strategies for adapting and integrating buildings within the local climate context. Two façade options were designed using as reference the bionics and the kinetics, as well as their applications to architectural constructions. This resulted in two lightweight and dynamic structures, which cater to constraints of comfort through combinations of movements, which control the impact of solar radiation and of cooling in the environment. The efficacy and technical functionality of the façades were tested with comfort analysis and graphic computation software, as well as with physical models. Thus, the current research contributes to the improvement of architectural solutions aimed at using passive energy strategies in order to offer both better quality for the users and for the sustainability of the planet

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Na unfolding method of linear intercept distributions and secction área distribution was implemented for structures with spherical grains. Although the unfolding routine depends on the grain shape, structures with spheroidal grains can also be treated by this routine. Grains of non-spheroidal shape can be treated only as approximation. A software was developed with two parts. The first part calculates the probability matrix. The second part uses this matrix and minimizes the chi-square. The results are presented with any number of size classes as required. The probability matrix was determined by means of the linear intercept and section area distributions created by computer simulation. Using curve fittings the probability matrix for spheres of any sizes could be determined. Two kinds of tests were carried out to prove the efficiency of the Technique. The theoretical tests represent ideal cases. The software was able to exactly find the proposed grain size distribution. In the second test, a structure was simulated in computer and images of its slices were used to produce the corresponding linear intercept the section area distributions. These distributions were then unfolded. This test simulates better reality. The results show deviations from the real size distribution. This deviations are caused by statistic fluctuation. The unfolding of the linear intercept distribution works perfectly, but the unfolding of section area distribution does not work due to a failure in the chi-square minimization. The minimization method uses a matrix inversion routine. The matrix generated by this procedure cannot be inverted. Other minimization method must be used

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fast pyrolysis of lignocellulosic biomass is a thermochemical conversion process for production energy which have been very atratactive due to energetic use of its products: gas (CO, CO2, H2, CH4, etc.), liquid (bio-oil) and charcoal. The bio-oil is the main product of fast pyrolysis, and its final composition and characteristics is intrinsically related to quality of biomass (ash disposal, moisture, content of cellulose, hemicellulose and lignin) and efficiency removal of oxygen compounds that cause undesirable features such as increased viscosity, instability, corrosiveness and low calorific value. The oxygenates are originated in the conventional process of biomass pyrolysis, where the use of solid catalysts allows minimization of these products by improving the bio-oil quality. The present study aims to evaluate the products of catalytic pyrolysis of elephant grass (Pennisetum purpureum Schum) using solid catalysts as tungsten oxides, supported or not in mesoporous materials like MCM-41, derived silica from rice husk ash, aimed to reduce oxygenates produced in pyrolysis. The biomasss treatment by washing with heated water (CEL) or washing with acid solution (CELix) and application of tungsten catalysts on vapors from the pyrolysis process was designed to improve the pyrolysis products quality. Conventional and catalytic pyrolysis of biomass was performed in a micro-pyrolyzer, Py-5200, coupled to GC/MS. The synthesized catalysts were characterized by X ray diffraction, infrared spectroscopy, X ray fluorescence, temperature programmed reduction and thermogravimetric analysis. Kinetic studies applying the Flynn and Wall model were performed in order to evaluate the apparent activation energy of holoceluloce thermal decomposition on samples elephant grass (CE, CEL and CELix). The results show the effectiveness of the treatment process, reducing the ash content, and were also observed decrease in the apparent activation energy of these samples. The catalytic pyrolysis process converted most of the oxygenate componds in aromatics such as benzene, toluene, ethylbenzene, etc

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work has as an objective analyze the efficiency of producers costs of the irrigation Project Baixo-Açu , and identify the determining factors of this efficiency. To achieve these targets it was estimated, in a first stage, a frontier of costs by the method, non parametric of Data Envelopment Analysis-DEA, and measured the stakes of efficiency producers. On the second stage, it was utilized the Tobit regression pattern, estimating an inefficiency function of costs, and were indentified the associated factors of resources waste. Among the results found it was noticed the existence of a high waste of resources, that represent more than 54% of effective cost. Among the factors with the highest wastes are: energy, herbicides, defensives and chemical fertilizers. In a general way, the producers presented low efficiency level and, only, two, of seventy-five researched, achieved the frontier of costs minimization. These results reveal, in a certain way, that the producers in irrigated fruit growing in the project Baixo-Açu don t seek to minimize the production costs. It was still noticed, that the reduction of resources waste, and this way the inefficiency of costs, is associated with the agriculturalist education, his experience in agriculture, his access to the technical assistance and credit

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote Communities. Absence of artifacts and minimization of the exacerbated consumption of modernity. The desire which spread beyond what reality can provide. Expressions like this are present in this paper which focus in the social representations of school built by residents who live at the riversides of Môa and Azul Rivers, in Mâncio Lima, Acre State. To do so, we used the methodological contribution of the semi-structured interview, observation of the place while a natural inhabitant of the region, and also photos analyses of local reality. A key feature of the riverside homes is the glued paper on the walls of houses forming a panel set of portraits, pictures, letters and numbers for all appreciated. Regardless of whether or not read, there is admiration for the color of the images, the layout of the letters, and the things of the city awakening the desire to obtain school knowledge. The resident of this Amazon region maintains a close relationship between thinking, acting and feeling living harmonically with nature that connects them to the ideal landscape which is revisited by the graphic material that attracts wondering what exists beyond the shores of the river, beyond the horizon of green forests. It is a life entirely accomplished by the imaginary where exist a framed landscape merged and confused by the real and the supernatural, in which men and gods walk together by the forest, sailing by the rivers and seek a possible aesthetic between the real and ideal. The Theory of Social Representations spread by Serge Moscovici (2005) and Jodelet (2001) guided our gaze on the understanding what the school is and its representation to the riversides, as well to reveal the relation they practice with the knowledge that is spread by the mystification and the knowledge that is practice daily. Based in Bardin s thematic analysis (2004) we tried to raise such contents combining them in five analysis categories

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this research is to discuss about the need for implementation of new alternatives for the implementation on the metrological control: on the findings of initial and subsequent measurements, the control procedures of measurement uncertainty applied in assessing the loss or remains found in handling operations of bulk liquids, when used turbine meters used in measuring the tax on the business of Petrobras, due to the current environment of legal metrology and scientific, both domestic and international. We aim, with these alternatives: standardizing the minimization of random and systematic errors, the estimate of the remaining errors, as well as the management control of metrological calibration procedures, control of measurement uncertainty, and contribute to the change in the form of performance of legal metrology and scientific disseminating new information to change management of metrological control, objectively focused on aspects of supervision in implementing these activities in the control of the uncertainties of measurement used in our processes in the fiscal measurement system Petrobras. Results are presented, information and comments on the influence of measurement uncertainty in the current results of the fiscal and transfer of custody. This will emphasize the need, among other things, improvement and expansion of metrological control monitored by setting a better meet demand, calibration equipment and measuring instruments for Petrobras. Finally, we intend to establish the need for improving the method of evaluation of the data meter applied to the current management control of measurement uncertainty by proposing a methodology for addressing the problem, as well as highlighting the expected results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work shows a study about the Generalized Predictive Controllers with Restrictions and their implementation in physical plants. Three types of restrictions will be discussed: restrictions in the variation rate of the signal control, restrictions in the amplitude of the signal control and restrictions in the amplitude of the Out signal (plant response). At the predictive control, the control law is obtained by the minimization of an objective function. To consider the restrictions, this minimization of the objective function is done by the use of a method to solve optimizing problems with restrictions. The chosen method was the Rosen Algorithm (based on the Gradient-projection). The physical plants in this study are two didactical systems of water level control. The first order one (a simple tank) and another of second order, which is formed by two tanks connected in cascade. The codes are implemented in C++ language and the communication with the system to be done through using a data acquisition panel offered by the system producer

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes the performance of a parallel implementation of Coupled Simulated Annealing (CSA) for the unconstrained optimization of continuous variables problems. Parallel processing is an efficient form of information processing with emphasis on exploration of simultaneous events in the execution of software. It arises primarily due to high computational performance demands, and the difficulty in increasing the speed of a single processing core. Despite multicore processors being easily found nowadays, several algorithms are not yet suitable for running on parallel architectures. The algorithm is characterized by a group of Simulated Annealing (SA) optimizers working together on refining the solution. Each SA optimizer runs on a single thread executed by different processors. In the analysis of parallel performance and scalability, these metrics were investigated: the execution time; the speedup of the algorithm with respect to increasing the number of processors; and the efficient use of processing elements with respect to the increasing size of the treated problem. Furthermore, the quality of the final solution was verified. For the study, this paper proposes a parallel version of CSA and its equivalent serial version. Both algorithms were analysed on 14 benchmark functions. For each of these functions, the CSA is evaluated using 2-24 optimizers. The results obtained are shown and discussed observing the analysis of the metrics. The conclusions of the paper characterize the CSA as a good parallel algorithm, both in the quality of the solutions and the parallel scalability and parallel efficiency

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a formulation for optimization of 2D-structure layouts submitted to mechanic and thermal shipments and applied an h-adaptive filter process which conduced to computational low spend and high definition structural layouts. The main goal of the formulation is to minimize the structure mass submitted to an effective state of stress of von Mises, with stability and lateral restriction variants. A criterion of global measurement was used for intents a parametric condition of stress fields. To avoid singularity problems was considerate a release on the stress restriction. On the optimization was used a material approach where the homogenized constructive equation was function of the material relative density. The intermediary density effective properties were represented for a SIMP-type artificial model. The problem was simplified by use of the method of finite elements of Galerkin using triangles with linear Lagrangian basis. On the solution of the optimization problem, was applied the augmented Lagrangian Method, that consists on minimum problem sequence solution with box-type restrictions, resolved by a 2nd orderprojection method which uses the method of the quasi-Newton without memory, during the problem process solution. This process reduces computational expends showing be more effective and solid. The results materialize more refined layouts with accurate topologic and shape of structure definitions. On the other hand formulation of mass minimization with global stress criterion provides to modeling ready structural layouts, with violation of the criterion of homogeneous distributed stress

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The investigation of viability to use containers for Natural Gas Vehicle (NGV) storage, with different geometries of commercial standards, come from necessity to join the ambient, financial and technological benefits offered by the gas combustion, to the convenience of not modify the original proposal of the automobile. The use of these current cylindrical models for storage in the converted vehicles is justified by the excellent behavior that this geometry presents about the imposed tensions for the high pressure that the related reservoirs are submitted. However, recent research directed toward application of adsorbent materials in the natural gas reservoirs had proven a substantial redusction of pressure and, consequently, a relief of the tensions in the reservoirs. However, this study considers alternative geometries for NGV reservoirs, searching the minimization of dimensions and weight, remaining capacity to resist the tensions imposed by the new pressure situation. The proposed reservoirs parameters are calculated through a mathematical study of the internal pressure according to Brazilian standards (NBR) for pressure vessels. Finally simulations of the new geometries behavior are carried through using a commercially avaible Finite Element Method (FEM) software package ALGOR® to verify of the reservoirs efficincy under the gas pressure load

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a computational methodology to solve problems of optimization in structural design. The application develops, implements and integrates methods for structural analysis, geometric modeling, design sensitivity analysis and optimization. So, the optimum design problem is particularized for plane stress case, with the objective to minimize the structural mass subject to a stress criterion. Notice that, these constraints must be evaluated at a series of discrete points, whose distribution should be dense enough in order to minimize the chance of any significant constraint violation between specified points. Therefore, the local stress constraints are transformed into a global stress measure reducing the computational cost in deriving the optimal shape design. The problem is approximated by Finite Element Method using Lagrangian triangular elements with six nodes, and use a automatic mesh generation with a mesh quality criterion of geometric element. The geometric modeling, i.e., the contour is defined by parametric curves of type B-splines, these curves hold suitable characteristics to implement the Shape Optimization Method, that uses the key points like design variables to determine the solution of minimum problem. A reliable tool for design sensitivity analysis is a prerequisite for performing interactive structural design, synthesis and optimization. General expressions for design sensitivity analysis are derived with respect to key points of B-splines. The method of design sensitivity analysis used is the adjoin approach and the analytical method. The formulation of the optimization problem applies the Augmented Lagrangian Method, which convert an optimization problem constrained problem in an unconstrained. The solution of the Augmented Lagrangian function is achieved by determining the analysis of sensitivity. Therefore, the optimization problem reduces to the solution of a sequence of problems with lateral limits constraints, which is solved by the Memoryless Quasi-Newton Method It is demonstrated by several examples that this new approach of analytical design sensitivity analysis of integrated shape design optimization with a global stress criterion purpose is computationally efficient

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mortar is a type of adhesive products used in large scale in construction, it is a function of its variety and ease of application . Although industrialized product and endowed with technology in its production is very frequent occurrence of the same pathology , which causes frequent damage and losses in the construction industry. Faced with this real market situation , the technical and scientific study of the effects of the addition of diatomite on the rheological and mechanical behavior of adhesive mortars are needed. This work back as a suggestion the use of diatomite as a mineral additive in formulations of adhesive mortars for partial replacement of cellulose based additives . The choice of using this mineral occurs through physical, chemical and rheological properties that justify its use in this product line , and is a raw material abundant in our region and can thus contribute positively to the minimization of direct costs cellulose -based additives . Industrial adhesive mortar used for comparison , was type AC1 . Formulations of adhesive mortar with diatomite held constant dosed quantities of sand, cement and the water / cement (w / c ) , or adhesive mortar formulations were developed with levels 10, 20, 30 and 40% of diatomite substituting part of the cellulose -based additives . These mortars were subjected to the following tests that define and evaluate the rheological and mechanical behavior of this type of mortar. The results attest the best performance of the adhesive mortar type AC1 with partial replacement of 30 % of the cellulose-based additive for diatomite

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topology optimization problem characterize and determine the optimum distribution of material into the domain. In other words, after the definition of the boundary conditions in a pre-established domain, the problem is how to distribute the material to solve the minimization problem. The objective of this work is to propose a competitive formulation for optimum structural topologies determination in 3D problems and able to provide high-resolution layouts. The procedure combines the Galerkin Finite Elements Method with the optimization method, looking for the best material distribution along the fixed domain of project. The layout topology optimization method is based on the material approach, proposed by Bendsoe & Kikuchi (1988), and considers a homogenized constitutive equation that depends only on the relative density of the material. The finite element used for the approach is a four nodes tetrahedron with a selective integration scheme, which interpolate not only the components of the displacement field but also the relative density field. The proposed procedure consists in the solution of a sequence of layout optimization problems applied to compliance minimization problems and mass minimization problems under local stress constraint. The microstructure used in this procedure was the SIMP (Solid Isotropic Material with Penalty). The approach reduces considerably the computational cost, showing to be efficient and robust. The results provided a well defined structural layout, with a sharpness distribution of the material and a boundary condition definition. The layout quality was proporcional to the medium size of the element and a considerable reduction of the project variables was observed due to the tetrahedrycal element