78 resultados para critério


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the Markov chain will be the tool used in the modeling and analysis of convergence of the genetic algorithm, both the standard version as for the other versions that allows the genetic algorithm. In addition, we intend to compare the performance of the standard version with the fuzzy version, believing that this version gives the genetic algorithm a great ability to find a global optimum, own the global optimization algorithms. The choice of this algorithm is due to the fact that it has become, over the past thirty yares, one of the more importan tool used to find a solution of de optimization problem. This choice is due to its effectiveness in finding a good quality solution to the problem, considering that the knowledge of a good quality solution becomes acceptable given that there may not be another algorithm able to get the optimal solution for many of these problems. However, this algorithm can be set, taking into account, that it is not only dependent on how the problem is represented as but also some of the operators are defined, to the standard version of this, when the parameters are kept fixed, to their versions with variables parameters. Therefore to achieve good performance with the aforementioned algorithm is necessary that it has an adequate criterion in the choice of its parameters, especially the rate of mutation and crossover rate or even the size of the population. It is important to remember that those implementations in which parameters are kept fixed throughout the execution, the modeling algorithm by Markov chain results in a homogeneous chain and when it allows the variation of parameters during the execution, the Markov chain that models becomes be non - homogeneous. Therefore, in an attempt to improve the algorithm performance, few studies have tried to make the setting of the parameters through strategies that capture the intrinsic characteristics of the problem. These characteristics are extracted from the present state of execution, in order to identify and preserve a pattern related to a solution of good quality and at the same time that standard discarding of low quality. Strategies for feature extraction can either use precise techniques as fuzzy techniques, in the latter case being made through a fuzzy controller. A Markov chain is used for modeling and convergence analysis of the algorithm, both in its standard version as for the other. In order to evaluate the performance of a non-homogeneous algorithm tests will be applied to compare the standard fuzzy algorithm with the genetic algorithm, and the rate of change adjusted by a fuzzy controller. To do so, pick up optimization problems whose number of solutions varies exponentially with the number of variables

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ln this work the implementation of the SOM (Self Organizing Maps) algorithm or Kohonen neural network is presented in the form of hierarchical structures, applied to the compression of images. The main objective of this approach is to develop an Hierarchical SOM algorithm with static structure and another one with dynamic structure to generate codebooks (books of codes) in the process of the image Vector Quantization (VQ), reducing the time of processing and obtaining a good rate of compression of images with a minimum degradation of the quality in relation to the original image. Both self-organizing neural networks developed here, were denominated HSOM, for static case, and DHSOM, for the dynamic case. ln the first form, the hierarchical structure is previously defined and in the later this structure grows in an automatic way in agreement with heuristic rules that explore the data of the training group without use of external parameters. For the network, the heuristic mIes determine the dynamics of growth, the pruning of ramifications criteria, the flexibility and the size of children maps. The LBO (Linde-Buzo-Oray) algorithm or K-means, one ofthe more used algorithms to develop codebook for Vector Quantization, was used together with the algorithm of Kohonen in its basic form, that is, not hierarchical, as a reference to compare the performance of the algorithms here proposed. A performance analysis between the two hierarchical structures is also accomplished in this work. The efficiency of the proposed processing is verified by the reduction in the complexity computational compared to the traditional algorithms, as well as, through the quantitative analysis of the images reconstructed in function of the parameters: (PSNR) peak signal-to-noise ratio and (MSE) medium squared error

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste trabalho é proposto um novo algoritmo online para o resolver o Problema dos k-Servos (PKS). O desempenho desta solução é comparado com o de outros algoritmos existentes na literatura, a saber, os algoritmos Harmonic e Work Function, que mostraram ser competitivos, tornando-os parâmetros de comparação significativos. Um algoritmo que apresente desempenho eficiente em relação aos mesmos tende a ser competitivo também, devendo, obviamente, se provar o referido fato. Tal prova, entretanto, foge aos objetivos do presente trabalho. O algoritmo apresentado para a solução do PKS é baseado em técnicas de aprendizagem por reforço. Para tanto, o problema foi modelado como um processo de decisão em múltiplas etapas, ao qual é aplicado o algoritmo Q-Learning, um dos métodos de solução mais populares para o estabelecimento de políticas ótimas neste tipo de problema de decisão. Entretanto, deve-se observar que a dimensão da estrutura de armazenamento utilizada pela aprendizagem por reforço para se obter a política ótima cresce em função do número de estados e de ações, que por sua vez é proporcional ao número n de nós e k de servos. Ao se analisar esse crescimento (matematicamente, ) percebe-se que o mesmo ocorre de maneira exponencial, limitando a aplicação do método a problemas de menor porte, onde o número de nós e de servos é reduzido. Este problema, denominado maldição da dimensionalidade, foi introduzido por Belmann e implica na impossibilidade de execução de um algoritmo para certas instâncias de um problema pelo esgotamento de recursos computacionais para obtenção de sua saída. De modo a evitar que a solução proposta, baseada exclusivamente na aprendizagem por reforço, seja restrita a aplicações de menor porte, propõe-se uma solução alternativa para problemas mais realistas, que envolvam um número maior de nós e de servos. Esta solução alternativa é hierarquizada e utiliza dois métodos de solução do PKS: a aprendizagem por reforço, aplicada a um número reduzido de nós obtidos a partir de um processo de agregação, e um método guloso, aplicado aos subconjuntos de nós resultantes do processo de agregação, onde o critério de escolha do agendamento dos servos é baseado na menor distância ao local de demanda

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a formulation for optimization of 2D-structure layouts submitted to mechanic and thermal shipments and applied an h-adaptive filter process which conduced to computational low spend and high definition structural layouts. The main goal of the formulation is to minimize the structure mass submitted to an effective state of stress of von Mises, with stability and lateral restriction variants. A criterion of global measurement was used for intents a parametric condition of stress fields. To avoid singularity problems was considerate a release on the stress restriction. On the optimization was used a material approach where the homogenized constructive equation was function of the material relative density. The intermediary density effective properties were represented for a SIMP-type artificial model. The problem was simplified by use of the method of finite elements of Galerkin using triangles with linear Lagrangian basis. On the solution of the optimization problem, was applied the augmented Lagrangian Method, that consists on minimum problem sequence solution with box-type restrictions, resolved by a 2nd orderprojection method which uses the method of the quasi-Newton without memory, during the problem process solution. This process reduces computational expends showing be more effective and solid. The results materialize more refined layouts with accurate topologic and shape of structure definitions. On the other hand formulation of mass minimization with global stress criterion provides to modeling ready structural layouts, with violation of the criterion of homogeneous distributed stress

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The manufacture of prostheses for lower limb amputees (transfemural and transtibial) requires the preparation of a cartridge with appropriate and custom fit to the profile of each patient. The traditional process to the patients, mainly in public hospitals in Brazil, begins with the completion of a form where types of equipment, plugins, measures, levels of amputation etc. are identified. Currently, such work is carried out manually using a common metric tape and caliper of wood to take the measures of the stump, featuring a very rudimentary, and with a high degree of uncertainty geometry of the final product. To address this problem, it was necessary to act in two simultaneously and correlated directions. Originally, it was developed an integrated tool for viewing 3D CAD for transfemoral types of prostheses and transtibial called OrtoCAD I. At the same time, it was necessary to design and build a reader Mechanical equipment (sort of three-dimensional scanner simplified) able to obtain, automatically and with accuracy, the geometric information of either of the stump or the healthy leg. The methodology includes the application of concepts of reverse engineering to computationally generate the representation of the stump and/or the reverse image of the healthy member. The materials used in the manufacturing of prostheses nor always obey to a technical scientific criteria, because, if by one way it meets the criteria of resistance, by the other, it brings serious problems mainly due to excess of weight. This causes to the user various disorders due to lack of conformity. That problem was addressed with the creation of a hybrid composite material for the manufacture of cartridges of prostheses. Using the Reader Fitter and OrtoCAD, the new composite material, which aggregates the mechanical properties of strength and rigidity on important parameters such as low weight and low cost, it can be defined in its better way. Besides, it brings a reduction of up steps in the current processes of manufacturing or even the feasibility of using new processes, in the industries, in order to obtain the prostheses. In this sense, the hybridization of the composite with the combination of natural and synthetic fibers can be a viable solution to the challenges offered above

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was motivated by the importance of conducting a study of vehicle emissions in captive fleets with diesel engine, coupled with the predictive maintenance plan. This type of maintenance includes techniques designed to meet the growing market demand to reduce maintenance costs by increasing the reliability of diagnoses, which has increased interest in automated predictive maintenance on diesel engines, preventing problems that might evolve into routine turn into serious situations, solved only with complex and costly repairs, the Reliability Centered Maintenance, will be the methodology that will make our goal is reached, beyond maintaining the vehicles regulated as fuel consumption and emissions. To Therefore, technical improvements were estimated capable of penetrating the automotive market and give the inshore fleet emission rates of opacity of the vehicles, being directly related to the conditions of the lubricating oil thus contributing to reducing maintenance costs by contributing significantly to emissions of pollutants and an improvement in the air in large cities. This criterion was adopted and implemented, em 241 buses and produced a diagnosis of possible failures by the correlation between the characterization of used lubricating oils and the analysis of opacity, with the objective of the aid the detection and solution of failures for the maintenance of sub-systems according to design criteria, and for this to be a deductive methodology to determine potential causes of failures, has been automated to implement a predictive maintenance system for this purpose was used in our study a mobile unit equipped with a opacimeter and a kit for collection and analysis of lubricating oil and the construction of the network diagnostics, we used a computer program in Microsoft Office Access 2007 platform tool is indispensable for creating a database data, this method is being used and successfully implemented in seven (7) bus companies from the city of Natal (RN) Brazil

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topology optimization problem characterize and determine the optimum distribution of material into the domain. In other words, after the definition of the boundary conditions in a pre-established domain, the problem is how to distribute the material to solve the minimization problem. The objective of this work is to propose a competitive formulation for optimum structural topologies determination in 3D problems and able to provide high-resolution layouts. The procedure combines the Galerkin Finite Elements Method with the optimization method, looking for the best material distribution along the fixed domain of project. The layout topology optimization method is based on the material approach, proposed by Bendsoe & Kikuchi (1988), and considers a homogenized constitutive equation that depends only on the relative density of the material. The finite element used for the approach is a four nodes tetrahedron with a selective integration scheme, which interpolate not only the components of the displacement field but also the relative density field. The proposed procedure consists in the solution of a sequence of layout optimization problems applied to compliance minimization problems and mass minimization problems under local stress constraint. The microstructure used in this procedure was the SIMP (Solid Isotropic Material with Penalty). The approach reduces considerably the computational cost, showing to be efficient and robust. The results provided a well defined structural layout, with a sharpness distribution of the material and a boundary condition definition. The layout quality was proporcional to the medium size of the element and a considerable reduction of the project variables was observed due to the tetrahedrycal element

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work presents a contribution in the study of modelings of transference of heat for foods submitted to the experimental tests in the considered solar oven, where the best modeling for the beefburger of chicken in study was evaluated, comparing the results, considering this food as a half-infinite(1er object considered model) and,after that, considered the chicken beefburger as a plain plate in transient regimen in two distinct conditions: not considering and another model considering the contribution of the generation term, through the Criterion of Pomerantsev. The Sun, beyond life source, is the origin of all the energy forms that the man comes using during its history and can be the reply for the question of the energy supplying in the future, a time that learns to use to advantage in rational way the light that this star constantly special tax on our planet. Shining more than the 5 billion years, it is calculated that the Sun still in them will privilege for others 6 billion years, or either, it is only in the half of its existence and will launch on the Earth, only in this year, 4000 times more energy that we will consume. Front to this reality, would be irrational not to search, by all means technical possible, to use to advantage this clean, ecological and gratuitous power plant. In this dissertation evaluate the performance of solar cooker of the type box. Laboratory of Solar Energy of the Federal University of the Great River of North - UFRN was constructed by the group (LES) a model of solar stove of the type box and was tested its viability technique, considering modeling foods submitted when baking in the solar oven, the cooker has main characteristic the easiness of manufacture and assembly, the low cost (was used material accessible composition to the low income communities) and simplicity in the mechanism of movement of the archetype for incidence of the direct solar light. They had been proposals modeling for calculations of food the minimum baking time, considering the following models of transference of heat in the transient state: object the halfinfinite, plain plate and the model of the sphere to study the necessary temperature for the it bakes of bread (considering spherical geometry). After evaluate the models of transmission of heat will be foods submitted you the processes of to it bakes of, the times gotten for the modeling with the experimental times of it bakes in the solar oven had been compared, demonstrating the modeling that more good that it portraies the accuracies of the results of the model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work studies the fabrication of spaghetti through the process at high temperatures through the use of flour added to flour and flaxseed meal, with the aim of evaluating the final product quality and estimate the cost of production. The values of moisture, ash, protein, wet gluten, gluten index, falling number and grain of flour and mixtures to test to be the possible use in mass manufacturing and technological criteria for compliance with current legislation. Spaghetti noodles type were manufactured by adding 10% and 20% flour and 10% and 20% flaxseed meal with performance of physical-chemical, sensory and rheological properties of the products. Further analysis was performed on the product acceptance and estimation of production cost in order to create subsidies to enable the introduction of products with greater acceptance and economic viability in the market by the food industry. On the rheology of the product test was cooking the pasta, specifying the volume increase, cooking time and percentage of solid waste. In the sensory evaluation was carried out the triangular test of product differentiation with 50 trained judges and acceptance testing by a hedonic scale with evaluation of the aspects color, taste, smell and texture. In defining the sensory profile of the product was performed with ADQ 9 judges recruited and trained at the factory, using unstructured scale of 9 cm, assessing the attributes of flavor of wheat, flax flavor, consistency, texture of raw pasta, raw pasta color and color of cooked pasta. The greater acceptance of product quality was good and the pasta with 20% flour, 10% followed by the full product, 10% and 20% flaxseed characterized the average quality of the criterion of loss analysis of solids, together with mass full commercial testing. In assessing the estimated cost of production, the two products more technologically feasible and acceptable (20% whole and 10% flaxseed) were evaluated in high temperature processes. With total cost of R $ 4,872.5 / 1,000 kg and R $ 5,354.9 / 1,000 kg respectively, the difference was related to the addition of lower inputs and higher added value in the market, flour and flaxseed meal. The comparative analysis of cases was confirmed the reduction in production time (10h), more uniform product to the drying process at high temperature compared to conventional

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because of disability in public policy development in mind to attend issues of sanitation in the municipalities, companies known as "clean-blue" appeared proposing to solve a simple collection and management of wastewater produced in single or multifamily residences, commercial, hospitals, etc. In the case of an activity in which there are no worries about the fate of sewage, emerged some doubts about the degree of health and environmental safety in these companies. Traditionally, most of them makes the provision of waste depleted soil or wetland, open, usually located on the outskirts of cities (MENESES, 2001). In turn, the sludge from septic tanks exhausted, provided no technical criteria - in the soil, rivers and as an agricultural fertilizer put in risk the health of the population and environmental quality. This work was entered in the search network 5 of the Notice of the Research Program in Sanitation - PROSAB-5, aimed to study the theme 'Characterization and study of alternative ways of treating sludge from septic tanks in the city of Natal, RN', proposing to evaluate the performance of the use of stabilization ponds as a system to handle waste from septic tanks exhausted. A series of lakes studied belong to one of the largest clean-pit of Natal, consisting of two anaerobic ponds, one facultative and maturation, and a tank disinfection, the wastewater being released in the Potengi River. Samples were collected between the months of October 2007 to October 2008, at six points previously defined and judged as more appropriate to what is proposed study. The analysis results in field and laboratory showed the most significant removal of COD (88.93%), total suspended solids (94.87%), organic nitrogen (66.87%) and thermotolerant coliforms (99.88%). Some results have not reached the expected because the system under study had operating problems that have undermined the efficiency of the reactors

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The semiarid rainfall regime is northeastern Brazil is highly variable. Climate processes associated with rainfall are complex and their effects may represent extreme situations of drought or floods, which can have adverse effects on society and the environment. The regional economy has a significant agricultural component, which is strongly influenced by weather conditions. Maximum precipitation analysis is traditionally performed using the intensity-duration-frequency (IDF) probabilistic approach. Results from such analysis are typically used in engineering projects involving hydraulic structures such as drainage network systems and road structures. On the other hand, precipitation data analysis may require the adoption of some kind of event identification criteria. The minimum inter-event duration (IMEE) is one of the most used criteria. This study aims to analyze the effect of the IMEE on the obtained rain event properties. For this purpose, a nine-year precipitation time series (2002- 2011) was used. This data was obtained from an automatic raingauge station, installed in an environmentally protected area, Ecological Seridó Station. The results showed that adopted IMEE values has an important effect on the number of events, duration, event height, mean rainfall rate and mean inter-event duration. Furthermore, a higher occurrence of extreme events was observed for small IMEE values. Most events showed average rainfall intensity higher than 2 mm.h-1 regardless of IMEE. The storm coefficient of advance was, in most cases, within the first quartile of the event, regardless of the IMEE value. Time series analysis using partial time series made it possible to adjust the IDF equations to local characteristics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies the use of argumentation as a discursive element in digital media, particularly blogs. We analyzed the Blog "Fatos e Dados" [Facts and Data], created by Petrobras in the context of allegations of corruption that culminated in the installation of a Parliamentary Commission of Inquiry to investigate the company within the Congress. We intend to understand the influence that the discursive elements triggered by argumentation exercise in blogs and about themes scheduling. To this end, we work with notions of argumentation in dialogue with questions of language and discourse from the work of Charaudeau (2006), Citelli (2007), Perelman & Olbrechts-Tyteca (2005), Foucault (2007, 2008a), Bakhtin (2006) and Breton (2003). We also observe our subject from the perspective of social representations, where we seek to clarify concepts such as public image and the use of representations as argumentative elements, considering the work of Moscovici (2007). We also consider reflections about hypertext and the context of cyberculture, with authors such as Levy (1993, 1999, 2003), Castells (2003) and Chartier (1999 and 2002), and issues of discourse analysis, especially in Orlandi (1988, 1989, 1996 and 2001), as well as Foucault (2008b). We analyzed 118 posts published in the first 30 days of existence of the blog "Fatos e Dados" (between 2 June and 1 July 2009), and analyzed in detail the top ten. A corporate blog aims to defend the points of view and public image of the organization, and, therefore, uses elements of social representations to build their arguments. It goes beyond the blog, as the main news criteria, including the posts we reviewed, the credibility of Petrobras as the source of information. In the posts analyzed, the news values of innovation and relevance also arise. The controversy between the Blog and the press resulted from an inadequacy and lack of preparation of media to deal with a corporate blog that was able to explore the characteristics of liberation of the emission pole in cyberculture. The Blog is a discursive manifestation in a concrete historical situation, whose understanding and attribution of meaning takes place from the social relations between subjects that, most of the time, place themselves in discursive and ideological dispute between each other - this dispute also affects the movements of reading and reading production. We conclude that intersubjective relationships that occur in blogs change, in the form of argumentative techniques used, the notions of news criteria, interfering with scheduling of news and organization of information in digital media outlets. It is also clear the influence that the discursive elements triggered by argumentation exercise in digital media, trying to resize and reframe frames of reality conveyed by it in relation to the subject-readers. Blogs have become part of the scenario information with the emergence of the Internet and are able to interfere in a more effective way to organize the scheduling of media from the conscious utilization of argumentative elements in their posts

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The following work is to interpret and analyze the problem of induction under a vision founded on set theory and probability theory as a basis for solution of its negative philosophical implications related to the systems of inductive logic in general. Due to the importance of the problem and the relatively recent developments in these fields of knowledge (early 20th century), as well as the visible relations between them and the process of inductive inference, it has been opened a field of relatively unexplored and promising possibilities. The key point of the study consists in modeling the information acquisition process using concepts of set theory, followed by a treatment using probability theory. Throughout the study it was identified as a major obstacle to the probabilistic justification, both: the problem of defining the concept of probability and that of rationality, as well as the subtle connection between the two. This finding called for a greater care in choosing the criterion of rationality to be considered in order to facilitate the treatment of the problem through such specific situations, but without losing their original characteristics so that the conclusions can be extended to classic cases such as the question about the continuity of the sunrise

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The SPPB provides information about physical function and is a predictor of adverse events in the elderly. Frailty is a multidimensional syndrome that increases susceptibility to diseases and disability. However it may be possible to prevent or postpone frailty if is identified early. Our objective is to analyze SPPB s ability in screening for frailty a community-dwelling young elderly from cities with distinct socioeconomic conditions. Methods: Data were originated from community dwelling adults (65-74 years old) in Canada (Saint Bruno; n = 60) and Brazil (Santa Cruz; n = 64). SPPB was used to assess physical performance. Frailty was defined as the presence of ≥ 3 of these criteria: weight loss, exhaustion, weakness, mobility limitation and low physical activity. One point was given for each criterion met, totalizing a frailty score ranged from 0 to 5. The Linear Regression and Receiver Operating Characteristics analyses were performed to evaluate the SPPB s screening ability. Results: Mean age was 69.48, 10.0% of the Saint Bruno s sample and 28.1% of Santa Cruz s were frail (p = 0.001), the SPPB score means were 9.6 and 8.5 respectively (p = 0.01). SPPB correlated with the frailty score (R2 = 0.33), with better results for Saint Bruno. A cutoff of 9 in SPPB had good sensitivity and specificity in discriminating frail from non frail in Saint Bruno (AUC = 0.81) but showed fair results in Santa Cruz (AUC = 0.61). Conclusion: The SPPB has moderate ability in predicting frailty among older adult s population, and is an useful test to identify people with good functionality and low frailty when SPPB scores are ≥9