62 resultados para Otimização. Cadeia de Markov. Algoritmo genético. Controladornebuloso
Resumo:
In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma
Resumo:
The aim of this study was to evaluate the potential of near-infrared reflectance spectroscopy (NIRS) as a rapid and non-destructive method to determine the soluble solid content (SSC), pH and titratable acidity of intact plums. Samples of plum with a total solids content ranging from 5.7 to 15%, pH from 2.72 to 3.84 and titratable acidity from 0.88 a 3.6% were collected from supermarkets in Natal-Brazil, and NIR spectra were acquired in the 714 2500 nm range. A comparison of several multivariate calibration techniques with respect to several pre-processing data and variable selection algorithms, such as interval Partial Least Squares (iPLS), genetic algorithm (GA), successive projections algorithm (SPA) and ordered predictors selection (OPS), was performed. Validation models for SSC, pH and titratable acidity had a coefficient of correlation (R) of 0.95 0.90 and 0.80, as well as a root mean square error of prediction (RMSEP) of 0.45ºBrix, 0.07 and 0.40%, respectively. From these results, it can be concluded that NIR spectroscopy can be used as a non-destructive alternative for measuring the SSC, pH and titratable acidity in plums
Resumo:
Multi-classifier systems, also known as ensembles, have been widely used to solve several problems, because they, often, present better performance than the individual classifiers that form these systems. But, in order to do so, it s necessary that the base classifiers to be as accurate as diverse among themselves this is also known as diversity/accuracy dilemma. Given its importance, some works have investigate the ensembles behavior in context of this dilemma. However, the majority of them address homogenous ensemble, i.e., ensembles composed only of the same type of classifiers. Thus, motivated by this limitation, this thesis, using genetic algorithms, performs a detailed study on the dilemma diversity/accuracy for heterogeneous ensembles
Resumo:
Classifier ensembles are systems composed of a set of individual classifiers and a combination module, which is responsible for providing the final output of the system. In the design of these systems, diversity is considered as one of the main aspects to be taken into account since there is no gain in combining identical classification methods. The ideal situation is a set of individual classifiers with uncorrelated errors. In other words, the individual classifiers should be diverse among themselves. One way of increasing diversity is to provide different datasets (patterns and/or attributes) for the individual classifiers. The diversity is increased because the individual classifiers will perform the same task (classification of the same input patterns) but they will be built using different subsets of patterns and/or attributes. The majority of the papers using feature selection for ensembles address the homogenous structures of ensemble, i.e., ensembles composed only of the same type of classifiers. In this investigation, two approaches of genetic algorithms (single and multi-objective) will be used to guide the distribution of the features among the classifiers in the context of homogenous and heterogeneous ensembles. The experiments will be divided into two phases that use a filter approach of feature selection guided by genetic algorithm
Resumo:
In this work we study a new risk model for a firm which is sensitive to its credit quality, proposed by Yang(2003): Are obtained recursive equations for finite time ruin probability and distribution of ruin time and Volterra type integral equation systems for ultimate ruin probability, severity of ruin and distribution of surplus before and after ruin
Resumo:
The on-line processes control for attributes consists of inspecting a single item at every m produced ones. If the examined item is conforming, the production continues; otherwise, the process stops for adjustment. However, in many practical situations, the interest consist of monitoring the number of non-conformities among the examined items. In this case, if the number of non-conformities is higher than an upper control limit, the process needs to be stopped and some adjustment is required. The contribution of this paper is to propose a control system for the number of nonconforming of the inspected item. Employing properties of an ergodic Markov chain, an expression for the expected cost per item of the control system was obtained and it will be minimized by two parameters: the sampling interval and the upper limit control of the non-conformities of the examined item. Numerical examples illustrate the proposed procedure
Resumo:
In production lines, the entire process is bound to unexpected happenings which may cost losing the production quality. Thus, it means losses to the manufacturer. Identify such causes and remove them is the task of the processing management. The on-line control system consists of periodic inspection of every month produced item. Once any of those items is quali ed as not t, it is admitted that a change in the fraction of the items occurred, and then the process is stopped for adjustments. This work is an extension of Quinino & Ho (2010) and has as objective main to make the monitoramento in a process through the control on-line of quality for the number of non-conformities about the inspected item. The strategy of decision to verify if the process is under control, is directly associated to the limits of the graphic control of non-conformities of the process. A policy of preventive adjustments is incorporated in order to enlarge the conforming fraction of the process. With the help of the R software, a sensibility analysis of the proposed model is done showing in which situations it is most interesting to execute the preventive adjustment
Resumo:
The use of behavioural indicators of suffering and welfare in captive animals has produced ambiguous results. In comparisons between groups, those in worse condition tend to exhibit increased overall rate of Behaviours Potentially Indicative of Stress (BPIS), but when comparing within groups, individuals differ in their stress coping strategies. This dissertation presents analyses to unravel the Behavioural Profile of a sample of 26 captive capuchin monkeys, of three different species (Sapajus libidinosus, S. flavius and S. xanthosternos), kept in different enclosure types. In total, 147,17 hours of data were collected. We explored four type of analysis: Activity Budgets, Diversity indexes, Markov chains and Sequence analyses, and Social Network Analyses, resulting in nine indexes of behavioural occurrence and organization. In chapter One we explore group differences. Results support predictions of minor sex and species differences and major differences in behavioural profile due to enclosure type: i. individuals in less enriched enclosures exhibited a more diverse BPIS repertoire and a decreased probability of a sequence with six Genus Normative Behaviour; ii. number of most probable behavioural transitions including at least one BPIS was higher in less enriched enclosures; iii. proeminence indexes indicate that BPIS function as dead ends of behavioural sequences, and proeminence of three BPIS (pacing, self-direct, active I) were higher in less enriched enclosures. Overall, these data are not supportive of BPIS as a repetitive pattern, with a mantra-like calming effect. Rather, the picture that emerges is more supportive of BPIS as activities that disrupt organization of behaviours, introducing “noise” that compromises optimal activity budget. In chapter Two we explored individual differences in stress coping strategies. We classified individuals along six axes of exploratory behaviour. These were only weakly correlated indicating low correlation among behavioural indicators of syndromes. Nevertheless, the results are suggestive of two broad stress coping strategies, similar to the bold/proactive and shy/reactive pattern: more exploratory capuchin monkeys exhibited increased values of proeminence in Pacing, aberrant sexual display and Active 1 BPIS, while less active animals exhibited increased probability in significant sequences involving at least one BPIS, and increased prominence in own stereotypy. Capuchin monkeys are known for their cognitive capacities and behavioural flexibility, therefore, the search for a consistent set of behavioural indictors of welfare and individual differences requires further studies and larger data sets. With this work we aim contributing to design scientifically grounded and statistically correct protocols for collection of behavioural data that permits comparability of results and meta-analyses, from whatever theoretical perspective interpretation it may receive.
Resumo:
Oil exploration at great depths requires the use of mobile robots to perform various operations such as maintenance, assembly etc. In this context, the trajectory planning and navigation study of these robots is relevant, as the great challenge is to navigate in an environment that is not fully known. The main objective is to develop a navigation algorithm to plan the path of a mobile robot that is in a given position (
Resumo:
Oil exploration at great depths requires the use of mobile robots to perform various operations such as maintenance, assembly etc. In this context, the trajectory planning and navigation study of these robots is relevant, as the great challenge is to navigate in an environment that is not fully known. The main objective is to develop a navigation algorithm to plan the path of a mobile robot that is in a given position (
Resumo:
This work presents a hybrid approach for the supplier selection problem in Supply Chain Management. We joined decision-making philosophy by researchers from business school and researchers from engineering in order to deal with the problem more extensively. We utilized traditional multicriteria decision-making methods, like AHP and TOPSIS, in order to evaluate alternatives according decision maker s preferences. The both techiniques were modeled by using definitions from the Fuzzy Sets Theory to deal with imprecise data. Additionally, we proposed a multiobjetive GRASP algorithm to perform an order allocation procedure between all pre-selected alternatives. These alternatives must to be pre-qualified on the basis of the AHP and TOPSIS methods before entering the LCR. Our allocation procedure has presented low CPU times for five pseudorandom instances, containing up to 1000 alternatives, as well as good values for all considered objectives. This way, we consider the proposed model as appropriate to solve the supplier selection problem in the SCM context. It can be used to help decision makers in reducing lead times, cost and risks in their supply chain. The proposed model can also improve firm s efficiency in relation to business strategies, according decision makers, even when a large number of alternatives must be considered, differently from classical models in purchasing literature
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
This work develops a methodology for defining the maximum active power being injected into predefined nodes in the studied distribution networks, considering the possibility of multiple accesses of generating units. The definition of these maximum values is obtained from an optimization study, in which further losses should not exceed those of the base case, i.e., without the presence of distributed generation. The restrictions on the loading of the branches and voltages of the system are respected. To face the problem it is proposed an algorithm, which is based on the numerical method called particle swarm optimization, applied to the study of AC conventional load flow and optimal load flow for maximizing the penetration of distributed generation. Alternatively, the Newton-Raphson method was incorporated to resolution of the load flow. The computer program is performed with the SCILAB software. The proposed algorithm is tested with the data from the IEEE network with 14 nodes and from another network, this one from the Rio Grande do Norte State, at a high voltage (69 kV), with 25 nodes. The algorithm defines allowed values of nominal active power of distributed generation, in percentage terms relative to the demand of the network, from reference values
Resumo:
This work seeks to propose and evaluate a change to the Ant Colony Optimization based on the results of experiments performed on the problem of Selective Ride Robot (PRS, a new problem, also proposed in this paper. Four metaheuristics are implemented, GRASP, VNS and two versions of Ant Colony Optimization, and their results are analyzed by running the algorithms over 32 instances created during this work. The metaheuristics also have their results compared to an exact approach. The results show that the algorithm implemented using the GRASP metaheuristic show good results. The version of the multicolony ant colony algorithm, proposed and evaluated in this work, shows the best results
Resumo:
Este trabalho aborda o problema de otimização em braquiterapia de alta taxa de dose no tratamento de pacientes com câncer, com vistas à definição do conjunto de tempos de parada. A técnica de solução adotada foi a Transgenética Computacional apoiada pelo método L-BFGS. O algoritmo desenvolvido foi empregado para gerar soluções não denominadas cujas distribuições de dose fossem capazes de eiminar o câncer e ao mesmo tempo preservar as regiões normais