984 resultados para métodos formais


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we have elaborated a spline-based method of solution of inicial value problems involving ordinary differential equations, with emphasis on linear equations. The method can be seen as an alternative for the traditional solvers such as Runge-Kutta, and avoids root calculations in the linear time invariant case. The method is then applied on a central problem of control theory, namely, the step response problem for linear EDOs with possibly varying coefficients, where root calculations do not apply. We have implemented an efficient algorithm which uses exclusively matrix-vector operations. The working interval (till the settling time) was determined through a calculation of the least stable mode using a modified power method. Several variants of the method have been compared by simulation. For general linear problems with fine grid, the proposed method compares favorably with the Euler method. In the time invariant case, where the alternative is root calculation, we have indications that the proposed method is competitive for equations of sifficiently high order.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oral and facial bone defects can undertake appearance, psychosocial well-being and stomathognatic function of its patients. Over the yerars several strategies for bone defect regeneration have arised to treat these pathologies, among them the use of frozen and irradiated bone allograft. Manipulation of bone grafts it s not determined yet, and several osteotomy alternatives can be observed. The present work evaluated with a microscope the bone fragments obtained from different osteotomy methods and irrigation on rings and blocks allografts irradiated and frozen at 80° negative in a rabbit model. The study is experimental in vitro and it sample was an adult male New Zealand rabbit. The animal was sacrificed to obtain long bones, that were submitted to freezing at 80º negative and irradiated with Cobalt- 60. Then the long bones were sectioned into 24 bone pieces, divided into 4 groups: G1 (n=06) osteotomy was performed with bur No. 6 forming rings with 5 mm thickness with high-speed handpiece with manual irrigation; G2 (n=06) osteotomy was performed with bur No. 6 forming rings with 5 mm thick with surgical motor with a manual irrigation rotation 1500 rpm; GA (n=06), osteotomy with trephine using manual irrigation with saline; and GB (n=06), osteotomy with trephine using saline from peristaltic pumps of surgical motor. Five bone pieces of each group were prepared for analysis on light microscopy (LM) and one on electronic scan electronic microscopy (SEM). On the SEM analysis edges surface, presence of microcracks and Smear Layer were evaluated. Analyzing osteotomy technics on SEM was observed: increased presence of microcracks cutting with high speed; increased presence of areas covered by Smear Layer when cutting with motor implant. The irrigation analysis with SEM was observed: that the presence of microcracks does not depend on the type of irrigation; on manual irrigation, there was greater discrepancy between the cutting lines. The descriptive analysis of the osteotomy and irrigation process on LM showed: histological analysis showing the bony margins with clear tissue changed layer, composed of blackened tissue of charred appearance near to the cortical bone; on the edges of the bony part, bone fragments that were displaced during the bone cut and bone irregularities were observed. After analysis of results we can conclude: that there was greater regularity of the bone cut using high-speed handpiece than using motor implant; the cut with trephine using saline irrigated from peristaltic pumps of surgical motor showed greater homogeneity when compared with manual irrigation; charred tissue was found in all obtained bone samples, whit no significant statistically difference on the proportion of carbonization of the two analysed technics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Com o grande crescimento tecnológico e a necessidade de melhoria, um dos materiais que ganhou uma grande gama na área da engenharia mecânica é a cerâmica, pois possui vantagens físico-químicas e propriedades mecânicas significativas sobre o aço. Entretanto, sua usinagem é um processo difícil e delicado, que requer ainda uma grande atenção em relação ao seu estudo. Assim, o processo de retificação é um dos métodos que tem apresentado bons resultados, porém um grande problema acerca de tal processo é o uso excessivo de fluidos de corte, o que se tornou uma preocupação mundial, já que os fluidos apresentam graves problemas socioambientais, além disso, o fluido de corte é responsável por uma grande parte do custo final do processo, provocando, desse modo, um grande interesse em pesquisas referentes a métodos alternativos de forma a reduzir o consumo e melhorar as características do fluido de corte utilizado. Este trabalho visa comparar duas técnicas de lubri-refrigeração, o método convencional e a Lubrificação Otimizada. O uso do método otimizado é uma alternativa à diminuição do volume de fluido utilizado, já que este tem como princípio a aplicação de uma menor quantidade de fluido de corte com uma alta velocidade, localmente aplicada, ou seja, com essa redução benefícios ambientais e socioeconômicas são obtidos. A análise do trabalho será feita a partir da avaliação das variáveis de saída do processo de retificação plana tais como o comportamento rugosidade e desgaste do rebolo, já que por elas é possível avaliar o processo em relação a qualidade da peça versus custo. Com essas analises, pretende-se avaliar se a técnica otimizada é viável a substituição da refrigeração convencional na retificação plana de cerâmicas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyzed the quality of raw milk from eight dairy farms in Rio Grande do Norte stored in a cooling tank , in order to evaluate methods for determining somatic cell counts (SCC). The Somaticell® kit and a portable Direct Cell Counter (DCC) were compared with each other and with the MilkoScanTM FT+ (FOSS Denmark), which uses Fourier Transform Infrared (FTIR) spectroscopy). Direct cell counter data were processed for somatic cell scores (log-transformed somatic cell count) and analyzed with the SAS®, statistical package , Statistical Analysis System, (SAS, INSTITUTE, 1998). Comparison of means and correlation of somatic cell scores were conducted using Pearson s correlation coefficient and the Tukey Test at 1 %. No significant difference was observed for comparison of means. The correlation between somatic cell scores was significant, that is, 0.907 and 0.876 between the MilkoScanTM FT+ and the Somaticell® kit and Direct Cell Count (DCC) respectively, and 0.943 between the Somaticell® kit and Direct Cell Count (DCC). The methods can be recommended for monitoring the quality of raw milk kept in a cooling tank in the production unit

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho apresenta uma investigação sobre o emprego de FMEA (Failure Mode and Effect Analysis) de Processo com a exposição de irregularidades na sua utilização. O método AHP (Analytic Hierarchy Process) e os Conjuntos Fuzzy são aplicados no estudo das práticas atuais de utilização de FMEA. O AHP é aplicado para a priorização das irregularidades quanto à gravidade de sua ocorrência. Os Conjuntos Fuzzy são aplicados para avaliação do desempenho da utilização de FMEA em algumas empresas do ramo automotivo. Como resultado, tem-se a aceitação de oito e a não aceitação de três dos onze formulários de FMEA averiguados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil contamination by pesticides is an environmental problem that needs to be monitored and avoided. However, the lack of fast, accurate and low cost analytical methods for discovering residual pesticide in complex matrices, such as soil, is a problem still unresolved. This problem needs to be solved before we are able to assess the quality of environmental samples. The intensive use of pesticides has increased since the 60s, because the dependence of their use, causing biological imbalances and promoting resistance and recurrence of high populations of pests and pathogens (upwelling). This has contributed to the appearance of new pests that were previously under natural control. To develop analytical methods that are able to quantify residues pesticide in complex environment. It is still a challenge for many laboratories. The integration of two analytical methods one ecotoxicological and another chemical demonstrates the potential for environmental analysis of methamidophos. The aim of this study was to evaluate an ecotoxicological method as "screening" analytical methamidophos in the soil and perform analytical confirmation in the samples of the concentration of the analyte by chemical method LC-MS/MS In this work we tested two soils: a clayey and sandy, both in contact with the kinetic methamidophos model followed pseudo-second order. The clay soil showed higher absorption of methamidophos and followed the Freundlich model, while the sandy, the Langmuir model. The chemical method was validated LC-MS/MS satisfactory, showing all parameters of linearity, range, precision, accuracy, and sensitivity adequate. In chronic ecotoxicological tests with C. dubia, the NOEC was 4.93 and 3.24 for ng L-1 of methamidophos to elutriate assays of sandy and clay soils, respectively. The method for ecotoxicological levels was more sensitive than LC-MS/MS detection of methamidophos, loamy and sandy soils. However, decreasing the concentration of the standard for analytical methamidophos and adjusting for the validation conditions chemical acquires a limit of quantification (LOQ) in ng L-1, consistent with the provisions of ecotoxicological test. The methods described should be used as an analytical tool for methamidophos in soil, and the ecotoxicological analysis can be used as a "screening" and LC-MS/MS as confirmatory analysis of the analyte molecule, confirming the objectives of this work

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of intelligent agents in multi-classifier systems appeared in order to making the centralized decision process of a multi-classifier system into a distributed, flexible and incremental one. Based on this, the NeurAge (Neural Agents) system (Abreu et al 2004) was proposed. This system has a superior performance to some combination-centered methods (Abreu, Canuto, and Santana 2005). The negotiation is important to the multiagent system performance, but most of negotiations are defined informaly. A way to formalize the negotiation process is using an ontology. In the context of classification tasks, the ontology provides an approach to formalize the concepts and rules that manage the relations between these concepts. This work aims at using ontologies to make a formal description of the negotiation methods of a multi-agent system for classification tasks, more specifically the NeurAge system. Through ontologies, we intend to make the NeurAge system more formal and open, allowing that new agents can be part of such system during the negotiation. In this sense, the NeurAge System will be studied on the basis of its functioning and reaching, mainly, the negotiation methods used by the same ones. After that, some negotiation ontologies found in literature will be studied, and then those that were chosen for this work will be adapted to the negotiation methods used in the NeurAge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the researches in artificial intelligence is to qualify the computer to execute functions that are performed by humans using knowledge and reasoning. This work was developed in the area of machine learning, that it s the study branch of artificial intelligence, being related to the project and development of algorithms and techniques capable to allow the computational learning. The objective of this work is analyzing a feature selection method for ensemble systems. The proposed method is inserted into the filter approach of feature selection method, it s using the variance and Spearman correlation to rank the feature and using the reward and punishment strategies to measure the feature importance for the identification of the classes. For each ensemble, several different configuration were used, which varied from hybrid (homogeneous) to non-hybrid (heterogeneous) structures of ensemble. They were submitted to five combining methods (voting, sum, sum weight, multiLayer Perceptron and naïve Bayes) which were applied in six distinct database (real and artificial). The classifiers applied during the experiments were k- nearest neighbor, multiLayer Perceptron, naïve Bayes and decision tree. Finally, the performance of ensemble was analyzed comparatively, using none feature selection method, using a filter approach (original) feature selection method and the proposed method. To do this comparison, a statistical test was applied, which demonstrate that there was a significant improvement in the precision of the ensembles

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The activity of requirements engineering is seen in agile methods as bureaucratic activity making the process less agile. However, the lack of documentation in agile development environment is identified as one of the main challenges of the methodology. Thus, it is observed that there is a contradiction between what agile methodology claims and the result, which occurs in the real environment. For example, in agile methods the user stories are widely used to describe requirements. However, this way of describing requirements is still not enough, because the user stories is an artifact too narrow to represent and detail the requirements. The activities of verifying issues like software context and dependencies between stories are also limited with the use of only this artifact. In the context of requirements engineering there are goal oriented approaches that bring benefits to the requirements documentation, including, completeness of requirements, analysis of alternatives and support to the rationalization of requirements. Among these approaches, it excels the i * modeling technique that provides a graphical view of the actors involved in the system and their dependencies. This work is in the context of proposing an additional resource that aims to reduce this lack of existing documentation in agile methods. Therefore, the objective of this work is to provide a graphical view of the software requirements and their relationships through i * models, thus enriching the requirements in agile methods. In order to do so, we propose a set of heuristics to perform the mapping of the requirements presented as user stories in i * models. These models can be used as a form of documentation in agile environment, because by mapping to i * models, the requirements will be viewed more broadly and with their proper relationships according to the business environment that they will meet

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A avaliação clínica dos membros inferiores na insuficiência venosa por si só não identifica os sistemas envolvidos ou os níveis anatômicos, sendo necessários exames complementares. Esses exames podem ser invasivos ou não-invasivos. Os invasivos, como flebografia e pressão venosa ambulatória, apesar de terem boa acurácia, trazem desconforto e complicações. Dentre os não-invasivos, destacam-se: Doppler ultra-som de ondas contínuas, fotopletismografia, pletismografia a ar e mapeamento dúplex. O Doppler ultra-som avalia a velocidade do fluxo sangüíneo de maneira indireta. A fotopletismografia avalia o tempo de reenchimento venoso, fornecendo um parâmetro objetivo de quantificação do refluxo venoso. A pletismografia a ar permite quantificar a redução ou não da capacitância, o refluxo e o desempenho da bomba muscular da panturrilha. O dúplex é considerado padrão-ouro dentre os não-invasivos, porque permite uma avaliação quantitativa e qualitativa, fornecendo informações anatômicas e funcionais, dando avaliação mais completa e detalhada dos sistemas venosos profundo e superficial.