898 resultados para MÉTODOS DE TRABALHO
Resumo:
Oral and facial bone defects can undertake appearance, psychosocial well-being and stomathognatic function of its patients. Over the yerars several strategies for bone defect regeneration have arised to treat these pathologies, among them the use of frozen and irradiated bone allograft. Manipulation of bone grafts it s not determined yet, and several osteotomy alternatives can be observed. The present work evaluated with a microscope the bone fragments obtained from different osteotomy methods and irrigation on rings and blocks allografts irradiated and frozen at 80° negative in a rabbit model. The study is experimental in vitro and it sample was an adult male New Zealand rabbit. The animal was sacrificed to obtain long bones, that were submitted to freezing at 80º negative and irradiated with Cobalt- 60. Then the long bones were sectioned into 24 bone pieces, divided into 4 groups: G1 (n=06) osteotomy was performed with bur No. 6 forming rings with 5 mm thickness with high-speed handpiece with manual irrigation; G2 (n=06) osteotomy was performed with bur No. 6 forming rings with 5 mm thick with surgical motor with a manual irrigation rotation 1500 rpm; GA (n=06), osteotomy with trephine using manual irrigation with saline; and GB (n=06), osteotomy with trephine using saline from peristaltic pumps of surgical motor. Five bone pieces of each group were prepared for analysis on light microscopy (LM) and one on electronic scan electronic microscopy (SEM). On the SEM analysis edges surface, presence of microcracks and Smear Layer were evaluated. Analyzing osteotomy technics on SEM was observed: increased presence of microcracks cutting with high speed; increased presence of areas covered by Smear Layer when cutting with motor implant. The irrigation analysis with SEM was observed: that the presence of microcracks does not depend on the type of irrigation; on manual irrigation, there was greater discrepancy between the cutting lines. The descriptive analysis of the osteotomy and irrigation process on LM showed: histological analysis showing the bony margins with clear tissue changed layer, composed of blackened tissue of charred appearance near to the cortical bone; on the edges of the bony part, bone fragments that were displaced during the bone cut and bone irregularities were observed. After analysis of results we can conclude: that there was greater regularity of the bone cut using high-speed handpiece than using motor implant; the cut with trephine using saline irrigated from peristaltic pumps of surgical motor showed greater homogeneity when compared with manual irrigation; charred tissue was found in all obtained bone samples, whit no significant statistically difference on the proportion of carbonization of the two analysed technics
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Com o grande crescimento tecnológico e a necessidade de melhoria, um dos materiais que ganhou uma grande gama na área da engenharia mecânica é a cerâmica, pois possui vantagens físico-químicas e propriedades mecânicas significativas sobre o aço. Entretanto, sua usinagem é um processo difícil e delicado, que requer ainda uma grande atenção em relação ao seu estudo. Assim, o processo de retificação é um dos métodos que tem apresentado bons resultados, porém um grande problema acerca de tal processo é o uso excessivo de fluidos de corte, o que se tornou uma preocupação mundial, já que os fluidos apresentam graves problemas socioambientais, além disso, o fluido de corte é responsável por uma grande parte do custo final do processo, provocando, desse modo, um grande interesse em pesquisas referentes a métodos alternativos de forma a reduzir o consumo e melhorar as características do fluido de corte utilizado. Este trabalho visa comparar duas técnicas de lubri-refrigeração, o método convencional e a Lubrificação Otimizada. O uso do método otimizado é uma alternativa à diminuição do volume de fluido utilizado, já que este tem como princípio a aplicação de uma menor quantidade de fluido de corte com uma alta velocidade, localmente aplicada, ou seja, com essa redução benefícios ambientais e socioeconômicas são obtidos. A análise do trabalho será feita a partir da avaliação das variáveis de saída do processo de retificação plana tais como o comportamento rugosidade e desgaste do rebolo, já que por elas é possível avaliar o processo em relação a qualidade da peça versus custo. Com essas analises, pretende-se avaliar se a técnica otimizada é viável a substituição da refrigeração convencional na retificação plana de cerâmicas.
Resumo:
Este trabalho apresenta uma investigação sobre o emprego de FMEA (Failure Mode and Effect Analysis) de Processo com a exposição de irregularidades na sua utilização. O método AHP (Analytic Hierarchy Process) e os Conjuntos Fuzzy são aplicados no estudo das práticas atuais de utilização de FMEA. O AHP é aplicado para a priorização das irregularidades quanto à gravidade de sua ocorrência. Os Conjuntos Fuzzy são aplicados para avaliação do desempenho da utilização de FMEA em algumas empresas do ramo automotivo. Como resultado, tem-se a aceitação de oito e a não aceitação de três dos onze formulários de FMEA averiguados.
Resumo:
In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma
Resumo:
Soil contamination by pesticides is an environmental problem that needs to be monitored and avoided. However, the lack of fast, accurate and low cost analytical methods for discovering residual pesticide in complex matrices, such as soil, is a problem still unresolved. This problem needs to be solved before we are able to assess the quality of environmental samples. The intensive use of pesticides has increased since the 60s, because the dependence of their use, causing biological imbalances and promoting resistance and recurrence of high populations of pests and pathogens (upwelling). This has contributed to the appearance of new pests that were previously under natural control. To develop analytical methods that are able to quantify residues pesticide in complex environment. It is still a challenge for many laboratories. The integration of two analytical methods one ecotoxicological and another chemical demonstrates the potential for environmental analysis of methamidophos. The aim of this study was to evaluate an ecotoxicological method as "screening" analytical methamidophos in the soil and perform analytical confirmation in the samples of the concentration of the analyte by chemical method LC-MS/MS In this work we tested two soils: a clayey and sandy, both in contact with the kinetic methamidophos model followed pseudo-second order. The clay soil showed higher absorption of methamidophos and followed the Freundlich model, while the sandy, the Langmuir model. The chemical method was validated LC-MS/MS satisfactory, showing all parameters of linearity, range, precision, accuracy, and sensitivity adequate. In chronic ecotoxicological tests with C. dubia, the NOEC was 4.93 and 3.24 for ng L-1 of methamidophos to elutriate assays of sandy and clay soils, respectively. The method for ecotoxicological levels was more sensitive than LC-MS/MS detection of methamidophos, loamy and sandy soils. However, decreasing the concentration of the standard for analytical methamidophos and adjusting for the validation conditions chemical acquires a limit of quantification (LOQ) in ng L-1, consistent with the provisions of ecotoxicological test. The methods described should be used as an analytical tool for methamidophos in soil, and the ecotoxicological analysis can be used as a "screening" and LC-MS/MS as confirmatory analysis of the analyte molecule, confirming the objectives of this work
Resumo:
The use of intelligent agents in multi-classifier systems appeared in order to making the centralized decision process of a multi-classifier system into a distributed, flexible and incremental one. Based on this, the NeurAge (Neural Agents) system (Abreu et al 2004) was proposed. This system has a superior performance to some combination-centered methods (Abreu, Canuto, and Santana 2005). The negotiation is important to the multiagent system performance, but most of negotiations are defined informaly. A way to formalize the negotiation process is using an ontology. In the context of classification tasks, the ontology provides an approach to formalize the concepts and rules that manage the relations between these concepts. This work aims at using ontologies to make a formal description of the negotiation methods of a multi-agent system for classification tasks, more specifically the NeurAge system. Through ontologies, we intend to make the NeurAge system more formal and open, allowing that new agents can be part of such system during the negotiation. In this sense, the NeurAge System will be studied on the basis of its functioning and reaching, mainly, the negotiation methods used by the same ones. After that, some negotiation ontologies found in literature will be studied, and then those that were chosen for this work will be adapted to the negotiation methods used in the NeurAge.
Resumo:
The objective of the researches in artificial intelligence is to qualify the computer to execute functions that are performed by humans using knowledge and reasoning. This work was developed in the area of machine learning, that it s the study branch of artificial intelligence, being related to the project and development of algorithms and techniques capable to allow the computational learning. The objective of this work is analyzing a feature selection method for ensemble systems. The proposed method is inserted into the filter approach of feature selection method, it s using the variance and Spearman correlation to rank the feature and using the reward and punishment strategies to measure the feature importance for the identification of the classes. For each ensemble, several different configuration were used, which varied from hybrid (homogeneous) to non-hybrid (heterogeneous) structures of ensemble. They were submitted to five combining methods (voting, sum, sum weight, multiLayer Perceptron and naïve Bayes) which were applied in six distinct database (real and artificial). The classifiers applied during the experiments were k- nearest neighbor, multiLayer Perceptron, naïve Bayes and decision tree. Finally, the performance of ensemble was analyzed comparatively, using none feature selection method, using a filter approach (original) feature selection method and the proposed method. To do this comparison, a statistical test was applied, which demonstrate that there was a significant improvement in the precision of the ensembles
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The activity of requirements engineering is seen in agile methods as bureaucratic activity making the process less agile. However, the lack of documentation in agile development environment is identified as one of the main challenges of the methodology. Thus, it is observed that there is a contradiction between what agile methodology claims and the result, which occurs in the real environment. For example, in agile methods the user stories are widely used to describe requirements. However, this way of describing requirements is still not enough, because the user stories is an artifact too narrow to represent and detail the requirements. The activities of verifying issues like software context and dependencies between stories are also limited with the use of only this artifact. In the context of requirements engineering there are goal oriented approaches that bring benefits to the requirements documentation, including, completeness of requirements, analysis of alternatives and support to the rationalization of requirements. Among these approaches, it excels the i * modeling technique that provides a graphical view of the actors involved in the system and their dependencies. This work is in the context of proposing an additional resource that aims to reduce this lack of existing documentation in agile methods. Therefore, the objective of this work is to provide a graphical view of the software requirements and their relationships through i * models, thus enriching the requirements in agile methods. In order to do so, we propose a set of heuristics to perform the mapping of the requirements presented as user stories in i * models. These models can be used as a form of documentation in agile environment, because by mapping to i * models, the requirements will be viewed more broadly and with their proper relationships according to the business environment that they will meet
Resumo:
Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets
Resumo:
This work has as main objective to find mathematical models based on linear parametric estimation techniques applied to the problem of calculating the grow of gas in oil wells. In particular we focus on achieving grow models applied to the case of wells that produce by plunger-lift technique on oil rigs, in which case, there are high peaks in the grow values that hinder their direct measurement by instruments. For this, we have developed estimators based on recursive least squares and make an analysis of statistical measures such as autocorrelation, cross-correlation, variogram and the cumulative periodogram, which are calculated recursively as data are obtained in real time from the plant in operation; the values obtained for these measures tell us how accurate the used model is and how it can be changed to better fit the measured values. The models have been tested in a pilot plant which emulates the process gas production in oil wells
Resumo:
Este trabalho tem como objetivo o estudo do comportamento assintótico da estatística de Pearson (1900), que é o aparato teórico do conhecido teste qui-quadrado ou teste x2 como também é usualmente denotado. Inicialmente estudamos o comportamento da distribuição da estatística qui-quadrado de Pearson (1900) numa amostra {X1, X2,...,Xn} quando n → ∞ e pi = pi0 , 8n. Em seguida detalhamos os argumentos usados em Billingley (1960), os quais demonstram a convergência em distribuição de uma estatística, semelhante a de Pearson, baseada em uma amostra de uma cadeia de Markov, estacionária, ergódica e com espaço de estados finitos S
Resumo:
OBJETIVOS: Apreender as concepções e experiências de enfermeiros sobre qualidade de vida e qualidade de vida no trabalho na Atenção Básica à Saúde. MÉTODOS: Estudo descritivo de abordagem qualitativa, realizado no interior paulista, Brasil, com oito enfermeiros cujos depoimentos foram submetidos à análise de conteúdo, na vertente temática. RESULTADOS: Os enfermeiros apresentaram concepções ampliadas sobre qualidade de vida e qualidade de vida no trabalho, em geral, apresentando-se satisfeitos quanto às mesmas. Entretanto, foram apontados entraves comprometedores da qualidade de vida dos profissionais no contexto estudado determinados, principalmente, pela falta/inadequação de recursos materiais, humanos e ambientais, bem como pelo processo de trabalho estabelecido. CONCLUSÃO: Embora haja o reconhecimento da satisfação em trabalhar na Atenção Básica à Saúde, os problemas apontados revelam a importância de se mobilizar maior atenção dos profissionais e gestores para o tema.
Resumo:
Objetivo: estudar a validade da prova de trabalho de parto (PTP) em gestantes com uma cesárea anterior. Métodos: estudo retrospectivo, tipo coorte, incluindo 438 gestantes com uma cesárea anterior ao parto em estudo e seus 450 recém-nascidos (RN), divididas em dois grupos - com e sem PTP. O tamanho amostral mínimo foi de 121 gestantes/grupo. Considerou-se variável independente a PTP e as dependentes relacionaram-se à ocorrência de parto vaginal e à freqüência de complicações maternas e perinatais. Foram efetuadas análises uni e multivariada, respectivamente. A comparação entre as freqüências (%) foi analisada pelo teste do qui-quadrado (chi²) com significância de 5% e regressão logística com cálculo do odds ratio (OR) e do intervalo de confiança a 95% (IC95%). Resultados: a PTP associou-se a 59,2% de partos vaginais. Foi menos indicada nas gestantes com mais de 40 anos (2,7% vs 6,5%) e nas portadoras de doenças associadas e complicações da gravidez: síndromes hipertensivas (7,0%) e hemorragias de 3º trimestre (0,3%). A PTP não se relacionou às complicações maternas e perinatais. As gestantes que tiveram o parto por cesárea, independente da PTP, apresentaram maior risco de complicações puerperais (OR = 3,53; IC95% = 1,57-7,93). A taxa de mortalidade perinatal foi dependente do peso do RN e das malformações fetais e não se relacionou à PTP. Ao contrário, as complicações respiratórias foram mais freqüentes nos RN de mães não testadas quanto à PTP (OR = 1,92; IC95% = 1,20-3,07). Conclusões: os resultados comprovaram que a PTP em gestantes com uma cesárea anterior é estratégia segura - favoreceu o parto vaginal em 59,2% dos casos e não interferiu com a morbimortalidade materna e perinatal. Portanto, é recurso que deve ser estimulado.