102 resultados para Estudo de métodos


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enzymatic synthesis of peptides using proteases has attracted a great deal of attention in recent years. One key challenge in peptide synthesis is to find supports for protease immobilization capable of working in aqueous medium at high performance, producing watersoluble oligopeptides. At present, few reports have been described using this strategy. Therefore, the aim of this thesis was to immobilize proteases applying different methods (Immobilization by covalent bound, entrapment onto polymeric gels of PVA and immobilization on glycidil metacrylate magnetic nanoparticles) in order to produce water-soluble oligopeptides derived from lysine. Three different proteases were used: trypsin, α-chymotrypsin and bromelain. According to immobilization strategies associated to the type of protease employed, trypsin-resin systems showed the best performance in terms of hydrolytic activity and oligopeptides synthesis. Hydrolytic activities of the free and immobilized enzymes were determined spectrophotometrically based on the absorbance change at 660 nm at 25 °C (Casein method). Calculations of oligolysine yield and average degree of polymerization (DPavg) were monitored by 1H-NMR analysis. Trypsin was covalently immobilized onto four different resins (Amberzyme, Eupergit C, Eupergit CM and Grace 192). Maximum yield of bound protein was 92 mg/g, 82 mg/g and 60 mg/g support for each resin respectively. The effectiveness of these systems (Trypsin-resins) was evaluated by hydrolysis of casein and synthesis of water-soluble oligolysine. Most systems were capable of catalyzing oligopeptide synthesis in aqueous medium, albeit at different efficiencies, namely: 40, 37 and 35% for Amberzyme, Eupergit C and Eupergit CM, respectively, in comparison with free enzyme. These systems produced oligomers in only 1 hour with DPavg higher than free enzyme. Among these systems, the Eupergit C-Trypsin system showed greater efficiency than others in terms of hydrolytic activity and thermal stability. However, this did not occur for oligolysine synthesis. Trypsin-Amberzyme proved to be more successful in oligopeptide synthesis, and exhibited excellent reusability, since it retained 90% of its initial hydrolytic and synthetic activity after 7 reuses. Trypsin hydrophobic interactions with Amberzyme support are responsible for protecting against strong enzyme conformational changes in the medium. In addition, the high concentration of oxirane groups on the surface promoted multi-covalent linking and, consequently, prevented the immobilized enzyme from leaching. The aforementioned results suggest that immobilized Trypsin on the supports evaluated can be efficiently used for oligopeptides synthesis in aqueous media

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biodiesel is a fuel made up by mono-alkyl-esters of long chain fatty acids, derived from vegetable oils or animal fat. This fuel can be used in compression ignition engines for automotive propulsion or energy generation, as a partial or total substitute of fossil diesel fuel. Biodiesel can be processed from different mechanisms. Transesterification is the most common process for obtaining biodiesel, in which an ester compound reacts with an alcohol to form a new ester and a new alcohol. These reactions are normally catalyzed by the addition of an acid or a base. Initially sunflower, castor and soybean oil physicochemical properties are determined according to standard test methods, to evaluate if they had favorable conditions for use as raw material in the transesterification reaction. Sunflower, castor and soybean biodiesel were obtained by the methylic transesterification route in the presence of KOH and presented a yield above 93% m/m. The sunflower/castor and soybean/castor blends were studied with the aim of evaluating the thermal and oxidative stability of the biofuels. The biodiesel and blends were characterized by acid value, iodine value, density, flash point, sulfur content, and content of methanol and esters by gas chromatography (GC). Also studies of thermal and oxidative stability by Thermogravimetry (TG), Differential Scanning Calorimetry High Pressure (P-DSC) and dynamic method exothermic and Rancimat were carried out. Biodiesel sunflower and soybean are presented according to the specifications established by the Resolution ANP no 7/2008. Biodiesel from castor oil, as expected, showed a high density and kinematic viscosity. For the blends studied, the concentration of castor biodiesel to increased the density, kinematic viscosity and flash point. The addition of castor biodiesel as antioxidant in sunflower and soybean biodiesels is promising, for a significant improvement in resistance to autoxidation and therefore on its oxidative stability. The blends showed that compliance with the requirements of the ANP have been included in the range of 20-40%. This form may be used as a partial substitute of fossil diesel

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The natural gas (NG) is a clean energy source and found in the underground of porous rocks, associated or not to oil. Its basic composition includes methane, ethane, propane and other components, like carbon dioxide, nitrogen, hydrogen sulphide and water. H2S is one of the natural pollutants of the natural gas. It is considered critical concerning corrosion. Its presence depends on origin, as well as of the process used in the gas treatment. It can cause problems in the tubing materials and final applications of the NG. The Agência Nacional do Petróleo sets out that the maximum concentration of H2S in the natural gas, originally national or imported, commercialized in Brazil must contain 10 -15 mg/cm3. In the Processing Units of Natural Gas, there are used different methods in the removal of H2S, for instance, adsorption towers filled with activated coal, zeolites and sulfatreat (solid, dry, granular and based on iron oxide). In this work, ion exchange resins were used as adsorbing materials. The resins were characterized by thermo gravimetric analysis, infrared spectroscopy and sweeping electronic microscopy. The adsorption tests were performed in a system linked to a gas-powered chromatograph. The present H2S in the exit of this system was monitored by a photometrical detector of pulsing flame. The electronic microscopy analyzes showed that the topography and morphology of the resins favor the adsorption process. Some characteristics were found such as, macro behavior, particles of variable sizes, spherical geometries, without the visualization of any pores in the surface. The infrared specters presented the main frequencies of vibration associated to the functional group of the amines and polymeric matrixes. When the resins are compared with sulfatreat, under the same experimental conditions, they showed a similar performance in retention times and adsorption capacities, making them competitive ones for the desulphurization process of the natural gas

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The natural gas (NG) is a clean energy source and found in the underground of porous rocks, associated or not to oil. Its basic composition includes methane, ethane, propane and other components, like carbon dioxide, nitrogen, hydrogen sulphide and water. H2S is one of the natural pollutants of the natural gas. It is considered critical concerning corrosion. Its presence depends on origin, as well as of the process used in the gas treatment. It can cause problems in the tubing materials and final applications of the NG. The Agência Nacional do Petróleo sets out that the maximum concentration of H2S in the natural gas, originally national or imported, commercialized in Brazil must contain 10 -15 mg/cm3. In the Processing Units of Natural Gas, there are used different methods in the removal of H2S, for instance, adsorption towers filled with activated coal, zeolites and sulfatreat (solid, dry, granular and based on iron oxide). In this work, ion exchange resins were used as adsorbing materials. The resins were characterized by thermo gravimetric analysis, infrared spectroscopy and sweeping electronic microscopy. The adsorption tests were performed in a system linked to a gas-powered chromatograph. The present H2S in the exit of this system was monitored by a photometrical detector of pulsing flame. The electronic microscopy analyzes showed that the topography and morphology of the resins favor the adsorption process. Some characteristics were found such as, macro behavior, particles of variable sizes, spherical geometries, without the visualization of any pores in the surface. The infrared specters presented the main frequencies of vibration associated to the functional group of the amines and polymeric matrixes. When the resins are compared with sulfatreat, under the same experimental conditions, they showed a similar performance in retention times and adsorption capacities, making them competitive ones for the desulphurization process of the natural gas

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The regeneration of bone defects with loss of substance remains as a therapeutic challenge in the medical field. There are basically four types of grafts: autologous, allogenic, xenogenic and isogenic. It is a consensus that autologous bone is the most suitable material for this purpose, but there are limitations to its use, especially the insufficient amount in the donor. Surveys show that the components of the extracellular matrix (ECM) are generally conserved between different species and are well tolerated even in xenogenic recipient. Thus, several studies have been conducted in the search for a replacement for autogenous bone scaffold using the technique of decellularization. To obtain these scaffolds, tissue must undergo a process of cell removal that causes minimal adverse effects on the composition, biological activity and mechanical integrity of the remaining extracellular matrix. There is not, however, a conformity among researchers about the best protocol for decellularization, since each of these treatments interfere differently in biochemical composition, ultrastructure and mechanical properties of the extracellular matrix, affecting the type of immune response to the material. Further down the arsenal of research involving decellularization bone tissue represents another obstacle to the arrival of a consensus protocol. The present study aimed to evaluate the influence of decellularization methods in the production of biological scaffolds from skeletal organs of mice, for their use for grafting. This was a laboratory study, sequenced in two distinct stages. In the first phase 12 mice hemi-calvariae were evaluated, divided into three groups (n = 4) and submitted to three different decellularization protocols (SDS [group I], trypsin [Group II], Triton X-100 [Group III]). We tried to identify the one that promotes most efficient cell removal, simultaneously to the best structural preservation of the bone extracellular matrix. Therefore, we performed quantitative analysis of the number of remaining cells and descriptive analysis of the scaffolds, made possible by microscopy. In the second stage, a study was conducted to evaluate the in vitro adhesion of mice bone marrow mesenchymal cells, cultured on these scaffolds, previously decellularized. Through manual counting of cells on scaffolds there was a complete cell removal in Group II, Group I showed a practically complete cell removal, and Group III displayed cell remains. The findings allowed us to observe a significant difference only between Groups II and III (p = 0.042). Better maintenance of the collagen structure was obtained with Triton X-100, whereas the decellularization with Trypsin was responsible for the major structural changes in the scaffolds. After culture, the adhesion of mesenchymal cells was only observed in specimens deccelularized with Trypsin. Due to the potential for total removal of cells and the ability to allow adherence of these, the protocol based on the use of Trypsin (Group II) was considered the most suitable for use in future experiments involving bone grafting decellularized scaffolds

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work performs an algorithmic study of optimization of a conformal radiotherapy plan treatment. Initially we show: an overview about cancer, radiotherapy and the physics of interaction of ionizing radiation with matery. A proposal for optimization of a plan of treatment in radiotherapy is developed in a systematic way. We show the paradigm of multicriteria problem, the concept of Pareto optimum and Pareto dominance. A generic optimization model for radioterapic treatment is proposed. We construct the input of the model, estimate the dose given by the radiation using the dose matrix, and show the objective function for the model. The complexity of optimization models in radiotherapy treatment is typically NP which justifyis the use of heuristic methods. We propose three distinct methods: MOGA, MOSA e MOTS. The project of these three metaheuristic procedures is shown. For each procedures follows: a brief motivation, the algorithm itself and the method for tuning its parameters. The three method are applied to a concrete case and we confront their performances. Finally it is analyzed for each method: the quality of the Pareto sets, some solutions and the respective Pareto curves

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of clustering methods for the discovery of cancer subtypes has drawn a great deal of attention in the scientific community. While bioinformaticians have proposed new clustering methods that take advantage of characteristics of the gene expression data, the medical community has a preference for using classic clustering methods. There have been no studies thus far performing a large-scale evaluation of different clustering methods in this context. This work presents the first large-scale analysis of seven different clustering methods and four proximity measures for the analysis of 35 cancer gene expression data sets. Results reveal that the finite mixture of Gaussians, followed closely by k-means, exhibited the best performance in terms of recovering the true structure of the data sets. These methods also exhibited, on average, the smallest difference between the actual number of classes in the data sets and the best number of clusters as indicated by our validation criteria. Furthermore, hierarchical methods, which have been widely used by the medical community, exhibited a poorer recovery performance than that of the other methods evaluated. Moreover, as a stable basis for the assessment and comparison of different clustering methods for cancer gene expression data, this study provides a common group of data sets (benchmark data sets) to be shared among researchers and used for comparisons with new methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the researches in artificial intelligence is to qualify the computer to execute functions that are performed by humans using knowledge and reasoning. This work was developed in the area of machine learning, that it s the study branch of artificial intelligence, being related to the project and development of algorithms and techniques capable to allow the computational learning. The objective of this work is analyzing a feature selection method for ensemble systems. The proposed method is inserted into the filter approach of feature selection method, it s using the variance and Spearman correlation to rank the feature and using the reward and punishment strategies to measure the feature importance for the identification of the classes. For each ensemble, several different configuration were used, which varied from hybrid (homogeneous) to non-hybrid (heterogeneous) structures of ensemble. They were submitted to five combining methods (voting, sum, sum weight, multiLayer Perceptron and naïve Bayes) which were applied in six distinct database (real and artificial). The classifiers applied during the experiments were k- nearest neighbor, multiLayer Perceptron, naïve Bayes and decision tree. Finally, the performance of ensemble was analyzed comparatively, using none feature selection method, using a filter approach (original) feature selection method and the proposed method. To do this comparison, a statistical test was applied, which demonstrate that there was a significant improvement in the precision of the ensembles

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJETIVO: As doenças osteomusculares são as afecções ocupacionais mais prevalentes em cirurgiões-dentistas. Nosso propósito: 1) investigar os conhecimentos, aplicabilidades clínicas dos princípios ergonômicos em discentes e docentes em atividades clínicas de uma universidade pública 2) pesquisar a incidência de sintomatologias dolorosas no pescoço, ombros, parte superior e inferior das costas, cotovelos, quadris, coxas, joelhos, tornozelos e pés no universo de alunos em estágios clínicos. 3) incitar discussões de normas e diretrizes ergonômicas na universidade. MÉTODOS: Esse estudo investigou o universo de alunos matriculados em disciplinas clínicas (148) e respectivos professores (30) do curso de odontologia da Universidade Federal do Rio Grande do Norte, Natal-RN a respeito dos princípios ergonômicos utilizados na rotina clínica. Paralelamente foi pesquisada a incidência de sintomatologia dolorosa nos alunos por intermédio do questionário nórdico e a partir dos resultados foi mensurado o índice de severidade dos sintomas em alunos. The Nordic Musculoskeletal Questionnaire (NMQ) é um instrumento de diagnóstico, proposto para padronizar a mensuração de relatos de sintomas osteomusculares. A análise dos dados foi através do programa SPSS-Statistical Package for the Social Sciences, versão 17.0 realizada analítica e descritivamente, com determinação das médias (x), desvio-padrão para variáveis quantitativas, freqüências simples e relativas para as variáveis categóricas, além da estatística de associação entre grupos (teste t) e a análise de associação do quiquadrado com nível de significância 5% entre as variáveis (Person). As respostas das questões abertas foram codificadas e transformadas em freqüências, descritas posteriormente. RESULTADOS: A aplicabilidade de medidas ergonômicas nas clínicas universitárias não foi evidenciada pelo universo de discentes e docentes. Quanto ao relato de sintomas osteomusculares o sexo feminino foi o mais acometido qualquer que seja o nível acadêmico cursado. As regiões anatômicas de maior grau de severidade de relatos dos sintomas foram: pescoço, parte inferior das costas, punhos, mãos e ombros, com significância etatística p<0,001. CONCLUSÃO: Em função dos achados os autores apresentam um protocolo de intervenção clínica baseado nos determinantes ergonômicos da Associação internacional de ergonomia (EAI) como medida de prevenção da saúde ocupacional dos futuros cirurgiões-dentistas ainda em processo de formação nas clínicas odontológicas das universidades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho tem como objetivo o estudo do comportamento assintótico da estatística de Pearson (1900), que é o aparato teórico do conhecido teste qui-quadrado ou teste x2 como também é usualmente denotado. Inicialmente estudamos o comportamento da distribuição da estatística qui-quadrado de Pearson (1900) numa amostra {X1, X2,...,Xn} quando n → ∞ e pi = pi0 , 8n. Em seguida detalhamos os argumentos usados em Billingley (1960), os quais demonstram a convergência em distribuição de uma estatística, semelhante a de Pearson, baseada em uma amostra de uma cadeia de Markov, estacionária, ergódica e com espaço de estados finitos S

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, an inverse problem corresponds to find a value of an element x in a suitable vector space, given a vector y measuring it, in some sense. When we discretize the problem, it usually boils down to solve an equation system f(x) = y, where f : U Rm ! Rn represents the step function in any domain U of the appropriate Rm. As a general rule, we arrive to an ill-posed problem. The resolution of inverse problems has been widely researched along the last decades, because many problems in science and industry consist in determining unknowns that we try to know, by observing its effects under certain indirect measures. Our general subject of this dissertation is the choice of Tykhonov´s regulaziration parameter of a poorly conditioned linear problem, as we are going to discuss on chapter 1 of this dissertation, focusing on the three most popular methods in nowadays literature of the area. Our more specific focus in this dissertation consists in the simulations reported on chapter 2, aiming to compare the performance of the three methods in the recuperation of images measured with the Radon transform, perturbed by the addition of gaussian i.i.d. noise. We choosed a difference operator as regularizer of the problem. The contribution we try to make, in this dissertation, mainly consists on the discussion of numerical simulations we execute, as is exposed in Chapter 2. We understand that the meaning of this dissertation lays much more on the questions which it raises than on saying something definitive about the subject. Partly, for beeing based on numerical experiments with no new mathematical results associated to it, partly for being about numerical experiments made with a single operator. On the other hand, we got some observations which seemed to us interesting on the simulations performed, considered the literature of the area. In special, we highlight observations we resume, at the conclusion of this work, about the different vocations of methods like GCV and L-curve and, also, about the optimal parameters tendency observed in the L-curve method of grouping themselves in a small gap, strongly correlated with the behavior of the generalized singular value decomposition curve of the involved operators, under reasonably broad regularity conditions in the images to be recovered

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seismic wave dispersion and attenuation studies have become an important tool for lithology and fluid discrimination in hydrocarbon reservoirs. The processes associated to attenuation are complex and are encapsulated in a single quantitative description called quality factor (Q). The present dissertation has the objective of comparing different approaches of Q determination and is divided in two parts. Firstly, we made performance and robustness tests of three different approaches for Q determination in the frequency domain. They are: peak shift, centroid shift and spectral ratio. All these tests were performed in a three-layered model. In the suite of tests performed here, we varied the thickness, Q and inclination of the layers for propagation pulses with central frequency of 30, 40 and 60 Hz. We found that the centroid shift method is produces robust results for the entire suíte of tests. Secondly, we inverted for Q values using the peak and centroid shift methods using an sequential grid search algorithm. In this case, centroid shift method also produced more robust results than the peak shift method, despite being of slower convergence