91 resultados para Memória : Bioquímica : Processamento


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The existence of inequalities among the Brazilian regions is an indeed fact along the country s history. Before this reality the constitutional legislator inserted into the Federal Constitution of 1988, as a purpose of the Federative Republic of Brazil, the reduction of regional inequalities. The development has also been included as a purpose from the State, because there is an straight relation with the reduction of regional inequalities. In both situations is searched the improvement of people s living conditions. . In pursuit of this achievement, the State must implement public policy, and, for this to happen, it needs the ingress of income inside of the public coffers and support of economic agents, therefore the importance of constitucionalization of the economic policy. The 1988 s Constitution adopted a rational capitalism regime consentaneous with current legal and social conceptions, that s why it enabled the State s intervention into economy to correct the so-called market failures or to make the established objectives fulfilled. About this last one, the intervention may happen by induction through the adoption of regulatory Standards of incentive or disincentive of economic activity. Among the possible inductive ways there are the tax assessments that aim to stimulate the economic agents behavior in view of finding that the development doesn t occur with the same intensity in all of the country s regions. Inside this context there are the Export Processing Zones (EPZs) which are special areas with different customs regime by the granting of benefits to the companies that are installed there. The EPZs have been used, by several countries, in order to develop certain regions, and economic indicators show that they promoted economic and social changes in the places where they are installed, especially because, by attracting companies, they provide job creation, industrialization and increased exports. In Brazil, they can contribute decisively to overcome major obstacles or decrease the attraction of economic agents and economic development of the country. In the case of an instrument known to be effective to achieve the goals established by the Constitution, it is duty of the Executive to push for the law that governs this customs regime is effectively applied. If the Executive doesn t fulfill this duty, incurs into unjustifiable omission, correction likely by the Judiciary, whose mission is to prevent acts or omissions contrary to constitutional order

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the late 1970s, the semi-arid region of Rio Grande do Norte was the setting of Projeto Baixo-Açu whose highlight was the building of the dam Eng. Armando Ribeiro Gonçalves, designed to collect 2.4 billion cubic meters of water. Presumably, such an initiative would bring economic and social development for thousands of potiguares who suffered the hardships of drought. However, the dam would reach several cities in the region, reaching to cover one of them: São Rafael. As a result, the early years of the 1980s, nearby, a new town was built by DNOCS. This thesis aims to discuss how the population of São Rafael recalls this fact and reconstructs its history by speaking, writing and computing, after three decades. Based on the prospect moriniana method as a strategy, visits were made to the city of São Rafael and open interviews (individual and collective) with two groups of subjects: one composed of those who lived in their ancient homeland, and another, with young people who were born in the new city. Besides the reports of these subjects, they were observed the visual narratives presented by images, mostly photographic, available on a profile created for the city in the orkut social network. As sources for this study, they were also considered the dialogues between rafaelenses accessing the above profile. Having as a central observation by Edgar Morin about what does not regenerate, degenerates . This study is the central argument that the idea of orkut has performed, today, a dual and interdependent role: being a tool that promotes a collective intelligence through cooperation, exchange of ideas and reconstitution of visual and written narratives. Far from a frozen conception in a historical perspective, it has defended the thesis that orkut has regenerated, repaired, reproduced, restored, reorganized and renewed the memory and history of a city that has succumbed to the immensity of the waters of a dam for almost thirty years

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with a mathematical fundament for digital signal processing under point view of interval mathematics. Intend treat the open problem of precision and repesention of data in digital systems, with a intertval version of signals representation. Signals processing is a rich and complex area, therefore, this work makes a cutting with focus in systems linear invariant in the time. A vast literature in the area exists, but, some concepts in interval mathematics need to be redefined or to be elaborated for the construction of a solid theory of interval signal processing. We will construct a basic fundaments for signal processing in the interval version, such as basic properties linearity, stability, causality, a version to intervalar of linear systems e its properties. They will be presented interval versions of the convolution and the Z-transform. Will be made analysis of convergences of systems using interval Z-transform , a essentially interval distance, interval complex numbers , application in a interval filter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usual programs for load flow calculation were in general developped aiming the simulation of electric energy transmission, subtransmission and distribution systems. However, the mathematical methods and algorithms used by the formulations were based, in majority, just on the characteristics of the transmittion systems, which were the main concern focus of engineers and researchers. Though, the physical characteristics of these systems are quite different from the distribution ones. In the transmission systems, the voltage levels are high and the lines are generally very long. These aspects contribute the capacitive and inductive effects that appear in the system to have a considerable influence in the values of the interest quantities, reason why they should be taken into consideration. Still in the transmission systems, the loads have a macro nature, as for example, cities, neiborhoods, or big industries. These loads are, generally, practically balanced, what reduces the necessity of utilization of three-phase methodology for the load flow calculation. Distribution systems, on the other hand, present different characteristics: the voltage levels are small in comparison to the transmission ones. This almost annul the capacitive effects of the lines. The loads are, in this case, transformers, in whose secondaries are connected small consumers, in a sort of times, mono-phase ones, so that the probability of finding an unbalanced circuit is high. This way, the utilization of three-phase methodologies assumes an important dimension. Besides, equipments like voltage regulators, that use simultaneously the concepts of phase and line voltage in their functioning, need a three-phase methodology, in order to allow the simulation of their real behavior. For the exposed reasons, initially was developped, in the scope of this work, a method for three-phase load flow calculation in order to simulate the steady-state behaviour of distribution systems. Aiming to achieve this goal, the Power Summation Algorithm was used, as a base for developping the three phase method. This algorithm was already widely tested and approved by researchers and engineers in the simulation of radial electric energy distribution systems, mainly for single-phase representation. By our formulation, lines are modeled in three-phase circuits, considering the magnetic coupling between the phases; but the earth effect is considered through the Carson reduction. Its important to point out that, in spite of the loads being normally connected to the transformers secondaries, was considered the hypothesis of existence of star or delta loads connected to the primary circuit. To perform the simulation of voltage regulators, a new model was utilized, allowing the simulation of various types of configurations, according to their real functioning. Finally, was considered the possibility of representation of switches with current measuring in various points of the feeder. The loads are adjusted during the iteractive process, in order to match the current in each switch, converging to the measured value specified by the input data. In a second stage of the work, sensibility parameters were derived taking as base the described load flow, with the objective of suporting further optimization processes. This parameters are found by calculating of the partial derivatives of a variable in respect to another, in general, voltages, losses and reactive powers. After describing the calculation of the sensibility parameters, the Gradient Method was presented, using these parameters to optimize an objective function, that will be defined for each type of study. The first one refers to the reduction of technical losses in a medium voltage feeder, through the installation of capacitor banks; the second one refers to the problem of correction of voltage profile, through the instalation of capacitor banks or voltage regulators. In case of the losses reduction will be considered, as objective function, the sum of the losses in all the parts of the system. To the correction of the voltage profile, the objective function will be the sum of the square voltage deviations in each node, in respect to the rated voltage. In the end of the work, results of application of the described methods in some feeders are presented, aiming to give insight about their performance and acuity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We revisit the problem of visibility, which is to determine a set of primitives potentially visible in a set of geometry data represented by a data structure, such as a mesh of polygons or triangles, we propose a solution for speeding up the three-dimensional visualization processing in applications. We introduce a lean structure , in the sense of data abstraction and reduction, which can be used for online and interactive applications. The visibility problem is especially important in 3D visualization of scenes represented by large volumes of data, when it is not worthwhile keeping all polygons of the scene in memory. This implies a greater time spent in the rendering, or is even impossible to keep them all in huge volumes of data. In these cases, given a position and a direction of view, the main objective is to determine and load a minimum ammount of primitives (polygons) in the scene, to accelerate the rendering step. For this purpose, our algorithm performs cutting primitives (culling) using a hybrid paradigm based on three known techniques. The scene is divided into a cell grid, for each cell we associate the primitives that belong to them, and finally determined the set of primitives potentially visible. The novelty is the use of triangulation Ja 1 to create the subdivision grid. We chose this structure because of its relevant characteristics of adaptivity and algebrism (ease of calculations). The results show a substantial improvement over traditional methods when applied separately. The method introduced in this work can be used in devices with low or no dedicated processing power CPU, and also can be used to view data via the Internet, such as virtual museums applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ln this work, it was deveIoped a parallel cooperative genetic algorithm with different evolution behaviors to train and to define architectures for MuItiIayer Perceptron neural networks. MuItiIayer Perceptron neural networks are very powerful tools and had their use extended vastIy due to their abiIity of providing great resuIts to a broad range of appIications. The combination of genetic algorithms and parallel processing can be very powerful when applied to the Iearning process of the neural network, as well as to the definition of its architecture since this procedure can be very slow, usually requiring a lot of computational time. AIso, research work combining and appIying evolutionary computation into the design of neural networks is very useful since most of the Iearning algorithms deveIoped to train neural networks only adjust their synaptic weights, not considering the design of the networks architecture. Furthermore, the use of cooperation in the genetic algorithm allows the interaction of different populations, avoiding local minima and helping in the search of a promising solution, acceIerating the evolutionary process. Finally, individuaIs and evolution behavior can be exclusive on each copy of the genetic algorithm running in each task enhancing the diversity of populations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded systems are widely spread nowadays. An example is the Digital Signal Processor (DSP), which is a high processing power device. This work s contribution consist of exposing DSP implementation of the system logic for detecting leaks in real time. Among the various methods of leak detection available today this work uses a technique based on the pipe pressure analysis and usesWavelet Transform and Neural Networks. In this context, the DSP, in addition to do the pressure signal digital processing, also communicates to a Global Positioning System (GPS), which helps in situating the leak, and to a SCADA, sharing information. To ensure robustness and reliability in communication between DSP and SCADA the Modbus protocol is used. As it is a real time application, special attention is given to the response time of each of the tasks performed by the DSP. Tests and leak simulations were performed using the structure of Laboratory of Evaluation of Measurement in Oil (LAMP), at Federal University of Rio Grande do Norte (UFRN)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Artificial neural networks are usually applied to solve complex problems. In problems with more complexity, by increasing the number of layers and neurons, it is possible to achieve greater functional efficiency. Nevertheless, this leads to a greater computational effort. The response time is an important factor in the decision to use neural networks in some systems. Many argue that the computational cost is higher in the training period. However, this phase is held only once. Once the network trained, it is necessary to use the existing computational resources efficiently. In the multicore era, the problem boils down to efficient use of all available processing cores. However, it is necessary to consider the overhead of parallel computing. In this sense, this paper proposes a modular structure that proved to be more suitable for parallel implementations. It is proposed to parallelize the feedforward process of an RNA-type MLP, implemented with OpenMP on a shared memory computer architecture. The research consistes on testing and analizing execution times. Speedup, efficiency and parallel scalability are analyzed. In the proposed approach, by reducing the number of connections between remote neurons, the response time of the network decreases and, consequently, so does the total execution time. The time required for communication and synchronization is directly linked to the number of remote neurons in the network, and so it is necessary to investigate which one is the best distribution of remote connections

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The limits to inform is about the character stico of basic, quimica, mineralogical and mechaniques of matlaughed material used in the manufacturing process the product certified in economic region the Cariri, specifically in the city of Crato, Ceará state, motivated the development of this work, since in this region the exist ing economic context that a general appear as important in the production chains. Were made twentyfive soils-test specimen collection and the study was performed to differentiate the mat laugh materials of variaveis processing of mathing raw materials in the factory The product mica monkeys by extrusion and pressing. The results were obtained ap s as analyzes: grain size, index of plasticity, fluoresce incidence X-ray difration the X-ray, and analyzes thermicals and properties technological. through s of curves gresifica returned to was a comparison between the retro the linear, absorb to water, porosity and bulk density. the results show that the excellent distribution and character acceptable available for the processing of the structure color dark red. needing, therefore, of the mixture of a less plastic clay with thick granulation, that works as plasticity reducer. In spite of the different resignation forms for prensagem and extrusion, the characteristics of absorption of water and rupture tension the flexing was shown inside of the patterns of ABNT

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cashew, a fruit from Brazilian Northeast is used to produce juice due to its flavor and vitamin C richness. However, its acceptance is limited due to its astringency. Cajuína is a derivate product appreciated by its characteristic flavor, freshness and lack of astringency, due to tannin removal. Cajuína is a light yellow beverage made from clarified cashew juice and sterilized after bottling. It differs from the integral and concentrated juice by the clarification and thermal treatment steps. Many problems such as haze and excessive browning could appear if these steps are not controlled. The objective of this work was divided into two stages with the aim to supply process information in order to obtain a good quality product with uniform characteristics (sensory and nutritional). Polyphenol-protein interaction was studied at the clarification step, which is an empirical process, to provide values on the amount of clarifying solution (gelatin) that must be added to achieve a complete juice clarification. Clarification essays were performed with juice dilutions of 1:2 and 1:10 and the effect of metabissulfite and tannic acid addition was evaluated. It was not possible to establish a clarification point. Metabissulfite did not influenced the clarification process however tannic acid addition displaced the clarification point, showing the difficulty visual monitoring of the process. Thermal treatment of clarified juice was studied at 88, 100, 111 e 121 °C. To evaluate the non-enzymatic browning, vitamin C, 5-hidroximetilfurfural (5-HMF) and sugar variation were correlated with color parameters (reflectance spectra, color difference and CIELAB). Kinetic models were obtained for reflectance spectra, ascorbic acid and 5-HMF. It was observed that 5-HMF introduction followed a first order kinetic rate at the beginning of the thermal treatment and a zero order kinetic at later process stages. An inverse correlation was observed between absorbance at 420 nm and ascorbic acid degradation, which indicates that ascorbic acid might be the principal factor on cajuína non-enzymatic browning. Constant sugar concentration showed that this parameter did not contribute directly to the nonenzymatic browning. Optimization techniques showed showed that to obtain a high vitamin C and a low 5-HMF content, the process must be done at 120 ºC. With the water-bath thermal treatment, the 90 °C temperature promoted a lower ascorbic acid degradation at the expense of a higher 5-HMF level

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the process of the salt production, the first the salt crystals formed are disposed of as industrial waste. This waste is formed basically by gypsum, composed of calcium sulfate dihydrate (CaSO4.2H2O), known as carago cru or malacacheta . After be submitted the process of calcination to produce gypsum (CaSO4.0,5H2O), can be made possible its application in cement industry. This work aims to optimize the time and temperature for the process of calcination of the gypsum (carago) for get beta plaster according to the specifications of the norms of civil construction. The experiments involved the chemical and mineralogical characterization of the gypsum (carago) from the crystallizers, and of the plaster that is produced in the salt industry located in Mossoró, through the following techniques: x-ray diffraction (XRD), x-ray fluorescence (FRX), thermogravimetric analysis (TG/DTG) and scanning electron microscopy (SEM) with EDS. For optimization of time and temperature of the process of calcination was used the planning three factorial with levels with response surfaces of compressive mechanical tests and setting time, according norms NBR-13207: Plasters for civil construction and x-ray diffraction of plasters (carago) beta obtained in calcination. The STATISTICA software 7.0 was used for the calculations to relate the experimental data for a statistical model. The process for optimization of calcination of gypsum (carago) occurred in the temperature range from 120° C to 160° C and the time in the range of 90 to 210 minutes in the oven at atmospheric pressure, it was found that with the increase of values of temperature of 160° C and time calcination of 210 minutes to get the results of tests of resistance to compression with values above 10 MPa which conform to the standard required (> 8.40) and that the X-ray diffractograms the predominance of the phase of hemidrato beta, getting a beta plaster of good quality and which is in accordance with the norms in force, giving a by-product of the salt industry employability in civil construction

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to assess the potential for industrial reuse of textile wastewater, after passing through a physical and chemical pretreatment, into denim washing wet processing operations in an industrial textile laundry, with no need for complementary treatments and dilutions. The methodology and evaluation of the proposed tests were based on the production techniques used in the company and upgraded for the experiments tested. The characterization of the treated effluent for 16 selected parameters and the development of a monitoring able to tailor the treated effluent for final disposal in accordance with current legislation was essential for the initiation of testing for reuse. The parameters color, turbidity, SS and pH used were satisfactory as control variables and presents simple determination methods. The denim quality variables considered were: color, odor, appearance and soft handle. The tests were started on a pilot scale following complexity factors attributed to the processes, in denim fabric and jeans, which demonstrated the possibility of reuse, because there was no interference in the processes and at quality of the tested product. Industrial scale tests were initiated by a step control that confirmed the methodology efficiency applied to identify the possibility of reuse by tests that precede each recipe to be processed. 556 replicates were performed in production scale for 47 different recipes of denim washing. The percentage of water reuse was 100% for all processes and repetitions performed after the initial adjustment testing phase. All the jeans were framed with the highest quality for internal control and marketed, being accepted by contractors. The full-scale use of treated wastewater, supported by monitoring and evaluation and control methodology suggested in this study, proved to be valid in textile production, not given any negative impact to the quality the produced jeans under the presented conditions. It is believed that this methodology can be extrapolated to other laundries to determine the possibility of reuse in denim washing wet processing with the necessary modifications to each company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este estudio trata de establecer relaciones sobre la importancia del fenómeno sociocultural que emerge de Doña Militana para la cultura potiguar. Para tanto tomamos sus recuerdos de los romances como parte de un contexto social, relacionados con el tiempo y el espacio, que afecten a la vida material y moral de su grupo social. Resaltamos, por lo tanto, el fenómeno de la memoria individual en su relación con la memoria colectiva. Proponemos, en este sentido, suponer que el mantenimiento y permanencia de estos romances en la memoria de la romancera revelan una dinámica de su grupo social para la formación de su identidad. En este sentido, nos servimos como referencial teórico de los estudios de Maurice Halbwachs, en lo que respecta a los debates sobre la memoria colectiva, en paralelo a los estudios de Paul Zunthor cuando se trata de las funciones de la oralidad para la formación de la identidad. Para llevar a cabo los trabajos es de fundamental importancia, por supuesto, el relato de vida de la propia Doña Militana en confronto con los simbolismo culturales contenidos en los romances, con el objetivo que flagremos las (co)incidencias que demarquen la identidad de sus vínculos de identidad con el universo cultural en que está insertado. Como resultado, se tomó como objeto de análisis desde la deposición presentadas en las entrevistas, hasta los romances en sus aspectos poéticos, lingüísticos y mitológicos, incluyendo los significados que el desempeño de la romancera muestra. Objetivamos, por lo tanto, una comprensión dialógica de la relación entre la memoria individual (el caso de Doña Militana) con la memoria colectiva, sobre la base de un hipotético concepto que subyace a la aparente singularidad de este fenómeno - un hecho aislado en cierta medida - una razón intrínseca y compleja que se revela como la punta de un iceberg, al que convergen motivos históricos inconscientes de un patrimonio cultural