38 resultados para Noz macadamia - Processamento
Resumo:
The cashew, a fruit from Brazilian Northeast is used to produce juice due to its flavor and vitamin C richness. However, its acceptance is limited due to its astringency. Cajuína is a derivate product appreciated by its characteristic flavor, freshness and lack of astringency, due to tannin removal. Cajuína is a light yellow beverage made from clarified cashew juice and sterilized after bottling. It differs from the integral and concentrated juice by the clarification and thermal treatment steps. Many problems such as haze and excessive browning could appear if these steps are not controlled. The objective of this work was divided into two stages with the aim to supply process information in order to obtain a good quality product with uniform characteristics (sensory and nutritional). Polyphenol-protein interaction was studied at the clarification step, which is an empirical process, to provide values on the amount of clarifying solution (gelatin) that must be added to achieve a complete juice clarification. Clarification essays were performed with juice dilutions of 1:2 and 1:10 and the effect of metabissulfite and tannic acid addition was evaluated. It was not possible to establish a clarification point. Metabissulfite did not influenced the clarification process however tannic acid addition displaced the clarification point, showing the difficulty visual monitoring of the process. Thermal treatment of clarified juice was studied at 88, 100, 111 e 121 °C. To evaluate the non-enzymatic browning, vitamin C, 5-hidroximetilfurfural (5-HMF) and sugar variation were correlated with color parameters (reflectance spectra, color difference and CIELAB). Kinetic models were obtained for reflectance spectra, ascorbic acid and 5-HMF. It was observed that 5-HMF introduction followed a first order kinetic rate at the beginning of the thermal treatment and a zero order kinetic at later process stages. An inverse correlation was observed between absorbance at 420 nm and ascorbic acid degradation, which indicates that ascorbic acid might be the principal factor on cajuína non-enzymatic browning. Constant sugar concentration showed that this parameter did not contribute directly to the nonenzymatic browning. Optimization techniques showed showed that to obtain a high vitamin C and a low 5-HMF content, the process must be done at 120 ºC. With the water-bath thermal treatment, the 90 °C temperature promoted a lower ascorbic acid degradation at the expense of a higher 5-HMF level
Resumo:
During the process of the salt production, the first the salt crystals formed are disposed of as industrial waste. This waste is formed basically by gypsum, composed of calcium sulfate dihydrate (CaSO4.2H2O), known as carago cru or malacacheta . After be submitted the process of calcination to produce gypsum (CaSO4.0,5H2O), can be made possible its application in cement industry. This work aims to optimize the time and temperature for the process of calcination of the gypsum (carago) for get beta plaster according to the specifications of the norms of civil construction. The experiments involved the chemical and mineralogical characterization of the gypsum (carago) from the crystallizers, and of the plaster that is produced in the salt industry located in Mossoró, through the following techniques: x-ray diffraction (XRD), x-ray fluorescence (FRX), thermogravimetric analysis (TG/DTG) and scanning electron microscopy (SEM) with EDS. For optimization of time and temperature of the process of calcination was used the planning three factorial with levels with response surfaces of compressive mechanical tests and setting time, according norms NBR-13207: Plasters for civil construction and x-ray diffraction of plasters (carago) beta obtained in calcination. The STATISTICA software 7.0 was used for the calculations to relate the experimental data for a statistical model. The process for optimization of calcination of gypsum (carago) occurred in the temperature range from 120° C to 160° C and the time in the range of 90 to 210 minutes in the oven at atmospheric pressure, it was found that with the increase of values of temperature of 160° C and time calcination of 210 minutes to get the results of tests of resistance to compression with values above 10 MPa which conform to the standard required (> 8.40) and that the X-ray diffractograms the predominance of the phase of hemidrato beta, getting a beta plaster of good quality and which is in accordance with the norms in force, giving a by-product of the salt industry employability in civil construction
Resumo:
This study aims to assess the potential for industrial reuse of textile wastewater, after passing through a physical and chemical pretreatment, into denim washing wet processing operations in an industrial textile laundry, with no need for complementary treatments and dilutions. The methodology and evaluation of the proposed tests were based on the production techniques used in the company and upgraded for the experiments tested. The characterization of the treated effluent for 16 selected parameters and the development of a monitoring able to tailor the treated effluent for final disposal in accordance with current legislation was essential for the initiation of testing for reuse. The parameters color, turbidity, SS and pH used were satisfactory as control variables and presents simple determination methods. The denim quality variables considered were: color, odor, appearance and soft handle. The tests were started on a pilot scale following complexity factors attributed to the processes, in denim fabric and jeans, which demonstrated the possibility of reuse, because there was no interference in the processes and at quality of the tested product. Industrial scale tests were initiated by a step control that confirmed the methodology efficiency applied to identify the possibility of reuse by tests that precede each recipe to be processed. 556 replicates were performed in production scale for 47 different recipes of denim washing. The percentage of water reuse was 100% for all processes and repetitions performed after the initial adjustment testing phase. All the jeans were framed with the highest quality for internal control and marketed, being accepted by contractors. The full-scale use of treated wastewater, supported by monitoring and evaluation and control methodology suggested in this study, proved to be valid in textile production, not given any negative impact to the quality the produced jeans under the presented conditions. It is believed that this methodology can be extrapolated to other laundries to determine the possibility of reuse in denim washing wet processing with the necessary modifications to each company.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Hebb postulated that memory could be stored thanks to the synchronous activity of many neurons, building a neural assembly. Knowing of the importance of the hippocampal structure to the formation of new explicit memories, we used electrophysiological recording of multiple neurons to access the relevance of rate coding from neural firing rates in comparison to the temporal coding of neural assemblies activity in the consolidation of an aversive memory in rats. Animals were trained at the discriminative avoidance task using a modified elevated plus-maze. During experimental sessions, slow wave sleep periods (SWS) were recorded. Our results show an increase in the identified neural assemblies activity during post-training SWS, but not for the neural firing rate. In summary, we demonstrate that for this particular task, the relevant information needed for a proper memory consolidation lies within the temporal patters of synchronized neural activity, not in its firing rate
Resumo:
We have recently verified that the monoamine depleting drug reserpine at doses that do not modify motor function - impairs memory in a rodent model of aversive discrimination. In this study, the effects of reserpine (0.1-0.5 mg/kg) on the performance of rats in object recognition, spatial working memory (spontaneous alternation) and emotional memory (contextual freezing conditioning) tasks were investigated. While object recognition and spontaneous alternation behavior were not affected by reserpine treatment, contextual fear conditioning was impaired. Together with previous studies, these results suggest that mild monoamine depletion would preferentially induce deficits in tasks involved with emotional contexts. Possible relationships with cognitive and emotional processing deficits in Parkinson disease are discussed
Resumo:
The auditory system is composed by a set of relays from the outer ear to the cerebral cortex. In mammals, the central auditory system is composed by cochlear nuclei, superior olivary complex, inferior colliculus and medial geniculate body. In this study, the auditory rombencephalic centers, the cochlear nuclear complex and the superior olivary complex were evaluated from the cytoarchitecture and neurochemical aspects, thorough Nissl staining and immunohistochemical techniques to reveal specific neuron nuclear protein (NeuN), glutamate (Glu), glutamic acid decarboxilase (GAD), enkephalin (ENK), serotonin (5-HT), choline acetyltransferase (ChAT) and calcium-binding proteins calbindin (CB), calretinin (CR), and parvalbumin (PV). The common marmoset (Callithrix jacchus), a little native primate of the Brazilian atlantic forest was used as an experimental animal. As results, it was noted that the cochlear nuclear complex is composed by anteroventral, posteroventral and dorsal nuclei, and the superior olivary complex is constituted by the lateral and medial superior olivary nuclei and the trapezoid body nucleus. Glu, GAD, ENK, ChAT, CB, CR, PV-immunoreactive cells, fibers and terminals besides besides only 5-HT terminals were found unhomogeneously in all nuclei, of both complex. The emerging data are discussed in a comparative and functional context, and represent an important contribution to knowledge of the central auditory pathways in the common marmoset, and then in primates
Resumo:
The process of salting and drying in the sun is used to preserve meat since the beginning of civilization. There is evidence that this preservation technique has arisen in Egypt, between 4,000 and 5,000 years ago. In our country, according to literature, was the first industrial product that gave the appearance of beef jerky, beef being produced, where about 70% to 75% of the muscle is composed of water, where it will be around 45% as a final product, according to the law in his article RIISPOA No. 432 provides that the jerky should contain no more than this amount of moisture in the muscular portion, or more than 15% of total ash with tolerance of up to 5% variation . Besides this parameter, proteins, lipids, ash, and minerals were analyzed in samples before and after the manufacturing process to know the content of these nutrients. Since these are considered important in product quality, thus the concentration in these samples, respectively, in the flesh Front (CD and CHD) before and after the manufacturing process for humidity were respectively 75.28% and 47.38% , the protein was 14.17 and 22.20 g / 100 g sample, 6.360 and 4.251 of lipids g/100g of the sample, and the ashes 0.974 9.144 g/100g sample, minerals like calcium and 4.074 30 , 06 ppm, sódio0, 055 and 5.401 g / L, sodium chloride, 0.139 and 13.74 g / L, potassium 237.5 and 166.8 ppm, 1.721 and 3.295 ppm iron, 0.143 and 0.135 ppm phosphorus, zinc and 4.690 6.905 ppm; magnésio14, 63 e13, 75 ppm manganese .017 e0, 007ppm, copper 0.057 and 0.039 ppm in the case of needle-type meat (CPA and CHPA), 68.04% and 44.17%, protein 13 , 72, and 24.42 g/100g of sample, 1.137 in the ash and 12.68 g / 100g of sample, and the minerals calcium 17.11 and 12.89 ppm; sódio0, 123 and 4.871 g / L, sodium chloride 0.312 and 12.39 g / L, potassium 305.3 and 182.1 ppm; ferro1, 817 and 1.513 ppm, 0.273 and 0.139 ppm phosphorus, zinc 6.305 and 4.783 ppm, 27.95 and 15.85 ppm magnesium, manganese and 0.025 0.011 ppm, 0.057 and 0.143 ppm copper and chromium 0.014 and 0.068 ppm
Resumo:
Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television
Resumo:
In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap
Resumo:
On the modern Continental Shelf to the north of Rio Grande do Norte state (NE Brazil) is located a paleo-valley, submerged during the last glacial sea-level lowstand, that marks continuation of the most important river of this area (Açu River). Despite the high level of exploration activity of oil industry, there is few information about shallow stratigraphy. Aiming to fill this gap, situated on the Neogene, was worked a marine seismic investigation, the development of a processing flow for high resolution data seismic, and the recognition of the main feature morphology of the study area: the incised valley of the River Açu. The acquisition of shallow seismic data was undertaken in conjunction with the laboratory of Marine Geology/Geophysics and Environmental Monitoring - GGEMMA of Federal University of Rio Grande do Norte UFRN, in SISPLAT project, where the geomorphological structure of the Rio paleovale Açu was the target of the investigation survey. The acquisition of geophysical data has been over the longitudinal and transverse sections, which were subsequently submitted to the processing, hitherto little-used and / or few addressed in the literature, which provided a much higher quality result with the raw data. Once proposed for the flow data was developed and applied to the data of X-Star (acoustic sensor), using available resources of the program ReflexW 4.5 A surface fluvial architecture has been constructed from the bathymetric data and remote sensing image fused and draped over Digital Elevation Models to create three-dimensional (3D) perspective views that are used to analyze the 3D geometry geological features and provide the mapping morphologically defined. The results are expressed in the analysis of seismic sections that extend over the region of the continental shelf and upper slope from mouth of the Açu River to the shelf edge, providing the identification / quantification of geometrical features such as depth, thickness, horizons and units seismic stratigraphyc area, with emphasis has been placed on the palaeoenvironmental interpretation of discordance limit and fill sediment of the incised valley, control by structural elements, and marked by the influence of changes in the sea level. The interpretation of the evolution of this river is worth can bring information to enable more precise descriptions and interpretations, which describes the palaeoenvironmental controls influencing incised valley evolution and preservation to provide a better comprehensive understanding of this reservoir analog system
Resumo:
The increasing use of shallow seismic methods of high resolution, for investigations of geological problems, environmental or industrial, has impelled the development of techniques, flows and computational algorithms. The practice of applying techniques for processing this data, until recently it wasn t used and the interpretation of the data was made as they were acquired. In order to facilitate and contribute to the improvement of the practices adopted, was developed a free graphical application and open source, called OpenSeismic which is based on free software Seismic Un*x, widely used in the treatment of conventional seismic data used in the exploration of hydrocarbon reservoirs. The data used to validate the initiative were marine seismic data of high resolution, acquired by the laboratory of Geology and Marine Geophysics and Environmental Monitoring - GGEMMA, of the Federal University of Rio Grande do Norte UFRN, for the SISPLAT Project, located at the region of paleo-valley of the Rio Acu. These data were submitted to the processing flow developed by Gomes (2009), using the free software developed in this work, the OpenSeismic, as well other free software, the Seismic Un*x and the commercial software ProMAX, where despite its peculiarities has presented similar results
Resumo:
In recent decades, changes in the surface properties of materials have been used to improve their tribological characteristics. However, this improvement depends on the process, treatment time and, primarily, the thickness of this surface film layer. Physical vapor deposition (PVD) of titanium nitrate (TiN) has been used to increase the surface hardness of metallic materials. Thus, the aim of the present study was to propose a numerical-experimental method to assess the film thickness (l) of TiN deposited by PVD. To reach this objective, experimental results of hardness (H) assays were combined with a numerical simulation to study the behavior of this property as a function of maximum penetration depth of the indenter (hmax) into the film/substrate conjugate. Two methodologies were adopted to determine film thickness. The first consists of the numerical results of the H x hmax curve with the experimental curve obtained by the instrumental indentation test. This methodology was used successfully in a TiN-coated titanium (Ti) conjugate. A second strategy combined the numerical results of the Hv x hmax curve with Vickers experimental hardness data (Hv). This methodology was applied to a TiN-coated M2 tool steel conjugate. The mechanical properties of the materials studied were also determined in the present study. The thicknesses results obtained for the two conjugates were compatible with their experimental data.
Resumo:
Several are the areas in which digital images are used in solving day-to-day problems. In medicine the use of computer systems have improved the diagnosis and medical interpretations. In dentistry it’s not different, increasingly procedures assisted by computers have support dentists in their tasks. Set in this context, an area of dentistry known as public oral health is responsible for diagnosis and oral health treatment of a population. To this end, oral visual inspections are held in order to obtain oral health status information of a given population. From this collection of information, also known as epidemiological survey, the dentist can plan and evaluate taken actions for the different problems identified. This procedure has limiting factors, such as a limited number of qualified professionals to perform these tasks, different diagnoses interpretations among other factors. Given this context came the ideia of using intelligent systems techniques in supporting carrying out these tasks. Thus, it was proposed in this paper the development of an intelligent system able to segment, count and classify teeth from occlusal intraoral digital photographic images. The proposed system makes combined use of machine learning techniques and digital image processing. We first carried out a color-based segmentation on regions of interest, teeth and non teeth, in the images through the use of Support Vector Machine. After identifying these regions were used techniques based on morphological operators such as erosion and transformed watershed for counting and detecting the boundaries of the teeth, respectively. With the border detection of teeth was possible to calculate the Fourier descriptors for their shape and the position descriptors. Then the teeth were classified according to their types through the use of the SVM from the method one-against-all used in multiclass problem. The multiclass classification problem has been approached in two different ways. In the first approach we have considered three class types: molar, premolar and non teeth, while the second approach were considered five class types: molar, premolar, canine, incisor and non teeth. The system presented a satisfactory performance in the segmenting, counting and classification of teeth present in the images.
Resumo:
Several are the areas in which digital images are used in solving day-to-day problems. In medicine the use of computer systems have improved the diagnosis and medical interpretations. In dentistry it’s not different, increasingly procedures assisted by computers have support dentists in their tasks. Set in this context, an area of dentistry known as public oral health is responsible for diagnosis and oral health treatment of a population. To this end, oral visual inspections are held in order to obtain oral health status information of a given population. From this collection of information, also known as epidemiological survey, the dentist can plan and evaluate taken actions for the different problems identified. This procedure has limiting factors, such as a limited number of qualified professionals to perform these tasks, different diagnoses interpretations among other factors. Given this context came the ideia of using intelligent systems techniques in supporting carrying out these tasks. Thus, it was proposed in this paper the development of an intelligent system able to segment, count and classify teeth from occlusal intraoral digital photographic images. The proposed system makes combined use of machine learning techniques and digital image processing. We first carried out a color-based segmentation on regions of interest, teeth and non teeth, in the images through the use of Support Vector Machine. After identifying these regions were used techniques based on morphological operators such as erosion and transformed watershed for counting and detecting the boundaries of the teeth, respectively. With the border detection of teeth was possible to calculate the Fourier descriptors for their shape and the position descriptors. Then the teeth were classified according to their types through the use of the SVM from the method one-against-all used in multiclass problem. The multiclass classification problem has been approached in two different ways. In the first approach we have considered three class types: molar, premolar and non teeth, while the second approach were considered five class types: molar, premolar, canine, incisor and non teeth. The system presented a satisfactory performance in the segmenting, counting and classification of teeth present in the images.