930 resultados para Sol-gel processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present a simulation of a recognition process with perimeter characterization of a simple plant leaves as a unique discriminating parameter. Data coding allowing for independence of leaves size and orientation may penalize performance recognition for some varieties. Border description sequences are then used to characterize the leaves. Independent Component Analysis (ICA) is then applied in order to study which is the best number of components to be considered for the classification task, implemented by means of an Artificial Neural Network (ANN). Obtained results with ICA as a pre-processing tool are satisfactory, and compared with some references our system improves the recognition success up to 80.8% depending on the number of considered independent components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed a number of tests with the aim to develop an effective extraction method for the analysis of carotenoid content in maize seed. Mixtures of methanol–ethyl acetate (6:4, v/v) and methanol–tetrahydrofuran (1:1, v/v) were the most effective solvent systems for carotenoid extraction from maize endosperm under the conditions assayed. In addition, we also addressed sample preparation prior to the analysis of carotenoids by liquid chromatography (LC). The LC response of extracted carotenoids and standards in several solvents was evaluated and results were related to the degree of solubility of these pigments. Three key factors were found to be important when selecting a suitable injection solvent: compatibility between the mobile phase and injection solvent, carotenoid polarity and content in the matrix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes applications of cumulant analysis in speech processing. A special focus is made on different second-order statistics. A dominant role is played by an integral representation for cumulants by means of integrals involving cyclic products of kernels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RÉSUMÉ - Les images satellitales multispectrales, notamment celles à haute résolution spatiale (plus fine que 30 m au sol), représentent une source d’information inestimable pour la prise de décision dans divers domaines liés à la gestion des ressources naturelles, à la préservation de l’environnement ou à l’aménagement et la gestion des centres urbains. Les échelles d’étude peuvent aller du local (résolutions plus fines que 5 m) à des échelles régionales (résolutions plus grossières que 5 m). Ces images caractérisent la variation de la réflectance des objets dans le spectre qui est l’information clé pour un grand nombre d’applications de ces données. Or, les mesures des capteurs satellitaux sont aussi affectées par des facteurs « parasites » liés aux conditions d’éclairement et d’observation, à l’atmosphère, à la topographie et aux propriétés des capteurs. Deux questions nous ont préoccupé dans cette recherche. Quelle est la meilleure approche pour restituer les réflectances au sol à partir des valeurs numériques enregistrées par les capteurs tenant compte des ces facteurs parasites ? Cette restitution est-elle la condition sine qua non pour extraire une information fiable des images en fonction des problématiques propres aux différents domaines d’application des images (cartographie du territoire, monitoring de l’environnement, suivi des changements du paysage, inventaires des ressources, etc.) ? Les recherches effectuées les 30 dernières années ont abouti à une série de techniques de correction des données des effets des facteurs parasites dont certaines permettent de restituer les réflectances au sol. Plusieurs questions sont cependant encore en suspens et d’autres nécessitent des approfondissements afin, d’une part d’améliorer la précision des résultats et d’autre part, de rendre ces techniques plus versatiles en les adaptant à un plus large éventail de conditions d’acquisition des données. Nous pouvons en mentionner quelques unes : - Comment prendre en compte des caractéristiques atmosphériques (notamment des particules d’aérosol) adaptées à des conditions locales et régionales et ne pas se fier à des modèles par défaut qui indiquent des tendances spatiotemporelles à long terme mais s’ajustent mal à des observations instantanées et restreintes spatialement ? - Comment tenir compte des effets de « contamination » du signal provenant de l’objet visé par le capteur par les signaux provenant des objets environnant (effet d’adjacence) ? ce phénomène devient très important pour des images de résolution plus fine que 5 m; - Quels sont les effets des angles de visée des capteurs hors nadir qui sont de plus en plus présents puisqu’ils offrent une meilleure résolution temporelle et la possibilité d’obtenir des couples d’images stéréoscopiques ? - Comment augmenter l’efficacité des techniques de traitement et d’analyse automatique des images multispectrales à des terrains accidentés et montagneux tenant compte des effets multiples du relief topographique sur le signal capté à distance ? D’autre part, malgré les nombreuses démonstrations par des chercheurs que l’information extraite des images satellitales peut être altérée à cause des tous ces facteurs parasites, force est de constater aujourd’hui que les corrections radiométriques demeurent peu utilisées sur une base routinière tel qu’est le cas pour les corrections géométriques. Pour ces dernières, les logiciels commerciaux de télédétection possèdent des algorithmes versatiles, puissants et à la portée des utilisateurs. Les algorithmes des corrections radiométriques, lorsqu’ils sont proposés, demeurent des boîtes noires peu flexibles nécessitant la plupart de temps des utilisateurs experts en la matière. Les objectifs que nous nous sommes fixés dans cette recherche sont les suivants : 1) Développer un logiciel de restitution des réflectances au sol tenant compte des questions posées ci-haut. Ce logiciel devait être suffisamment modulaire pour pouvoir le bonifier, l’améliorer et l’adapter à diverses problématiques d’application d’images satellitales; et 2) Appliquer ce logiciel dans différents contextes (urbain, agricole, forestier) et analyser les résultats obtenus afin d’évaluer le gain en précision de l’information extraite par des images satellitales transformées en images des réflectances au sol et par conséquent la nécessité d’opérer ainsi peu importe la problématique de l’application. Ainsi, à travers cette recherche, nous avons réalisé un outil de restitution de la réflectance au sol (la nouvelle version du logiciel REFLECT). Ce logiciel est basé sur la formulation (et les routines) du code 6S (Seconde Simulation du Signal Satellitaire dans le Spectre Solaire) et sur la méthode des cibles obscures pour l’estimation de l’épaisseur optique des aérosols (aerosol optical depth, AOD), qui est le facteur le plus difficile à corriger. Des améliorations substantielles ont été apportées aux modèles existants. Ces améliorations concernent essentiellement les propriétés des aérosols (intégration d’un modèle plus récent, amélioration de la recherche des cibles obscures pour l’estimation de l’AOD), la prise en compte de l’effet d’adjacence à l’aide d’un modèle de réflexion spéculaire, la prise en compte de la majorité des capteurs multispectraux à haute résolution (Landsat TM et ETM+, tous les HR de SPOT 1 à 5, EO-1 ALI et ASTER) et à très haute résolution (QuickBird et Ikonos) utilisés actuellement et la correction des effets topographiques l’aide d’un modèle qui sépare les composantes directe et diffuse du rayonnement solaire et qui s’adapte également à la canopée forestière. Les travaux de validation ont montré que la restitution de la réflectance au sol par REFLECT se fait avec une précision de l’ordre de ±0.01 unités de réflectance (pour les bandes spectrales du visible, PIR et MIR), même dans le cas d’une surface à topographie variable. Ce logiciel a permis de montrer, à travers des simulations de réflectances apparentes à quel point les facteurs parasites influant les valeurs numériques des images pouvaient modifier le signal utile qui est la réflectance au sol (erreurs de 10 à plus de 50%). REFLECT a également été utilisé pour voir l’importance de l’utilisation des réflectances au sol plutôt que les valeurs numériques brutes pour diverses applications courantes de la télédétection dans les domaines des classifications, du suivi des changements, de l’agriculture et de la foresterie. Dans la majorité des applications (suivi des changements par images multi-dates, utilisation d’indices de végétation, estimation de paramètres biophysiques, …), la correction des images est une opération cruciale pour obtenir des résultats fiables. D’un point de vue informatique, le logiciel REFLECT se présente comme une série de menus simples d’utilisation correspondant aux différentes étapes de saisie des intrants de la scène, calcul des transmittances gazeuses, estimation de l’AOD par la méthode des cibles obscures et enfin, l’application des corrections radiométriques à l’image, notamment par l’option rapide qui permet de traiter une image de 5000 par 5000 pixels en 15 minutes environ. Cette recherche ouvre une série de pistes pour d’autres améliorations des modèles et méthodes liés au domaine des corrections radiométriques, notamment en ce qui concerne l’intégration de la FDRB (fonction de distribution de la réflectance bidirectionnelle) dans la formulation, la prise en compte des nuages translucides à l’aide de la modélisation de la diffusion non sélective et l’automatisation de la méthode des pentes équivalentes proposée pour les corrections topographiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cephalopods are utilized as an important food item in various countries because of its delicacy as raw consumed food. Mainly sepia and loligo are consumed raw by Japanese and Russians. The freshness of the products is very important when the product is consumed raw. The major species that dominate our squid catch are Loligo duvaucelii and Doryteuthis sibogae. There is a noticeable difference in the quality of both the species. The needle squid (Doryteuthis sibogae ) contributes about 35% of the total squid landing. Due to the fast deterioration , a major portion of the needle squid, which is caught during the first few hauls, is thrown back to sea. The catch in the last hauls only are taken to the landing centers. At present the needle squid is processed as blanched rings and the desired quality is not obtained if it is processed as whole, whole cleaned or as tubes. In this study an attempt is made to investigate the biochemical characteristics in both the species of squid in relation to their quality and, the process control measures to be adopted. The effect of various treatments on their quality and the changes in proteolytic and lysosomal enzymes under various processing conditions are also studied in detail.Thus this study can provide the seafood industry with relevant suggestions and solutions for effective utilization of both the species of squid with emphasis on needle squid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The period, known to UK farmers and processors as the "spring flush", when the cows' diet changes from dry feed to spring pasture, has long been established as a time of change in milk properties and processing characteristics. Although it is believed to be a time when problems in processing are most likely to occur (e.g. milk that does not form clots or forms weak gels during cheesemaking), there is little evidence in the literature of detailed changes in milk composition and their impact on product manufacture. In this study, a range of physicochemical properties were analysed in milk collected from five commercial dairy herds before, during and after the spring flush period of 2006. In particular, total and ionic calcium contents of milk were studied in relation to other parameters including rennet clotting, acid gel properties, heat coagulation, alcohol stability, micelle size and zeta potential. Total divalent cations were significantly reduced from 35.4 to 33.4 mmol.L-1 during the study, while ionic calcium was reduced from 1.48 to 1.40 mmol.L-1 over the same period. Many parameters varied significantly between the sample dates. However, there was no evidence to suggest that any of the milk samples would have been unsuitable for processing - e.g. there were no samples that did not form clots with chymosin within a reasonable time or formed especially weak rennet or acid gels. A number of statistically significant correlations were found within the data, including ionic calcium concentration and pH; rennet clotting time (RCT) and micelle diameter; and RCT and ethanol stability. Overall, while there were clear variations in milk composition and properties over this period, there was no evidence to support the view that serious processing problems are likely during the change from dry feed to spring pasture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Yor" is a traditional sausage like product widely consumed in Thailand. Its textures are usually set by steaming, in this experiment ultra-high pressure was used to modify the product. Three types of hydrocolloid; carboxymethylcellulose (CMC), locust bean gum (LBG) and xanthan gum, were added to minced ostrich meat batter at concentration of 0-1% and subjected to high pressure 600 Mpa, 50 degrees C, 40 min. The treated samples were analysed for storage (G) and loss (G '') moduli by dynamic oscillatory testing as well as creep compliance for control stress measurement. Their microstructures using confocal microscopy were also examined. Hydrocolloid addition caused a significant (P < 0.05) decrease in both the G' and G '' moduli. However the loss tangent of all samples remained unchanged. Addition of hydrocolloids led to decreases in the gel network formation but appears to function as surfactant materials during the initial mixing stage as shown by the microstructure. Confocal microscopy suggested that the size of the fat droplets decreased with gum addition. The fat droplets were smallest on the addition of xanthan gum and increased in the order CMC, LBG and no added gum, respectively. Creep parameters of ostrich yors with four levels of xanthan gum addition (0.50%, 0.75%, 1.00% and 1.25%) showed an increase in the instantaneous compliance (J(0)), the retarded compliance (J(1)) and retardation time (lambda(1)) but a decrease in the viscosity (eta(0)) with increasing levels of addition. The results also suggested that the larger deformations used during creep testing might be more helpful in assessing the mechanical properties of the product than the small deformations used in oscillatory rheology. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential of visible-near infrared spectra, obtained using a light backscatter sensor, in conjunction with chemometrics, to predict curd moisture and whey fat content in a cheese vat was examined. A three-factor (renneting temperature, calcium chloride, cutting time), central composite design was carried out in triplicate. Spectra (300–1,100 nm) of the product in the cheese vat were captured during syneresis using a prototype light backscatter sensor. Stirring followed upon cutting the gel, and samples of curd and whey were removed at 10 min intervals and analyzed for curd moisture and whey fat content. Spectral data were used to develop models for predicting curd moisture and whey fat contents using partial least squares regression. Subjecting the spectral data set to Jack-knifing improved the accuracy of the models. The whey fat models (R = 0.91, 0.95) and curd moisture model (R = 0.86, 0.89) provided good and approximate predictions, respectively. Visible-near infrared spectroscopy was found to have potential for the prediction of important syneresis indices in stirred cheese vats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dried beef is a food traditionally eaten by Northeastern and has an extensive trade in the city of Natal-RN. It is usually produced in an empirical manner, without any standardization in production. Characterized as partially dehydrated meat product, so that the activity of water present is not sufficient to prevent microbial growth, degradation or the production of microbial toxins. The guarantee that the market dried beef is to provide a quality product hygienic, microbiological, physicochemical and sensory stable and adequate for the safety and consumer satisfaction, which has been increasingly attracted to food with natural preservatives. Thus, the meat industries are replacing the current seasonings and natural preservatives for similar, with it without affecting the shelf life of products. Lactic acid has been used to meet these requirements. In this sense, this study aimed to evaluate the effect of lactic acid on the physico-chemical, microbiological and sensory, besides knowing the consumer profile of dried meat of the City of Natal / RN. The results demonstrated that the use of lactic acid in concentrations of 1% and 2% during the processing of dried meat, had statistically significant effect (p < 0.05) on the physico-chemical (pH and water activity) and consequently reduced the microbial count does not alter the taste of the new product developed. Regarding the results on the consumer profile, it was found that the majority of respondents (71.75%) did not observe the presence of the stamp of the Federal Inspection Service (SIF) to buy this meat food that 81.55% of consumers check the hygiene conditions of the site and handlers, however, a large proportion of respondents not concerned with the guarantee of origin of typical regional products featuring a hazard to food safety for consumers of the city of Natal-RN

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nanostructural characteristics of acid-catalyzed sonogels are studied along the aging process at 60 degreesC in saturated conditions and after the CO, supercritical extraction (aerogel). The structural evolution was studied by means of small-angle X-ray scattering (SAXS) and UV-Visible absorption techniques. The sonogel exhibits a mass fractal structure in a length scale between zeta - 1/q(0) similar to 5.3 and a(1) similar to 1/q(m) similar to 0.22 nm, as the length scale probed by SAXS. The apparent mass fractal dimension lightly increases from 2.0 for fresh gel until 2.2 for 14 days aging in wet conditions. The UV absorption also increases with the aging time in wet conditions. Both observations are consistent with the syneresis process accompanying the polycondensation progress during aging in saturated conditions. For long aging times, the wet sonogels show a light transition from a mass to a surface fractal. in a very small interval of the length scale, developing an extremely rough surface with fractal dimension D-S similar to 2.9, the fractal characteristics of the sonogels practically do not change with the alcohol exchange. With the CO2 supercritical extraction (aerogel). The interval in the length scale in which the surface fractal is defined increases, while the surface fractal dimension diminishes to D-S similar to 2.5. The mass fractal characteristics are less apparent in the aerogels. (C) 2001 Elsevier B.V. B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of milk processing on the microstructure of probiotic low-fat yogurt was studied. Skim milk fortified with skim milk powder was subjected to three treatments prior to innoculation: thermal treatment at 85 degrees C for 30 min, high hydrostatic pressure at 676 MPa for 5 min, and combined treatments of high hydrostatic pressure (HHP) and heat. The processed milk was then fermented by using two different starter cultures containing Streptococcus thermophilus, Lactobacillus delbrueckii ssp. bulgaricus, Lactobacillus acidophilus, and Bifidobacterium longum. The microstructure of heat-treated milk yogurt had fewer interconnected chains of irregularly shaped casein micelles, forming a network that enclosed the void spaces. on the other hand, microstructure of HHP yogurt had more interconnected clusters of densely aggregated protein of reduced particle size, with an appearance more spherical in shape, exhibiting a smoother more regular surface and presenting more uniform size distribution. The combined HHP and heat milk treatments led to compact yogurt gels with increasingly larger casein micelle clusters interspaced by void spaces, and exhibited a high degree of cross-linking. The rounded micelles tended to fuse and form small irregular aggregates in association with clumps of dense amorphous material, which resulted in improved gel texture and viscosity. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Listeria monocytogenes is a pathogen capable of adhering to many surfaces and forming biofilms, which may explain its persistence in food processing environments. This study aimed to genetically characterise L monocytogenes isolates obtained from bovine carcasses and beef processing facilities and to evaluate their adhesion abilities. DNA from 29 L monocytogenes isolates was subjected to enzymatic restriction digestion (Ascii and Apal), and two clusters were identified for serotypes 4b and 112a, with similarities of 48% and 68%. respectively. The adhesion ability of the isolates was tested considering: inoculum concentration, culture media, carbohydrate source, NaCl concentration, incubation temperature, and pH. Each isolate was tested at 10(8) CFU mL(-1) and classified according to its adhesion ability as weak (8 isolates). moderate (17) or strong (4). The isolates showed higher adhesion capability in non-diluted culture media, media at pH 7.0, incubation at 25 degrees C and 37 degrees C, and media with NaCl at 5% and 7%. No relevant differences were observed for adhesion ability with respect to the carbohydrate source. The results indicated a wide diversity of PFGE profiles of persistent L monocytogenes isolates, without relation to their adhesion characteristics. Also, it was observed that stressing conditions did not enhance the adhesion profile of the isolates. (C) 2012 Elsevier Ltd. All rights reserved.