985 resultados para Optimized application
Resumo:
Lipolysis and oxidation of lipids in foods are the major biochemical and chemical processes that cause food quality deterioration, leading to the characteristic, unpalatable odour and flavour called rancidity. In addition to unpalatability, rancidity may give rise to toxic levels of certain compounds like aldehydes, hydroperoxides, epoxides and cholesterol oxidation products. In this PhD study chromatographic and spectroscopic techniques were employed to determine the degree of rancidity in different animal products and its relationship with technological parameters like feeding fat sources, packaging, processing and storage conditions. To achieve this goal capillary gas chromatography (CGC) was employed not only to determine the fatty acids profile but also, after solid phase extraction, the amount of free fatty acids (FFA), diglycerides (DG), sterols (cholesterol and phytosterols) and cholesterol oxidation products (COPs). To determine hydroperoxides, primary products of oxidation and quantify secondary products UV/VIS absorbance spectroscopy was applied. Most of the foods analysed in this study were meat products. In actual fact, lipid oxidation is a major deterioration reaction in meat and meat products and results in adverse changes in the colour, flavour and texture of meat. The development of rancidity has long recognized as a serious problem during meat handling, storage and processing. On a dairy product, a vegetal cream, a study of lipid fraction and development of rancidity during storage was carried out to evaluate its shelf-life and some nutritional features life saturated/unsaturated fatty acids ratio and phytosterols content. Then, according to the interest that has been growing around functional food in the last years, a new electrophoretic method was optimized and compared with HPLC to check the quality of a beehive product like royal jelly. This manuscript reports the main results obtained in the five activities briefly summarized as follows: 1) comparison between HPLC and a new electrophoretic method in the evaluation of authenticity of royal jelly; 2) study of the lipid fraction of a vegetal cream under different storage conditions; 3) study of lipid oxidation in minced beef during storage under a modified atmosphere packaging, before and after cooking; 4) evaluation of the influence of dietary fat and processing on the lipid fraction of chicken patties; 5) study of the lipid fraction of typical Italian and Spanish pork dry sausages and cured hams.
Resumo:
Research in art conservation has been developed from the early 1950s, giving a significant contribution to the conservation-restoration of cultural heritage artefacts. In fact, only through a profound knowledge about the nature and conditions of constituent materials, suitable decisions on the conservation and restoration measures can thus be adopted and preservation practices enhanced. The study of ancient artworks is particularly challenging as they can be considered as heterogeneous and multilayered systems where numerous interactions between the different components as well as degradation and ageing phenomena take place. However, difficulties to physically separate the different layers due to their thickness (1-200 µm) can result in the inaccurate attribution of the identified compounds to a specific layer. Therefore, details can only be analysed when the sample preparation method leaves the layer structure intact, as for example the preparation of embedding cross sections in synthetic resins. Hence, spatially resolved analytical techniques are required not only to exactly characterize the nature of the compounds but also to obtain precise chemical and physical information about ongoing changes. This thesis focuses on the application of FTIR microspectroscopic techniques for cultural heritage materials. The first section is aimed at introducing the use of FTIR microscopy in conservation science with a particular attention to the sampling criteria and sample preparation methods. The second section is aimed at evaluating and validating the use of different FTIR microscopic analytical methods applied to the study of different art conservation issues which may be encountered dealing with cultural heritage artefacts: the characterisation of the artistic execution technique (chapter II-1), the studies on degradation phenomena (chapter II-2) and finally the evaluation of protective treatments (chapter II-3). The third and last section is divided into three chapters which underline recent developments in FTIR spectroscopy for the characterisation of paint cross sections and in particular thin organic layers: a newly developed preparation method with embedding systems in infrared transparent salts (chapter III-1), the new opportunities offered by macro-ATR imaging spectroscopy (chapter III-2) and the possibilities achieved with the different FTIR microspectroscopic techniques nowadays available (chapter III-3). In chapter II-1, FTIR microspectroscopy as molecular analysis, is presented in an integrated approach with other analytical techniques. The proposed sequence is optimized in function of the limited quantity of sample available and this methodology permits to identify the painting materials and characterise the adopted execution technique and state of conservation. Chapter II-2 describes the characterisation of the degradation products with FTIR microscopy since the investigation on the ageing processes encountered in old artefacts represents one of the most important issues in conservation research. Metal carboxylates resulting from the interaction between pigments and binding media are characterized using synthesised metal palmitates and their production is detected on copper-, zinc-, manganese- and lead- (associated with lead carbonate) based pigments dispersed either in oil or egg tempera. Moreover, significant effects seem to be obtained with iron and cobalt (acceleration of the triglycerides hydrolysis). For the first time on sienna and umber paints, manganese carboxylates are also observed. Finally in chapter II-3, FTIR microscopy is combined with further elemental analyses to characterise and estimate the performances and stability of newly developed treatments, which should better fit conservation-restoration problems. In the second part, in chapter III-1, an innovative embedding system in potassium bromide is reported focusing on the characterisation and localisation of organic substances in cross sections. Not only the identification but also the distribution of proteinaceous, lipidic or resinaceous materials, are evidenced directly on different paint cross sections, especially in thin layers of the order of 10 µm. Chapter III-2 describes the use of a conventional diamond ATR accessory coupled with a focal plane array to obtain chemical images of multi-layered paint cross sections. A rapid and simple identification of the different compounds is achieved without the use of any infrared microscope objectives. Finally, the latest FTIR techniques available are highlighted in chapter III-3 in a comparative study for the characterisation of paint cross sections. Results in terms of spatial resolution, data quality and chemical information obtained are presented and in particular, a new FTIR microscope equipped with a linear array detector, which permits reducing the spatial resolution limit to approximately 5 µm, provides very promising results and may represent a good alternative to either mapping or imaging systems.
Electrostatic supramolecular assembly of charged dendritic polymers and their biological application
Resumo:
The aim of this study was the development of functional multilayer films through electrostatic layer by layer (LbL) assembly of dendritic macromolecules, the investigation of the fundamental properties of these multilalyered films and the study of their biological applications. rnThe synthesis of the anionic hyperbranched polyglycerols (hbPG) and the preparation of multilayers made of hbPG/phosphorus dendrimer as well as the influences of deposition conditions on multilayers were reported. The thicknesses of multilayer films increase with a decrease of molecular weight of anionic hbPGs. The multilayer films fabricated by low molecular weight hbPGs grow less regularly due to the less charged carboxylic acid groups providing the relative weaker electrostatic forces for the deposition. The thicknesses of multilayer films are reduced with increasing pH values and decreasing the concentration of NaCl. The observed changes of multilayer thickness and surface morphology could be interpreted with the aid of theories regarding the charge density and conformation of the anionic hbPG chains in solution. rnBesides the study of fundamental properties of hbPG/phosphorus multilayer films, antifouling thin films derived from hbPG layers were developed. The antifouling properties of hbPG layers were found to correlate with factors of the molecular weight of anionic hbPG and the film thickness. It was demonstrated that anionic hbPG single layer with highest molecular weight can reduce non specific protein adsorption more efficiently than single layer with lower molecular weight and all the hbPG bilayers possessed excellent property of antifouling. rnPhosphorus dendrimer multilayers were successfully prepared as the platforms to detect DNA immobilization and hybridization. The effect of NaCl concentration on the multilayer film thickness was evaluated to obtain the optimized film thickness. Making use of the multilayer deposited at the optimized condition as a substrate, a high loading of DNA probes was achieved through covalent coupling of probe DNA with the as-formed multilayer films. The hybridization of target DNA with immobilized probe DNA was then carried out and studied by SPFS. The limit of detection upon hybridization was estimated on various dendrimer multilayer platforms. The minimum detection concentration for DNA hybridization is in the same order of magnitude compared with other neutral phosphorus dendrimer systems. Furthermore, the LbL deposition of phosphorus dendrimer multilayers provided a mild and simple way to prepare platforms as DNA microarrays. rnBased on the phosphorus dendrimer multilayer systems, dendritic star polymers were employed which have more reactive groups than that phosphorus dendrimers. The as-assembled dendritic star polymer multilayer films exhibited such distinct morphology characteristics that they underwent extensive structural reorganization upon post-treatment under different pH conditions. Kinetic binding of probe DNA molecules on the outermost negatively charged dendritic surface was studied by SPR as well. The binding capacities of probe DNA on the multilayer surfaces fabricated from the first-generation and the second-generation of dendritic star polymers were compared. The improved binding capacity was achieved from the second-generation of dendritic star polymer multilayer films due to their more reactive groups. DNA hybridization reaction on dendritic multilayer films was investigated by SPFS. The similar hybridization behaviors were found on both multilayer surfaces. Meanwhile, the hybridization kinetic affinities were compared with that of phosphorus dendrimer multilayer surfaces and showed improved detection sensitivity than phosphorus dendrimer multilayer films.rn
Resumo:
Im Bereich sicherheitsrelevanter eingebetteter Systeme stellt sich der Designprozess von Anwendungen als sehr komplex dar. Entsprechend einer gegebenen Hardwarearchitektur lassen sich Steuergeräte aufrüsten, um alle bestehenden Prozesse und Signale pünktlich auszuführen. Die zeitlichen Anforderungen sind strikt und müssen in jeder periodischen Wiederkehr der Prozesse erfüllt sein, da die Sicherstellung der parallelen Ausführung von größter Bedeutung ist. Existierende Ansätze können schnell Designalternativen berechnen, aber sie gewährleisten nicht, dass die Kosten für die nötigen Hardwareänderungen minimal sind. Wir stellen einen Ansatz vor, der kostenminimale Lösungen für das Problem berechnet, die alle zeitlichen Bedingungen erfüllen. Unser Algorithmus verwendet Lineare Programmierung mit Spaltengenerierung, eingebettet in eine Baumstruktur, um untere und obere Schranken während des Optimierungsprozesses bereitzustellen. Die komplexen Randbedingungen zur Gewährleistung der periodischen Ausführung verlagern sich durch eine Zerlegung des Hauptproblems in unabhängige Unterprobleme, die als ganzzahlige lineare Programme formuliert sind. Sowohl die Analysen zur Prozessausführung als auch die Methoden zur Signalübertragung werden untersucht und linearisierte Darstellungen angegeben. Des Weiteren präsentieren wir eine neue Formulierung für die Ausführung mit fixierten Prioritäten, die zusätzlich Prozessantwortzeiten im schlimmsten anzunehmenden Fall berechnet, welche für Szenarien nötig sind, in denen zeitliche Bedingungen an Teilmengen von Prozessen und Signalen gegeben sind. Wir weisen die Anwendbarkeit unserer Methoden durch die Analyse von Instanzen nach, welche Prozessstrukturen aus realen Anwendungen enthalten. Unsere Ergebnisse zeigen, dass untere Schranken schnell berechnet werden können, um die Optimalität von heuristischen Lösungen zu beweisen. Wenn wir optimale Lösungen mit Antwortzeiten liefern, stellt sich unsere neue Formulierung in der Laufzeitanalyse vorteilhaft gegenüber anderen Ansätzen dar. Die besten Resultate werden mit einem hybriden Ansatz erzielt, der heuristische Startlösungen, eine Vorverarbeitung und eine heuristische mit einer kurzen nachfolgenden exakten Berechnungsphase verbindet.
Resumo:
OBJECTIVES: The disease alveolar echinococcosis (AE), caused by the larval stage of the cestode Echinococcus multilocularis, is fatal if treatment is unsuccessful. Current treatment options are, at best, parasitostatic, and involve taking benzimidazoles (albendazole, mebendazole) for the whole of a patient's life. In conjunction with the recent development of optimized procedures for E. multilocularis metacestode cultivation, we aimed to develop a rapid and reliable drug screening test, which enables efficient screening of a large number of compounds in a relatively short time frame. METHODS: Metacestodes were treated in vitro with albendazole, the nitro-thiazole nitazoxanide and 29 nitazoxanide derivatives. The resulting leakage of phosphoglucose isomerase (PGI) activity into the medium supernatant was measured and provided an indication of compound efficacy. RESULTS: We show that upon in vitro culture of E. multilocularis metacestodes in the presence of active drugs such as albendazole, the nitro-thiazole nitazoxanide and 30 different nitazoxanide derivatives, the activity of PGI in culture supernatants increased. The increase in PGI activity correlated with the progressive degeneration and destruction of metacestode tissue in a time- and concentration-dependent manner, which allowed us to perform a structure-activity relationship analysis on the thiazolide compounds used in this study. CONCLUSIONS: The assay presented here is inexpensive, rapid, can be used in 24- and 96-well formats and will serve as an ideal tool for first-round in vitro tests on the efficacy of large numbers of antiparasitic compounds.
Resumo:
Cigarettes may contain up to 10% by weight additives which are intended to make them more attractive. A fast and rugged method for a cigarette-screening for additives with medium volatility was developed using automatic headspace solid phase microextraction (HS-SPME) with a 65 microm carbowax-divinylbenzene fiber and gas chromatography-mass spectrometry (GC-MS) with standard electron impact ionisation. In three runs, each cigarette sample was extracted in closed headspace vials using basic, acidic and neutral medium containing 0.5 g NaCl or Na2SO4. Furthermore, the method was optimized for quantitative determination of 17 frequently occurring additives. The practical applicability of the method was demonstrated for cigarettes from 32 brands.
Resumo:
This paper describes a method for DRR generation as well as for volume gradients projection using hardware accelerated 2D texture mapping and accumulation buffering and demonstrates its application in 2D-3D registration of X-ray fluoroscopy to CT images. The robustness of the present registration scheme are guaranteed by taking advantage of a coarse-to-fine processing of the volume/image pyramids based on cubic B-splines. A human cadaveric spine specimen together with its ground truth was used to compare the present scheme with a purely software-based scheme in three aspects: accuracy, speed, and capture ranges. Our experiments revealed an equivalent accuracy and capture ranges but with much shorter registration time with the present scheme. More specifically, the results showed 0.8 mm average target registration error, 55 second average execution time per registration, and 10 mm and 10° capture ranges for the present scheme when tested on a 3.0 GHz Pentium 4 computer.
Resumo:
OBJECT: Fat suppressed 3D steady-state free precession (SSFP) sequences are of special interest in cartilage imaging due to their short repetition time in combination with high signal-to-noise ratio. At low-to-high fields (1.5-3.0 T), spectral spatial (spsp) radio frequency (RF) pulses perform superiorly over conventional saturation of the fat signal (FATSAT pulses). However, ultra-high fields (7.0 T and more) may offer alternative fat suppression techniques as a result of the increased chemical shift. MATERIALS AND METHODS: Application of a single, frequency selective, RF pulse is compared to spsp excitation for water (or fat) selective imaging at 7.0 T. RESULTS: For SSFP, application of a single frequency selective RF pulse for selective water or fat excitation performs beneficially over the commonly applied spsp RF pulses. In addition to the overall improved fat suppression, the application of single RF pulses leads to decreased power depositions, still representing one of the major restrictions in the design and application of many pulse sequences at ultra-high fields. CONCLUSION: The ease of applicability and implementation of single frequency selective RF pulses at ultra-high-fields might be of great benefit for a vast number of applications where fat suppression is desirable or fat-water separation is needed for quantification purposes.
Resumo:
The purpose of this study is to design, develop and integrate a Compressed Natural Gas (CNG) tank that will have a conformable shape for efficient storage in a light-duty pick-up truck. The CNG tank will be a simple rectangular box geometry to demonstrate capability of non-cylindrical shapes. Using CAD drawings of the truck, a conformable tank will be designed to fit under the pick-up bed. The intent of the non-cylindrical CNG tank is to demonstrate improvement in size over the current solution, which is a large cylinder in the box of a pick-up truck. The geometry of the tank’s features is critical to its size and strength. The optimized tank design will be simulated with Finite Element Analysis (FEA) to determine critical stress regions, and appropriate design changes will be made to reduce stress concentration. Following the American National Standard Institute (ANSI) guide, different aluminum alloys will be optimized to obtain the best possible result for the CNG tank.
Resumo:
This paper presents a comparison of principal component (PC) regression and regularized expectation maximization (RegEM) to reconstruct European summer and winter surface air temperature over the past millennium. Reconstruction is performed within a surrogate climate using the National Center for Atmospheric Research (NCAR) Climate System Model (CSM) 1.4 and the climate model ECHO-G 4, assuming different white and red noise scenarios to define the distortion of pseudoproxy series. We show how sensitivity tests lead to valuable “a priori” information that provides a basis for improving real world proxy reconstructions. Our results emphasize the need to carefully test and evaluate reconstruction techniques with respect to the temporal resolution and the spatial scale they are applied to. Furthermore, we demonstrate that uncertainties inherent to the predictand and predictor data have to be more rigorously taken into account. The comparison of the two statistical techniques, in the specific experimental setting presented here, indicates that more skilful results are achieved with RegEM as low frequency variability is better preserved. We further detect seasonal differences in reconstruction skill for the continental scale, as e.g. the target temperature average is more adequately reconstructed for summer than for winter. For the specific predictor network given in this paper, both techniques underestimate the target temperature variations to an increasing extent as more noise is added to the signal, albeit RegEM less than with PC regression. We conclude that climate field reconstruction techniques can be improved and need to be further optimized in future applications.
Resumo:
To make use of the isotope ratio of nonexchangeable hydrogen (δ2Hn (nonexchangeable)) of bulk soil organic matter (SOM), the mineral matrix (containing structural water of clay minerals) must be separated from SOM and samples need to be analyzed after H isotope equilibration. We present a novel technique for demineralization of soil samples with HF and dilute HCl and recovery of the SOM fraction solubilized in the HF demineralization solution via solid-phase extraction. Compared with existing techniques, organic C (Corg) and organic N (Norg) recovery of demineralized SOM concentrates was significantly increased (Corg recovery using existing techniques vs new demineralization method: 58% vs 78%; Norg recovery: 60% vs 78%). Chemicals used for the demineralization treatment did not affect δ2Hn values as revealed by spiking with deuterated water. The new demineralization method minimized organic matter losses and thus artificial H isotope fractionation, opening up the opportunity to use δ2Hn analyses of SOM as a new tool in paleoclimatology or geospatial forensics.
Resumo:
An accurate and coherent chronological framework is essential for the interpretation of climatic and environmental records obtained from deep polar ice cores. Until now, one common ice core age scale had been developed based on an inverse dating method (Datice), combining glaciological modelling with absolute and stratigraphic markers between 4 ice cores covering the last 50 ka (thousands of years before present) (Lemieux-Dudon et al., 2010). In this paper, together with the companion paper of Veres et al. (2013), we present an extension of this work back to 800 ka for the NGRIP, TALDICE, EDML, Vostok and EDC ice cores using an improved version of the Datice tool. The AICC2012 (Antarctic Ice Core Chronology 2012) chronology includes numerous new gas and ice stratigraphic links as well as improved evaluation of background and associated variance scenarios. This paper concentrates on the long timescales between 120–800 ka. In this framework, new measurements of δ18Oatm over Marine Isotope Stage (MIS) 11–12 on EDC and a complete δ18Oatm record of the TALDICE ice cores permit us to derive additional orbital gas age constraints. The coherency of the different orbitally deduced ages (from δ18Oatm, δO2/N2 and air content) has been verified before implementation in AICC2012. The new chronology is now independent of other archives and shows only small differences, most of the time within the original uncertainty range calculated by Datice, when compared with the previous ice core reference age scale EDC3, the Dome F chronology, or using a comparison between speleothems and methane. For instance, the largest deviation between AICC2012 and EDC3 (5.4 ka) is obtained around MIS 12. Despite significant modifications of the chronological constraints around MIS 5, now independent of speleothem records in AICC2012, the date of Termination II is very close to the EDC3 one.
Resumo:
Long-term electrocardiography (ECG) featuring adequate atrial and ventricular signal quality is highly desirable. Routinely used surface leads are limited in atrial signal sensitivity and recording capability impeding complete ECG delineation, i.e. in the presence of supraventricular arrhythmias. Long-term esophageal ECG might overcome these limitations but requires a dedicated lead system and recorder design. To this end, we analysed multiple-lead esophageal ECGs with respect to signal quality by describing the ECG waves as a function of the insertion level, interelectrode distance, electrode shape and amplifier's input range. The results derived from clinical data show that two bipolar esophageal leads, an atrial lead with short (15 mm) interelectrode distance and a ventricular lead with long (80 mm) interelectrode distance provide non-inferior ventricular signal strength and superior atrial signal strength compared to standard surface lead II. High atrial signal slope in particular is observed with the atrial esophageal lead. The proposed esophageal lead system in combination with an increased recorder input range of ±20 mV minimizes signal loss due to excessive electrode motion typically observed in esophageal ECGs. The design proposal might help to standardize long-term esophageal ECG registrations and facilitate novel ECG classification systems based on the independent detection of ventricular and atrial electrical activity.
Resumo:
Computer vision-based food recognition could be used to estimate a meal's carbohydrate content for diabetic patients. This study proposes a methodology for automatic food recognition, based on the Bag of Features (BoF) model. An extensive technical investigation was conducted for the identification and optimization of the best performing components involved in the BoF architecture, as well as the estimation of the corresponding parameters. For the design and evaluation of the prototype system, a visual dataset with nearly 5,000 food images was created and organized into 11 classes. The optimized system computes dense local features, using the scale-invariant feature transform on the HSV color space, builds a visual dictionary of 10,000 visual words by using the hierarchical k-means clustering and finally classifies the food images with a linear support vector machine classifier. The system achieved classification accuracy of the order of 78%, thus proving the feasibility of the proposed approach in a very challenging image dataset.