802 resultados para LIQUID INTERFACES
Resumo:
Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well.
Resumo:
We analyze the behavior of complex information in the Fresnel domain, taking into account the limited capability to display complex values of liquid crystal devices when they are used as holographic displays. To do this analysis we study the reconstruction of Fresnel holograms at several distances using the different parts of the complex distribution. We also use the information adjusted with a method that combines two configurations of the devices in an adding architecture. The results of the error analysis show different behavior for the reconstructions when using the different methods. Simulated and experimental results are presented.
Resumo:
The determination of gross alpha, gross beta and 226Ra activity in natural waters is useful in a wide range of environmental studies. Furthermore, gross alpha and gross beta parameters are included in international legislation on the quality of drinking water [Council Directive 98/83/EC].1 In this work, a low-background liquid scintillation counter (Wallac, Quantulus 1220) was used to simultaneously determine gross alpha, gross beta and 226Ra activity in natural water samples. Sample preparation involved evaporation to remove 222Rn and its short-lived decay daughters. The evaporation process concentrated the sample ten-fold. Afterwards, a sample aliquot of 8 mL was mixed with 12 mL of Ultima Gold AB scintillation cocktail in low-diffusion vials. In this study, a theoretical mathematical model based on secular equilibrium conditions between 226Ra and its short-lived decay daughters is presented. The proposed model makes it possible to determine 226Ra activity from two measurements. These measurements also allow determining gross alpha and gross beta simultaneously. To validate the proposed model, spiked samples with different activity levels for each parameter were analysed. Additionally, to evaluate the model's applicability in natural water, eight natural water samples from different parts of Spain were analysed. The eight natural water samples were also characterised by alpha spectrometry for the naturally occurring isotopes of uranium (234U, 235U and 238U), radium (224Ra and 226Ra), 210Po and 232Th. The results for gross alpha and 226Ra activity were compared with alpha spectrometry characterization, and an acceptable concordance was obtained.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.
Resumo:
The present work describes the development of a fast and robust analytical method for the determination of 53 antibiotic residues, covering various chemical groups and some of their metabolites, in environmental matrices that are considered important sources of antibiotic pollution, namely hospital and urban wastewaters, as well as in river waters. The method is based on automated off-line solid phase extraction (SPE) followed by ultra-high-performance liquid chromatography coupled to quadrupole linear ion trap tandem mass spectrometry (UHPLC–QqLIT). For unequivocal identification and confirmation, and in order to fulfill EU guidelines, two selected reaction monitoring (SRM) transitions per compound are monitored (the most intense one is used for quantification and the second one for confirmation). Quantification of target antibiotics is performed by the internal standard approach, using one isotopically labeled compound for each chemical group, in order to correct matrix effects. The main advantages of the method are automation and speed-up of sample preparation, by the reduction of extraction volumes for all matrices, the fast separation of a wide spectrum of antibiotics by using ultra-high-performance liquid chromatography, its sensitivity (limits of detection in the low ng/L range) and selectivity (due to the use of tandem mass spectrometry) The inclusion of β-lactam antibiotics (penicillins and cephalosporins), which are compounds difficult to analyze in multi-residue methods due to their instability in water matrices, and some antibiotics metabolites are other important benefits of the method developed. As part of the validation procedure, the method developed was applied to the analysis of antibiotics residues in hospital, urban influent and effluent wastewaters as well as in river water samples
Resumo:
This work describes the formation of transformation products (TPs) by the enzymatic degradation at laboratory scale of two highly consumed antibiotics: tetracycline (Tc) and erythromycin (ERY). The analysis of the samples was carried out by a fast and simple method based on the novel configuration of the on-line turbulent flow system coupled to a hybrid linear ion trap – high resolution mass spectrometer. The method was optimized and validated for the complete analysis of ERY, Tc and their transformation products within 10 min without any other sample manipulation. Furthermore, the applicability of the on-line procedure was evaluated for 25 additional antibiotics, covering a wide range of chemical classes in different environmental waters with satisfactory quality parameters. Degradation rates obtained for Tc by laccase enzyme and ERY by EreB esterase enzyme without the presence of mediators were ∼78% and ∼50%, respectively. Concerning the identification of TPs, three suspected compounds for Tc and five of ERY have been proposed. In the case of Tc, the tentative molecular formulas with errors mass within 2 ppm have been based on the hypothesis of dehydroxylation, (bi)demethylation and oxidation of the rings A and C as major reactions. In contrast, the major TP detected for ERY has been identified as the “dehydration ERY-A”, with the same molecular formula of its parent compound. In addition, the evaluation of the antibiotic activity of the samples along the enzymatic treatments showed a decrease around 100% in both cases
Resumo:
Reversed phase liquid chromatography (RPLC) coupled to mass spectrometry (MS) is the gold standard technique in bioanalysis. However, hydrophilic interaction chromatography (HILIC) could represent a viable alternative to RPLC for the analysis of polar and/or ionizable compounds, as it often provides higher MS sensitivity and alternative selectivity. Nevertheless, this technique can be also prone to matrix effects (ME). ME are one of the major issues in quantitative LC-MS bioanalysis. To ensure acceptable method performance (i.e., trueness and precision), a careful evaluation and minimization of ME is required. In the present study, the incidence of ME in HILIC-MS/MS and RPLC-MS/MS was compared for plasma and urine samples using two representative sets of 38 pharmaceutical compounds and 40 doping agents, respectively. The optimal generic chromatographic conditions in terms of selectivity with respect to interfering compounds were established in both chromatographic modes by testing three different stationary phases in each mode with different mobile phase pH. A second step involved the assessment of ME in RPLC and HILIC under the best generic conditions, using the post-extraction addition method. Biological samples were prepared using two different sample pre-treatments, i.e., a non-selective sample clean-up procedure (protein precipitation and simple dilution for plasma and urine samples, respectively) and a selective sample preparation, i.e., solid phase extraction for both matrices. The non-selective pretreatments led to significantly less ME in RPLC vs. HILIC conditions regardless of the matrix. On the contrary, HILIC appeared as a valuable alternative to RPLC for plasma and urine samples treated by a selective sample preparation. Indeed, in the case of selective sample preparation, the compounds influenced by ME were different in HILIC and RPLC, and lower and similar ME occurrence was generally observed in RPLC vs. HILIC for urine and plasma samples, respectively. The complementary of both chromatographic modes was also demonstrated, as ME was observed only scarcely for urine and plasma samples when selecting the most appropriate chromatographic mode.
Resumo:
The presence of residues of antibiotics, metabolites, and thermal transformation products (TPs), produced during thermal treatment to eliminate pathogenic microorganisms in milk, could represent a risk for people. Cow"s milk samples spiked with enrofloxacin (ENR), ciprofloxacin (CIP), difloxacin (DIF), and sarafloxacin (SAR) and milk samples from cows medicated with ENR were submitted to several thermal treatments. The milk samples were analyzed by liquid chromatography-mass spectrometry (LC-MS) to find and identify TPs and metabolites. In this work, 27 TPs of 4 quinolones and 24 metabolites of ENR were found. Some of these compounds had been reported previously, but others were characterized for the first time, including lactose-conjugated CIP, the formamidation reaction for CIP and SAR, and hydroxylation or ketone formation to produce three different isomers for all quinolones studied.
Resumo:
The presence of residues of antibiotics, metabolites, and thermal transformation products (TPs), produced during thermal treatment to eliminate pathogenic microorganisms in milk, could represent a risk for people. Cow"s milk samples spiked with enrofloxacin (ENR), ciprofloxacin (CIP), difloxacin (DIF), and sarafloxacin (SAR) and milk samples from cows medicated with ENR were submitted to several thermal treatments. The milk samples were analyzed by liquid chromatography-mass spectrometry (LC-MS) to find and identify TPs and metabolites. In this work, 27 TPs of 4 quinolones and 24 metabolites of ENR were found. Some of these compounds had been reported previously, but others were characterized for the first time, including lactose-conjugated CIP, the formamidation reaction for CIP and SAR, and hydroxylation or ketone formation to produce three different isomers for all quinolones studied.
Resumo:
This thesis focuses on fibre coalescers whose efficiency is based on the surface properties/characteristics. They have the ability to preferentially wet or interact with one or more of the fluids to be separated. Thus, the interfacial phenomena governing the separation efficiency of the coalescers is investigated depending on physical factors such as flowrates, phase ratios and coalescer packing density. Design of process equipment to produce and separate of the emulsions was carried out.The experimentation was carried out to test the separation efficiency of the coalescing medias, namely fibreglass, polyester I and polyester II. The performances of the coalescing medias were assessed via droplet size information. In conclusion, the objectives (design of process equipment and experimentation) were achieved. Fibre glass was the best coalescing media, next was polyester I and then finally polyester II. Droplets sizes increased with decreased flowrates and increased packing density of the coalescer. Phase ratio had effect on the droplet sizes of the feed but had no effect on the coalescence of droplets of the feed.
Resumo:
This article reports the phase behavior determi- nation of a system forming reverse liquid crystals and the formation of novel disperse systems in the two-phase region. The studied system is formed by water, cyclohexane, and Pluronic L-121, an amphiphilic block copolymer considered of special interest due to its aggregation and structural proper- ties. This system forms reverse cubic (I2) and reverse hexagonal (H2) phases at high polymer concentrations. These reverse phases are of particular interest since in the two-phase region, stable high internal phase reverse emulsions can be formed. The characterization of the I2 and H2 phases and of the derived gel emulsions was performed with small-angle X-ray scattering (SAXS) and rheometry, and the influence of temperature and water content was studied. TheH2 phase experimented a thermal transition to an I2 phase when temperature was increased, which presented an Fd3m structure. All samples showed a strong shear thinning behavior from low shear rates. The elasticmodulus (G0) in the I2 phase was around 1 order of magnitude higher than in theH2 phase. G0 was predominantly higher than the viscousmodulus (G00). In the gel emulsions,G0 was nearly frequency-independent, indicating their gel type nature. Contrarily to water-in-oil (W/O) normal emulsions, in W/I2 and W/H2 gel emulsions, G0, the complex viscosity (|η*|), and the yield stress (τ0) decreased with increasing water content, since the highly viscous microstructure of the con- tinuous phase was responsible for the high viscosity and elastic behavior of the emulsions, instead of the volumefraction of dispersed phase and droplet size. A rheological analysis, in which the cooperative flow theory, the soft glass rheology model, and the slip plane model were analyzed and compared, was performed to obtain one single model that could describe the non-Maxwellian behavior of both reverse phases and highly concentrated emulsions and to characterize their microstructure with the rheological properties.
Resumo:
A rapid and sensitive method is described for the determination of clofentezine residues in apple, papaya, mango and orange. The procedure is based on the extraction of the sample with a hexane:ethyl acetate mixture (1:1, v/v) and liquid chromatographic analysis using UV detection. Mean recoveries from 4 replicates of fortified fruit samples ranged from 81% to 96%, with coefficients of variation from 8.9% to 12.5%. The detection and quantification limits of the method were of 0.05 and 0.1 mg kg-1, respectively.
Resumo:
The use of the quartz crystal microbalance process, electrochemical impedance spectroscopy and surface plasmon resonance for characterizing thin films and monitoring interfaces is presented. The theorical aspects of QCM, EIS and SPR are introduced and the main application areas are outlined. Future prospects of the combined applications of QCM, EIS and SPR methods in the studies of interfacial processes at surfaces are also discussed.
Resumo:
Two high performance liquid chromatography (HPLC) methods for the quantitative determination of indinavir sulfate were tested, validated and statistically compared. Assays were carried out using as mobile phases mixtures of dibutylammonium phosphate buffer pH 6.5 and acetonitrile (55:45) at 1 mL/min or citrate buffer pH 5 and acetonitrile (60:40) at 1 mL/min, an octylsilane column (RP-8) and a UV spectrophotometric detector at 260 nm. Both methods showed good sensitivity, linearity, precision and accuracy. The statistical analysis using the t-student test for the determination of indinavir sulfate raw material and capsules indicated no statistically significant difference between the two methods.