913 resultados para Agglutination Tests


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrospinning is used to produce fibers in the nanometer range by stretching a polymeric jet using electric fields of high magnitude. Chitosan is an abundant natural polymer that can be used to obtain biocompatible nanostructured membranes. The objectives of this work were to obtain nanostructured membranes based on blends of chitosan and polyoxyethylene (PEO), and evaluate their thermal and morphological properties, as well as their in vitro biocompatibility by agar diffusion cytotoxicity tests for three different cell lines. A nanostructured fibrous membrane with fiber diameters in the order of 200 nm was obtained, which presented a rough surface and thickness ranging from one to two millimeters. The results of the cytotoxicity tests evidenced that the chitosan/PEO membranes are non-toxic to the cells studied in this work. Further, the electrospinning technique was effective in obtaining nanostructured chitosan/PEO membranes, which showed biocompatibility according to in vitro preliminary tests using the cell lines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. METHODS: To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. RESULTS: A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. CONCLUSION: The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An environmental impact study was conducted to determine the Piracicamirim's creek water quality in order to assess the influence of effluents from a sugar industry in this water body. For this, toxicity tests were performed with a water sample upstream and downstream the industry using the microcrustaceans Daphnia magna, Ceriodaphnia dubia and Ceriodaphnia silvestrii as test organisms, as well as physical and chemical analysis of water. Results showed that physical and chemical parameters did not change during the sampling period, except for the dissolved oxygen. No toxicity was observed for D. magna and reproduction of C. dubia and C. silvestrii in both sampling points. Thus, the industry was not negatively impacting the quality of this water body.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis the application of biotechnological processes based on microbial metabolic degradation of halogenated compound has been investigated. Several studies showed that most of these pollutants can be biodegraded by single bacterial strains or mixed microbial population via aerobic direct metabolism or cometabolism using as a growth substrates aromatic or aliphatic hydrocarbons. The enhancement of two specific processes has been here object of study in relation with its own respective scenario described as follow: 1st) the bioremediation via aerobic cometabolism of soil contaminated by a high chlorinated compound using a mixed microbial population and the selection and isolation of consortium specific for the compound. 2nd) the implementation of a treatment technology based on direct metabolism of two pure strains at the exact point source of emission, preventing dilution and contamination of large volumes of waste fluids polluted by several halogenated compound minimizing the environmental impact. In order to verify the effect of these two new biotechnological application to remove halogenated compound and purpose them as a more efficient alternative continuous and batch tests have been set up in the experimental part of this thesis. Results obtained from the continuous tests in the second scenario have been supported by microbial analysis via Fluorescence in situ Hybridisation (FISH) and by a mathematical model of the system. The results showed that both process in its own respective scenario offer an effective solutions for the biological treatment of chlorinate compound pollution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bread dough and particularly wheat dough, due to its viscoelastic behaviour, is probably the most dynamic and complicated rheological system and its characteristics are very important since they highly affect final products’ textural and sensorial properties. The study of dough rheology has been a very challenging task for many researchers since it can provide numerous information about dough formulation, structure and processing. This explains why dough rheology has been a matter of investigation for several decades. In this research rheological assessment of doughs and breads was performed by using empirical and fundamental methods at both small and large deformation, in order to characterize different types of doughs and final products such as bread. In order to study the structural aspects of food products, image analysis techniques was used for the integration of the information coming from empirical and fundamental rheological measurements. Evaluation of dough properties was carried out by texture profile analysis (TPA), dough stickiness (Chen and Hoseney cell) and uniaxial extensibility determination (Kieffer test) by using a Texture Analyser; small deformation rheological measurements, were performed on a controlled stress–strain rheometer; moreover the structure of different doughs was observed by using the image analysis; while bread characteristics were studied by using texture profile analysis (TPA) and image analysis. The objective of this research was to understand if the different rheological measurements were able to characterize and differentiate the different samples analysed. This in order to investigate the effect of different formulation and processing conditions on dough and final product from a structural point of view. For this aim the following different materials were performed and analysed: - frozen dough realized without yeast; - frozen dough and bread made with frozen dough; - doughs obtained by using different fermentation method; - doughs made by Kamut® flour; - dough and bread realized with the addition of ginger powder; - final products coming from different bakeries. The influence of sub-zero storage time on non-fermented and fermented dough viscoelastic performance and on final product (bread) was evaluated by using small deformation and large deformation methods. In general, the longer the sub-zero storage time the lower the positive viscoelastic attributes. The effect of fermentation time and of different type of fermentation (straight-dough method; sponge-and-dough procedure and poolish method) on rheological properties of doughs were investigated using empirical and fundamental analysis and image analysis was used to integrate this information throughout the evaluation of the dough’s structure. The results of fundamental rheological test showed that the incorporation of sourdough (poolish method) provoked changes that were different from those seen in the others type of fermentation. The affirmative action of some ingredients (extra-virgin olive oil and a liposomic lecithin emulsifier) to improve rheological characteristics of Kamut® dough has been confirmed also when subjected to low temperatures (24 hours and 48 hours at 4°C). Small deformation oscillatory measurements and large deformation mechanical tests performed provided useful information on the rheological properties of samples realized by using different amounts of ginger powder, showing that the sample with the highest amount of ginger powder (6%) had worse rheological characteristics compared to the other samples. Moisture content, specific volume, texture and crumb grain characteristics are the major quality attributes of bread products. The different sample analyzed, “Coppia Ferrarese”, “Pane Comune Romagnolo” and “Filone Terra di San Marino”, showed a decrease of crumb moisture and an increase in hardness over the storage time. Parameters such as cohesiveness and springiness, evaluated by TPA that are indicator of quality of fresh bread, decreased during the storage. By using empirical rheological tests we found several differences among the samples, due to the different ingredients used in formulation and the different process adopted to prepare the sample, but since these products are handmade, the differences could be account as a surplus value. In conclusion small deformation (in fundamental units) and large deformation methods showed a significant role in monitoring the influence of different ingredients used in formulation, different processing and storage conditions on dough viscoelastic performance and on final product. Finally the knowledge of formulation, processing and storage conditions together with the evaluation of structural and rheological characteristics is fundamental for the study of complex matrices like bakery products, where numerous variable can influence their final quality (e.g. raw material, bread-making procedure, time and temperature of the fermentation and baking).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interactions between outdoor bronzes and the environment, which lead to bronze corrosion, require a better understanding in order to design effective conservation strategies in the Cultural Heritage field. In the present work, investigations on real patinas of the outdoor monument to Vittorio Bottego (Parma, Italy) and laboratory studies on accelerated corrosion testing of inhibited (by silane-based films, with and without ceria nanoparticles) and non-inhibited quaternary bronzes are reported and discussed. In particular, a wet&dry ageing method was used both for testing the efficiency of the inhibitor and for patinating bronze coupons before applying the inhibitor. A wide range of spectroscopic techniques has been used, for characterizing the core metal (SEM+EDS, XRF, AAS), the corroded surfaces (SEM+EDS, portable XRF, micro-Raman, ATR-IR, Py-GC-MS) and the ageing solutions (AAS). The main conclusions were: 1. The investigations on the Bottego monument confirmed the differentiation of the corrosion products as a function of the exposure geometry, already observed in previous works, further highlighting the need to take into account the different surface features when selecting conservation procedures such as the application of inhibitors (i.e. the relative Sn enrichment in unsheltered areas requires inhibitors which effectively interact not only with Cu but also with Sn). 2. The ageing (pre-patination) cycle on coupons was able to reproduce the relative Sn enrichment that actually happens in real patinated surfaces, making the bronze specimens representative of the real support for bronze inhibitors. 3. The non-toxic silane-based inhibitors display a good protective efficiency towards pre-patinated surfaces, differently from other widely used inhibitors such as benzotriazole (BTA) and its derivatives. 4. The 3-mercapto-propyl-trimethoxy-silane (PropS-SH) additivated with CeO2 nanoparticles generally offered a better corrosion protection than PropS-SH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An Adaptive Optic (AO) system is a fundamental requirement of 8m-class telescopes. We know that in order to obtain the maximum possible resolution allowed by these telescopes we need to correct the atmospheric turbulence. Thanks to adaptive optic systems we are able to use all the effective potential of these instruments, drawing all the information from the universe sources as best as possible. In an AO system there are two main components: the wavefront sensor (WFS) that is able to measure the aberrations on the incoming wavefront in the telescope, and the deformable mirror (DM) that is able to assume a shape opposite to the one measured by the sensor. The two subsystem are connected by the reconstructor (REC). In order to do this, the REC requires a “common language" between these two main AO components. It means that it needs a mapping between the sensor-space and the mirror-space, called an interaction matrix (IM). Therefore, in order to operate correctly, an AO system has a main requirement: the measure of an IM in order to obtain a calibration of the whole AO system. The IM measurement is a 'mile stone' for an AO system and must be done regardless of the telescope size or class. Usually, this calibration step is done adding to the telescope system an auxiliary artificial source of light (i.e a fiber) that illuminates both the deformable mirror and the sensor, permitting the calibration of the AO system. For large telescope (more than 8m, like Extremely Large Telescopes, ELTs) the fiber based IM measurement requires challenging optical setups that in some cases are also impractical to build. In these cases, new techniques to measure the IM are needed. In this PhD work we want to check the possibility of a different method of calibration that can be applied directly on sky, at the telescope, without any auxiliary source. Such a technique can be used to calibrate AO system on a telescope of any size. We want to test the new calibration technique, called “sinusoidal modulation technique”, on the Large Binocular Telescope (LBT) AO system, which is already a complete AO system with the two main components: a secondary deformable mirror with by 672 actuators, and a pyramid wavefront sensor. My first phase of PhD work was helping to implement the WFS board (containing the pyramid sensor and all the auxiliary optical components) working both optical alignments and tests of some optical components. Thanks to the “solar tower” facility of the Astrophysical Observatory of Arcetri (Firenze), we have been able to reproduce an environment very similar to the telescope one, testing the main LBT AO components: the pyramid sensor and the secondary deformable mirror. Thanks to this the second phase of my PhD thesis: the measure of IM applying the sinusoidal modulation technique. At first we have measured the IM using a fiber auxiliary source to calibrate the system, without any kind of disturbance injected. After that, we have tried to use this calibration technique in order to measure the IM directly “on sky”, so adding an atmospheric disturbance to the AO system. The results obtained in this PhD work measuring the IM directly in the Arcetri solar tower system are crucial for the future development: the possibility of the acquisition of IM directly on sky means that we are able to calibrate an AO system also for extremely large telescope class where classic IM measurements technique are problematic and, sometimes, impossible. Finally we have not to forget the reason why we need this: the main aim is to observe the universe. Thanks to these new big class of telescopes and only using their full capabilities, we will be able to increase our knowledge of the universe objects observed, because we will be able to resolve more detailed characteristics, discovering, analyzing and understanding the behavior of the universe components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1) Background: The most common methods to evaluate clarithromycin resistance is the E-Test, but is time consuming. Resistance of Hp to clarithromycin is due to point mutations in the 23S rRNA. Eight different point mutations have been related to CH resistance, but the large majority of the clarithromycin resistance depends on three point mutations (A2142C, A2142G and A2143G). A novel PCR-based clarithromycin resistance assays, even on paraffin-embedded biopsy specimens, have been proposed. Aims: to assess clarithromycin resistance detecting these point mutation (E-Test as a reference method);secondly, to investigate relation with MIC values. Methods: Paraffin-embedded biopsies of patients Hp-positive were retrieved. The A2142C, A2142G and A2143G point mutations were detected by molecular analysis after DNA extraction by using a TaqMan real-time PCR. Results: The study enrolled 86 patients: 46 resistant and 40 sensible to CH. The Hp status was evaluated at endoscopy, by rapid urease test (RUT), histology and hp culture. According to real-time PCR, 37 specimens were susceptible to clarithromycin (wild type dna) whilst the remaining 49 specimens (57%) were resistant. A2143G is the most frequent mutation. A2142C always express a resistant phenotype and A2142G leads to a resitant phenotype only if homozigous. 2) Background: Colonoscopy work-load for endoscopy services is increasing due to colorectal cancer prevention. We tested a combination of faecal tests to improve accuracy and prioritize the access to colonoscopy. Methods: we tested a combination of fecal tests (FOBT, M2-PK and calprotectin) in a group of 280 patients requiring colonoscopy. Results: 47 patients had CRC and 85 had advanced adenoma/s at colonoscopy/histology. In case of single test, for CRC detection FOBT was the test with the highest specificity and PPV, M2-PK had the highest sensitivity and higher NPV. Combination was more interesting in term of PPV. And the best combination of tests was i-FOBT + M2-PK.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Die Elektrische Impedanztomographie soll als kostengünstige und nebenwirkungsfreie Tomographiemethode in der medizinischen Diagnostik, z. B. in der Mammographie dienen. Mit der EIT läßt sich Krebsgewebe von gesundem Gewebe unterscheiden, da es eine signifikant erhöhte Leitfähigkeit aufweist. Damit kann die EIT als Ergänzung zu den klassischen Diagnoseverfahren dienen. So ist z.B. bei jungen Frauen mit einem dichteren Fettgewebe die Identifizierung eines Mammakarzinoms mit der Röntgentomographie nicht immer möglich. Ziel dieser Arbeit war es, einen Prototypen für die Impedanztomographie zu entwickeln und mögliche Anwendungen zu testen. Der Tomograph ist in Zusammenarbeit mit Dr. K.H.Georgi gebaut worden. Der Tomograph erlaubt es niederohmige, Wechselströme an Elektroden auf der Körperoberfläche einzuspeisen. Die Potentiale können an diesen Elektroden programmierbar vorgegeben werden. Weitere hochohmige Elektroden dienen zur Potentialmessung. Um den Hautwiderstand zu überbrücken, werden Wechselstromfrequenzen von 20-100 kHz eingesetzt. Mit der Möglichkeit der Messung von Strom und Potential auf unterschiedlichen Elektroden kann man das Problem des nur ungenau bekannten Hautwiderstandes umgehen. Prinzipiell ist es mit dem Mainzer EIT System möglich, 100 Messungen in der Sekunde durchzuführen. Auf der Basis von mit dem Mainzer EIT gewonnenen Daten sollten unterschiedliche Rekonstruktionsalgorithmen getestet und weiterentwickelt werden. In der Vergangenheit sind verschiedene Rekonstruktionsalgorithmen für das mathematisch schlecht gestellte EIT Problem betrachtet worden. Sie beruhen im Wesentlichen auf zwei Strategien: Die Linearisierung und iterative Lösung des Problems und Gebietserkennungsmethoden. Die iterativen Verfahren wurden von mir dahingehend modifiziert, dass Leitfähigkeitserhöhungen und Leitfähigkeitserniedrigungen gleichberechtigt behandelt werden können. Für den modifizierten Algorithmus wurden zwei verschiedene Rekonstruktionsalgorithmen programmiert und mit synthetischen Daten getestet. Zum einen die Rekonstruktion über die approximative Inverse, zum anderen eine Rekonstruktion mit einer Diskretisierung. Speziell für die Rekonstruktion mittels Diskretisierung wurde eine Methode entwickelt, mit der zusätzliche Informationen in der Rekonstruktion berücksichtigt werden können, was zu einer Verbesserung der Rekonstruktion beiträgt. Der Gebietserkennungsalgorithmus kann diese Zusatzinformationen liefern. In der Arbeit wurde ein neueres Verfahren für die Gebietserkennung derart modifiziert, dass eine Rekonstruktion auch für getrennte Strom- und Spannungselektroden möglich wurde. Mit Hilfe von Differenzdaten lassen sich ausgezeichnete Rekonstruktionen erzielen. Für die medizinischen Anwendungen sind aber Absolutmessungen nötig, d.h. ohne Leermessung. Der erwartende Effekt einer Inhomogenität in der Leitfähigkeit ist sehr klein und als Differenz zweier grosser Zahlen sehr schwierig zu bestimmen. Die entwickelten Algorithmen kommen auch gut mit Absolutdaten zurecht.