853 resultados para COINTEGRATION TESTS
Resumo:
Using the indirect hemagglutination (IH), indirect immunofluorescence (IIF) and enzyme linked immunosorbent assay (ELISA) tests for the diagnosis of Chagas disease, 4000 serum samples were examined. This study was conducted with different purposes: clinical interest, research support and parasitological monitoring of those patients with Chagas disease who were treated with heart transplantations. The tests occurred without patient selection and in accordance with the medical requests. The results showed discrepancies and brought about several questions, considering the different results that all three methods showed when considered together. What was found brought about concerns and we suggest the adoption of different measures, aiming to avoid these mismatches in the context of this disease.
Resumo:
A literatura argumenta que o Brasil, embora ainda seja o maior exportador mundial de café verde, tem perdido poder neste mercado, pois a concorrência (rivalidade e probabilidade de entrada) imposta por países como a Colômbia e o Vietnã é forte o suficiente para tornar este mercado bastante competitivo. Assim, este artigo avalia o padrão recente de concorrência do mercado mundial de café verde utilizando uma metodologia econométrica mais usualmente empregada em análise antitruste. Para avaliar o comportamento dos consumidores, foram estimadas as elasticidades-preço da demanda mundial de café verde, por tipo de café, usando o modelo de demanda Logit Multinomial Antitruste. Para avaliar o comportamento de equilíbrio de mercado foram realizados testes de instabilidade de share de quantidade por meio de análise de cointegração em painel. Os resultados apontam para aumento da concorrência à variedade de café brasileiro por parte da demanda e manutenção de sharede quantidades como configuração de equilíbrio de mercado.
Resumo:
Electrospinning is used to produce fibers in the nanometer range by stretching a polymeric jet using electric fields of high magnitude. Chitosan is an abundant natural polymer that can be used to obtain biocompatible nanostructured membranes. The objectives of this work were to obtain nanostructured membranes based on blends of chitosan and polyoxyethylene (PEO), and evaluate their thermal and morphological properties, as well as their in vitro biocompatibility by agar diffusion cytotoxicity tests for three different cell lines. A nanostructured fibrous membrane with fiber diameters in the order of 200 nm was obtained, which presented a rough surface and thickness ranging from one to two millimeters. The results of the cytotoxicity tests evidenced that the chitosan/PEO membranes are non-toxic to the cells studied in this work. Further, the electrospinning technique was effective in obtaining nanostructured chitosan/PEO membranes, which showed biocompatibility according to in vitro preliminary tests using the cell lines.
Resumo:
OBJECTIVE: The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. METHODS: To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. RESULTS: A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. CONCLUSION: The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.
Resumo:
An environmental impact study was conducted to determine the Piracicamirim's creek water quality in order to assess the influence of effluents from a sugar industry in this water body. For this, toxicity tests were performed with a water sample upstream and downstream the industry using the microcrustaceans Daphnia magna, Ceriodaphnia dubia and Ceriodaphnia silvestrii as test organisms, as well as physical and chemical analysis of water. Results showed that physical and chemical parameters did not change during the sampling period, except for the dissolved oxygen. No toxicity was observed for D. magna and reproduction of C. dubia and C. silvestrii in both sampling points. Thus, the industry was not negatively impacting the quality of this water body.
Resumo:
Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.
Resumo:
In this thesis the application of biotechnological processes based on microbial metabolic degradation of halogenated compound has been investigated. Several studies showed that most of these pollutants can be biodegraded by single bacterial strains or mixed microbial population via aerobic direct metabolism or cometabolism using as a growth substrates aromatic or aliphatic hydrocarbons. The enhancement of two specific processes has been here object of study in relation with its own respective scenario described as follow: 1st) the bioremediation via aerobic cometabolism of soil contaminated by a high chlorinated compound using a mixed microbial population and the selection and isolation of consortium specific for the compound. 2nd) the implementation of a treatment technology based on direct metabolism of two pure strains at the exact point source of emission, preventing dilution and contamination of large volumes of waste fluids polluted by several halogenated compound minimizing the environmental impact. In order to verify the effect of these two new biotechnological application to remove halogenated compound and purpose them as a more efficient alternative continuous and batch tests have been set up in the experimental part of this thesis. Results obtained from the continuous tests in the second scenario have been supported by microbial analysis via Fluorescence in situ Hybridisation (FISH) and by a mathematical model of the system. The results showed that both process in its own respective scenario offer an effective solutions for the biological treatment of chlorinate compound pollution.
Resumo:
Bread dough and particularly wheat dough, due to its viscoelastic behaviour, is probably the most dynamic and complicated rheological system and its characteristics are very important since they highly affect final products’ textural and sensorial properties. The study of dough rheology has been a very challenging task for many researchers since it can provide numerous information about dough formulation, structure and processing. This explains why dough rheology has been a matter of investigation for several decades. In this research rheological assessment of doughs and breads was performed by using empirical and fundamental methods at both small and large deformation, in order to characterize different types of doughs and final products such as bread. In order to study the structural aspects of food products, image analysis techniques was used for the integration of the information coming from empirical and fundamental rheological measurements. Evaluation of dough properties was carried out by texture profile analysis (TPA), dough stickiness (Chen and Hoseney cell) and uniaxial extensibility determination (Kieffer test) by using a Texture Analyser; small deformation rheological measurements, were performed on a controlled stress–strain rheometer; moreover the structure of different doughs was observed by using the image analysis; while bread characteristics were studied by using texture profile analysis (TPA) and image analysis. The objective of this research was to understand if the different rheological measurements were able to characterize and differentiate the different samples analysed. This in order to investigate the effect of different formulation and processing conditions on dough and final product from a structural point of view. For this aim the following different materials were performed and analysed: - frozen dough realized without yeast; - frozen dough and bread made with frozen dough; - doughs obtained by using different fermentation method; - doughs made by Kamut® flour; - dough and bread realized with the addition of ginger powder; - final products coming from different bakeries. The influence of sub-zero storage time on non-fermented and fermented dough viscoelastic performance and on final product (bread) was evaluated by using small deformation and large deformation methods. In general, the longer the sub-zero storage time the lower the positive viscoelastic attributes. The effect of fermentation time and of different type of fermentation (straight-dough method; sponge-and-dough procedure and poolish method) on rheological properties of doughs were investigated using empirical and fundamental analysis and image analysis was used to integrate this information throughout the evaluation of the dough’s structure. The results of fundamental rheological test showed that the incorporation of sourdough (poolish method) provoked changes that were different from those seen in the others type of fermentation. The affirmative action of some ingredients (extra-virgin olive oil and a liposomic lecithin emulsifier) to improve rheological characteristics of Kamut® dough has been confirmed also when subjected to low temperatures (24 hours and 48 hours at 4°C). Small deformation oscillatory measurements and large deformation mechanical tests performed provided useful information on the rheological properties of samples realized by using different amounts of ginger powder, showing that the sample with the highest amount of ginger powder (6%) had worse rheological characteristics compared to the other samples. Moisture content, specific volume, texture and crumb grain characteristics are the major quality attributes of bread products. The different sample analyzed, “Coppia Ferrarese”, “Pane Comune Romagnolo” and “Filone Terra di San Marino”, showed a decrease of crumb moisture and an increase in hardness over the storage time. Parameters such as cohesiveness and springiness, evaluated by TPA that are indicator of quality of fresh bread, decreased during the storage. By using empirical rheological tests we found several differences among the samples, due to the different ingredients used in formulation and the different process adopted to prepare the sample, but since these products are handmade, the differences could be account as a surplus value. In conclusion small deformation (in fundamental units) and large deformation methods showed a significant role in monitoring the influence of different ingredients used in formulation, different processing and storage conditions on dough viscoelastic performance and on final product. Finally the knowledge of formulation, processing and storage conditions together with the evaluation of structural and rheological characteristics is fundamental for the study of complex matrices like bakery products, where numerous variable can influence their final quality (e.g. raw material, bread-making procedure, time and temperature of the fermentation and baking).
Resumo:
The interactions between outdoor bronzes and the environment, which lead to bronze corrosion, require a better understanding in order to design effective conservation strategies in the Cultural Heritage field. In the present work, investigations on real patinas of the outdoor monument to Vittorio Bottego (Parma, Italy) and laboratory studies on accelerated corrosion testing of inhibited (by silane-based films, with and without ceria nanoparticles) and non-inhibited quaternary bronzes are reported and discussed. In particular, a wet&dry ageing method was used both for testing the efficiency of the inhibitor and for patinating bronze coupons before applying the inhibitor. A wide range of spectroscopic techniques has been used, for characterizing the core metal (SEM+EDS, XRF, AAS), the corroded surfaces (SEM+EDS, portable XRF, micro-Raman, ATR-IR, Py-GC-MS) and the ageing solutions (AAS). The main conclusions were: 1. The investigations on the Bottego monument confirmed the differentiation of the corrosion products as a function of the exposure geometry, already observed in previous works, further highlighting the need to take into account the different surface features when selecting conservation procedures such as the application of inhibitors (i.e. the relative Sn enrichment in unsheltered areas requires inhibitors which effectively interact not only with Cu but also with Sn). 2. The ageing (pre-patination) cycle on coupons was able to reproduce the relative Sn enrichment that actually happens in real patinated surfaces, making the bronze specimens representative of the real support for bronze inhibitors. 3. The non-toxic silane-based inhibitors display a good protective efficiency towards pre-patinated surfaces, differently from other widely used inhibitors such as benzotriazole (BTA) and its derivatives. 4. The 3-mercapto-propyl-trimethoxy-silane (PropS-SH) additivated with CeO2 nanoparticles generally offered a better corrosion protection than PropS-SH.
Resumo:
An Adaptive Optic (AO) system is a fundamental requirement of 8m-class telescopes. We know that in order to obtain the maximum possible resolution allowed by these telescopes we need to correct the atmospheric turbulence. Thanks to adaptive optic systems we are able to use all the effective potential of these instruments, drawing all the information from the universe sources as best as possible. In an AO system there are two main components: the wavefront sensor (WFS) that is able to measure the aberrations on the incoming wavefront in the telescope, and the deformable mirror (DM) that is able to assume a shape opposite to the one measured by the sensor. The two subsystem are connected by the reconstructor (REC). In order to do this, the REC requires a “common language" between these two main AO components. It means that it needs a mapping between the sensor-space and the mirror-space, called an interaction matrix (IM). Therefore, in order to operate correctly, an AO system has a main requirement: the measure of an IM in order to obtain a calibration of the whole AO system. The IM measurement is a 'mile stone' for an AO system and must be done regardless of the telescope size or class. Usually, this calibration step is done adding to the telescope system an auxiliary artificial source of light (i.e a fiber) that illuminates both the deformable mirror and the sensor, permitting the calibration of the AO system. For large telescope (more than 8m, like Extremely Large Telescopes, ELTs) the fiber based IM measurement requires challenging optical setups that in some cases are also impractical to build. In these cases, new techniques to measure the IM are needed. In this PhD work we want to check the possibility of a different method of calibration that can be applied directly on sky, at the telescope, without any auxiliary source. Such a technique can be used to calibrate AO system on a telescope of any size. We want to test the new calibration technique, called “sinusoidal modulation technique”, on the Large Binocular Telescope (LBT) AO system, which is already a complete AO system with the two main components: a secondary deformable mirror with by 672 actuators, and a pyramid wavefront sensor. My first phase of PhD work was helping to implement the WFS board (containing the pyramid sensor and all the auxiliary optical components) working both optical alignments and tests of some optical components. Thanks to the “solar tower” facility of the Astrophysical Observatory of Arcetri (Firenze), we have been able to reproduce an environment very similar to the telescope one, testing the main LBT AO components: the pyramid sensor and the secondary deformable mirror. Thanks to this the second phase of my PhD thesis: the measure of IM applying the sinusoidal modulation technique. At first we have measured the IM using a fiber auxiliary source to calibrate the system, without any kind of disturbance injected. After that, we have tried to use this calibration technique in order to measure the IM directly “on sky”, so adding an atmospheric disturbance to the AO system. The results obtained in this PhD work measuring the IM directly in the Arcetri solar tower system are crucial for the future development: the possibility of the acquisition of IM directly on sky means that we are able to calibrate an AO system also for extremely large telescope class where classic IM measurements technique are problematic and, sometimes, impossible. Finally we have not to forget the reason why we need this: the main aim is to observe the universe. Thanks to these new big class of telescopes and only using their full capabilities, we will be able to increase our knowledge of the universe objects observed, because we will be able to resolve more detailed characteristics, discovering, analyzing and understanding the behavior of the universe components.