915 resultados para Dynamic Headspace Analysis
Resumo:
RpfG is a paradigm for a class of widespread bacterial two-component regulators with a CheY-like receiver domain attached to a histidine-aspartic acid-glycine-tyrosine-proline (HD-GYP) cyclic di-GMP phosphodiesterase domain. In the plant pathogen Xanthomonas campestris pv. campestris (Xcc), a two-component system comprising RpfG and the complex sensor kinase RpfC is implicated in sensing and responding to the diffusible signaling factor (DSF), which is essential for cell-cell signaling. RpfF is involved in synthesizing DSF, and mutations of rpfF, rpfG, or rpfC lead to a coordinate reduction in the synthesis of virulence factors such as extracellular enzymes, biofilm structure, and motility. Using yeast two-hybrid analysis and fluorescence resonance energy transfer experiments in Xcc, we show that the physical interaction of RpfG with two proteins with diguanylate cyclase (GGDEF) domains controls a subset of RpfG-regulated virulence functions. RpfG interactions were abolished by alanine substitutions of the three residues of the conserved GYP motif in the HD-GYP domain. Changing the GYP motif or deletion of the two GGDEF-domain proteins reduced Xcc motility but not the synthesis of extracellular enzymes or biofilm formation. RpfG-GGDEF interactions are dynamic and depend on DSF signaling, being reduced in the rpfF mutant but restored by DSF addition. The results are consistent with a model in which DSF signal transduction controlling motility depends on a highly regulated, dynamic interaction of proteins that influence the localized expression of cyclic di-GMP.
Resumo:
A dynamic atmosphere generator with a naphthalene emission source has been constructed and used for the development and evaluation of a bioluminescence sensor based on the bacteria Pseudomonas fluorescens HK44 immobilized in 2% agar gel (101 cell mL(-1)) placed in sampling tubes. A steady naphthalene emission rate (around 7.3 nmol min(-1) at 27 degrees C and 7.4 mLmin(-1) of purified air) was obtained by covering the diffusion unit containing solid naphthalene with a PTFE filter membrane. The time elapsed from gelation of the agar matrix to analyte exposure (""maturation time"") was found relevant for the bioluminescence assays, being most favorable between 1.5 and 3 h. The maximum light emission, observed after 80 min, is dependent on the analyte concentration and the exposure time (evaluated between 5 and 20 min), but not on the flow rate of naphthalene in the sampling tube, over the range of 1.8-7.4 nmol min(-1). A good linear response was obtained between 50 and 260 nmol L-1 with a limit of detection estimated in 20 nmol L-1 far below the recommended threshold limit value for naphthalene in air. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Flow injection analysis (FIA) using a carbon film sensor for amperometric detection was explored for ambroxol analysis in pharmaceutical formulations. The specially designed flow cell designed in the lab generated sharp and reproducible current peaks, with a wide linear dynamic range from 5 x 10(-7) to 3.5 x 10(-4) mol L-1, in 0.1 mol L-1 sulfuric acid electrolyte, as well as high sensitivity, 0.110 A mol(-1) L cm(-2) at the optimized flow rate. A detection limit of 7.6 x 10(-8) mol L-1 and a sampling frequency of 50 determinations per hour were achieved, employing injected volumes of 100 mu L and a flow rate of 2.0 mL min(-1). The repeatability, expressed as R.S.D. for successive and alternated injections of 6.0 x 10(-6) and 6.0 x 10(-5) mol L-1 ambroxol solutions, was 3.0 and 1.5%, respectively, without any noticeable memory effect between injections. The proposed method was applied to the analysis of ambroxol in pharmaceutical samples and the results obtained were compared with UV spectrophotometric and acid-base titrimetric methods. Good agreement between the results utilizing the three methods and the labeled values was achieved, corroborating the good performance of the proposed electrochemical methodology for ambroxol analysis. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This paper describes the development and evaluation of a sequential injection method to automate the determination of methyl parathion by square wave adsorptive cathodic stripping voltammetry exploiting the concept of monosegmented flow analysis to perform in-line sample conditioning and standard addition. Accumulation and stripping steps are made in the sample medium conditioned with 40 mmol L-1 Britton-Robinson buffer (pH 10) in 0.25 mol L-1 NaNO3. The homogenized mixture is injected at a flow rate of 10 mu Ls(-1) toward the flow cell, which is adapted to the capillary of a hanging drop mercury electrode. After a suitable deposition time, the flow is stopped and the potential is scanned from -0.3 to -1.0 V versus Ag/AgCl at frequency of 250 Hz and pulse height of 25 mV The linear dynamic range is observed for methyl parathion concentrations between 0.010 and 0.50 mgL(-1), with detection and quantification limits of 2 and 7 mu gL(-1), respectively. The sampling throughput is 25 h(-1) if the in line standard addition and sample conditioning protocols are followed, but this frequency can be increased up to 61 h(-1) if the sample is conditioned off-line and quantified using an external calibration curve. The method was applied for determination of methyl parathion in spiked water samples and the accuracy was evaluated either by comparison to high performance liquid chromatography with UV detection, or by the recovery percentages. Although no evidences of statistically significant differences were observed between the expected and obtained concentrations, because of the susceptibility of the method to interference by other pesticides (e.g., parathion, dichlorvos) and natural organic matter (e.g., fulvic and humic acids), isolation of the analyte may be required when more complex sample matrices are encountered. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This work describes the partial oxypropylation of filter paper cellulose fibers, employing two different basic catalyst, viz., potassium hydroxide and 1,4-diazabicyclo [2.2.2] octane, to activate the hydroxyl groups of the polysaccharide and thus provide the anionic initiation sites for the ""grafting-from"" polymerization of propylene oxide. The success of this chemical modification was assessed by FTIR spectroscopy, X-ray diffraction, scanning electron microscopy, differential scanning calorimetry, thermogravimetric analysis and contact angle measurements. The study of the role of the catalyst employed on the extent of the modification and on the mechanical properties of the ensuing composites, after hot pressing, showed that both the Bronsted and the Lewis base gave satisfactory results, without any marked difference.
Resumo:
A variety of substrates have been used for fabrication of microchips for DNA extraction, PCR amplification, and DNA fragment separation, including the more conventional glass and silicon as well as alternative polymer-based materials. Polyester represents one such polymer, and the laser-printing of toner onto polyester films has been shown to be effective for generating polyester-toner (PeT) microfluidic devices with channel depths on the order of tens of micrometers. Here, we describe a novel and simple process that allows for the production of multilayer, high aspect-ratio PeT microdevices with substantially larger channel depths. This innovative process utilizes a CO(2) laser to create the microchannel in polyester sheets containing a uniform layer of printed toner, and multilayer devices can easily be constructed by sandwiching the channel layer between uncoated cover sheets of polyester containing precut access holes. The process allows the fabrication of deep channels, with similar to 270 mu m, and we demonstrate the effectiveness of multilayer PeT microchips for dynamic solid phase extraction (dSPE) and PCR amplification. With the former, we found that (i) more than 65% of DNA from 0.6 mu L of blood was recovered, (ii) the resultant DNA was concentrated to greater than 3 ng/mu L., (which was better than other chip-based extraction methods), and (iii) the DNA recovered was compatible with downstream microchip-based PCR amplification. Illustrative of the compatibility of PeT microchips with the PCR process, the successful amplification of a 520 bp fragment of lambda-phage DNA in a conventional thermocycler is shown. The ability to handle the diverse chemistries associated with DNA purification and extraction is a testimony to the potential utility of PeT microchips beyond separations and presents a promising new disposable platform for genetic analysis that is low cost and easy to fabricate.
Resumo:
The purpose of this paper is to make quantitative and qualitative analysis of foreign citizens who may participate on the Swedish labor market (in text refers to as ‘immigrants’). This research covers the period 1973-2005 and gives prediction figures of immigrant population, age and gender structure, and education attainment in 2010. To cope with data regarding immigrants from different countries, the population was divided into six groups. The main chapter is divided into two parts. The first part specifies division of immigrants into groups by country of origin according to geographical, ethnical, economical and historical criteria. Brief characteristics and geographic position, dynamic and structure description were given for each group; historical review explain rapid changes in immigrant population. Statistical models for description and estimation future population were given. The second part specifies education and qualification level of the immigrants according to international and Swedish standards. Models for estimating age and gender structure, level of education and professional orientation of immigrants in different groups are given. Inferences were made regarding ethnic, gender and education structure of immigrants; the distribution of immigrants among Swedish counties is given. Discussion part presents the results of the research, gives perspectives for the future brief evaluation of the role of immigrants on the Swedish labor market.
Resumo:
This paper is concerned with the modern theory of social cost-benefit analysis in a dynamic economy. The theory emphasizes the role of a comprehensive, forward-looking, dynamic welfare index within the period of the project rather than that of a project's long-term consequences. However, what constitutes such a welfare index remains controversial in the recent literature. In this paper, we attempt to shed light on the issue by deriving three equivalent cost-benefit rules for evaluating a small project. In particular, we show that the direct change in net national product (NNP) qualifies as a convenient welfare index without involving any other induced side effects. The project evaluation criterion thus becomes the present discounted value of the direct changes in NNP over the project period. We also illustrate the application of this theory in a few stylized examples.
Resumo:
The narrative of the United States is of a "nation of immigrants" in which the language shift patterns of earlier ethnolinguistic groups have tended towards linguistic assimilation through English. In recent years, however, changes in the demographic landscape and language maintenance by non-English speaking immigrants, particularly Hispanics, have been perceived as threats and have led to calls for an official English language policy.This thesis aims to contribute to the study of language policy making from a societal security perspective as expressed in attitudes regarding language and identity originating in the daily interaction between language groups. The focus is on the role of language and American identity in relation to immigration. The study takes an interdisciplinary approach combining language policy studies, security theory, and critical discourse analysis. The material consists of articles collected from four newspapers, namely USA Today, The New York Times, Los Angeles Times, and San Francisco Chronicle between April 2006 and December 2007.Two discourse types are evident from the analysis namely Loyalty and Efficiency. The former is mainly marked by concerns of national identity and contains speech acts of security related to language shift, choice and English for unity. Immigrants are represented as dehumanised, and harmful. Immigration is given as sovereignty-related, racial, and as war. The discourse type of Efficiency is mainly instrumental and contains speech acts of security related to cost, provision of services, health and safety, and social mobility. Immigrants are further represented as a labour resource. These discourse types reflect how the construction of the linguistic 'we' is expected to be maintained. Loyalty is triggered by arguments that the collective identity is threatened and is itself used in reproducing the collective 'we' through hegemonic expressions of monolingualism in the public space and semi-public space. The denigration of immigrants is used as a tool for enhancing societal security through solidarity and as a possible justification for the denial of minority rights. Also, although language acquisition patterns still follow the historical trend of language shift, factors indicating cultural separateness such as the appearance of speech communities or the use of minority languages in the public space and semi-public space have led to manifestations of intolerance. Examples of discrimination and prejudice towards minority groups indicate that the perception of worth of a shared language differs from the actual worth of dominant language acquisition for integration purposes. The study further indicates that the efficient working of the free market by using minority languages to sell services or buy labour is perceived as conflicting with nation-building notions since it may create separately functioning sub-communities with a new cultural capital recognised as legitimate competence. The discourse types mainly represent securitising moves constructing existential threats. The perception of threat and ideas of national belonging are primarily based on a zero-sum notion favouring monolingualism. Further, the identity of the immigrant individual is seen as dynamic and adaptable to assimilationist measures whereas the identity of the state and its members are perceived as static. Also, the study shows that debates concerning language status are linked to extra-linguistic matters. To conclude, policy makers in the US need to consider the relationship between four factors, namely societal security based on collective identity, individual/human security, human rights, and a changing linguistic demography, for proposed language intervention measures to be successful.
Resumo:
Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.
Resumo:
This manuscript empirically assesses the effects of political institutions on economic growth. It analyzes how political institutions affect economic growth in different stages of democratization and economic development by means of dynamic panel estimation with interaction terms. The new empirical results obtained show that political institutions work as a substitute for democracy promoting economic growth. In other words, political institutions are important for increasing economic growth, mainly when democracy is not consolidated. Moreover, political institutions are extremely relevant to economic outcomes in periods of transition to democracy and in poor countries with high ethnical fractionalization.
Resumo:
The purpose of this work is to provide a brief overview of the literature on the optimal design of unemployment insurance systems by analyzing some of the most influential articles published over the last three decades on the subject and extend the main results to a multiple aggregate shocks environment. The properties of optimal contracts are discussed in light of the key assumptions commonly made in theoretical publications on the area. Moreover, the implications of relaxing each of these hypothesis is reckoned as well. The analysis of models of only one unemployment spell starts from the seminal work of Shavell and Weiss (1979). In a simple and common setting, unemployment benefits policies, wage taxes and search effort assignments are covered. Further, the idea that the UI distortion of the relative price of leisure and consumption is the only explanation for the marginal incentives to search for a job is discussed, putting into question the reduction in labor supply caused by social insurance, usually interpreted as solely an evidence of a dynamic moral hazard caused by a substitution effect. In addition, the paper presents one characterization of optimal unemployment insurance contracts in environments in which workers experience multiple unemployment spells. Finally, an extension to multiple aggregate shocks environment is considered. The paper ends with a numerical analysis of the implications of i.i.d. shocks to the optimal unemployment insurance mechanism.
Resumo:
Over the last decades, the analysis of the transmissions of international nancial events has become the subject of many academic studies focused on multivariate volatility models volatility. The goal of this study is to evaluate the nancial contagion between stock market returns. The econometric approach employed was originally presented by Pelletier (2006), named Regime Switching Dynamic Correlation (RSDC). This methodology involves the combination of Constant Conditional Correlation Model (CCC) proposed by Bollerslev (1990) with Markov Regime Switching Model suggested by Hamilton and Susmel (1994). A modi cation was made in the original RSDC model, the introduction of the GJR-GARCH model formulated in Glosten, Jagannathan e Runkle (1993), on the equation of the conditional univariate variances to allow asymmetric e ects in volatility be captured. The database was built with the series of daily closing stock market indices in the United States (SP500), United Kingdom (FTSE100), Brazil (IBOVESPA) and South Korea (KOSPI) for the period from 02/01/2003 to 09/20/2012. Throughout the work the methodology was compared with others most widespread in the literature, and the model RSDC with two regimes was de ned as the most appropriate for the selected sample. The set of results provide evidence for the existence of nancial contagion between markets of the four countries considering the de nition of nancial contagion from the World Bank called very restrictive. Such a conclusion should be evaluated carefully considering the wide diversity of de nitions of contagion in the literature.
Resumo:
Apos uma década de rápido crescimento econômico na primeira década do século 21, Brasil e Turquia foram considerados duas das economias emergentes mais dinâmicas e promissoras. No entanto, vários sinais de dificuldades econômicas e tensões políticas reapareceram recentemente e simultaneamente nos dois países. Acreditamos que esses sinais e a sua simultaneidade podem ser entendidos melhor com um olhar retrospectivo sobre a história econômica dos dois países, que revela ser surpreendentemente paralela. Numa primeira parte, empreendemos uma comparação abrangente da história econômica brasileira e turca para mostrar as numerosas similaridades entre os desafios de política econômica que os dois países enfrentaram, assim como entre as respostas que eles lhes deram desde a virada da Grande Depressão até a primeira década do século 21. Essas escolhas de política econômica comuns dão forma a uma trajetória de desenvolvimento notavelmente análoga, caracterizada primeiro pela adoção do modelo de industrialização por substituição das importações (ISI) no contexto da recessão mundial dos anos 1930; depois pela intensificação e crise final desse modelo nos anos 1980; e finalmente por duas décadas de estabilização e transição para um modelo econômico mais liberal. Numa segunda parte, o desenvolvimento das instituições econômicas e políticas, assim como da economia política subjacente nos dois países, são analisados comparativamente a fim de prover alguns elementos de explicação do paralelo observado na primeira parte. Sustentamos que o marco institucional estabelecido nos dois países durante esse período também têm varias características fundamentais em comum e contribui a explicar as escolhas de política econômica e as performances econômicas comparáveis, detalhadas na primeira parte. Este estudo aborda elementos do contexto histórico úteis para compreender a situação econômica e política atual nos dois países. Potencialmente também constitui uma tentativa de considerar as economias emergentes numa perspectiva histórica e comparativa mais ampla para entender melhor as suas fraquezas institucionais e adotar um olhar mais equilibrado sobre seu potencial econômico.