889 resultados para Smoke screens


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cytokine-driven signalling shapes immune homeostasis and guides inflammatory responses mainly through induction of specific gene expression programmes both within and outside the immune cell compartment. These transcriptional outputs are often amplified via cytokine synergy, which sets a stimulatory threshold that safeguards from exacerbated inflammation and immunopathology. In this study, we investigated the molecular mechanisms underpinning synergy between two pivotal Th1 cytokines, IFN-γ and TNF-α, in human intestinal epithelial cells. These two proinflammatory mediators induce a unique state of signalling and transcriptional synergy implicated in processes such as antiviral and antitumour immunity, intestinal barrier and pancreatic β-cell dysfunction. Since its discovery more than 30 years ago, this biological phenomenon remains, however, only partially defined. Here, using a functional genomics approach including RNAi perturbation screens and small-molecule inhibitors, we identified two new regulators of IFN-γ/TNF-α-induced chemokine and antiviral gene and protein expression, a Bcl-2 protein BCL-G and a histone demethylase UTX. We also discovered that IFN-γ/TNF-α synergise to trigger a coordinated shutdown of major receptor tyrosine kinases expression in colon cancer cells. Together, these findings extend our current understanding of how IFN-γ/TNF-α synergy elicits qualitatively and quantitatively distinct outputs in the intestinal epithelium. Given the well-documented role of this synergistic state in immunopathology of various disorders, our results may help to inform the identification of high quality and biologically relevant druggable targets for diseases characterised by an IFN-γ/TNF-α high immune signature

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While there are many reasons to continue to smoke in spite of its consequences for health, the concern that many smoke because they misperceive the risks of smoking remains a focus of public discussion and motivates tobacco control policies and litigation. In this paper we investigate the relative accuracy of mature smokers' risk perceptions about future survival, and a range of morbidities and disabilities. Using data from the survey on smoking (SOS) conducted for this research, we compare subjective beliefs elicited from the SOS with corresponding individual-specific objective probabilities estimated from the health and retirement study. Overall, consumers in the age group studied, 50-70, are not overly optimistic in their perceptions of health risk. If anything, smokers tend to be relatively pessimistic about these risks. The finding that smokers are either well informed or pessimistic regarding a broad range of health risks suggests that these beliefs are not pivotal in the decision to continue smoking. Although statements by the tobacco companies may have been misleading and thus encouraged some to start smoking, we find no evidence that systematic misinformation about the health consequences of smoking inhibits quitting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smoking is an expensive habit. Smoking households spend, on average, more than $US1000 annually on cigarettes. When a family member quits, in addition to the former smoker's improved long-term health, families benefit because savings from reduced cigarette expenditures can be allocated to other goods. For households in which some members continue to smoke, smoking expenditures crowd-out other purchases, which may affect other household members, as well as the smoker. We empirically analyse how expenditures on tobacco crowd-out consumption of other goods, estimating the patterns of substitution and complementarity between tobacco products and other categories of household expenditure. We use the Consumer Expenditure Survey data for the years 1995-2001, which we complement with regional price data and state cigarette prices. We estimate a consumer demand system that includes several main expenditure categories (cigarettes, food, alcohol, housing, apparel, transportation, medical care) and controls for socioeconomic variables and other sources of observable heterogeneity. Descriptive data indicate that, comparing smokers to nonsmokers, smokers spend less on housing. Results from the demand system indicate that as the price of cigarettes rises, households increase the quantity of food purchased, and, in some samples, reduce the quantity of apparel and housing purchased.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: There is considerable interest in the development of methods to efficiently identify all coding variants present in large sample sets of humans. There are three approaches possible: whole-genome sequencing, whole-exome sequencing using exon capture methods, and RNA-Seq. While whole-genome sequencing is the most complete, it remains sufficiently expensive that cost effective alternatives are important. RESULTS: Here we provide a systematic exploration of how well RNA-Seq can identify human coding variants by comparing variants identified through high coverage whole-genome sequencing to those identified by high coverage RNA-Seq in the same individual. This comparison allowed us to directly evaluate the sensitivity and specificity of RNA-Seq in identifying coding variants, and to evaluate how key parameters such as the degree of coverage and the expression levels of genes interact to influence performance. We find that although only 40% of exonic variants identified by whole genome sequencing were captured using RNA-Seq; this number rose to 81% when concentrating on genes known to be well-expressed in the source tissue. We also find that a high false positive rate can be problematic when working with RNA-Seq data, especially at higher levels of coverage. CONCLUSIONS: We conclude that as long as a tissue relevant to the trait under study is available and suitable quality control screens are implemented, RNA-Seq is a fast and inexpensive alternative approach for finding coding variants in genes with sufficiently high expression levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To develop a sleep hypoxia (SH) in emphysema (SHE) rat model and to explore whether SHE results in more severe hepatic inflammation than emphysema alone and whether the inflammation changes levels of coagulant/anticoagulant factors synthesized in the liver. METHODS: Seventy-five rats were put into 5 groups: SH control (SHCtrl), treated with sham smoke exposure (16 weeks) and SH exposure (12.5% O(2), 3 h/d, latter 8 weeks); emphysema control (ECtrl), smoke exposure and sham SH exposure (21% O(2)); short SHE (SHEShort), smoke exposure and short SH exposure (1.5 h/d); mild SHE (SHEMild), smoke exposure and mild SH exposure (15% O(2)); standard SHE (SHEStand), smoke exposure and SH exposure. Therefore, ECtrl, SHEShort, SHEMild and SHEStand group were among emphysematous groups. Arterial blood gas (ABG) data was obtained during preliminary tests. After exposure, hepatic inflammation (interleukin -6 [IL-6] mRNA and protein, tumor necrosis factor α [TNFα] mRNA and protein) and liver coagulant/anticoagulant factors (antithrombin [AT], fibrinogen [FIB] and Factor VIII [F VIII]) were evaluated. SPSS 11.5 software was used for statistical analysis. RESULTS: Characteristics of emphysema were obvious in emphysematous groups and ABGs reached SH criteria on hypoxia exposure. Hepatic inflammation parameters and coagulant factors are the lowest in SHCtrl and the highest in SHEStand while AT is the highest in SHCtrl and the lowest in SHEStand. Inflammatory cytokines of liver correlate well with coagulant factors positively and with AT negatively. CONCLUSIONS: When SH is combined with emphysema, hepatic inflammation and coagulability enhance each other synergistically and produce a more significant liver-derivative inflammatory and prothrombotic status.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is a crucial transition time for human genetics in general, and for HIV host genetics in particular. After years of equivocal results from candidate gene analyses, several genome-wide association studies have been published that looked at plasma viral load or disease progression. Results from other studies that used various large-scale approaches (siRNA screens, transcriptome or proteome analysis, comparative genomics) have also shed new light on retroviral pathogenesis. However, most of the inter-individual variability in response to HIV-1 infection remains to be explained: genome resequencing and systems biology approaches are now required to progress toward a better understanding of the complex interactions between HIV-1 and its human host.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigated perceptions among overweight and obese state employees about changes to health insurance that were designed to reduce the scope of health benefits for employees who are obese or who smoke. Before implementation of health benefit plan changes, 658 state employees who were overweight (ie, those with a body mass index [BMI] of 25-29.9) or obese (ie, those with a BMI of > or = 30) enrolled in a weight-loss intervention study were asked about their attitudes and beliefs concerning the new benefit plan changes. Thirty-one percent of employees with a measured BMI of 40 or greater self-reported a BMI of less than 40, suggesting they were unaware that their current BMI would place them in a higher-risk benefit plan. More than half of all respondents reported that the new benefit changes would motivate them to make behavioral changes, but fewer than half felt confident in their ability to make changes. Respondents with a BMI of 40 or greater were more likely than respondents in lower BMI categories to oppose the new changes focused on obesity (P < .001). Current smokers were more likely than former smokers and nonsmokers to oppose the new benefit changes focused on tobacco use (P < .01). Participants represented a sample of employees enrolled in a weight-loss study, limiting generalizability to the larger population of state employees. Benefit plan changes that require employees who are obese and smoke to pay more for health care may motivate some, but not all, individuals to change their behaviors. Since confidence to lose weight was lowest among individuals in the highest BMI categories, more-intense intervention options may be needed to achieve desired health behavior changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermal-optical analysis is a conventional method for classifying carbonaceous aerosols as organic carbon (OC) and elemental carbon (EC). This article examines the effects of three different temperature protocols on the measured EC. For analyses of parallel punches from the same ambient sample, the protocol with the highest peak helium-mode temperature (870°C) gives the smallest amount of EC, while the protocol with the lowest peak helium-mode temperature (550°C) gives the largest amount of EC. These differences are observed when either sample transmission or reflectance is used to define the OC/EC split. An important issue is the effect of the peak helium-mode temperature on the relative rate at which different types of carbon with different optical properties evolve from the filter. Analyses of solvent-extracted samples are used to demonstrate that high temperatures (870°C) lead to premature EC evolution in the helium-mode. For samples collected in Pittsburgh, this causes the measured EC to be biased low because the attenuation coefficient of pyrolyzed carbon is consistently higher than that of EC. While this problem can be avoided by lowering the peak helium-mode temperature, analyses of wood smoke dominated ambient samples and levoglucosan-spiked filters indicate that too low helium-mode peak temperatures (550°C) allow non-light absorbing carbon to slip into the oxidizing mode of the analysis. If this carbon evolves after the OC/EC split, it biases the EC measurements high. Given the complexity of ambient aerosols, there is unlikely to be a single peak helium-mode temperature at which both of these biases can be avoided. Copyright © American Association for Aerosol Research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays multi-touch devices (MTD) can be found in all kind of contexts. In the learning context, MTD availability leads many teachers to use them in their class room, to support the use of the devices by students, or to assume that it will enhance the learning processes. Despite the raising interest for MTD, few researches studying the impact in term of performance or the suitability of the technology for the learning context exist. However, even if the use of touch-sensitive screens rather than a mouse and keyboard seems to be the easiest and fastest way to realize common learning tasks (as for instance web surfing behaviour), we notice that the use of MTD may lead to a less favourable outcome. The complexity to generate an accurate fingers gesture and the split attention it requires (multi-tasking effect) make the use of gestures to interact with a touch-sensitive screen more difficult compared to the traditional laptop use. More precisely, it is hypothesized that efficacy and efficiency decreases, as well as the available cognitive resources making the users’ task engagement more difficult. Furthermore, the presented study takes into account the moderator effect of previous experiences with MTD. Two key factors of technology adoption theories were included in the study: familiarity and self-efficacy with the technology.Sixty university students, invited to a usability lab, are asked to perform information search tasks on an online encyclopaedia. The different tasks were created in order to execute the most commonly used mouse actions (e.g. right click, left click, scrolling, zooming, key words encoding…). Two different conditions were created: (1) MTD use and (2) laptop use (with keyboard and mouse). The cognitive load, self-efficacy, familiarity and task engagement scales were adapted to the MTD context. Furthermore, the eye-tracking measurement would offer additional information about user behaviours and their cognitive load.Our study aims to clarify some important aspects towards the usage of MTD and the added value compared to a laptop in a student learning context. More precisely, the outcomes will enhance the suitability of MTD with the processes at stakes, the role of previous knowledge in the adoption process, as well as some interesting insights into the user experience with such devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this review, we discuss recent work by the ENIGMA Consortium (http://enigma.ini.usc.edu) - a global alliance of over 500 scientists spread across 200 institutions in 35 countries collectively analyzing brain imaging, clinical, and genetic data. Initially formed to detect genetic influences on brain measures, ENIGMA has grown to over 30 working groups studying 12 major brain diseases by pooling and comparing brain data. In some of the largest neuroimaging studies to date - of schizophrenia and major depression - ENIGMA has found replicable disease effects on the brain that are consistent worldwide, as well as factors that modulate disease effects. In partnership with other consortia including ADNI, CHARGE, IMAGEN and others(1), ENIGMA's genomic screens - now numbering over 30,000 MRI scans - have revealed at least 8 genetic loci that affect brain volumes. Downstream of gene findings, ENIGMA has revealed how these individual variants - and genetic variants in general - may affect both the brain and risk for a range of diseases. The ENIGMA consortium is discovering factors that consistently affect brain structure and function that will serve as future predictors linking individual brain scans and genomic data. It is generating vast pools of normative data on brain measures - from tens of thousands of people - that may help detect deviations from normal development or aging in specific groups of subjects. We discuss challenges and opportunities in applying these predictors to individual subjects and new cohorts, as well as lessons we have learned in ENIGMA's efforts so far.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, multi-touch devices (MTD) have spread in a range of contexts. In the learning context, MTD accessibility leads more and more teachers to use them in their classroom, assuming that it will improve the learning activities. Despite a growing interest, only few studies have focused on the impacts of MTD use in terms of performance and suitability in a learning context.However, even if the use of touch-sensitive screens rather than a mouse and keyboard seems to be the easiest and fastest way to realize common learning tasks (as for instance web surfing), we notice that the use of MTD may lead to a less favorable outcome. More precisely, tasks that require users to generate complex and/or less common gestures may increase extrinsic cognitive load and impair performance, especially for intrinsically complex tasks. It is hypothesized that task and gesture complexity will affect users’ cognitive resources and decrease task efficacy and efficiency. Because MTD are supposed to be more appealing, it is assumed that it will also impact cognitive absorption. The present study also takes into account user’s prior knowledge concerning MTD use and gestures by using experience with MTD as a moderator. Sixty university students were asked to perform information search tasks on an online encyclopedia. Tasks were set up so that users had to generate the most commonly used mouse actions (e.g. left/right click, scrolling, zooming, text encoding…). Two conditions were created: MTD use and laptop use (with mouse and keyboard) in order to make a comparison between the two devices. An eye tracking device was used to measure user’s attention and cognitive load. Our study sheds light on some important aspects towards the use of MTD and the added value compared to a laptop in a student learning context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While the number of traditional laptops and computers sold has dipped slightly year over year, manufacturers have developed new hybrid laptops with touch screens to build on the tactile trend. This market is moving quickly to make touch the rule rather than the exception and the sales of these devices have tripled since the launch of Windows 8 in 2012, to reach more than sixty million units sold in 2015. Unlike tablets, that benefit from easy-to-use applications specially designed for tactile interactions, hybrid laptops are intended to be used with regular user-interfaces. Hence, one could ask whether tactile interactions are suited for every task and activity performed with such interfaces. Since hybrid laptops are increasingly used in educational situations, this study focuses on information search tasks which are commonly performed for learning purposes. It is hypothesized that tasks that require complex and/or less common gestures will increase user's cognitive load and impair task performance in terms of efficacy and efficiency. A study was carried out in a usability laboratory with 30 participants for whom prior experience with tactile devices has been controlled. They were asked to perform information search tasks on an online encyclopaedia by using only the touch screen of and hybrid laptop. Tasks were selected with respect to their level of cognitive demand (amount of information that had to be maintained in working memory) and the complexity of gestures needed (left and/or right clicks, zoom, text selection and/or input.), and grouped into 4 sets accordingly. Task performance was measured by the number of tasks succeeded (efficacy) and time spent on each task (efficiency). Perceived cognitive load was assessed thanks to a questionnaire given after each set of tasks. An eye tracking device was used to monitor users' attention allocation and to provide objective cognitive load measures based on pupil dilation and the Index of Cognitive Activity. Each experimental run took approximately one hour. The results of this within-subjects design indicate that tasks involving complex gestures led to a lower efficacy, especially when the tasks were cognitively demanding. Regarding efficacy, there is no significant differences between sets of tasks excepted for tasks with low cognitive demand and complex gestures that required more time to be achieved. Surprisingly, users that declared the biggest experience with tactile devices spent more time than less frequent users. Cognitive load measures indicate that participants reported having devoted more mental effort in the interaction when they had to use complex gestures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CD4+ T cells are prominent effector cells in controlling Mycobacterium tuberculosis (Mtb) infection but may also contribute to immunopathology. Studies probing the CD4+ T cell response from individuals latently infected with Mtb or patients with active tuberculosis using either small or proteome-wide antigen screens so far revealed a multi-antigenic, yet mostly invariable repertoire of immunogenic Mtb proteins. Recent developments in mass spectrometry-based proteomics have highlighted the occurrence of numerous types of post-translational modifications (PTMs) in proteomes of prokaryotes, including Mtb. The well-known PTMs in Mtb are glycosylation, lipidation, or phosphorylation, known regulators of protein function or compartmentalization. Other PTMs include methylation, acetylation, and pupylation, involved in protein stability. While all PTMs add variability to the Mtb proteome, relatively little is understood about their role in the anti-Mtb immune responses. Here,we reviewMtb protein PTMs and methods to assess their role in protective immunity against Mtb. © 2014 van Els, Corbière, Smits, vanGaans-van den Brink, Poelen, Mascart, Meiring and Locht.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this paper is to demonstrate the potential of the EXODUS evacuation model in building environments. The latest PC/workstation version of EXODUS is described and is also applied to a large hypothetical supermarket/restaurant complex measuring 50 m x 40 m. A range of scenarios is presented where population characteristics (such as size, individual travel speeds, and individual response times), and enclosure configuration characteristics (such as number of exits, size of exits, and opening times of exits) are varied. The results demonstrate a wide range of occupant behavior including overtaking, queuing, redirection, and conflict avoidance. Evacuation performance is measured by a number of model predicted parameters including individual exit flow rates, overall evacuation flow rates, total evacuation time, average evacuation time per occupant, average travel distance, and average wait time. The simulations highlight the profound impact that variations in individual travel speeds and occupant response times have in determining the overall evacuation performance. 1. Jin, T., and Yamada T., "Experimental Study of Human Behavior in Smoke Filled Corridors," Proceedings of The Second International Symposium on Fire Safety Science, 1988, pp. 511-519. 2. Galea, E.R., and Galparsoro, J.M.P., "EXODUS: An Evacuation Model for Mass Transport Vehicles," UK CAA Paper 93006 ISBN 086039 543X, CAA London, 1993. 3. Galea, E.R., and Galparsoro, J.M.P., "A Computer Based Simulation Model for the Prediction of Evacuation from Mass Transport Vehicles," Fire Safety Journal, Vol. 22, 1994, pp. 341-366. 4. Galea, E.R., Owen, M., and Lawrence, P., "Computer Modeling of Human Be havior in Aircraft Fire Accidents," to appear in the Proceedings of Combus tion Toxicology Symposium, CAMI, Oklahoma City, OK, 1995. 5. Kisko, T.M. and Francis, R.L., "EVACNET+: A Computer Program to Determine Optimal Building Evacuation Plans," Fire Safety Journal, Vol. 9, 1985, pp. 211-220. 6. Levin, B., "EXITT, A Simulation Model of Occupant Decisions and Actions in Residential Fires," Proceedings of The Second International Symposium on Fire Safety Science, 1988, pp. 561-570. 7. Fahy, R.F., "EXIT89: An Evacuation Model for High-Rise Buildings," Pro ceedings of The Third International Sym posium on Fire Safety Science, 1991, pp. 815-823. 8. Thompson, P.A., and Marchant, E.W., "A Computer Model for the Evacuation of Large Building Populations," Fire Safety Journal, Vol. 24, 1995, pp. 131-148. 9. Still, K., "New Computer System Can Predict Human Behavior Response to Building Fires," FIRE 84, 1993, pp. 40-41. 10. Ketchell, N., Cole, S.S., Webber, D.M., et.al., "The Egress Code for Human Move ment and Behavior in Emergency Evacu ations," Engineering for Crowd Safety (Smith, R.A., and Dickie, J.F., Eds.), Elsevier, 1993, pp. 361-370. 11. Takahashi, K., Tanaka, T. and Kose, S., "An Evacuation Model for Use in Fire Safety Design of Buildings," Proceedings of The Second International Symposium on Fire Safety Science, 1988, pp. 551- 560. 12. G2 Reference Manual, Version 3.0, Gensym Corporation, Cambridge, MA. 13. XVT Reference Manual, Version 3.0 XVT Software Inc., Boulder, CO. 14. Galea, E.R., "On the Field Modeling Approach to the Simulation of Enclosure Fires, Journal of Fire Protection Engineering, Vol. 1, No. 1, 1989, pp. 11-22. 15. Purser, D.A., "Toxicity Assessment of Combustion Products," SFPE Handbook of Fire Protection Engineering, National Fire Protection Association, Quincy, MA, pp. 1-200 - 1-245, 1988. 16. Hankin, B.D., and Wright, R.A., "Pas senger Flows in Subways," Operational Research Quarterly, Vol. 9, 1958, pp. 81-88. 17. HMSO, The Building Regulations 1991 - Approved Document B, section B 1 (1992 edition), HMSO publications, London, pp. 9-40. 18. Polus A., Schofer, J.L., and Ushpiz, A., "Pedestrian Flow and Level of Service," Journal of Transportation Engineering, Vol. 109, 1983, pp. 46-47. 19. Muir, H., Marrison, C., and Evans, A., "Aircraft Evacuations: the Effect of Passenger Motivation and Cabin Con figuration Adjacent to the Exit," CAA Paper 89019, ISBN 0 86039 406 9, 1989. 20. Muir, H., Private communication to appear as a CAA report, 1996.