986 resultados para gas test
Resumo:
Thin film nanostructured gas sensors typically operate at temperatures above 400°C, but lower temperature operation is highly desirable, especially for remote area field sensing as this reduces significantly power consumption. We have investigated a range of sensor materials based on both pure and doped tungsten oxide (mainly focusing on Fe-doping), deposited using both thermal evaporation and electron-beam evaporation, and using a variety of post-deposition annealing. The films show excellent sensitivity at operating temperatures as low as 150°C for detection of NO2. There is a definite relationship between the sensitivity and the crystallinity and nanostructure obtained through the deposition and heat treatment processes, as well as variations in the conductivity caused both by doping and heat treatmetn. The ultimate goal of this work is to control the sensing properties, including selectivity to specific gases through the engineering of the electronic properties and the nanostructure of the films.
Resumo:
Quantum-like models can be fruitfully used to model attitude change in a social context. Next steps require data, and higher dimensional models. Here, we discuss an exploratory study that demonstrates an order effect when three question sets about Climate Beliefs, Political Affiliation and Attitudes Towards Science are presented in different orders within a larger study of n=533 subjects. A quantum-like model seems possible, and we propose a new experiment which could be used to test between three possible models for this scenario.
Resumo:
As the cost of mineral fertilisers increases globally, organic soil amendments (OAs) from agricultural sources are increasingly being used as substitutes for nitrogen. However, the impact of OAs on the production of greenhouse gases (CO2 and N2O) is not well understood. A 60-day laboratory incubation experiment was conducted to investigate the impacts of applying OAs (equivalent to 296 kg N ha−1 on average) on N2O and CO2 emissions and soil properties of clay and sandy loam soils from sugar cane production. The experiment included 6 treatments, one being an un-amended (UN) control with addition of five OAs being raw mill mud (MM), composted mill mud (CM), high N compost (HC), rice husk biochar (RB), and raw mill mud plus rice husk biochar (MB). These OAs were incubated at 60, 75 and 90% water-filled pore space (WFPS) at 25°C with urea (equivalent to 200 kg N ha−1) added to the soils thirty days after the incubation commenced. Results showed WFPS did not influence CO2 emissions over the 60 days but the magnitude of emissions as a proportion of C applied was RB < CM < MB < HC
Resumo:
The Escherichia coli mu operon was subcloned into a pKK233-2 vector containing rat glutathione S-transferase (GST) 5-5 cDNA and the plasmid thus obtained was introduced into Salmonella typhimurium TA1535. The newly developed strain S.typhimurium NM5004, was found to have 52-fold greater GST activity than the original umu strain S.typhimurium TA1535/pSK1002. We compared sensitivities of these two tester strains, NM5004 and TA1535/ pSK1002, for induction of umuC gene expression with several dihaloalkanes which are activated or inactivated by GST 5-5 activity. The induction of umuC gene expression by these chemicals was monitored by measuring the cellular P-galactosidase activity produced by umuC'lacZ fusion gene in these two tester strains. Ethylene dibromide, 1-bromo-2-chloroethane, 1,2-dichloroethane, and methylene dichloride induced umuC gene expression more strongly in the NM5004 strain than the original strain, 4-Nitroquinoline 1-oxide and N-methyl-N'-nitro-N-nitrosoguanidine were found to induce umuC gene expression to similar extents in both strains. In the case of 1-nitropyrene and 2-nitrofluorene, however, NM5004 strain showed weaker umuC gene expression responses than the original TA1535/ pSK1002 strain, 1,2-Epoxy-3-(4'-nitrophenoxy)propane, a known substrate for GST 5-5, was found to inhibit umuC induction caused by 1-bromo-2-chloroethane. These results indicate that this new tester NM5004 strain expressing a mammalian GST theta class enzyme may be useful for studies of environmental chemicals proposed to be activated or inactivated by GST activity.
Resumo:
In this 1972 documentary, The Computer Generation, by John Musilli, artist Stan Vanderbeek talks about the possibility of computers as an artist tool. My aim with drawing on this documentary is to compare the current state of transmedia with previous significant changes in media history, to illustrate how the current state of transmedia is quite diverse.
Resumo:
Background To determine whether changes in appetite and energy intake (EI) can be detected and play a role in the effectiveness of interventions, it is necessary to identify their variability under normal conditions. We assessed the reproducibility of subjective appetite ratings and ad libitum test meal EI after a standardised pre-load in overweight and obese males. Methods Fifteen overweight and obese males (BMI 30.3 ± 4.9 kg/m2, aged 34.9 ± 10.6 years) completed two identical test days, 7 days apart. Participants were provided with a standardised fixed breakfast (1676 kJ) and 5 h later an ad libitum pasta lunch. An electronic appetite rating system was used to assess subjective ratings before and after the fixed breakfast, and periodically during the postprandial period. EI was assessed at the ad libitum lunch meal. Sample size estimates for paired design studies were calculated. Results Appetite ratings demonstrated a consistent oscillating pattern between test days, and were more reproducible for mean postprandial than fasting ratings. The correlation between ad libitum EI on the two test days was r = 0.78 (P < 0.01). Using a paired design and a power of 0.8, a minimum of 12 participants would be needed to detect a 10 mm change in 5 h postprandial mean ratings and 17 to detect a 500 kJ difference in ad libitum EI. Conclusion Intra-individual variability of appetite and ad libitum test meal EI in overweight and obese males is comparable to previous reports in normal weight adults. Sample size requirements for studies vary depending on the parameter of interest and sensitivity needed.
Resumo:
There is an increasing desire and emphasis to integrate assessment tools into the everyday training environment of athletes. These tools are intended to fine-tune athlete development, enhance performance and aid in the development of individualised programmes for athletes. The areas of workload monitoring, skill development and injury assessment are expected to benefit from such tools. This paper describes the development of an instrumented leg press and its application to testing leg dominance with a cohort of athletes. The developed instrumented leg press is a 45° reclining sled-type leg press with dual force plates, a displacement sensor and a CCD camera. A custom software client was developed using C#. The software client enabled near-real-time display of forces beneath each limb together with displacement of the quad track roller system and video feedback of the exercise. In recording mode, the collection of athlete particulars is prompted at the start of the exercise, and pre-set thresholds are used subsequently to separate the data into epochs from each exercise repetition. The leg press was evaluated in a controlled study of a cohort of physically active adults who performed a series of leg press exercises. The leg press exercises were undertaken at a set cadence with nominal applied loads of 50%, 100% and 150% of body weight without feedback. A significant asymmetry in loading of the limbs was observed in healthy adults during both the eccentric and concentric phases of the leg press exercise (P < .05). Mean forces were significantly higher beneath the non-dominant limb (4–10%) and during the concentric phase of the muscle action (5%). Given that symmetrical loading is often emphasized during strength training and remains a common goal in sports rehabilitation, these findings highlight the clinical potential for this instrumented leg press system to monitor symmetry in lower-limb loading during progressive strength training and sports rehabilitation protocols.
Resumo:
Few would disagree that the upstream oil & gas industry has become more technology-intensive over the years. But how does innovation happen in the industry? Specifically, what ideas and inputs flow from which parts of the sector׳s value network, and where do these inputs go? And how do firms and organizations from different countries contribute differently to this process? This paper puts forward the results of a survey designed to shed light on these questions. Carried out in collaboration with the Society of Petroleum Engineers (SPE), the survey was sent to 469 executives and senior managers who played a significant role with regard to R&D and/or technology deployment in their respective business units. A total of 199 responses were received from a broad range of organizations and countries around the world. Several interesting themes and trends emerge from the results, including: (1) service companies tend to file considerably more patents per innovation than other types of organization; (2) over 63% of the deployed innovations reported in the survey originated in service companies; (3) neither universities nor government-led research organizations were considered to be valuable sources of new information and knowledge in the industry׳s R&D initiatives, and; (4) despite the increasing degree of globalization in the marketplace, the USA still plays an extremely dominant role in the industry׳s overall R&D and technology deployment activities. By providing a detailed and objective snapshot of how innovation happens in the upstream oil & gas sector, this paper provides a valuable foundation for future investigations and discussions aimed at improving how R&D and technology deployment are managed within the industry. The methodology did result in a coverage bias within the survey, however, and the limitations arising from this are explored.
Resumo:
The upstream oil & gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data”—that is, the ability to apply more sophisticated types of analytical tools to information in a way that extracts new insights or creates new forms of value—is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil & gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This paper examines existing data management practices in the upstream oil & gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the Big Data revolution. The comparison shows that, in companies that are leading the Big Data revolution, data is regarded as a valuable asset. The presented evidence also shows, however, that this is usually not true within the oil & gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how upstream oil & gas companies could potentially extract more value from data, and concludes with a series of specific technical and management-related recommendations to this end.