40 resultados para Power-to-Gas (P2G)
em University of Queensland eSpace - Australia
Resumo:
An experimental study has been carried out for the gas-liquid two-phase flow in a packed bed simulating conditions of the gas and liquid flows in the lower part of blast furnace. The localised liquid flow phenomenon in presence of gas cross flow, which usually occurs around the cohesive zone and raceway in blast furnace, was investigated in detail. Such liquid flow is characterised in terms of liquid shift distance or liquid shift angle that can effectively be measured by the experiments involved in the current study. It is found that liquid shift angle does not significantly increase or decrease with different packing depth. This finding supports the hypothesis of the force balance model where a vectorial relationship among acting forces, i.e. gas drag force, gravitational force and solid-liquid friction force, and liquid shift angle does exist. Liquid shift angle is inversely proportional to particle size and liquid density, and proportional to square of gas superficial velocity, but is almost independent on liquid flowrate and liquid viscosity. The gas-liquid drag coefficient, an important aspect for quantifying the interaction between gas and liquid flows, was conceptually modified based on the discrete feature of liquid flow through a packed bed and evaluated by the combined theoretical and experimental investigation. Experimental measurements suggest that the gas-liquid drag coefficient is approximately a constant (C-DG(')=5.4+/-1.0) and is independent on liquid properties, gas velocity and packing structure. The result shows a good agreement with previous experimental data and prediction of the existing liquid flow model.
Resumo:
In Ruddock and Others v Vadarlis and Others the Federal Court had to balance two fundamental but competing rights, the right of the state to secure its frontiers and the rights of individuals not to be subjected to unlawful detention - Court's task was hampered by intense public debate over the illegal refugee crisis - in the wake of 11 September 2001 and the Tampa crisis, the Federal Government has rushed through several amendments to migration laws and border protection legislation.
Resumo:
A review is given on the fundamental studies of gas-carbon reactions using electronic structure methods in the last several decades. The three types of electronic structure methods including semi-empirical, ab initio and density functional theory, methods are briefly introduced first, followed by the studies on carbon reactions with hydrogen and oxygen-containing gases (non-catalysed and catalysed). The problems yet to solve and possible promising directions are discussed. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
A narrow absorption feature in an atomic or molecular gas (such as iodine or methane) is used as the frequency reference in many stabilized lasers. As part of the stabilization scheme an optical frequency dither is applied to the laser. In optical heterodyne experiments, this dither is transferred to the RF beat signal, reducing the spectral power density and hence the signal to noise ratio over that in the absence of dither. We removed the dither by mixing the raw beat signal with a dithered local oscillator signal. When the dither waveform is matched to that of the reference laser the output signal from the mixer is rendered dither free. Application of this method to a Winters iodine-stabilized helium-neon laser reduced the bandwidth of the beat signal from 6 MHz to 390 kHz, thereby lowering the detection threshold from 5 pW of laser power to 3 pW. In addition, a simple signal detection model is developed which predicts similar threshold reductions.
Resumo:
The University of Queensland, Australia has developed Fez, a world-leading user-interface and management system for Fedora-based institutional repositories, which bridges the gap between a repository and users. Christiaan Kortekaas, Andrew Bennett and Keith Webster will review this open source software that gives institutions the power to create a comprehensive repository solution without the hassle..
Resumo:
The aim of this study was to explore the feasibility of an exercise scientist (ES) working in general practice to promote physical activity (PA) to 55 to 70 year old adults. Participants were randomised into one of three groups: either brief verbal and written advice from a general practitioner (GP) (G1, N=9); or individualised counselling and follow-up telephone calls from an ES, either with (G3, N=8) or without a pedometer (G2, N=11). PA levels were assessed at week 1, after the 12-wk intervention and again at 24 weeks. After the 12-wk intervention, the average increase in PA was 116 (SD=237) min/wk; N=28, p < 0.001. Although there were no statistically significant between-group differences, the average increases in PA among G2 and G3 participants were 195 (SD=207) and 138 (SD=315) min/wk respectively, compared with no change (0.36, SD=157) in G1. After 24 weeks, average PA levels remained 56 (SD=129) min/wk higher than in week 1. The small numbers of participants in this feasibility study limit the power to detect significant differences between groups, but it would appear that individualised counselling and follow-up contact from an ES, with or without a pedometer, can result in substantial changes in PA levels. A larger study is now planned to confirm these findings.
Resumo:
We investigate the size and power properties of the AH test of evolutionary change. This involves examining whether the size results are sensitive to both the number of individual frequencies estimated and the spectral shape adopted under the null hypothesis. The power tests examine whether the test has good power to detect shifts in both spectral position (variance) and spectral shape (autocovariance structure).
Resumo:
Rheumatic fever (RF)/rheumatic heart disease (RHD) and post-streptococcal glomerulonephritis are thought to be autoimmune diseases, and follow group A streptococcal (GAS) infection. Different GAS M types have been associated with rheumatogenicity or nephritogenicity and categorized into either of two distinct classes (I or II) based on amino acid sequences present within the repeat region ('C' repeats) of the M protein. Sera from ARF patients have previously been shown to contain elevated levels of antibodies to the class I-specific epitope and myosin with the class I-specific antibodies also being cross-reactive to myosin, suggesting a disease association. This study shows that immunoreactivity of the class I-specific peptide and myosin does not differ between controls and acute RF (ARF)/RHD in populations that are highly endemic for GAS, raising the possibility that the association is related to GAS exposure, not the presence of ARF/RHD. Peptide inhibition studies suggest that the class I epitope may be conformational and residue 10 of the peptide is critical for antibody binding. We demonstrate that correlation of antibody levels between the class I and II epitope is due to class II-specific antibodies recognizing a common epitope with class I which is contained within the sequence RDL-ASRE. Our results suggest that antibody prevalence to class I and II epitopes and myosin is associated with GAS exposure, and that antibodies to these epitopes are not an indicator of disease nor a pathogenic factor in endemic populations.
Resumo:
Using the work and ideas of French theorist Michel Foucault the writer examines s 3LA of the Crimes Act, which provides law enforcement officers with power to compel a person to reveal their private encryption keys and other personal information, and concludes that such a section creates fear, redirects flow of power between law enforcement agencies and citizens, and creates resistance.
Resumo:
Genetic assignment methods use genotype likelihoods to draw inference about where individuals were or were not born, potentially allowing direct, real-time estimates of dispersal. We used simulated data sets to test the power and accuracy of Monte Carlo resampling methods in generating statistical thresholds for identifying F-0 immigrants in populations with ongoing gene flow, and hence for providing direct, real-time estimates of migration rates. The identification of accurate critical values required that resampling methods preserved the linkage disequilibrium deriving from recent generations of immigrants and reflected the sampling variance present in the data set being analysed. A novel Monte Carlo resampling method taking into account these aspects was proposed and its efficiency was evaluated. Power and error were relatively insensitive to the frequency assumed for missing alleles. Power to identify F-0 immigrants was improved by using large sample size (up to about 50 individuals) and by sampling all populations from which migrants may have originated. A combination of plotting genotype likelihoods and calculating mean genotype likelihood ratios (D-LR) appeared to be an effective way to predict whether F-0 immigrants could be identified for a particular pair of populations using a given set of markers.
Resumo:
There is a general form of an argument which I call the 'argument from vagueness' which attempts to show that objects persist by perduring, via the claim that vagueness is never ontological in nature and thus that composition is unrestricted. I argue that even if we grant that vagueness is always the result of semantic indeterminacy rather than ontological vagueness, and thus also grant that composition is unrestricted, it does not follow that objects persist by perduring. Unrestricted mereological composition lacks the power to ensure that there exist instantaneous objects that wholly overlap persisting objects at times, and thus lacks the power to ensure that there exists anything that could be called a temporal part. Even if we grant that such instantaneous objects exist, however, I argue that it does not follow that objects perdure. To show this I briefly outline a coherent version of three dimensionalism that grants just such an assumption. Thus considerations pertaining to the nature of vagueness need not lead us inevitably to accept perdurantism.
Resumo:
Background The identification and characterization of genes that influence the risk of common, complex multifactorial disease primarily through interactions with other genes and environmental factors remains a statistical and computational challenge in genetic epidemiology. We have previously introduced a genetic programming optimized neural network (GPNN) as a method for optimizing the architecture of a neural network to improve the identification of gene combinations associated with disease risk. The goal of this study was to evaluate the power of GPNN for identifying high-order gene-gene interactions. We were also interested in applying GPNN to a real data analysis in Parkinson's disease. Results We show that GPNN has high power to detect even relatively small genetic effects (2–3% heritability) in simulated data models involving two and three locus interactions. The limits of detection were reached under conditions with very small heritability (
Resumo:
Background: The identification and characterization of genes that influence the risk of common, complex multifactorial disease primarily through interactions with other genes and environmental factors remains a statistical and computational challenge in genetic epidemiology. We have previously introduced a genetic programming optimized neural network (GPNN) as a method for optimizing the architecture of a neural network to improve the identification of gene combinations associated with disease risk. The goal of this study was to evaluate the power of GPNN for identifying high-order gene-gene interactions. We were also interested in applying GPNN to a real data analysis in Parkinson's disease. Results: We show that GPNN has high power to detect even relatively small genetic effects (2-3% heritability) in simulated data models involving two and three locus interactions. The limits of detection were reached under conditions with very small heritability (
Resumo:
Error condition detected Although coal may be viewed as a dirty fuel due to its high greenhouse emissions when combusted, a strong case can be made for coal to be a major world source of clean H-2 energy. Apart from the fact that resources of coal will outlast oil and natural gas by centuries, there is a shift towards developing environmentally benign coal technologies, which can lead to high energy conversion efficiencies and low air pollution emissions as compared to conventional coal fired power generation plant. There are currently several world research and industrial development projects in the areas of Integrated Gasification Combined Cycles (IGCC) and Integrated Gasification Fuel Cell (IGFC) systems. In such systems, there is a need to integrate complex unit operations including gasifiers, gas separation and cleaning units, water gas shift reactors, turbines, heat exchangers, steam generators and fuel cells. IGFC systems tested in the USA, Europe and Japan employing gasifiers (Texaco, Lurgi and Eagle) and fuel cells have resulted in energy conversions at efficiency of 47.5% (HHV) which is much higher than the 30-35% efficiency of conventional coal fired power generation. Solid oxide fuel cells (SOFC) and molten carbonate fuel cells (MCFC) are the front runners in energy production from coal gases. These fuel cells can operate at high temperatures and are robust to gas poisoning impurities. IGCC and IGFC technologies are expensive and currently economically uncompetitive as compared to established and mature power generation technology. However, further efficiency and technology improvements coupled with world pressures on limitation of greenhouse gases and other gaseous pollutants could make IGCC/IGFC technically and economically viable for hydrogen production and utilisation in clean and environmentally benign energy systems. (c) 2005 Elsevier B.V. All rights reserved.