934 resultados para standard gas analysis
Resumo:
In this study, we investigated the relationship of European Union carbon dioxide CO2 allowances EUAs prices and oil prices by employing a VAR analysis, Granger causality test and impulse response function. If oil price continues increasing, companies will decrease dependency on fossil fuels because of an increase in energy costs. Therefore, the price of EUAs may be affected by variations in oil prices if the greenhouse gases discharged by the consumption of alternative energy are less than that of fossil fuels. There are no previous studies that investigated these relationships. In this study, we analyzed eight types of EUAs EUA05 to EUA12 with a time series daily data set during 2005-2007 collected from a European Climate Exchange time series data set. Differentiations in these eight types were redemption period. We used the New York Mercantile Exchange light sweet crude price as an oil price. From our examination, we found that only the EUA06 and EUA07 types of EUAs Granger-cause oil prices and vice versa and other six types of EUAs do not Granger-cause oil price. These results imply that the earlier redemption period types of EUAs are more sensitive to oil price. In employing the impulse response function, the results showed that a shock to oil price has a slightly positive effect on all types of EUAs for a very short period. On the other hand, we found that a shock to price of EUA has a slightly negative effect on oil price following a positive effect in only EUA06 and EUA07 types. Therefore, these results imply that fluctuations in EUAs prices and oil prices have little effect on each other. Lastly, we did not consider the substitute energy prices in this study, so we plan to include the prices of coal and natural gas in future analyses.
Resumo:
Enterprise Resource Planning (ERP) software is the dominant strategic platform for supporting enterprise-wide business processes. However, it has been criticised for being inflexible and not meeting specific organisation and industry requirements. An alternative, Best of Breed (BoB), integrates components of standard package and/or custom software. The objective is to develop enterprise systems that are more closely aligned with the business processes of an organisation. A case study of a BoB implementation facilitates a comparative analysis of the issues associated with this strategy and the single vendor ERP alternative. The paper illustrates the differences in complexity of implementation, levels of functionality, business process alignment potential and associated maintenance.
Resumo:
Aim A new method of penumbral analysis is implemented which allows an unambiguous determination of field size and penumbra size and quality for small fields and other non-standard fields. Both source occlusion and lateral electronic disequilibrium will affect the size and shape of cross-axis profile penumbrae; each is examined in detail. Method A new method of penumbral analysis is implemented where the square of the derivative of the cross-axis profile is plotted. The resultant graph displays two peaks in the place of the two penumbrae. This allows a strong visualisation of the quality of a field penumbra, as well as a mathematically consistent method of determining field size (distance between the two peak’s maxima), and penumbra (full-widthtenth-maximum of peak). Cross-axis profiles were simulated in a water phantom at a depth of 5 cm using Monte Carlo modelling, for field sizes between 5 and 30 mm. The field size and penumbra size of each field was calculated using the method above, as well as traditional definitions set out in IEC976. The effect of source occlusion and lateral electronic disequilibrium on the penumbrae was isolated by repeating the simulations removing electron transport and using an electron spot size of 0 mm, respectively. Results All field sizes calculated using the traditional and proposed methods agreed within 0.2 mm. The penumbra size measured using the proposed method was systematically 1.8 mm larger than the traditional method at all field sizes. The size of the source had a larger effect on the size of the penumbra than did lateral electronic disequilibrium, particularly at very small field sizes. Conclusion Traditional methods of calculating field size and penumbra are proved to be mathematically adequate for small fields. However, the field size definition proposed in this study would be more robust amongst other nonstandard fields, such as flattening filter free. Source occlusion plays a bigger role than lateral electronic disequilibrium in small field penumbra size.
Resumo:
Optimisation of organic Rankine cycles(ORCs for binary cycle applications could play a major role in determining the competitiveness of low to moderate renewable sources. An important aspect of the optimisation is to maximise the turbine output power for a given resource. This requires careful attention to the turbine design notably through numerical simulations. Challenges in the numerical modelling of radial-inflow turbines using high-density working fluids still need to be addressed in order to improve the turbine design and better optimise ORCs. Thispaper presents preliminary 3D numerical simulations of a high-density radial-inflow ORC turbine in sensible geothermal conditions. Following extensive investigation of the operating conditions and thermodynamic cycle analysis, therefrigerant R143a is chosen as the high-density working fluid. The 1D design of the candidate radial-inflow turbine is presented in details. Furthermore, commercially-available software Ansys-CFX is used to perform preliminary steady-state 3D CFD simulations of the candidate R143a radial-inflow turbine for a number of operating conditions including off-design conditions. The real-gas properties are obtained using the Peng–Robinson equations of state.The thermodynamic ORC cycle is presented. The preliminary design created using dedicated radial-inflow turbine software Concepts-Rital is discussed and the 3D CFD results are presented and compared against the meanline analysis.
Resumo:
An increasing concern over the sustainability credentials of food and fiber crops require that farmers and their supply chain partners have access to appropriate and industry-friendly tools to be able to measure and improve the outcomes. This article focuses on one of the sustainability indicators, namely, greenhouse gas (GHG) emissions, and nine internationally accredited carbon footprint calculators were identified and compared on an outcomes basis against the same cropping data from a case study cotton farm. The purpose of this article is to identify the most “appropriate” methodology to be applied by cotton suppliers in this regard. From the analysis of the results, we subsequently propose a new integrated model as the basis for an internationally accredited carbon footprint tool for cotton and show how the model can be applied to evaluate the emission outcomes of different farming practices.
Resumo:
Approximate Bayesian Computation’ (ABC) represents a powerful methodology for the analysis of complex stochastic systems for which the likelihood of the observed data under an arbitrary set of input parameters may be entirely intractable – the latter condition rendering useless the standard machinery of tractable likelihood-based, Bayesian statistical inference [e.g. conventional Markov chain Monte Carlo (MCMC) simulation]. In this paper, we demonstrate the potential of ABC for astronomical model analysis by application to a case study in the morphological transformation of high-redshift galaxies. To this end, we develop, first, a stochastic model for the competing processes of merging and secular evolution in the early Universe, and secondly, through an ABC-based comparison against the observed demographics of massive (Mgal > 1011 M⊙) galaxies (at 1.5 < z < 3) in the Cosmic Assembly Near-IR Deep Extragalatic Legacy Survey (CANDELS)/Extended Groth Strip (EGS) data set we derive posterior probability densities for the key parameters of this model. The ‘Sequential Monte Carlo’ implementation of ABC exhibited herein, featuring both a self-generating target sequence and self-refining MCMC kernel, is amongst the most efficient of contemporary approaches to this important statistical algorithm. We highlight as well through our chosen case study the value of careful summary statistic selection, and demonstrate two modern strategies for assessment and optimization in this regard. Ultimately, our ABC analysis of the high-redshift morphological mix returns tight constraints on the evolving merger rate in the early Universe and favours major merging (with disc survival or rapid reformation) over secular evolution as the mechanism most responsible for building up the first generation of bulges in early-type discs.
Resumo:
Occupational standards concerning the allowable concentrations of chemical compounds in the ambient air of workplaces have been established in several countries at national levels. With the integration of the European Union, a need exists for establishing harmonized Occupational Exposure Limits. For analytical developments, it is apparent that methods for speciation or fractionation of carcinogenic metal compounds will be of increasing practical importance for standard setting. Criteria of applicability under field conditions, cost-effectiveness, and robustness are practical driving forces for new developments. When the European Union issued a list of 62 chemical substances with Occupational Exposure Limits in 2000, 25 substances received a 'skin' notation. The latter indicates that toxicologically significant amounts may be taken up via the skin. Similar notations exist on national levels. For such substances, monitoring concentrations in ambient air will not be sufficient; biological monitoring strategies will gain further importance in the medical surveillance of workers who are exposed to such compounds. Proceedings in establishing legal frameworks for a biological monitoring of chemical exposures within Europe are paralleled by scientific advances in this field. A new aspect is the possibility of a differential adduct monitoring, using blood proteins of different half-life or lifespan. This technique allows differentiation between long-term mean exposure to reactive chemicals and short-term episodes, for example, by accidental overexposure. For further analytical developments, the following issues have been addressed as being particularly important: New dose monitoring strategies, sensitive and reliable methods for detection of DNA adducts, cytogenetic parameters in biological monitoring, methods to monitor exposure to sensitizing chemicals, and parameters for individual susceptibilities to chemical toxicants.
Resumo:
The type of contract model may have a significant influence on achieving project objectives, including environmental and climate change goals. This research investigates non-standard contract models impacting greenhouse gas emissions (GHG) in transport infrastructure construction in Australia. The research is based on the analysis of two case studies: an Early Contractor Involvement (ECI) contract and a Design and Construct (D&C) contract with GHG reduction requirements embedded in the contractor selection. Main findings support the use of ECIs for better integrating decisions made during the planning phase with the construction activities, and improve environmental outcomes while achieving financial and time savings. Key words: greenhouse gases reduction; road construction; contracting; ECI; D&C
Resumo:
Managing spinal deformities in young children is challenging, particularly early onset scoliosis (EOS). Surgical intervention is often required if EOS has been unresponsive to conservative treatment particularly with rapidly progressive curves. An emerging treatment option for EOS is fusionless scoliosis surgery. Similar to bracing, this surgical option potentially harnesses growth, motion and function of the spine along with correcting spinal deformity. Dual growing rods are one such fusionless treatment, which aims to modulate growth of the vertebrae. The aim of this study was to ascertain the extent to which semi-constrained growing rods (Medtronic Sofamor Danek Memphis, TN, USA) with a telescopic sleeve component, reduce rotational constraint on the spine compared with standard rigid rods and hence potentially provide a more physiological mechanical environment for the growing spine. This study found that semi-constrained growing rods would be expected to allow growth via the telescopic rod components while maintaining the axial flexibility of the spine and the improved capacity for final correction.
Resumo:
Meta-analysis is a method to obtain a weighted average of results from various studies. In addition to pooling effect sizes, meta-analysis can also be used to estimate disease frequencies, such as incidence and prevalence. In this article we present methods for the meta-analysis of prevalence. We discuss the logit and double arcsine transformations to stabilise the variance. We note the special situation of multiple category prevalence, and propose solutions to the problems that arise. We describe the implementation of these methods in the MetaXL software, and present a simulation study and the example of multiple sclerosis from the Global Burden of Disease 2010 project. We conclude that the double arcsine transformation is preferred over the logit, and that the MetaXL implementation of multiple category prevalence is an improvement in the methodology of the meta-analysis of prevalence.
Resumo:
This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.
Resumo:
Background Child sexual abuse is considered a modifiable risk factor for mental disorders across the life course. However the long-term consequences of other forms of child maltreatment have not yet been systematically examined. The aim of this study was to summarise the evidence relating to the possible relationship between child physical abuse, emotional abuse, and neglect, and subsequent mental and physical health outcomes. Methods and Findings A systematic review was conducted using the Medline, EMBASE, and PsycINFO electronic databases up to 26 June 2012. Published cohort, cross-sectional, and case-control studies that examined non-sexual child maltreatment as a risk factor for loss of health were included. All meta-analyses were based on quality-effects models. Out of 285 articles assessed for eligibility, 124 studies satisfied the pre-determined inclusion criteria for meta-analysis. Statistically significant associations were observed between physical abuse, emotional abuse, and neglect and depressive disorders (physical abuse [odds ratio (OR) = 1.54; 95% CI 1.16–2.04], emotional abuse [OR = 3.06; 95% CI 2.43–3.85], and neglect [OR = 2.11; 95% CI 1.61–2.77]); drug use (physical abuse [OR = 1.92; 95% CI 1.67–2.20], emotional abuse [OR = 1.41; 95% CI 1.11–1.79], and neglect [OR = 1.36; 95% CI 1.21–1.54]); suicide attempts (physical abuse [OR = 3.40; 95% CI 2.17–5.32], emotional abuse [OR = 3.37; 95% CI 2.44–4.67], and neglect [OR = 1.95; 95% CI 1.13–3.37]); and sexually transmitted infections and risky sexual behaviour (physical abuse [OR = 1.78; 95% CI 1.50–2.10], emotional abuse [OR = 1.75; 95% CI 1.49–2.04], and neglect [OR = 1.57; 95% CI 1.39–1.78]). Evidence for causality was assessed using Bradford Hill criteria. While suggestive evidence exists for a relationship between maltreatment and chronic diseases and lifestyle risk factors, more research is required to confirm these relationships. Conclusions This overview of the evidence suggests a causal relationship between non-sexual child maltreatment and a range of mental disorders, drug use, suicide attempts, sexually transmitted infections, and risky sexual behaviour. All forms of child maltreatment should be considered important risks to health with a sizeable impact on major contributors to the burden of disease in all parts of the world. The awareness of the serious long-term consequences of child maltreatment should encourage better identification of those at risk and the development of effective interventions to protect children from violence.
Resumo:
Background Up-to-date evidence on levels and trends for age-sex-specific all-cause and cause-specific mortality is essential for the formation of global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013) we estimated yearly deaths for 188 countries between 1990, and 2013. We used the results to assess whether there is epidemiological convergence across countries. Methods We estimated age-sex-specific all-cause mortality using the GBD 2010 methods with some refinements to improve accuracy applied to an updated database of vital registration, survey, and census data. We generally estimated cause of death as in the GBD 2010. Key improvements included the addition of more recent vital registration data for 72 countries, an updated verbal autopsy literature review, two new and detailed data systems for China, and more detail for Mexico, UK, Turkey, and Russia. We improved statistical models for garbage code redistribution. We used six different modelling strategies across the 240 causes; cause of death ensemble modelling (CODEm) was the dominant strategy for causes with sufficient information. Trends for Alzheimer's disease and other dementias were informed by meta-regression of prevalence studies. For pathogen-specific causes of diarrhoea and lower respiratory infections we used a counterfactual approach. We computed two measures of convergence (inequality) across countries: the average relative difference across all pairs of countries (Gini coefficient) and the average absolute difference across countries. To summarise broad findings, we used multiple decrement life-tables to decompose probabilities of death from birth to exact age 15 years, from exact age 15 years to exact age 50 years, and from exact age 50 years to exact age 75 years, and life expectancy at birth into major causes. For all quantities reported, we computed 95% uncertainty intervals (UIs). We constrained cause-specific fractions within each age-sex-country-year group to sum to all-cause mortality based on draws from the uncertainty distributions. Findings Global life expectancy for both sexes increased from 65·3 years (UI 65·0–65·6) in 1990, to 71·5 years (UI 71·0–71·9) in 2013, while the number of deaths increased from 47·5 million (UI 46·8–48·2) to 54·9 million (UI 53·6–56·3) over the same interval. Global progress masked variation by age and sex: for children, average absolute differences between countries decreased but relative differences increased. For women aged 25–39 years and older than 75 years and for men aged 20–49 years and 65 years and older, both absolute and relative differences increased. Decomposition of global and regional life expectancy showed the prominent role of reductions in age-standardised death rates for cardiovascular diseases and cancers in high-income regions, and reductions in child deaths from diarrhoea, lower respiratory infections, and neonatal causes in low-income regions. HIV/AIDS reduced life expectancy in southern sub-Saharan Africa. For most communicable causes of death both numbers of deaths and age-standardised death rates fell whereas for most non-communicable causes, demographic shifts have increased numbers of deaths but decreased age-standardised death rates. Global deaths from injury increased by 10·7%, from 4·3 million deaths in 1990 to 4·8 million in 2013; but age-standardised rates declined over the same period by 21%. For some causes of more than 100 000 deaths per year in 2013, age-standardised death rates increased between 1990 and 2013, including HIV/AIDS, pancreatic cancer, atrial fibrillation and flutter, drug use disorders, diabetes, chronic kidney disease, and sickle-cell anaemias. Diarrhoeal diseases, lower respiratory infections, neonatal causes, and malaria are still in the top five causes of death in children younger than 5 years. The most important pathogens are rotavirus for diarrhoea and pneumococcus for lower respiratory infections. Country-specific probabilities of death over three phases of life were substantially varied between and within regions. Interpretation For most countries, the general pattern of reductions in age-sex specific mortality has been associated with a progressive shift towards a larger share of the remaining deaths caused by non-communicable disease and injuries. Assessing epidemiological convergence across countries depends on whether an absolute or relative measure of inequality is used. Nevertheless, age-standardised death rates for seven substantial causes are increasing, suggesting the potential for reversals in some countries. Important gaps exist in the empirical data for cause of death estimates for some countries; for example, no national data for India are available for the past decade.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.