95 resultados para batch reactors
em Queensland University of Technology - ePrints Archive
Resumo:
Bid opening in e-auction is efficient when a homomorphic secret sharing function is employed to seal the bids and homomorphic secret reconstruction is employed to open the bids. However, this high efficiency is based on an assumption: the bids are valid (e.g., within a special range). An undetected invalid bid can compromise correctness and fairness of the auction. Unfortunately, validity verification of the bids is ignored in the auction schemes employing homomorphic secret sharing (called homomorphic auction in this paper). In this paper, an attack against the homomorphic auction in the absence of bid validity check is presented and a necessary bid validity check mechanism is proposed. Then a batch cryptographic technique is introduced and applied to improve the efficiency of bid validity check.
Resumo:
The material presented in this thesis may be viewed as comprising two key parts, the first part concerns batch cryptography specifically, whilst the second deals with how this form of cryptography may be applied to security related applications such as electronic cash for improving efficiency of the protocols. The objective of batch cryptography is to devise more efficient primitive cryptographic protocols. In general, these primitives make use of some property such as homomorphism to perform a computationally expensive operation on a collective input set. The idea is to amortise an expensive operation, such as modular exponentiation, over the input. Most of the research work in this field has concentrated on its employment as a batch verifier of digital signatures. It is shown that several new attacks may be launched against these published schemes as some weaknesses are exposed. Another common use of batch cryptography is the simultaneous generation of digital signatures. There is significantly less previous work on this area, and the present schemes have some limited use in practical applications. Several new batch signatures schemes are introduced that improve upon the existing techniques and some practical uses are illustrated. Electronic cash is a technology that demands complex protocols in order to furnish several security properties. These typically include anonymity, traceability of a double spender, and off-line payment features. Presently, the most efficient schemes make use of coin divisibility to withdraw one large financial amount that may be progressively spent with one or more merchants. Several new cash schemes are introduced here that make use of batch cryptography for improving the withdrawal, payment, and deposit of electronic coins. The devised schemes apply both to the batch signature and verification techniques introduced, demonstrating improved performance over the contemporary divisible based structures. The solutions also provide an alternative paradigm for the construction of electronic cash systems. Whilst electronic cash is used as the vehicle for demonstrating the relevance of batch cryptography to security related applications, the applicability of the techniques introduced extends well beyond this.
Resumo:
Series reactors are used in distribution grids to reduce the short-circuit fault level. Some of the disadvantages of the application of these devices are the voltage drop produced across the reactor and the steep front rise of the transient recovery voltage (TRV), which generally exceeds the rating of the associated circuit breaker. Simulations were performed to compare the characteristics of a saturated core High-Temperature Superconducting Fault Current Limiter (HTS FCL) and a series reactor. The design of the HTS FCL was optimized using the evolutionary algorithm. The resulting Pareto frontier curve of optimum solution is presented in this paper. The results show that the steady-state impedance of an HTS FCL is significantly lower than that of a series reactor for the same level of fault current limiting. Tests performed on a prototype 11 kV HTS FCL confirm the theoretical results. The respective transient recovery voltages (TRV) of the HTS FCL and an air core reactor of comparable fault current limiting capability are also determined. The results show that the saturated core HTS FCL has a significantly lower effect on the rate of rise of the circuit breaker TRV as compared to the air core reactor. The simulations results are validated with shortcircuit test results.
Resumo:
The use of immobilised TiO2 for the purification of polluted water streams introduces the necessity to evaluate the effect of mechanisms such as the transport of pollutants from the bulk of the liquid to the catalyst surface and the transport phenomena inside the porous film. Experimental results of the effects of film thickness on the observed reaction rate for both liquid-side and support-side illumination are here compared with the predictions of a one-dimensional mathematical model of the porous photocatalytic slab. Good agreement was observed between the experimentally obtained photodegradation of phenol and its by-products, and the corresponding model predictions. The results have confirmed that an optimal catalyst thickness exists and, for the films employed here, is 5 μm. Furthermore, the modelling results have highlighted the fact that porosity, together with the intrinsic reaction kinetics are the parameters controlling the photocatalytic activity of the film. The former by influencing transport phenomena and light absorption characteristics, the latter by naturally dictating the rate of reaction.
Resumo:
The effects of oxygen availability and induction culture biomass upon production of an industrially important monoamine oxidase (MAO) were investigated in fed-batch cultures of a recombinant E. coli. For each induction cell biomass 2 different oxygenation methods were used, aeration and oxygen enriched air. Induction at higher biomass levels increased the culture demand for oxygen, leading to fermentative metabolism and accumulation of high levels of acetate in the aerated cultures. Paradoxically, despite an almost eight fold increase in acetate accumulation to levels widely reported to be highly detrimental to protein production, when induction wet cell weight (WCW) rose from 100% to 137.5%, MAO specific activity in these aerated processes showed a 3 fold increase. By contrast, for oxygenated cultures induced at WCW's 100% and 137.5% specific activity levels were broadly similar, but fell rapidly after the maxima were reached. Induction at high biomass levels (WCW 175%) led to very low levels of specific MAO activity relative to induction at lower WCW's in both aerated and oxygenated cultures. Oxygen enrichment of these cultures was a useful strategy for boosting specific growth rates, but did not have positive effects upon specific enzyme activity. Based upon our findings, consideration of the amino acid composition of MAO and previous studies on related enzymes, we propose that this effect is due to oxidative damage to the MAO enzyme itself during these highly aerobic processes. Thus, the optimal process for MAO production is aerated, not oxygenated, and induced at moderate cell density, and clearly represents a compromise between oxygen supply effects on specific growth rate/induction cell density, acetate accumulation, and high specific MAO activity. This work shows that the negative effects of oxygen previously reported in free enzyme preparations, are not limited to these acellular environments but are also discernible in the sheltered environment of the cytosol of E. coli cells.
Resumo:
A dynamic accumulator is an algorithm, which gathers together a large set of elements into a constant-size value such that for a given element accumulated, there is a witness confirming that the element was indeed included into the value, with a property that accumulated elements can be dynamically added and deleted into/from the original set such that the cost of an addition or deletion operation is independent of the number of accumulated elements. Although the first accumulator was presented ten years ago, there is still no standard formal definition of accumulators. In this paper, we generalize formal definitions for accumulators, formulate a security game for dynamic accumulators so-called Chosen Element Attack (CEA), and propose a new dynamic accumulator for batch updates based on the Paillier cryptosystem. Our construction makes a batch of update operations at unit cost. We prove its security under the extended strong RSA (es-RSA) assumption
Resumo:
Switchgrass was treated by 1% (w/w) H₂SO₄in batch tube reactors at temperatures ranging from 140–220°C for up to 60 minutes. In this study, release patterns of glucose, 5-hydroxymethylfurfural (5-HMF), and levulinic acid from switchgrass cellulose were investigated through a mechanistic kinetic model. The predictions were consistent with the measured products of interest when new parameters reflecting the effects of reaction limitations, such as cellulose crystallinity, acid soluble lignin–glucose complex (ASL–glucose) and humins that cannot be quantitatively analyzed, were included. The new mechanistic kinetic model incorporating these parameters simulated the experimental data with R² above 0.97. Results showed that glucose yield was most sensitive to variations in the parameter regarding the cellulose crystallinity at low temperatures (140–180°C), while the impact of crystallinity on the glucose yield became imperceptible at elevated temperatures (200–220 °C). Parameters related to the undesired products (e.g. ASL–glucose and humins) were the most sensitive factors compared with rate constants and other additional parameters in impacting the levulinic acid yield at elevated temperatures (200–220°C), while their impacts were negligible at 140–180°C. These new findings provide a more rational explanation for the kinetic changes in dilute acid pretreatment performance and suggest that the influences of cellulose crystallinity and undesired products including ASL–glucose and humins play key roles in determining the generation of glucose, 5-HMF and levulinic acid from biomass-derived cellulose.
Resumo:
The impact of erroneous genotypes having passed standard quality control (QC) can be severe in genome-wide association studies, genotype imputation, and estimation of heritability and prediction of genetic risk based on single nucleotide polymorphisms (SNP). To detect such genotyping errors, a simple two-locus QC method, based on the difference in test statistic of association between single SNPs and pairs of SNPs, was developed and applied. The proposed approach could detect many problematic SNPs with statistical significance even when standard single SNP QC analyses fail to detect them in real data. Depending on the data set used, the number of erroneous SNPs that were not filtered out by standard single SNP QC but detected by the proposed approach varied from a few hundred to thousands. Using simulated data, it was shown that the proposed method was powerful and performed better than other tested existing methods. The power of the proposed approach to detect erroneous genotypes was approximately 80% for a 3% error rate per SNP. This novel QC approach is easy to implement and computationally efficient, and can lead to a better quality of genotypes for subsequent genotype-phenotype investigations.
Resumo:
A new solution to the millionaire problem is designed on the base of two new techniques: zero test and batch equation. Zero test is a technique used to test whether one or more ciphertext contains a zero without revealing other information. Batch equation is a technique used to test equality of multiple integers. Combination of these two techniques produces the only known solution to the millionaire problem that is correct, private, publicly verifiable and efficient at the same time.
Resumo:
SRI has examined the organosolv (organic solvation) pulping of Australian bagasse using technology supplied by Ecopulp. In the process, bagasse is reacted with aqueous ethanol in a digester at elevated temperatures (between 150ºC and 200ºC). The products from the digester are separated using proprietary technology before further processing into a range of saleable products. Test trials were undertaken using two batch digesters; the first capable of pulping about 25 g of wet depithed bagasse and the second, larger samples of about 1.5 kg of wet depithed bagasse. From this study, the unbleached pulp produced from fresh bagasse did not have very good strength properties for the production of corrugated medium for cartons and bleached pulp. In particular, the lignin contents as indicated by the Kappa number for the unbleached pulps are high for making bleached pulp. However, in spite of the high lignin content, it is possible to bleach the pulp to acceptable levels of brightness up to 86.6% ISO. The economics were assessed for three tier pricing (namely low, medium and high price). The economic return for a plant that produces 100 air dry t/d of brownstock pulp is satisfactory for both high and medium pricing levels of pricing. The outcomes from the project justify that work should continue through to either pilot plant or upgraded laboratory facility.