908 resultados para Static-order-trade-off


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines Interim Financial Reporting disclosure compliance and associated factors for listed firms in Asia-Pacific countries: Australia, Hong Kong, Malaysia, Singapore, the Philippines, Thailand, and Vietnam. Employing disclosure theory (in the context of information economics), with the central premise being that manager' trade-off costs and benefits relating to disclosure, the factors influencing the variation in interim reporting disclosure compliance are examined. Using researcher-constructed disclosure indices and regression modelling, the results reveal significant cross-country variation in interim reporting disclosure compliance, with higher compliance associated with IFRS adoption, audit review, quarterly reporting (rather than six-monthly) and shorter reporting lags.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in data center dependent services has made energy optimization of data centers one of the most exigent challenges in today's Information Age. The necessity of green and energy-efficient measures is very high for reducing carbon footprint and exorbitant energy costs. However, inefficient application management of data centers results in high energy consumption and low resource utilization efficiency. Unfortunately, in most cases, deploying an energy-efficient application management solution inevitably degrades the resource utilization efficiency of the data centers. To address this problem, a Penalty-based Genetic Algorithm (GA) is presented in this paper to solve a defined profile-based application assignment problem whilst maintaining a trade-off between the power consumption performance and resource utilization performance. Case studies show that the penalty-based GA is highly scalable and provides 16% to 32% better solutions than a greedy algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project was a step forward in introducing suitable cooperative diversity transmission techniques for vehicle to vehicle communications. The contributions are intended to aid in the successful implementation of future vehicular safety and autonomous controlling systems. Several protocols were introduced for vehicles to communicate effectively without losing connectivity. This study investigated novel protocols in terms of diversity-multiplexing trade-off and outage for a range of potential vehicular safety and infotainment applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metabolic imaging using positron emission tomography (PET) has found increasing clinical use for the management of infiltrating tumours such as glioma. However, the heterogeneous biological nature of tumours and intrinsic treatment resistance in some regions means that knowledge of multiple biological factors is needed for effective treatment planning. For example, the use of 18F-FDOPA to identify infiltrative tumour and 18F-FMISO for localizing hypoxic regions. Performing multiple PET acquisitions is impractical in many clinical settings, but previous studies suggest multiplexed PET imaging could be viable. The fidelity of the two signals is affected by the injection interval, scan timing and injected dose. The contribution of this work is to propose a framework to explicitly trade-off signal fidelity with logistical constraints when designing the imaging protocol. The particular case of estimating 18F-FMISO from a single frame prior to injection of 18F-FDOPA is considered. Theoretical experiments using simulations for typical biological scenarios in humans demonstrate that results comparable to a pair of single-tracer acquisitions can be obtained provided protocol timings are carefully selected. These results were validated using a pre-clinical data set that was synthetically multiplexed. The results indicate that the dual acquisition of 18F-FMISO and 18F-FDOPA could be feasible in the clinical setting. The proposed framework could also be used to design protocols for other tracers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the terminating concept of BKZ reduction first introduced by Hanrot et al. [Crypto'11] and make extensive experiments to predict the number of tours necessary to obtain the best possible trade off between reduction time and quality. Then, we improve Buchmann and Lindner's result [Indocrypt'09] to find sub-lattice collision in SWIFFT. We illustrate that further improvement in time is possible through special setting of SWIFFT parameters and also through the combination of different reduction parameters adaptively. Our contribution also include a probabilistic simulation approach top-up deterministic simulation described by Chen and Nguyen [Asiacrypt'11] that can able to predict the Gram-Schmidt norms more accurately for large block sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we analyse two variants of SIMON family of light-weight block ciphers against variants of linear cryptanalysis and present the best linear cryptanalytic results on these variants of reduced-round SIMON to date. We propose a time-memory trade-off method that finds differential/linear trails for any permutation allowing low Hamming weight differential/linear trails. Our method combines low Hamming weight trails found by the correlation matrix representing the target permutation with heavy Hamming weight trails found using a Mixed Integer Programming model representing the target differential/linear trail. Our method enables us to find a 17-round linear approximation for SIMON-48 which is the best current linear approximation for SIMON-48. Using only the correlation matrix method, we are able to find a 14-round linear approximation for SIMON-32 which is also the current best linear approximation for SIMON-32. The presented linear approximations allow us to mount a 23-round key recovery attack on SIMON-32 and a 24-round Key recovery attack on SIMON-48/96 which are the current best results on SIMON-32 and SIMON-48. In addition we have an attack on 24 rounds of SIMON-32 with marginal complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The appropriate frequency and precision for surveys of wildlife populations represent a trade-off between survey cost and the risk of making suboptimal management decisions because of poor survey data. The commercial harvest of kangaroos is primarily regulated through annual quotas set as proportions of absolute estimates of population size. Stochastic models were used to explore the effects of varying precision, survey frequency and harvest rate on the risk of quasiextinction for an arid-zone and a more mesic-zone kangaroo population. Quasiextinction probability increases in a sigmoidal fashion as survey frequency is reduced. The risk is greater in more arid regions and is highly sensitive to harvest rate. An appropriate management regime involves regular surveys in the major harvest areas where harvest rate can be set close to the maximum sustained yield. Outside these areas, survey frequency can be reduced in relatively mesic areas and reduced in arid regions when combined with lowered harvest rates. Relative to other factors, quasiextinction risk is only affected by survey precision (standard error/mean × 100) when it is >50%, partly reflecting the safety of the strategy of harvesting a proportion of a population estimate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To remain competitive, many agricultural systems are now being run along business lines. Systems methodologies are being incorporated, and here evolutionary computation is a valuable tool for identifying more profitable or sustainable solutions. However, agricultural models typically pose some of the more challenging problems for optimisation. This chapter outlines these problems, and then presents a series of three case studies demonstrating how they can be overcome in practice. Firstly, increasingly complex models of Australian livestock enterprises show that evolutionary computation is the only viable optimisation method for these large and difficult problems. On-going research is taking a notably efficient and robust variant, differential evolution, out into real-world systems. Next, models of cropping systems in Australia demonstrate the challenge of dealing with competing objectives, namely maximising farm profit whilst minimising resource degradation. Pareto methods are used to illustrate this trade-off, and these results have proved to be most useful for farm managers in this industry. Finally, land-use planning in the Netherlands demonstrates the size and spatial complexity of real-world problems. Here, GIS-based optimisation techniques are integrated with Pareto methods, producing better solutions which were acceptable to the competing organizations. These three studies all show that evolutionary computation remains the only feasible method for the optimisation of large, complex agricultural problems. An extra benefit is that the resultant population of candidate solutions illustrates trade-offs, and this leads to more informed discussions and better education of the industry decision-makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data-flow analysis is an integral part of any aggressive optimizing compiler. We propose a framework for improving the precision of data-flow analysis in the presence of complex control-flow. W initially perform data-flow analysis to determine those control-flow merges which cause the loss in data-flow analysis precision. The control-flow graph of the program is then restructured such that performing data-flow analysis on the resulting restructured graph gives more precise results. The proposed framework is both simple, involving the familiar notion of product automata, and also general, since it is applicable to any forward data-flow analysis. Apart from proving that our restructuring process is correct, we also show that restructuring is effective in that it necessarily leads to more optimization opportunities. Furthermore, the framework handles the trade-off between the increase in data-flow precision and the code size increase inherent in the restructuring. We show that determining an optimal restructuring is NP-hard, and propose and evaluate a greedy strategy. The framework has been implemented in the Scale research compiler, and instantiated for the specific problem of Constant Propagation. On the SPECINT 2000 benchmark suite we observe an average speedup of 4% in the running times over Wegman-Zadeck conditional constant propagation algorithm and 2% over a purely path profile guided approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using analysis-by-synthesis (AbS) approach, we develop a soft decision based switched vector quantization (VQ) method for high quality and low complexity coding of wideband speech line spectral frequency (LSF) parameters. For each switching region, a low complexity transform domain split VQ (TrSVQ) is designed. The overall rate-distortion (R/D) performance optimality of new switched quantizer is addressed in the Gaussian mixture model (GMM) based parametric framework. In the AbS approach, the reduction of quantization complexity is achieved through the use of nearest neighbor (NN) TrSVQs and splitting the transform domain vector into higher number of subvectors. Compared to the current LSF quantization methods, the new method is shown to provide competitive or better trade-off between R/D performance and complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Post-rainy sorghum (Sorghum bicolor (L.) Moench) production underpins the livelihood of millions in the semiarid tropics, where the crop is affected by drought. Drought scenarios have been classified and quantified using crop simulation. In this report, variation in traits that hypothetically contribute to drought adaptation (plant growth dynamics, canopy and root water conducting capacity, drought stress responses) were virtually introgressed into the most common post-rainy sorghum genotype, and the influence of these traits on plant growth, development, and grain and stover yield were simulated across different scenarios. Limited transpiration rates under high vapour pressure deficit had the highest positive effect on production, especially combined with enhanced water extraction capacity at the root level. Variability in leaf development (smaller canopy size, later plant vigour or increased leaf appearance rate) also increased grain yield under severe drought, although it caused a stover yield trade-off under milder stress. Although the leaf development response to soil drying varied, this trait had only a modest benefit on crop production across all stress scenarios. Closer dissection of the model outputs showed that under water limitation, grain yield was largely determined by the amount of water availability after anthesis, and this relationship became closer with stress severity. All traits investigated increased water availability after anthesis and caused a delay in leaf senescence and led to a ‘stay-green’ phenotype. In conclusion, we showed that breeding success remained highly probabilistic; maximum resilience and economic benefits depended on drought frequency. Maximum potential could be explored by specific combinations of traits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For accurate calculation of reductions in greenhouse-gas (GHG) emissions, methodologies under the Australian Government's Carbon Farming Initiative (CFI) depend on a valid assessment of the baseline and project emissions. Life-cycle assessments (LCAs) clearly show that enteric methane emitted from the rumen of cattle and sheep is the major source of GHG emissions from livestock enterprises. Where a historic baseline for a CFI methodology for livestock is required, the use of simulated data for cow-calf enterprises at six sites in southern Australia demonstrated that a 5-year rolling emission average will provide an acceptable trade off in terms of accuracy and stability, but this is a much shorter time period than typically used for LCA. For many CFI livestock methodologies, comparative or pair-wise baselines are potentially more appropriate than historic baselines. A case study of lipid supplementation of beef cows over winter is presented. The case study of a control herd of 250 cows used a comparative baseline derived from simple data on livestock numbers and class of livestock to quantify the emission abatement. Compared with the control herd, lipid supplementation to cows over winter increased livestock productivity, total livestock production and enterprise GHG emissions from 990 t CO2-e to 1022 t CO2-e. Energy embodied in the supplement and extra diesel used in transporting the supplement diminished the enteric-methane abatement benefit of lipid supplementation. Reducing the cow herd to 238 cows maintained the level of livestock production of the control herd and reduced enterprise emissions to 938 t CO2-e, but was not cost effective under the assumptions of this case study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The built environment is a major contributor to the world’s carbon dioxide emissions, with a considerable amount of energy being consumed in buildings due to heating, ventilation and air-conditioning, space illumination, use of electrical appliances, etc., to facilitate various anthropogenic activities. The development of sustainable buildings seeks to ameliorate this situation mainly by reducing energy consumption. Sustainable building design, however, is a complicated process involving a large number of design variables, each with a range of feasible values. There are also multiple, often conflicting, objectives involved such as the life cycle costs and occupant satisfaction. One approach to dealing with this is through the use of optimization models. In this paper, a new multi-objective optimization model is developed for sustainable building design by considering the design objectives of cost and energy consumption minimization and occupant comfort level maximization. In a case study demonstration, it is shown that the model can derive a set of suitable design solutions in terms of life cycle cost, energy consumption and indoor environmental quality so as to help the client and design team gain a better understanding of the design space and trade-off patterns between different design objectives. The model can very useful in the conceptual design stages to determine appropriate operational settings to achieve the optimal building performance in terms of minimizing energy consumption and maximizing occupant comfort level.