912 resultados para Bias-Variance Trade-off
Resumo:
One of the most discussed topics in labour and demographic studies, population ageing and stability, is closely related to fertility choices. This thesis explores recent developments in the fertility literature in the context of Australia. We investigate individual preferences for child bearing, the determinants of fertility decisions and the effectiveness of policies implemented by the government aimed at improving total fertility. The first study highlights the impact of monetary incentives on the decision to bear children in light of potentially differential responses across the native and immigrant population. The second study analyses the role of unemployment and job stability on the fertility choices of women. The final study examines whether the quality-quantity trade-off exists for Australian families and explores the impact of siblings on a child's health and educational outcomes.
Resumo:
This study examines Interim Financial Reporting disclosure compliance and associated factors for listed firms in Asia-Pacific countries: Australia, Hong Kong, Malaysia, Singapore, the Philippines, Thailand, and Vietnam. Employing disclosure theory (in the context of information economics), with the central premise being that manager' trade-off costs and benefits relating to disclosure, the factors influencing the variation in interim reporting disclosure compliance are examined. Using researcher-constructed disclosure indices and regression modelling, the results reveal significant cross-country variation in interim reporting disclosure compliance, with higher compliance associated with IFRS adoption, audit review, quarterly reporting (rather than six-monthly) and shorter reporting lags.
Resumo:
The increase in data center dependent services has made energy optimization of data centers one of the most exigent challenges in today's Information Age. The necessity of green and energy-efficient measures is very high for reducing carbon footprint and exorbitant energy costs. However, inefficient application management of data centers results in high energy consumption and low resource utilization efficiency. Unfortunately, in most cases, deploying an energy-efficient application management solution inevitably degrades the resource utilization efficiency of the data centers. To address this problem, a Penalty-based Genetic Algorithm (GA) is presented in this paper to solve a defined profile-based application assignment problem whilst maintaining a trade-off between the power consumption performance and resource utilization performance. Case studies show that the penalty-based GA is highly scalable and provides 16% to 32% better solutions than a greedy algorithm.
Resumo:
This project was a step forward in introducing suitable cooperative diversity transmission techniques for vehicle to vehicle communications. The contributions are intended to aid in the successful implementation of future vehicular safety and autonomous controlling systems. Several protocols were introduced for vehicles to communicate effectively without losing connectivity. This study investigated novel protocols in terms of diversity-multiplexing trade-off and outage for a range of potential vehicular safety and infotainment applications.
Resumo:
Metabolic imaging using positron emission tomography (PET) has found increasing clinical use for the management of infiltrating tumours such as glioma. However, the heterogeneous biological nature of tumours and intrinsic treatment resistance in some regions means that knowledge of multiple biological factors is needed for effective treatment planning. For example, the use of 18F-FDOPA to identify infiltrative tumour and 18F-FMISO for localizing hypoxic regions. Performing multiple PET acquisitions is impractical in many clinical settings, but previous studies suggest multiplexed PET imaging could be viable. The fidelity of the two signals is affected by the injection interval, scan timing and injected dose. The contribution of this work is to propose a framework to explicitly trade-off signal fidelity with logistical constraints when designing the imaging protocol. The particular case of estimating 18F-FMISO from a single frame prior to injection of 18F-FDOPA is considered. Theoretical experiments using simulations for typical biological scenarios in humans demonstrate that results comparable to a pair of single-tracer acquisitions can be obtained provided protocol timings are carefully selected. These results were validated using a pre-clinical data set that was synthetically multiplexed. The results indicate that the dual acquisition of 18F-FMISO and 18F-FDOPA could be feasible in the clinical setting. The proposed framework could also be used to design protocols for other tracers.
Resumo:
We investigate the terminating concept of BKZ reduction first introduced by Hanrot et al. [Crypto'11] and make extensive experiments to predict the number of tours necessary to obtain the best possible trade off between reduction time and quality. Then, we improve Buchmann and Lindner's result [Indocrypt'09] to find sub-lattice collision in SWIFFT. We illustrate that further improvement in time is possible through special setting of SWIFFT parameters and also through the combination of different reduction parameters adaptively. Our contribution also include a probabilistic simulation approach top-up deterministic simulation described by Chen and Nguyen [Asiacrypt'11] that can able to predict the Gram-Schmidt norms more accurately for large block sizes.
Resumo:
In this paper we analyse two variants of SIMON family of light-weight block ciphers against variants of linear cryptanalysis and present the best linear cryptanalytic results on these variants of reduced-round SIMON to date. We propose a time-memory trade-off method that finds differential/linear trails for any permutation allowing low Hamming weight differential/linear trails. Our method combines low Hamming weight trails found by the correlation matrix representing the target permutation with heavy Hamming weight trails found using a Mixed Integer Programming model representing the target differential/linear trail. Our method enables us to find a 17-round linear approximation for SIMON-48 which is the best current linear approximation for SIMON-48. Using only the correlation matrix method, we are able to find a 14-round linear approximation for SIMON-32 which is also the current best linear approximation for SIMON-32. The presented linear approximations allow us to mount a 23-round key recovery attack on SIMON-32 and a 24-round Key recovery attack on SIMON-48/96 which are the current best results on SIMON-32 and SIMON-48. In addition we have an attack on 24 rounds of SIMON-32 with marginal complexity.
Resumo:
The appropriate frequency and precision for surveys of wildlife populations represent a trade-off between survey cost and the risk of making suboptimal management decisions because of poor survey data. The commercial harvest of kangaroos is primarily regulated through annual quotas set as proportions of absolute estimates of population size. Stochastic models were used to explore the effects of varying precision, survey frequency and harvest rate on the risk of quasiextinction for an arid-zone and a more mesic-zone kangaroo population. Quasiextinction probability increases in a sigmoidal fashion as survey frequency is reduced. The risk is greater in more arid regions and is highly sensitive to harvest rate. An appropriate management regime involves regular surveys in the major harvest areas where harvest rate can be set close to the maximum sustained yield. Outside these areas, survey frequency can be reduced in relatively mesic areas and reduced in arid regions when combined with lowered harvest rates. Relative to other factors, quasiextinction risk is only affected by survey precision (standard error/mean × 100) when it is >50%, partly reflecting the safety of the strategy of harvesting a proportion of a population estimate.
Resumo:
To remain competitive, many agricultural systems are now being run along business lines. Systems methodologies are being incorporated, and here evolutionary computation is a valuable tool for identifying more profitable or sustainable solutions. However, agricultural models typically pose some of the more challenging problems for optimisation. This chapter outlines these problems, and then presents a series of three case studies demonstrating how they can be overcome in practice. Firstly, increasingly complex models of Australian livestock enterprises show that evolutionary computation is the only viable optimisation method for these large and difficult problems. On-going research is taking a notably efficient and robust variant, differential evolution, out into real-world systems. Next, models of cropping systems in Australia demonstrate the challenge of dealing with competing objectives, namely maximising farm profit whilst minimising resource degradation. Pareto methods are used to illustrate this trade-off, and these results have proved to be most useful for farm managers in this industry. Finally, land-use planning in the Netherlands demonstrates the size and spatial complexity of real-world problems. Here, GIS-based optimisation techniques are integrated with Pareto methods, producing better solutions which were acceptable to the competing organizations. These three studies all show that evolutionary computation remains the only feasible method for the optimisation of large, complex agricultural problems. An extra benefit is that the resultant population of candidate solutions illustrates trade-offs, and this leads to more informed discussions and better education of the industry decision-makers.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.
Resumo:
Data-flow analysis is an integral part of any aggressive optimizing compiler. We propose a framework for improving the precision of data-flow analysis in the presence of complex control-flow. W initially perform data-flow analysis to determine those control-flow merges which cause the loss in data-flow analysis precision. The control-flow graph of the program is then restructured such that performing data-flow analysis on the resulting restructured graph gives more precise results. The proposed framework is both simple, involving the familiar notion of product automata, and also general, since it is applicable to any forward data-flow analysis. Apart from proving that our restructuring process is correct, we also show that restructuring is effective in that it necessarily leads to more optimization opportunities. Furthermore, the framework handles the trade-off between the increase in data-flow precision and the code size increase inherent in the restructuring. We show that determining an optimal restructuring is NP-hard, and propose and evaluate a greedy strategy. The framework has been implemented in the Scale research compiler, and instantiated for the specific problem of Constant Propagation. On the SPECINT 2000 benchmark suite we observe an average speedup of 4% in the running times over Wegman-Zadeck conditional constant propagation algorithm and 2% over a purely path profile guided approach.
Resumo:
Various intrusion detection systems (IDSs) reported in the literature have shown distinct preferences for detecting a certain class of attack with improved accuracy, while performing moderately on the other classes. In view of the enormous computing power available in the present-day processors, deploying multiple IDSs in the same network to obtain best-of-breed solutions has been attempted earlier. The paper presented here addresses the problem of optimizing the performance of IDSs using sensor fusion with multiple sensors. The trade-off between the detection rate and false alarms with multiple sensors is highlighted. It is illustrated that the performance of the detector is better when the fusion threshold is determined according to the Chebyshev inequality. In the proposed data-dependent decision ( DD) fusion method, the performance optimization of ndividual IDSs is first addressed. A neural network supervised learner has been designed to determine the weights of individual IDSs depending on their reliability in detecting a certain attack. The final stage of this DD fusion architecture is a sensor fusion unit which does the weighted aggregation in order to make an appropriate decision. This paper theoretically models the fusion of IDSs for the purpose of demonstrating the improvement in performance, supplemented with the empirical evaluation.
Resumo:
Using analysis-by-synthesis (AbS) approach, we develop a soft decision based switched vector quantization (VQ) method for high quality and low complexity coding of wideband speech line spectral frequency (LSF) parameters. For each switching region, a low complexity transform domain split VQ (TrSVQ) is designed. The overall rate-distortion (R/D) performance optimality of new switched quantizer is addressed in the Gaussian mixture model (GMM) based parametric framework. In the AbS approach, the reduction of quantization complexity is achieved through the use of nearest neighbor (NN) TrSVQs and splitting the transform domain vector into higher number of subvectors. Compared to the current LSF quantization methods, the new method is shown to provide competitive or better trade-off between R/D performance and complexity.
Resumo:
Reef-building corals are an example of plastic photosynthetic organisms that occupy environments of high spatiotemporal variations in incident irradiance. Many phototrophs use a range of photoacclimatory mechanisms to optimize light levels reaching the photosynthetic units within the cells. In this study, we set out to determine whether phenotypic plasticity in branching corals across light habitats optimizes potential light utilization and photosynthesis. In order to do this, we mapped incident light levels across coral surfaces in branching corals and measured the photosynthetic capacity across various within-colony surfaces. Based on the field data and modelled frequency distribution of within-colony surface light levels, our results show that branching corals are substantially self-shaded at both 5 and 18 m, and the modal light level for the within-colony surface is 50 mu mol photons m(-2) s(-1). Light profiles across different locations showed that the lowest attenuation at both depths was found on the inner surface of the outermost branches, while the most self-shading surface was on the bottom side of these branches. In contrast, vertically extended branches in the central part of the colony showed no differences between the sides of branches. The photosynthetic activity at these coral surfaces confirmed that the outermost branches had the greatest change in sun- and shade-adapted surfaces; the inner surfaces had a 50 % greater relative maximum electron transport rate compared to the outer side of the outermost branches. This was further confirmed by sensitivity analysis, showing that branch position was the most influential parameter in estimating whole-colony relative electron transport rate (rETR). As a whole, shallow colonies have double the photosynthetic capacity compared to deep colonies. In terms of phenotypic plasticity potentially optimizing photosynthetic capacity, we found that at 18 m, the present coral colony morphology increased the whole-colony rETR, while at 5 m, the colony morphology decreased potential light utilization and photosynthetic output. This result of potential energy acquisition being underutilized in shallow, highly lit waters due to the shallow type morphology present may represent a trade-off between optimizing light capture and reducing light damage, as this type morphology can perhaps decrease long-term costs of and effect of photoinhibition. This may be an important strategy as opposed to adopting a type morphology, which results in an overall higher energetic acquisition. Conversely, it could also be that maximizing light utilization and potential photosynthetic output is more important in low-light habitats for Acropora humilis.
Resumo:
In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.