266 resultados para standard batch algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-speed broadband internet access is widely recognised as a catalyst to social and economic development, having a significant impact on global economy. Rural Australia’s inherent dispersed population over a large geographical area make the delivery of efficient, well-maintained and cost-effective internet a challenging task. The novel and highly-efficient Multi-User-Single-Antenna for MIMO (MUSA-MIMO) broadband wireless communication technology can effectively be used to deliver wireless broadband access to rural areas. This research aims to develop for the first time, an efficient and accurate algorithm for the tracking and prediction of Channel State Information (CSI) at the transmitter, by characterising time variation effects of the wireless communication channel on the performance of a highly-efficient MUSA-MIMO technology particularly suited for rural communities, improving their quality of life and economic prosperity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research explores the empirical association between takeover bid premium and acquired (purchased) goodwill, and tests whether the strength of the association changes after the passage of approved accounting standard AASB 1013 in Australia in 1988. AASB 1013 mandated capitalization and amortization of acquired goodwill to the income statement over a maximum period of 20 years. We use regressions to assess how the association between bid premium and acquired goodwill varies in the pre-AASB and post-AASB 1013 periods after controlling for confounding factors. Our results show that reducing the variety of accounting policy options available to bidder management after an acquisition results in a systematic reduction in the strength of the association between premium and goodwill.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This approach to sustainable design explores the possibility of creating an architectural design process which can iteratively produce optimised and sustainable design solutions. Driven by an evolution process based on genetic algorithms, the system allows the designer to “design the building design generator” rather than to “designs the building”. The design concept is abstracted into a digital design schema, which allows transfer of the human creative vision into the rational language of a computer. The schema is then elaborated into the use of genetic algorithms to evolve innovative, performative and sustainable design solutions. The prioritisation of the project’s constraints and the subsequent design solutions synthesised during design generation are expected to resolve most of the major conflicts in the evaluation and optimisation phases. Mosques are used as the example building typology to ground the research activity. The spatial organisations of various mosque typologies are graphically represented by adjacency constraints between spaces. Each configuration is represented by a planar graph which is then translated into a non-orthogonal dual graph and fed into the genetic algorithm system with fixed constraints and expected performance criteria set to govern evolution. The resultant Hierarchical Evolutionary Algorithmic Design System is developed by linking the evaluation process with environmental assessment tools to rank the candidate designs. The proposed system generates the concept, the seed, and the schema, and has environmental performance as one of the main criteria in driving optimisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the High Court decision of Cook v Cook (1986) 162 CLR 376, a person who voluntarily undertakes to instruct a learner driver of a motor vehicle is owed a lower standard of care than that owed to other road users. The standard of care was still expressed to be objective; however, it took into account the inexperience of the learner driver. Therefore, a person instructing a learner driver was owed a duty of care the standard being that of a reasonable learner driver. This ‘special relationship’ was said to exist because of the passenger’s knowledge of the driver’s inexperience and lack of skill. On 28 August 2008 the High Court handed down its decision in Imbree v McNeilly [2008] HCA 40, overruling Cook v Cook.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a simple and intuitive approach to determining the kinematic parameters of a serial-link robot in Denavit– Hartenberg (DH) notation. Once a manipulator’s kinematics is parameterized in this form, a large body of standard algorithms and code implementations for kinematics, dynamics, motion planning, and simulation are available. The proposed method has two parts. The first is the “walk through,” a simple procedure that creates a string of elementary translations and rotations, from the user-defined base coordinate to the end-effector. The second step is an algebraic procedure to manipulate this string into a form that can be factorized as link transforms, which can be represented in standard or modified DH notation. The method allows for an arbitrary base and end-effector coordinate system as well as an arbitrary zero joint angle pose. The algebraic procedure is amenable to computer algebra manipulation and a Java program is available as supplementary downloadable material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the optimal design of an active flow control device; Shock Control Bump (SCB) on suction and pressure sides of transonic aerofoil to reduce transonic total drag is investigated. Two optimisation test cases are conducted using different advanced Evolutionary Algorithms (EAs); the first optimiser is the Hierarchical Asynchronous Parallel Evolutionary Algorithm (HAPMOEA) based on canonical Evolutionary Strategies (ES). The second optimiser is the HAPMOEA is hybridised with one of well-known Game Strategies; Nash-Game. Numerical results show that SCB significantly reduces the drag by 30% when compared to the baseline design. In addition, the use of a Nash-Game strategy as a pre-conditioner of global control saves computational cost up to 90% when compared to the first optimiser HAPMOEA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimizing complexity of group key exchange (GKE) protocols is an important milestone towards their practical deployment. An interesting approach to achieve this goal is to simplify the design of GKE protocols by using generic building blocks. In this paper we investigate the possibility of founding GKE protocols based on a primitive called multi key encapsulation mechanism (mKEM) and describe advantages and limitations of this approach. In particular, we show how to design a one-round GKE protocol which satisfies the classical requirement of authenticated key exchange (AKE) security, yet without forward secrecy. As a result, we obtain the first one-round GKE protocol secure in the standard model. We also conduct our analysis using recent formal models that take into account both outsider and insider attacks as well as the notion of key compromise impersonation resilience (KCIR). In contrast to previous models we show how to model both outsider and insider KCIR within the definition of mutual authentication. Our analysis additionally implies that the insider security compiler by Katz and Shin from ACM CCS 2005 can be used to achieve more than what is shown in the original work, namely both outsider and insider KCIR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We give a direct construction of a certificateless key encapsulation mechanism (KEM) in the standard model that is more efficient than the generic constructions proposed before by Huang and Wong \cite{DBLP:conf/acisp/HuangW07}. We use a direct construction from Kiltz and Galindo's KEM scheme \cite{DBLP:conf/acisp/KiltzG06} to obtain a certificateless KEM in the standard model; our construction is roughly twice as efficient as the generic construction. We also address the security flaw discovered by Selvi et al. \cite{cryptoeprint:2009:462}.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show how to construct a certificateless key agreement protocol from the certificateless key encapsulation mechanism introduced by \cite{lippold-ICISC_2009} in ICISC 2009 using the \cite{DBLP:conf/acisp/BoydCNP08} protocol from ACISP 2008. We introduce the Canetti-Krawczyk (CK) model for certificateless cryptography, give security notions for Type I and Type II adversaries in the CK model, and highlight the differences to the existing e$^2$CK model discussed by \cite{DBLP:conf/pairing/LippoldBN09}. The resulting CK model is more relaxed thus giving more power to the adversary than the original CK model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the role of powerful entities and coalitions in shaping international accounting standards. Specifically, the focus is on the process by which the International Accounting Standards Board (IASB) developed IFRS 6, Exploration for and Evaluation of Mineral Resources. In its Issues Paper, the IASB recommended that the successful efforts method be mandated for pre-production costs, eliminating the choice previously available between full cost and successful efforts methods. In spite of the endorsement of this view by a majority of the constituents who responded to the Issues Paper, the final outcome changed nothing, with choice being retained. A compelling explanation of this disparity between the visible inputs and outputs of the standard setting process is the existence of a “black box”, in which powerful extractive industries entities and coalitions covertly influenced the IASB to secure their own ends and ensure that the status quo was maintained