377 resultados para Derivation principle


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this paper is to emphasis the significance of public asset management in Indonesia that is by identifying opportunities and challenges of Indonesian local governments in adopting current practice of Public Asset Management System. A Case Study, in South Sulawesi Provincial government was used as the approach to achieve the research objective. The case study involved two data collection techniques i.e. interviews followed by study on documents. The result of the study indicates there are some significant opportunities and challenges that Indonesian local government might deal with in adopting current practice of public asset management. There are opportunities that can lead to more effective and efficient local government, accountable and auditable local government organization, increase local government portfolio, and improve the quality of public services. The challenges include no clear institutional and legal framework to support the asset management application, non-profit principle of public assets, cross jurisdictions in public asset management, complexity of local government objectives, and unavailability of data for managing public property. The study only covers condition of South Sulawesi Province, which could not represent exactly the whole local governments’ condition in Indonesia. Findings from this study provide useful input for the policy makers, scholars and asset management practitioners in Indonesia to establish a public asset management framework that suitable for Indonesia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes an application of decoupled probabilistic world modeling to achieve team planning. The research is based on the principle that the action selection mechanism of a member in a robot team can select an effective action if a global world model is available to all team members. In the real world, the sensors are imprecise, and are individual to each robot, hence providing each robot a partial and unique view about the environment. We address this problem by creating a probabilistic global view on each agent by combining the perceptual information from each robot. This probabilistic view forms the basis for selecting actions to achieve the team goal in a dynamic environment. Experiments have been carried out to investigate the effectiveness of this principle using custom-built robots for real world performance, in addition, to extensive simulation results. The results show an improvement in team effectiveness when using probabilistic world modeling based on perception sharing for team planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The doctrine of 'prosecution history estoppel' (PH estoppel) as developed in the United States has strong intuitive appeal, especially when applied to counterbalance a related patent law principle, the doctrine of equivalents. The doctrines are receiving increasing attention in US patent decisions, to the point where one patent litigator recently compared them to "two cars that keep bumping fenders. They are frequently returned to the shop for repairs". Could PH estoppel find its way into UK patent law? This article briefly examines the doctrine, its evolution in the US and the problems associated with importing the doctrine into the UK. As the EU legislation stands, Article 69 and the Protocol to the European Patent Convention (EPC) pose serious obstacles to using the doctrine directly in claim construction. However there appears to be some scope to apply the doctrine as a limited form of defence in infringement actions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The global impact of an ever-increasing population-base combined with dangerously depleted natural resources highlights the urgent need for changes in human lifestyles and land-use patterns. To achieve more equitable and sustainable land use, it is imperative that populations live within the carrying capacity of their natural assets in a manner more accountable to and ethically responsible for the land which sustains them. Our society’s very survival may well depend on worldwide acceptance of the carrying capacity imperative as a principle of personal, political, economic, educational and planning responsibility. This theoretically-focused research identifies, examines and compares a range of methodological approaches to carrying capacity assessment and considers their relevance to future spatial planning. It also addresses existing gaps in current methodologies and suggests avenues for improvement. A set of eleven key criteria are employed to compare various existing carrying capacity assessment models. These criteria include whole-systems analysis, dynamic responses, levels of impact and risk, systemic constraints, applicability to future planning and the consideration of regional and local boundary delineation. This research finds that while some existing methodologies offer significant insights into the assessment of population carrying capacities, a comprehensive model is yet to be developed. However, it is suggested that by combining successful components from various authors, and collecting a range of interconnected data, a practical and workable systems-based model may be achievable in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wide-angle images exhibit significant distortion for which existing scale-space detectors such as the scale-invariant feature transform (SIFT) are inappropriate. The required scale-space images for feature detection are correctly obtained through the convolution of the image, mapped to the sphere, with the spherical Gaussian. A new visual key-point detector, based on this principle, is developed and several computational approaches to the convolution are investigated in both the spatial and frequency domain. In particular, a close approximation is developed that has comparable computation time to conventional SIFT but with improved matching performance. Results are presented for monocular wide-angle outdoor image sequences obtained using fisheye and equiangular catadioptric cameras. We evaluate the overall matching performance (recall versus 1-precision) of these methods compared to conventional SIFT. We also demonstrate the use of the technique for variable frame-rate visual odometry and its application to place recognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we describe the Large Margin Vector Quantization algorithm (LMVQ), which uses gradient ascent to maximise the margin of a radial basis function classifier. We present a derivation of the algorithm, which proceeds from an estimate of the class-conditional probability densities. We show that the key behaviour of Kohonen's well-known LVQ2 and LVQ3 algorithms emerge as natural consequences of our formulation. We compare the performance of LMVQ with that of Kohonen's LVQ algorithms on an artificial classification problem and several well known benchmark classification tasks. We find that the classifiers produced by LMVQ attain a level of accuracy that compares well with those obtained via LVQ1, LVQ2 and LVQ3, with reduced storage complexity. We indicate future directions of enquiry based on the large margin approach to Learning Vector Quantization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is devoted to the study of linear relationships in symmetric block ciphers. A block cipher is designed so that the ciphertext is produced as a nonlinear function of the plaintext and secret master key. However, linear relationships within the cipher can still exist if the texts and components of the cipher are manipulated in a number of ways, as shown in this thesis. There are four main contributions of this thesis. The first contribution is the extension of the applicability of integral attacks from word-based to bitbased block ciphers. Integral attacks exploit the linear relationship between texts at intermediate stages of encryption. This relationship can be used to recover subkey bits in a key recovery attack. In principle, integral attacks can be applied to bit-based block ciphers. However, specific tools to define the attack on these ciphers are not available. This problem is addressed in this thesis by introducing a refined set of notations to describe the attack. The bit patternbased integral attack is successfully demonstrated on reduced-round variants of the block ciphers Noekeon, Present and Serpent. The second contribution is the discovery of a very small system of equations that describe the LEX-AES stream cipher. LEX-AES is based heavily on the 128-bit-key (16-byte) Advanced Encryption Standard (AES) block cipher. In one instance, the system contains 21 equations and 17 unknown bytes. This is very close to the upper limit for an exhaustive key search, which is 16 bytes. One only needs to acquire 36 bytes of keystream to generate the equations. Therefore, the security of this cipher depends on the difficulty of solving this small system of equations. The third contribution is the proposal of an alternative method to measure diffusion in the linear transformation of Substitution-Permutation-Network (SPN) block ciphers. Currently, the branch number is widely used for this purpose. It is useful for estimating the possible success of differential and linear attacks on a particular SPN cipher. However, the measure does not give information on the number of input bits that are left unchanged by the transformation when producing the output bits. The new measure introduced in this thesis is intended to complement the current branch number technique. The measure is based on fixed points and simple linear relationships between the input and output words of the linear transformation. The measure represents the average fraction of input words to a linear diffusion transformation that are not effectively changed by the transformation. This measure is applied to the block ciphers AES, ARIA, Serpent and Present. It is shown that except for Serpent, the linear transformations used in the block ciphers examined do not behave as expected for a random linear transformation. The fourth contribution is the identification of linear paths in the nonlinear round function of the SMS4 block cipher. The SMS4 block cipher is used as a standard in the Chinese Wireless LAN Wired Authentication and Privacy Infrastructure (WAPI) and hence, the round function should exhibit a high level of nonlinearity. However, the findings in this thesis on the existence of linear relationships show that this is not the case. It is shown that in some exceptional cases, the first four rounds of SMS4 are effectively linear. In these cases, the effective number of rounds for SMS4 is reduced by four, from 32 to 28. The findings raise questions about the security provided by SMS4, and might provide clues on the existence of a flaw in the design of the cipher.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scientific discoveries, developments in medicine and health issues are the constant focus of media attention and the principles surrounding the creation of so called ‘saviour siblings’ are of no exception. The development in the field of reproductive techniques has provided the ability to genetically analyse embryos created in the laboratory to enable parents to implant selected embryos to create a tissue-matched child who may be able to cure an existing sick child. The research undertaken in this thesis examines the regulatory frameworks overseeing the delivery of assisted reproductive technologies (ART) in Australia and the United Kingdom and considers how those frameworks impact on the accessibility of in vitro fertilisation (IVF) procedures for the creation of ‘saviour siblings’. In some jurisdictions, the accessibility of such techniques is limited by statutory requirements. The limitations and restrictions imposed by the state in relation to the technology are analysed in order to establish whether such restrictions are justified. The analysis is conducted on the basis of a harm framework. The framework seeks to establish whether those affected by the use of the technology (including the child who will be created) are harmed. In order to undertake such evaluation, the concept of harm is considered under the scope of John Stuart Mill’s liberal theory and the Harm Principle is used as a normative tool to judge whether the level of harm that may result, justifies state intervention or restriction with the reproductive decision-making of parents in this context. The harm analysis conducted in this thesis seeks to determine an appropriate regulatory response in relation to the use of pre-implantation tissue-typing for the creation of ‘saviour siblings’. The proposals outlined in the last part of this thesis seek to address the concern that harm may result from the practice of pre-implantation tissue-typing. The current regulatory frameworks in place are also analysed on the basis of the harm framework established in this thesis. The material referred to in this thesis reflects the law and policy in place in Australia and the UK at the time the thesis was submitted for examination (December 2009).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stereo vision is a method of depth perception, in which depth information is inferred from two (or more) images of a scene, taken from different perspectives. Practical applications for stereo vision include aerial photogrammetry, autonomous vehicle guidance, robotics and industrial automation. The initial motivation behind this work was to produce a stereo vision sensor for mining automation applications. For such applications, the input stereo images would consist of close range scenes of rocks. A fundamental problem faced by matching algorithms is the matching or correspondence problem. This problem involves locating corresponding points or features in two images. For this application, speed, reliability, and the ability to produce a dense depth map are of foremost importance. This work implemented a number of areabased matching algorithms to assess their suitability for this application. Area-based techniques were investigated because of their potential to yield dense depth maps, their amenability to fast hardware implementation, and their suitability to textured scenes such as rocks. In addition, two non-parametric transforms, the rank and census, were also compared. Both the rank and the census transforms were found to result in improved reliability of matching in the presence of radiometric distortion - significant since radiometric distortion is a problem which commonly arises in practice. In addition, they have low computational complexity, making them amenable to fast hardware implementation. Therefore, it was decided that matching algorithms using these transforms would be the subject of the remainder of the thesis. An analytic expression for the process of matching using the rank transform was derived from first principles. This work resulted in a number of important contributions. Firstly, the derivation process resulted in one constraint which must be satisfied for a correct match. This was termed the rank constraint. The theoretical derivation of this constraint is in contrast to the existing matching constraints which have little theoretical basis. Experimental work with actual and contrived stereo pairs has shown that the new constraint is capable of resolving ambiguous matches, thereby improving match reliability. Secondly, a novel matching algorithm incorporating the rank constraint has been proposed. This algorithm was tested using a number of stereo pairs. In all cases, the modified algorithm consistently resulted in an increased proportion of correct matches. Finally, the rank constraint was used to devise a new method for identifying regions of an image where the rank transform, and hence matching, are more susceptible to noise. The rank constraint was also incorporated into a new hybrid matching algorithm, where it was combined a number of other ideas. These included the use of an image pyramid for match prediction, and a method of edge localisation to improve match accuracy in the vicinity of edges. Experimental results obtained from the new algorithm showed that the algorithm is able to remove a large proportion of invalid matches, and improve match accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One approach to reducing the yield losses caused by banana viral diseases is the use of genetic engineering and pathogen-derived resistance strategies to generate resistant cultivars. The development of transgenic virus resistance requires an efficient banana transformation method, particularly for commercially important 'Cavendish' type cultivars such as 'Grand Nain'. Prior to this study, only two examples of the stable transformation of banana had been reported, both of which demonstrated the principle of transformation but did not characterise transgenic plants in terms of the efficiency at which individual transgenic lines were generated, relative activities of promoters in stably transformed plants, and the stability of transgene expression. The aim of this study was to develop more efficient transformation methods for banana, assess the activity of some commonly used and also novel promoters in stably transformed plants, and transform banana with genes that could potentially confer resistance to banana bunchy top nanovirus (BBTV) and banana bract mosaic potyvirus (BBrMV). A regeneration system using immature male flowers as the explant was established. The frequency of somatic embryogenesis in male flower explants was influenced by the season in which the inflorescences were harvested. Further, the media requirements of various banana cultivars in respect to the 2,4-D concentration in the initiation media also differed. Following the optimisation of these and other parameters, embryogenic cell suspensions of several banana (Musa spp.) cultivars including 'Grand Nain' (AAA), 'Williams' (AAA), 'SH-3362' (AA), 'Goldfinger' (AAAB) and 'Bluggoe' (ABB) were successfully generated. Highly efficient transformation methods were developed for both 'Bluggoe' and 'Grand Nain'; this is the first report of microprojectile bombardment transformation of the commercially important 'Grand Nain' cultivar. Following bombardment of embryogenic suspension cells, regeneration was monitored from single transfom1ed cells to whole plants using a reporter gene encoding the green fluorescent protein (gfp). Selection with kanamycin enabled the regeneration of a greater number of plants than with geneticin, while still preventing the regeneration of non-transformed plants. Southern hybridisation confirmed the neomycin phosphotransferase gene (npt II) was stably integrated into the banana genome and that multiple transgenic lines were derived from single bombardments. The activity, stability and tissue specificity of the cauliflower mosaic virus 358 (CaMV 35S) and maize polyubiquitin-1 (Ubi-1) promoters were examined. In stably transformed banana, the Ubi-1 promoter provided approximately six-fold higher p-glucuronidase (GUS) activity than the CaMV 35S promoter, and both promoters remained active in glasshouse grown plants for the six months they were observed. The intergenic regions ofBBTV DNA-I to -6 were isolated and fused to either the uidA (GUS) or gfjJ reporter genes to assess their promoter activities. BBTV promoter activity was detected in banana embryogenic cells using the gfp reporter gene. Promoters derived from BBTV DNA-4 and -5 generated the highest levels of transient activity, which were greater than that generated by the maize Ubi-1 promoter. In transgenic banana plants, the activity of the BBTV DNA-6 promoter (BT6.1) was restricted to the phloem of leaves and roots, stomata and root meristems. The activity of the BT6.1 promoter was enhanced by the inclusion of intron-containing fragments derived from the maize Ubi-1, rice Act-1, and sugarcane rbcS 5' untranslated regions in GUS reporter gene constructs. In transient assays in banana, the rice Act-1 and maize Ubi-1 introns provided the most significant enhancement, increasing expression levels 300-fold and 100-fold, respectively. The sugarcane rbcS intron increased expression about 10-fold. In stably transformed banana plants, the maize Ubi-1 intron enhanced BT6.1 promoter activity to levels similar to that of the CaMV 35S promoter, but did not appear to alter the tissue specificity of the promoter. Both 'Grand Nain' and 'Bluggoe' were transformed with constructs that could potentially confer resistance to BBTV and BBrMV, including constructs containing BBTV DNA-1 major and internal genes, BBTV DNA-5 gene, and the BBrMV coat protein-coding region all under the control of the Ubi-1 promoter, while the BT6 promoter was used to drive the npt II selectable marker gene. At least 30 transgenic lines containing each construct were identified and replicates of each line are currently being generated by micropropagation in preparation for virus challenge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fables of sovereignty / Wayne Hudson Sovereignty discourse and practice : past and future / Joseph Camilleri Guises of sovereignty / Gerry Simpson Westphalian and Islamic concepts of sovereignty in the Middle East / Amin Saikal Wither sovereignty in Southeast Asia today? / See Seng Tan Ambivalent sovereignty : China and re-imagining the Westphalian ideal / Yongjin Zhang Confronting terrorism : dilemmas of principle and practice regarding sovereignty / Brian L. Job Sovereignty in the 21st century : security, immigration, and refugees / Howard Adelman State sovereignty and international refugee protection / Robyn Lui Do no harm : towards a Hippocratic standard for international civilisation / Neil Arya Sovereignty and the global politics of the environment : beyond Westphalia? / Lorraine Elliott Westphalian sovereignty in the shadow of international justice? a fresh coat of paint for a tainted concept / Jackson Nyamuya Maogoto Development assistance and the hollow sovereignty of the weak / Roland Rich Corruption and transparency in governance and development : reinventing sovereignty for promoting good governance / C. Raj Kumar Re-envisioning economic sovereignty : developing countries and the International Monetary Fund / Ross P. Buckley Trust, legitimacy, and the sharing of sovereignty / William Maley Sovereignty as indirect rule / Barry Hindess Indigenous sovereignty / Paul Keal Civil society in a post-statist circumstance / Jan Aart Scholte.