221 resultados para area-preserving maps
Resumo:
The refractive error of a human eye varies across the pupil and therefore may be treated as a random variable. The probability distribution of this random variable provides a means for assessing the main refractive properties of the eye without the necessity of traditional functional representation of wavefront aberrations. To demonstrate this approach, the statistical properties of refractive error maps are investigated. Closed-form expressions are derived for the probability density function (PDF) and its statistical moments for the general case of rotationally-symmetric aberrations. A closed-form expression for a PDF for a general non-rotationally symmetric wavefront aberration is difficult to derive. However, for specific cases, such as astigmatism, a closed-form expression of the PDF can be obtained. Further, interpretation of the distribution of the refractive error map as well as its moments is provided for a range of wavefront aberrations measured in real eyes. These are evaluated using a kernel density and sample moments estimators. It is concluded that the refractive error domain allows non-functional analysis of wavefront aberrations based on simple statistics in the form of its sample moments. Clinicians may find this approach to wavefront analysis easier to interpret due to the clinical familiarity and intuitive appeal of refractive error maps.
Resumo:
A common optometric problem is to specify the eye’s ocular aberrations in terms of Zernike coefficients and to reduce that specification to a prescription for the optimum sphero-cylindrical correcting lens. The typical approach is first to reconstruct wavefront phase errors from measurements of wavefront slopes obtained by a wavefront aberrometer. This paper applies a new method to this clinical problem that does not require wavefront reconstruction. Instead, we base our analysis of axial wavefront vergence as inferred directly from wavefront slopes. The result is a wavefront vergence map that is similar to the axial power maps in corneal topography and hence has a potential to be favoured by clinicians. We use our new set of orthogonal Zernike slope polynomials to systematically analyse details of the vergence map analogous to Zernike analysis of wavefront maps. The result is a vector of slope coefficients that describe fundamental aberration components. Three different methods for reducing slope coefficients to a spherocylindrical prescription in power vector forms are compared and contrasted. When the original wavefront contains only second order aberrations, the vergence map is a function of meridian only and the power vectors from all three methods are identical. The differences in the methods begin to appear as we include higher order aberrations, in which case the wavefront vergence map is more complicated. Finally, we discuss the advantages and limitations of vergence map representation of ocular aberrations.
Resumo:
Manufacturing managers have a measurable mindset (or frame) that structures their response to the manufacturing environment. Most importantly, this frame represents a set of assumptions about the relative prominence of concepts in the manufacturing domains, about the nature of people, and about the sensemaking processes required to understand the nature of the manufacturing environment as seen through the eyes of manufacturing managers. This paper uses work in the area of text analysis and extends the scope of a methodology that has been approached from two different directions by Carley ( Journal of Organizational Behavior , 18 (51), 533-558, 1997) and Gephart ( Journal of Organizational Behavior , 18 (51), 583-622, 1997). This methodology is termed collocate analysis. Based on the analysis of transcripts of interviews of Australian manufacturing managers mind maps of the concepts used by these managers have been constructed. From an analysis of these mind maps it is argued that strategy plays a minor role in their thinking second only to the improvement domain, whereas design and related concepts play a dominant role in their day-to-day thinking.
Resumo:
This paper discusses how internet services can be brought one step closer to the rural dispersed communities by improving wireless broadband communications in those areas. To accomplish this objective we describe the use of an innovative Multi-User-Single-Antenna for MIMO (MUSA-MIMO) technology using the spectrum currently allocated to analogue TV. MUSA-MIMO technology can be considered as a special case of MIMO technology, which is beneficial when provisioning reliable and high-speed communication channels. This paper describes channel modelling techniques to characterise the MUSA-MIMO system allowing an effective deployment of this technology. Particularly, it describes the development of a novel MUSA MIMO channel model that takes into account temporal variations in the rural wireless environment. This can be considered as a novel approach tailor-maid to rural Australia for provisioning efficient wireless broadband communications.
Resumo:
The concept of "fair basing" is widely acknowledged as a difficult area of patent law. This article maps the development of fair basing law to demonstrate how some of the difficulties have arisen. Part I of the article traces the development of the branches of patent law that were swept under the nomenclature of "fair basing" by British legislation in 1949. It looks at the early courts' approach to patent construction, examines the early origin of fair basing and what it was intended to achiever. Part II of the article considers the modern interpretation of fair basing, which provides a striking contrast to its historical context. Without any consistent judicial approach to construction the doctrine has developed inappropriately, giving rise to both over-strict and over-generous approaches.
Resumo:
This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.
Resumo:
This article focuses on the social interactions of several boys aged 3-5 years in the block area of a preschool classroom in a childcare setting. Using transcripts of video segments showing these boys engaged in daily play and interactions, the article analyses two episodes that occurred in the first weeks of the school year. At first glance, both episodes appear chaotic, with little appearance of order among the players. A closer analysis reveals a finely organized play taking place, with older boys teaching important lessons to the newcomers about how to be masculine in the block area. These episodes illustrate that masculinity is not a fixed character trait, but is determined through practice and participation in the activities of masculinity. Play and conflict are the avenues through which this occurs.
Resumo:
At the turn of the millennium, the Earth’s human population has reached unprecedented levels and its natural resources are being pushed to the limit. Thus, cities are focused on sustainable development and they have begun to develop new strategies for improving the built environment. Sustainable development provides the best outcomes for the human and natural environments by improving the quality of life that protects and balances the ecological, social and economic values. This brings us to the main point: to build a sustainable built environment, cities need to redesign many of their technologies and planning policies within the context of ecological principles. As an environmental sustainability index model, ASSURE is developed to investigate the present environmental situation of an urban area by assessing the impacts of development pressure on natural resources. It is an innovative approach to provide the resilience and function of urban ecosystems secure against the environmental degradation for now and the future. This paper aims to underline the importance of the model (ASSURE) in preserving biodiversity and natural ecosystems in the built environment and investigate its role in delivering long-term urban planning policies.
Resumo:
Objective. To provide a preliminary test of a Theory of Planned Behavior (TPB) belief-based intervention to increase adolescents’ sun protective behaviors in a high risk area, Queensland, Australia. Methods. In the period of October-November, 2007 and May-June, 2008, 80 adolescents (14.53 ± 0.69 years) were recruited from two secondary schools (one government and one private) in Queensland after obtaining student, parental, and school informed consent. Adolescents were allocated to either a control or intervention condition based on the class they attended. The intervention comprised three, one hour in-school sessions facilitated by Cancer Council Queensland employees with sessions covering the belief basis of the TPB (i.e., behavioral, normative, and control [barrier and motivator] sun-safe beliefs). Participants completed questionnaires assessing sun-safety beliefs, intentions, and behavior pre- and post-intervention. Repeated Measures Multivariate Analysis of Variance was used to test the effect of the intervention across time on these constructs. Results. Students completing the intervention reported stronger sun-safe normative and motivator beliefs and intentions and the performance of more sun-safe behaviors across time than those in the control condition. Conclusion. Strengthening beliefs about the approval of others and motivators for sun protection may encourage sun-safe cognitions and actions among adolescents.
Resumo:
Stereo vision is a method of depth perception, in which depth information is inferred from two (or more) images of a scene, taken from different perspectives. Practical applications for stereo vision include aerial photogrammetry, autonomous vehicle guidance, robotics and industrial automation. The initial motivation behind this work was to produce a stereo vision sensor for mining automation applications. For such applications, the input stereo images would consist of close range scenes of rocks. A fundamental problem faced by matching algorithms is the matching or correspondence problem. This problem involves locating corresponding points or features in two images. For this application, speed, reliability, and the ability to produce a dense depth map are of foremost importance. This work implemented a number of areabased matching algorithms to assess their suitability for this application. Area-based techniques were investigated because of their potential to yield dense depth maps, their amenability to fast hardware implementation, and their suitability to textured scenes such as rocks. In addition, two non-parametric transforms, the rank and census, were also compared. Both the rank and the census transforms were found to result in improved reliability of matching in the presence of radiometric distortion - significant since radiometric distortion is a problem which commonly arises in practice. In addition, they have low computational complexity, making them amenable to fast hardware implementation. Therefore, it was decided that matching algorithms using these transforms would be the subject of the remainder of the thesis. An analytic expression for the process of matching using the rank transform was derived from first principles. This work resulted in a number of important contributions. Firstly, the derivation process resulted in one constraint which must be satisfied for a correct match. This was termed the rank constraint. The theoretical derivation of this constraint is in contrast to the existing matching constraints which have little theoretical basis. Experimental work with actual and contrived stereo pairs has shown that the new constraint is capable of resolving ambiguous matches, thereby improving match reliability. Secondly, a novel matching algorithm incorporating the rank constraint has been proposed. This algorithm was tested using a number of stereo pairs. In all cases, the modified algorithm consistently resulted in an increased proportion of correct matches. Finally, the rank constraint was used to devise a new method for identifying regions of an image where the rank transform, and hence matching, are more susceptible to noise. The rank constraint was also incorporated into a new hybrid matching algorithm, where it was combined a number of other ideas. These included the use of an image pyramid for match prediction, and a method of edge localisation to improve match accuracy in the vicinity of edges. Experimental results obtained from the new algorithm showed that the algorithm is able to remove a large proportion of invalid matches, and improve match accuracy.