15 resultados para Infinity

em Queensland University of Technology - ePrints Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new approach for delay-dependent robust H-infinity stability analysis and control synthesis of uncertain systems with time-varying delay. The key features of the approach include the introduction of a new Lyapunov–Krasovskii functional, the construction of an augmented matrix with uncorrelated terms, and the employment of a tighter bounding technique. As a result, significant performance improvement is achieved in system analysis and synthesis without using either free weighting matrices or model transformation. Examples are given to demonstrate the effectiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since a celebrate linear minimum mean square (MMS) Kalman filter in integration GPS/INS system cannot guarantee the robustness performance, a H(infinity) filtering with respect to polytopic uncertainty is designed. The purpose of this paper is to give an illustration of this application and a contrast with traditional Kalman filter. A game theory H(infinity) filter is first reviewed; next we utilize linear matrix inequalities (LMI) approach to design the robust H(infinity) filter. For the special INS/GPS model, unstable model case is considered. We give an explanation for Kalman filter divergence under uncertain dynamic system and simultaneously investigate the relationship between H(infinity) filter and Kalman filter. A loosely coupled INS/GPS simulation system is given here to verify this application. Result shows that the robust H(infinity) filter has a better performance when system suffers uncertainty; also it is more robust compared to the conventional Kalman filter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This collaborative event was organised to coincide with International celebrations by the International Council of Societies of Industrial Design (ICSID). The panel discussion involved industrial designers from a variety of backgrounds including academics, theorists and practitioners. Each panel member was given time to voice their opinion surrounding the theme of WIDD2010 "Industrial Design: Humane Solutions for a Resilient World". The discussion was then extended to the audience through active question and answer time. The panel included: * Professor Vesna Popovic FDIA - Queensland University of Technology * Adam Doyle, Studio Manager - Infinity Design Development * Scott Cox MDIA, Creative Director - Formwerx * Alexander Lotersztain, Director - Derlot * Philip Whiting FDIA, Design Convenor - QCA * Professor Tony Fry, Director Team D/E/S & QCA After this, the documentary by Gary Hewtsit "Objectified" was then screened (75 min).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a nonlinear H_infinity controller for stabilization of velocities, attitudes and angular rates of a fixed-wing unmanned aerial vehicle (UAV) in a windy environment. The suggested controller aims to achieve a steady-state flight condition in the presence of wind gusts such that the host UAV can be maneuvered to avoid collision with other UAVs during cruise flight with safety guarantees. This paper begins with building a proper model capturing flight aerodynamics of UAVs. Then a nonlinear controller is developed with gust attenuation and rapid response properties. Simulations are conducted for the Shadow UAV to verify performance of the proposed con- troller. Comparative studies with the proportional-integral-derivative (PID) controllers demonstrate that the proposed controller exhibits great performance improvement in a gusty environment, making it suitable for integration into the design of flight control systems for cruise flight of UAVs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a disturbance attenuation controller for horizontal position stabilization for hover and automatic landings of a Rotary-wing Unmanned Aerial Vehicle (RUAV) operating in rough seas. Based on a helicopter model representing aerodynamics during the landing phase, a nonlinear state feedback H-infinity controller is designed to achieve rapid horizontal position tracking in a gusty environment. The resultant control variables are further treated in consideration of practical constraints (flapping dynamics, servo dynamics and time lag effect) for implementation purpose. The high-fidelity closed-loop simulation using parameters of the Vario helicopter verifies performance of the proposed position controller. It not only increases the disturbance attenuation capability of the RUAV, but also enables rapid position response when gusts occur. Comparative studies show that the H-infinity controller exhibits great performance improvement and can be applied to ship/RUAV landing systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents a disturbance attenuation controller for horizontal position stabilisation for hover and automatic landings of a rotary-wing unmanned aerial vehicle (RUAV) operating close to the landing deck in rough seas. Based on a helicopter model representing aerodynamics during the landing phase, a non-linear state feedback H∞ controller is designed to achieve rapid horizontal position tracking in a gusty environment. Practical constraints including flapping dynamics, servo dynamics and time lag effect are considered. A high-fidelity closed-loop simulation using parameters of the Vario XLC gas-turbine helicopter verifies performance of the proposed horizontal position controller. The proposed controller not only increases the disturbance attenuation capability of the RUAV, but also enables rapid position response when gusts occur. Comparative studies show that the H∞ controller exhibits performance improvement and can be applied to ship/RUAV landing systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To assess intrasessional and intersessional repeatability of two commercial partial coherence interferometry instruments for measuring peripheral eye lengths and to investigate the agreement between the two instruments. Methods: Central and peripheral eye lengths were determined with the IOLMaster (Carl-Zeiss Meditec AG, Jena, Germany) and the Lenstar (Haag Streit, Bern, Switzerland) in seven adults. Measurements were performed out to 35° and 30° from fixation for horizontal and vertical visual fields, respectively, in 5° intervals. An external fixation target at optical infinity was used. At least four measurements were taken at each location for each instrument, and measurements were taken at two sessions. Results: The mean intrasessional SDs for the IOLMaster along both the horizontal and vertical visual fields were 0.04 ± 0.04 mm; corresponding results for the Lenstar were 0.02 ± 0.02 mm along both fields. The intersessional SDs for the IOLMaster for the horizontal and vertical visual fields were ±0.11 and ±0.08 mm, respectively; corresponding limits for the Lenstar were ±0.05 and ±0.04 mm. The intrasessional and intersessional variability increased away from fixation. The mean differences between the two instruments were 0.01 ± 0.07 mm and 0.02 ± 0.07 mm in the horizontal and vertical visual fields, but the lengths with the Lenstar became greater than those with the IOLMaster as axial length increased (rate of approximately 0.016 mm/mm). Conclusions: Both the IOLMaster and the Lenstar demonstrated good intrasessional and intersessional repeatability for peripheral eye length measurements, with the Lenstar showing better repeatability. The Lenstar would be expected to give a slightly greater range of eye lengths than the IOLMaster across the visual field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Article is about legal scholarly publication in a time of plenitude. It is an attempt to explain why the most pressing questions in legal scholarly publishing are about how we ensure access to an infinity of content. It explains why standard assumptions about resource scarcity in publication are wrong in general, and how the changes in the modality of publication affect legal scholarship. It talks about the economics of open access to legal material, and how this connects to a future where there is infinite content. And because student-edited law reviews fit this future better than their commercially-produced, peer-refereed cousins, this Article is, in part, a defense of the crazy-beautiful institution that is the American law review.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Fabens method is commonly used to estimate growth parameters k and l infinity in the von Bertalanffy model from tag-recapture data. However, the Fabens method of estimation has an inherent bias when individual growth is variable. This paper presents an asymptotically unbiassed method using a maximum likelihood approach that takes account of individual variability in both maximum length and age-at-tagging. It is assumed that each individual's growth follows a von Bertalanffy curve with its own maximum length and age-at-tagging. The parameter k is assumed to be a constant to ensure that the mean growth follows a von Bertalanffy curve and to avoid overparameterization. Our method also makes more efficient use nf thp measurements at tno and recapture and includes diagnostic techniques for checking distributional assumptions. The method is reasonably robust and performs better than the Fabens method when individual growth differs from the von Bertalanffy relationship. When measurement error is negligible, the estimation involves maximizing the profile likelihood of one parameter only. The method is applied to tag-recapture data for the grooved tiger prawn (Penaeus semisulcatus) from the Gulf of Carpentaria, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimation of von Bertalanffy growth parameters has received considerable attention in fisheries research. Since Sainsbury (1980, Can. J. Fish. Aquat. Sci. 37: 241-247) much of this research effort has centered on accounting for individual variability in the growth parameters. In this paper we demonstrate that, in analysis of tagging data, Sainsbury's method and its derivatives do not, in general, satisfactorily account for individual variability in growth, leading to inconsistent parameter estimates (the bias does not tend to zero as sample size increases to infinity). The bias arises because these methods do not use appropriate conditional expectations as a basis for estimation. This bias is found to be similar to that of the Fabens method. Such methods would be appropriate only under the assumption that the individual growth parameters that generate the growth increment were independent of the growth parameters that generated the initial length. However, such an assumption would be unrealistic. The results are derived analytically, and illustrated with a simulation study. Until techniques that take full account of the appropriate conditioning have been developed, the effect of individual variability on growth has yet to be fully understood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider estimation of mortality rates and growth parameters from length-frequency data of a fish stock and derive the underlying length distribution of the population and the catch when there is individual variability in the von Bertalanffy growth parameter L-infinity. The model is flexible enough to accommodate 1) any recruitment pattern as a function of both time and length, 2) length-specific selectivity, and 3) varying fishing effort over time. The maximum likelihood method gives consistent estimates, provided the underlying distribution for individual variation in growth is correctly specified. Simulation results indicate that our method is reasonably robust to violations in the assumptions. The method is applied to tiger prawn data (Penaeus semisulcatus) to obtain estimates of natural and fishing mortality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an iterative estimating equations procedure for analysis of longitudinal data. We show that, under very mild conditions, the probability that the procedure converges at an exponential rate tends to one as the sample size increases to infinity. Furthermore, we show that the limiting estimator is consistent and asymptotically efficient, as expected. The method applies to semiparametric regression models with unspecified covariances among the observations. In the special case of linear models, the procedure reduces to iterative reweighted least squares. Finite sample performance of the procedure is studied by simulations, and compared with other methods. A numerical example from a medical study is considered to illustrate the application of the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider estimation of mortality rates and growth parameters from length-frequency data of a fish stock when there is individual variability in the von Bertalanffy growth parameter L-infinity and investigate the possible bias in the estimates when the individual variability is ignored. Three methods are examined: (i) the regression method based on the Beverton and Holt's (1956, Rapp. P.V. Reun. Cons. Int. Explor. Mer, 140: 67-83) equation; (ii) the moment method of Powell (1979, Rapp. PV. Reun. Int. Explor. Mer, 175: 167-169); and (iii) a generalization of Powell's method that estimates the individual variability to be incorporated into the estimation. It is found that the biases in the estimates from the existing methods are, in general, substantial, even when individual variability in growth is small and recruitment is uniform, and the generalized method performs better in terms of bias but is subject to a larger variation. There is a need to develop robust and flexible methods to deal with individual variability in the analysis of length-frequency data.