295 resultados para order-statistics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Burrage and Burrage [1] it was shown that by introducing a very general formulation for stochastic Runge-Kutta methods, the previous strong order barrier of order one could be broken without having to use higher derivative terms. In particular, methods of strong order 1.5 were developed in which a Stratonovich integral of order one and one of order two were present in the formulation. In this present paper, general order results are proven about the maximum attainable strong order of these stochastic Runge-Kutta methods (SRKs) in terms of the order of the Stratonovich integrals appearing in the Runge-Kutta formulation. In particular, it will be shown that if an s-stage SRK contains Stratonovich integrals up to order p then the strong order of the SRK cannot exceed min{(p + 1)/2, (s - 1)/2), p greater than or equal to 2, s greater than or equal to 3 or 1 if p = 1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work investigates the accuracy and efficiency tradeoffs between centralized and collective (distributed) algorithms for (i) sampling, and (ii) n-way data analysis techniques in multidimensional stream data, such as Internet chatroom communications. Its contributions are threefold. First, we use the Kolmogorov-Smirnov goodness-of-fit test to show that statistical differences between real data obtained by collective sampling in time dimension from multiple servers and that of obtained from a single server are insignificant. Second, we show using the real data that collective data analysis of 3-way data arrays (users x keywords x time) known as high order tensors is more efficient than centralized algorithms with respect to both space and computational cost. Furthermore, we show that this gain is obtained without loss of accuracy. Third, we examine the sensitivity of collective constructions and analysis of high order data tensors to the choice of server selection and sampling window size. We construct 4-way tensors (users x keywords x time x servers) and analyze them to show the impact of server and window size selections on the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Virgtel Ltd v Zabusky [2009] QCA 92 the Queensland Court of Appeal considered the scope of an order “as to costs only” within the meaning of s 253 of the Supreme Court Act 1995 (Qld) (‘the Act”). The Court also declined to accept submissions from one of the parties after oral hearing, and made some useful comments which serve as a reminder to practitioners of their obligations in that regard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decision of Applegarth J in Heartwood Architectural & Joinery Pty Ltd v Redchip Lawyers [2009] QSC 195 (27 July 2009) involved a costs order against solicitors personally. This decision is but one of several recent decisions in which the court has been persuaded that the circumstances justified costs orders against legal practitioners on the indemnity basis. These decisions serve as a reminder to practitioners of their disclosure obligations when seeking any interlocutory relief in an ex parte application. These obligations are now clearly set out in r 14.4 of the Legal Profession (Solicitors) Rule 2007 and r 25 of 2007 Barristers Rule. Inexperience or ignorance will not excuse breaches of the duties owed to the court.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conventional mechanical properties of articular cartilage, such as compressive stiffness, have been demonstrated to be limited in their capacity to distinguish intact (visually normal) from degraded cartilage samples. In this paper, we explore the correlation between a new mechanical parameter, namely the reswelling of articular cartilage following unloading from a given compressive load, and the near infrared (NIR) spectrum. The capacity to distinguish mechanically intact from proteoglycan-depleted tissue relative to the "reswelling" characteristic was first established, and the result was subsequently correlated with the NIR spectral data of the respective tissue samples. To achieve this, normal intact and enzymatically degraded samples were subjected to both NIR probing and mechanical compression based on a load-unload-reswelling protocol. The parameter δ(r), characteristic of the osmotic "reswelling" of the matrix after unloading to a constant small load in the order of the osmotic pressure of cartilage, was obtained for the different sample types. Multivariate statistics was employed to determine the degree of correlation between δ(r) and the NIR absorption spectrum of relevant specimens using Partial Least Squared (PLS) regression. The results show a strong relationship (R(2)=95.89%, p<0.0001) between the spectral data and δ(r). This correlation of δ(r) with NIR spectral data suggests the potential for determining the reswelling characteristics non-destructively. It was also observed that δ(r) values bear a significant relationship with the cartilage matrix integrity, indicated by its proteoglycan content, and can therefore differentiate between normal and artificially degraded proteoglycan-depleted cartilage samples. It is therefore argued that the reswelling of cartilage, which is both biochemical (osmotic) and mechanical (hydrostatic pressure) in origin, could be a strong candidate for characterizing the tissue, especially in regions surrounding focal cartilage defects in joints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To use a large wavefront database of a clinical population to investigate relationships between refractions and higher order aberrations and between aberrations of right and left eyes. Methods: Third and fourth-order aberration coefficients and higher-order root-mean-squared aberrations (HO RMS), scaled to a pupil size of 4.5 mm diameter, were analysed in a population of about 24,000 patients from Carl Zeiss Vision's European wavefront database. Correlations were determined between the aberrations and the variables of refraction, near addition and cylinder. Results: Most aberration coefficients were significantly dependent upon these variables, but the proportions of aberrations that could be explained by these factors were less than 2% except for spherical aberration (12%), horizontal coma (9%) and HO RMS (7%). Near addition was the major contributor for horizontal coma (8.5% out of 9.5%) and spherical equivalent was the major contributor for spherical aberration (7.7% out of 11.6%). Interocular correlations were highly significant for all aberration coefficients, varying between 0.16 and 0.81. Anisometropia was a variable of significance for three aberrations (vertical coma, secondary astigmatism and tetrafoil), but little importance can be placed on this because of the small proportions of aberrations that can be explained by refraction (all less than 1.0 %). Conclusions: Most third- and fourth-order aberration coefficients were significantly dependent upon spherical equivalent, near addition and cylinder, but only horizontal coma (9%) and spherical aberration (12%) showed dependencies of greater than 2%. Interocular correlations were highly significant for all aberration coefficients, but anisometropia had little influence on aberration coefficients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article uses critical discourse analysis to analyse material shifts in the political economy of communications. It examines texts of major corporations to describe four key changes in political economy: (1) the separation of ownership from control; (2) the separation of business from industry; (3) the separation of accountability from responsibility; and (4) the subjugation of ‘going concerns’ by overriding concerns. The authors argue that this amounts to a political economic shift from traditional concepts of ‘capitalism’ to a new ‘corporatism’ in which the relationships between public and private, state and individual interests have become redefined and obscured through new discourse strategies. They conclude that the present financial and regulatory ‘crisis’ cannot be adequately resolved without a new analytic framework for examining the relationships between corporation, discourse and political economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an analysis of construction project bids to determine (a) the global distribution and (b) factors influencing the distribution of bids. The global distribution of bids was found, by using a battery of ll test statistics, to be approximated by a three-parameter log normal distribution. No global spread parameter was found. A multivariate analysis revealed the year of tender to be the major influencing factor. Consideration of the construction order, tender price and output indices lead to the conclusion that distributional spread reflected the degree of difference in pricing policies between bidders and the skewness of the distributions reflected the degree of competition. The paper concludes with a tentative model of the causal relationships between the factors and distributional characteristics involved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rayleigh–Stokes problems have in recent years received much attention due to their importance in physics. In this article, we focus on the variable-order Rayleigh–Stokes problem for a heated generalized second grade fluid with fractional derivative. Implicit and explicit numerical methods are developed to solve the problem. The convergence, stability of the numerical methods and solvability of the implicit numerical method are discussed via Fourier analysis. Moreover, a numerical example is given and the results support the effectiveness of the theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional reaction–subdiffusion equations are widely used in recent years to simulate physical phenomena. In this paper, we consider a variable-order nonlinear reaction–subdiffusion equation. A numerical approximation method is proposed to solve the equation. Its convergence and stability are analyzed by Fourier analysis. By means of the technique for improving temporal accuracy, we also propose an improved numerical approximation. Finally, the effectiveness of the theoretical results is demonstrated by numerical examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given global demand for new infrastructure, governments face substantial challenges in funding new infrastructure and delivering Value for Money (VfM). As part of the background to this challenge, a critique is given of current practice in the selection of the approach to procure major public sector infrastructure in Australia and which is akin to the Multi-Attribute Utility Approach (MAUA). To contribute towards addressing the key weaknesses of MAUA, a new first-order procurement decision-making model is presented. The model addresses the make-or-buy decision (risk allocation); the bundling decision (property rights incentives), as well as the exchange relationship decision (relational to arms-length exchange) in its novel approach to articulating a procurement strategy designed to yield superior VfM across the whole life of the asset. The aim of this paper is report on the development of this decisionmaking model in terms of the procedural tasks to be followed and the method being used to test the model. The planned approach to testing the model uses a sample of 87 Australian major infrastructure projects in the sum of AUD32 billion and deploys a key proxy for VfM comprising expressions of interest, as an indicator of competition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological growth in the 21st century is exponential. Simultaneously, development of the associated risk, uncertainty and user acceptance are scattered. This required appropriate study to establish people accepting controversial technology (PACT). The Internet and services around it, such as World Wide Web, e-mail, instant messaging and social networking are increasingly becoming important in many aspects of our lives. Information related to medical and personal health sharing using the Internet is controversial and demand validity, usability and acceptance. Whilst literature suggest, Internet enhances patients and physicians’ positive interactions some studies establish opposite of such interaction in particular the associated risk. In recent years Internet has attracted considerable attention as a means to improve health and health care delivery. However, it is not clear how widespread the use of Internet for health care really is or what impact it has on health care utilisation. Estimated impact of Internet usage varies widely from the locations locally and globally. As a result, an estimate (or predication) of Internet use and their effects in Medical Informatics related decision-making is impractical. This open up research issues on validating and accepting Internet usage when designing and developing appropriate policy and processes activities for Medical Informatics, Health Informatics and/or e-Health related protocols. Access and/or availability of data on Internet usage for Medical Informatics related activities are unfeasible. This paper presents a trend analysis of the growth of Internet usage in medical informatics related activities. In order to perform the analysis, data was extracted from ERA (Excellence Research in Australia) ranked “A” and “A*” Journal publications and reports from the authenticated public domain. The study is limited to the analyses of Internet usage trends in United States, Italy, France and Japan. Projected trends and their influence to the field of medical informatics is reviewed and discussed. The study clearly indicates a trend of patients becoming active consumers of health information rather than passive recipients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Higher-order thinking has featured persistently in the reform agenda for science education. The intended curriculum in various countries sets out aspirational statements for the levels of higher-order thinking to be attained by students. This study reports the extent to which chemistry examinations from four Australian states align and facilitate the intended higher-order thinking skills stipulated in curriculum documents. Through content analysis, the curriculum goals were identified for each state and compared to the nature of question items in the corresponding examinations. Categories of higher-order thinking were adapted from the OECD’s PISA Science test to analyze question items. There was considerable variation in the extent to which the examinations from the states supported the curriculum intent of developing and assessing higher-order thinking. Generally, examinations that used a marks-based system tended to emphasize lower-order thinking, with a greater distribution of marks allocated for lower-order thinking questions. Examinations associated with a criterion-referenced examination tended to award greater credit for higher-order thinking questions. The level of complexity of chemistry was another factor that limited the extent to which examination questions supported higher-order thinking. Implications from these findings are drawn for the authorities responsible for designing curriculum and assessment procedures and for teachers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.