998 resultados para order statistic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rayleigh–Stokes problems have in recent years received much attention due to their importance in physics. In this article, we focus on the variable-order Rayleigh–Stokes problem for a heated generalized second grade fluid with fractional derivative. Implicit and explicit numerical methods are developed to solve the problem. The convergence, stability of the numerical methods and solvability of the implicit numerical method are discussed via Fourier analysis. Moreover, a numerical example is given and the results support the effectiveness of the theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional reaction–subdiffusion equations are widely used in recent years to simulate physical phenomena. In this paper, we consider a variable-order nonlinear reaction–subdiffusion equation. A numerical approximation method is proposed to solve the equation. Its convergence and stability are analyzed by Fourier analysis. By means of the technique for improving temporal accuracy, we also propose an improved numerical approximation. Finally, the effectiveness of the theoretical results is demonstrated by numerical examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given global demand for new infrastructure, governments face substantial challenges in funding new infrastructure and delivering Value for Money (VfM). As part of the background to this challenge, a critique is given of current practice in the selection of the approach to procure major public sector infrastructure in Australia and which is akin to the Multi-Attribute Utility Approach (MAUA). To contribute towards addressing the key weaknesses of MAUA, a new first-order procurement decision-making model is presented. The model addresses the make-or-buy decision (risk allocation); the bundling decision (property rights incentives), as well as the exchange relationship decision (relational to arms-length exchange) in its novel approach to articulating a procurement strategy designed to yield superior VfM across the whole life of the asset. The aim of this paper is report on the development of this decisionmaking model in terms of the procedural tasks to be followed and the method being used to test the model. The planned approach to testing the model uses a sample of 87 Australian major infrastructure projects in the sum of AUD32 billion and deploys a key proxy for VfM comprising expressions of interest, as an indicator of competition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Higher-order thinking has featured persistently in the reform agenda for science education. The intended curriculum in various countries sets out aspirational statements for the levels of higher-order thinking to be attained by students. This study reports the extent to which chemistry examinations from four Australian states align and facilitate the intended higher-order thinking skills stipulated in curriculum documents. Through content analysis, the curriculum goals were identified for each state and compared to the nature of question items in the corresponding examinations. Categories of higher-order thinking were adapted from the OECD’s PISA Science test to analyze question items. There was considerable variation in the extent to which the examinations from the states supported the curriculum intent of developing and assessing higher-order thinking. Generally, examinations that used a marks-based system tended to emphasize lower-order thinking, with a greater distribution of marks allocated for lower-order thinking questions. Examinations associated with a criterion-referenced examination tended to award greater credit for higher-order thinking questions. The level of complexity of chemistry was another factor that limited the extent to which examination questions supported higher-order thinking. Implications from these findings are drawn for the authorities responsible for designing curriculum and assessment procedures and for teachers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose/aim Myopia incidence is increasing around the world. Myopisation is considered to be caused by a variety of factors. One consideration is whether higher-order aberrations (HOA) influence myopisation. More knowledge of optics in anisometropic eyes might give further insight into the development of refractive error. Materials and methods To analyse the possible influence of HOA on refractive error development, we compared HOA between anisometropes and isometropes. We analysed HOA up to the 4th order for both eyes of 20 anisometropes (mean age: 43 ± 17 years) and 20 isometropes (mean age: 33 ±17 years). HOA were measured with the Shack-Hartman i.Profiler (Carl Zeiss, Germany) and were recalculated for a 4 mm pupil. Mean spherical equivalent (MSE) was based on the subjective refraction. Anisometropia was defined as ≥1D interocular difference in MSE. The mean absolute differences between right and left eyes in spherical equivalent were 0.28 ± 0.21 D in the isometropic group and 2.81 ± 2.04 D in the anisometropic group. Interocular differences in HOA were compared with the interocular difference in MSE using correlations. Results For isometropes oblique trefoil, vertical coma, horizontal coma and spherical aberration showed significant correlations between the two eyes. In anisometropes all analysed higher-order aberrations correlated significantly between the two eyes except oblique secondary astigmatism and secondary astigmatism. When analysing anisometropes and isometropes separately, no significant correlations were found between interocular differences of higher-order aberrations and MSE. For isometropes and anisometropes combined, tetrafoil correlated significantly with MSE in left eyes. Conclusions The present study could not show that interocular differences of higher-order aberrations increase with increasing interocular difference in MSE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interpreting acoustic recordings of the natural environment is an increasingly important technique for ecologists wishing to monitor terrestrial ecosystems. Technological advances make it possible to accumulate many more recordings than can be listened to or interpreted, thereby necessitating automated assistance to identify elements in the soundscape. In this paper we examine the problem of estimating avian species richness by sampling from very long acoustic recordings. We work with data recorded under natural conditions and with all the attendant problems of undefined and unconstrained acoustic content (such as wind, rain, traffic, etc.) which can mask content of interest (in our case, bird calls). We describe 14 acoustic indices calculated at one minute resolution for the duration of a 24 hour recording. An acoustic index is a statistic that summarizes some aspect of the structure and distribution of acoustic energy and information in a recording. Some of the indices we calculate are standard (e.g. signal-to-noise ratio), some have been reported useful for the detection of bioacoustic activity (e.g. temporal and spectral entropies) and some are directed to avian sources (spectral persistence of whistles). We rank the one minute segments of a 24 hour recording in descending order according to an "acoustic richness" score which is derived from a single index or a weighted combination of two or more. We describe combinations of indices which lead to more efficient estimates of species richness than random sampling from the same recording, where efficiency is defined as total species identified for given listening effort. Using random sampling, we achieve a 53% increase in species recognized over traditional field surveys and an increase of 87% using combinations of indices to direct the sampling. We also demonstrate how combinations of the same indices can be used to detect long duration acoustic events (such as heavy rain and cicada chorus) and to construct long duration (24 h) spectrograms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article examines the evidence of endemic financial crime in the global financial crisis (GFC), the legal impunity surrounding these crimes and the popular revolt against these abuses in the financial, political and legal systems. This is set against a consideration of the development since the 1970s of a conservative politics championing de-regulation, unfettered markets, welfare cuts and harsh law and order policies. On the one hand, this led to massively increased inequality and concentrations of wealth and political power in the hands of the super-rich, effectively placing them above the law, as the GFC revealed. On the other, a greatly enlarged, more punitive criminal justice system was directed at poor and minority communities. Explanations in terms of the rise of penal populism are helpful in explaining these developments, but it is argued they adopt a limited and reductionist view of populism, failing to see the prospects for a progressive populist politics to re-direct political attention to issues of inequality and corporate and white collar criminality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In our rejoinder to Don Weatherburn's paper,"Law and Order Blues", we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in "Law and Order Blues". We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostics of rotating machinery has developed significantly in the last decades, and industrial applications are spreading in different sectors. Most applications are characterized by varying velocities of the shaft and in many cases transients are the most critical to monitor. In these variable speed conditions, fault symptoms are clearer in the angular/order domains than in the common time/frequency ones. In the past, this issue was often solved by synchronously sampling data by means of phase locked circuits governing the acquisition; however, thanks to the spread of cheap and powerful microprocessors, this procedure is nowadays rarer; sampling is usually performed at constant time intervals, and the conversion to the order domain is made by means of digital signal processing techniques. In the last decades different algorithms have been proposed for the extraction of an order spectrum from a signal sampled asynchronously with respect to the shaft rotational velocity; many of them (the so called computed order tracking family) use interpolation techniques to resample the signal at constant angular increments, followed by a common discrete Fourier transform to shift from the angular domain to the order domain. A less exploited family of techniques shifts directly from the time domain to the order spectrum, by means of modified Fourier transforms. This paper proposes a new transform, named velocity synchronous discrete Fourier transform, which takes advantage of the instantaneous velocity to improve the quality of its result, reaching performances that can challenge the computed order tracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyclostationary models for the diagnostic signals measured on faulty rotating machineries have proved to be successful in many laboratory tests and industrial applications. The squared envelope spectrum has been pointed out as the most efficient indicator for the assessment of second order cyclostationary symptoms of damages, which are typical, for instance, of rolling element bearing faults. In an attempt to foster the spread of rotating machinery diagnostics, the current trend in the field is to reach higher levels of automation of the condition monitoring systems. For this purpose, statistical tests for the presence of cyclostationarity have been proposed during the last years. The statistical thresholds proposed in the past for the identification of cyclostationary components have been obtained under the hypothesis of having a white noise signal when the component is healthy. This need, coupled with the non-white nature of the real signals implies the necessity of pre-whitening or filtering the signal in optimal narrow-bands, increasing the complexity of the algorithm and the risk of losing diagnostic information or introducing biases on the result. In this paper, the authors introduce an original analytical derivation of the statistical tests for cyclostationarity in the squared envelope spectrum, dropping the hypothesis of white noise from the beginning. The effect of first order and second order cyclostationary components on the distribution of the squared envelope spectrum will be quantified and the effectiveness of the newly proposed threshold verified, providing a sound theoretical basis and a practical starting point for efficient automated diagnostics of machine components such as rolling element bearings. The analytical results will be verified by means of numerical simulations and by using experimental vibration data of rolling element bearings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The transmission path from the excitation to the measured vibration on the surface of a mechanical system introduces a distortion both in amplitude and in phase. Moreover, in variable speed conditions, the amplification/attenuation and the phase shift, due to the transfer function of the mechanical system, varies in time. This phenomenon reduces the effectiveness of the traditionally tachometer based order tracking, compromising the results of a discrete-random separation performed by a synchronous averaging. In this paper, for the first time, the extent of the distortion is identified both in the time domain and in the order spectrum of the signal, highlighting the consequences for the diagnostics of rotating machinery. A particular focus is given to gears, providing some indications on how to take advantage of the quantification of the disturbance to better tune the techniques developed for the compensation of the distortion. The full theoretical analysis is presented and the results are applied to an experimental case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shanghai possesses an apt legacy, once referred to as “Paris of the East”. Municipal aspirations for Shanghai to assume a position among the great fashion cities of the world have been integrated in the recent re-shaping of this modern city into a role model for Chinese creative enterprise yet China is still known primarily as centre of clothing production. Increasingly however, “Made in China” is being replaced by “Created in China” drawing attention to two distinct consumer markets for Chinese designers. Fashion designers who have entered the global fashion system for education or by showing their collections have generally adopted a design aesthetic that aligns with Western markets, allowing little competitive advantage. In contrast, Chinese designers who rest their attention on the domestic Chinese market find a disparate, highly competitive marketplace. The pillars of authenticity that for foreign fashion brands extend far into their cultural and creative histories, often for many decades in the case of Louis Vuitton, Hermes and Christian Dior do not yet exist in China in this era of rapid globalisation. Here, the cultural bedrock allows these same pillars to extend only thirty years or so into the past reaching the moments when Deng Xiaoping granted China’s creative entrepreneurs passage. To this end, interviews with fashion designers in Shanghai have been undertaken during the last twelve months for a PhD dissertation. Production of culture theory has been used to identify working methods, practices of production and the social and cultural milieu necessary for designers to achieve viability.