399 resultados para default probability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fables of sovereignty / Wayne Hudson Sovereignty discourse and practice : past and future / Joseph Camilleri Guises of sovereignty / Gerry Simpson Westphalian and Islamic concepts of sovereignty in the Middle East / Amin Saikal Wither sovereignty in Southeast Asia today? / See Seng Tan Ambivalent sovereignty : China and re-imagining the Westphalian ideal / Yongjin Zhang Confronting terrorism : dilemmas of principle and practice regarding sovereignty / Brian L. Job Sovereignty in the 21st century : security, immigration, and refugees / Howard Adelman State sovereignty and international refugee protection / Robyn Lui Do no harm : towards a Hippocratic standard for international civilisation / Neil Arya Sovereignty and the global politics of the environment : beyond Westphalia? / Lorraine Elliott Westphalian sovereignty in the shadow of international justice? a fresh coat of paint for a tainted concept / Jackson Nyamuya Maogoto Development assistance and the hollow sovereignty of the weak / Roland Rich Corruption and transparency in governance and development : reinventing sovereignty for promoting good governance / C. Raj Kumar Re-envisioning economic sovereignty : developing countries and the International Monetary Fund / Ross P. Buckley Trust, legitimacy, and the sharing of sovereignty / William Maley Sovereignty as indirect rule / Barry Hindess Indigenous sovereignty / Paul Keal Civil society in a post-statist circumstance / Jan Aart Scholte.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most statistical methods use hypothesis testing. Analysis of variance, regression, discrete choice models, contingency tables, and other analysis methods commonly used in transportation research share hypothesis testing as the means of making inferences about the population of interest. Despite the fact that hypothesis testing has been a cornerstone of empirical research for many years, various aspects of hypothesis tests commonly are incorrectly applied, misinterpreted, and ignored—by novices and expert researchers alike. On initial glance, hypothesis testing appears straightforward: develop the null and alternative hypotheses, compute the test statistic to compare to a standard distribution, estimate the probability of rejecting the null hypothesis, and then make claims about the importance of the finding. This is an oversimplification of the process of hypothesis testing. Hypothesis testing as applied in empirical research is examined here. The reader is assumed to have a basic knowledge of the role of hypothesis testing in various statistical methods. Through the use of an example, the mechanics of hypothesis testing is first reviewed. Then, five precautions surrounding the use and interpretation of hypothesis tests are developed; examples of each are provided to demonstrate how errors are made, and solutions are identified so similar errors can be avoided. Remedies are provided for common errors, and conclusions are drawn on how to use the results of this paper to improve the conduct of empirical research in transportation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AfL practices observed in case studies in a North Queensland school were analysed from a sociocultural theoretical perspective. AfL practices of feedback, dialogue and peer assessment were viewed as an opportunity for students to learn the social expectations about being an autonomous learner, or central participant within the classroom community of practice. This process of becoming more expert and belonging within the community of practice involved students negotiating identities of participation that included knowing both academic skills and social expectations within the classroom. This paper argues that when AfL practices are viewed as ways of enhancing participation, there is potential for learners to negotiate identities as autonomous learners. AfL practices within the daily classroom interactions and pedagogy that enabled students to develop a shared repertoire, joint enterprise and mutual engagement in the classroom communities of practice are described. The challenges for teachers in shifting their gaze to patterns of participation are also briefly discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper puts forward a proposal for reviewing the role and purpose of standards in the context of national curriculum and assessment reform more generally. It seeks to commence the much-needed conversation about standards in the work of teachers as distinct from large-scale testing companies and the policy personnel responsible for reporting. Four key conditions that relate to the effective use of standards to measure improvement and support learning are analysed: clarity about purpose and function; understanding of the representation of standards; moderation practice; and the assessment community. The Queensland experience of the use of standards, teacher judgement and moderation is offered to identify what is educationally preferable in terms of their use and their relationships to curriculum, improvement and accountability. The article illustrates how these practices have recently been challenged by emerging political constraints related to the Australian Government’s implementation of national testing and national partnership funding arrangements tied to the performance of students at or below minimum standards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bearing damage in modern inverter-fed AC drive systems is more common than in motors working with 50 or 60 Hz power supply. Fast switching transients and common mode voltage generated by a PWM inverter cause unwanted shaft voltage and resultant bearing currents. Parasitic capacitive coupling creates a path to discharge current in rotors and bearings. In order to analyze bearing current discharges and their effect on bearing damage under different conditions, calculation of the capacitive coupling between the outer and inner races is needed. During motor operation, the distances between the balls and races may change the capacitance values. Due to changing of the thickness and spatial distribution of the lubricating grease, this capacitance does not have a constant value and is known to change with speed and load. Thus, the resultant electric field between the races and balls varies with motor speed. The lubricating grease in the ball bearing cannot withstand high voltages and a short circuit through the lubricated grease can occur. At low speeds, because of gravity, balls and shaft voltage may shift down and the system (ball positions and shaft) will be asymmetric. In this study, two different asymmetric cases (asymmetric ball position, asymmetric shaft position) are analyzed and the results are compared with the symmetric case. The objective of this paper is to calculate the capacitive coupling and electric fields between the outer and inner races and the balls at different motor speeds in symmetrical and asymmetrical shaft and balls positions. The analysis is carried out using finite element simulations to determine the conditions which will increase the probability of high rates of bearing failure due to current discharges through the balls and races.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The presence of insects in stored grains is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspections of bulk grain commodities is essential to detect pests and therefore to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grains, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper we demonstrate a sampling methodology that accounts for the heterogeneous distribution of insects in bulk grains. RESULTS: We show that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling program to detect insects in bulk grains. Our results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. We also demonstrate that the probability of detecting pests in bulk grains increases as the number of sub-samples increases, even when the total volume or mass of grain sampled remains constant. CONCLUSION: This study demonstrates the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models.