948 resultados para Random matrix theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusion techniques have received considerable attention for achieving performance improvement with biometrics. While a multi-sample fusion architecture reduces false rejects, it also increases false accepts. This impact on performance also depends on the nature of subsequent attempts, i.e., random or adaptive. Expressions for error rates are presented and experimentally evaluated in this work by considering the multi-sample fusion architecture for text-dependent speaker verification using HMM based digit dependent speaker models. Analysis incorporating correlation modeling demonstrates that the use of adaptive samples improves overall fusion performance compared to randomly repeated samples. For a text dependent speaker verification system using digit strings, sequential decision fusion of seven instances with three random samples is shown to reduce the overall error of the verification system by 26% which can be further reduced by 6% for adaptive samples. This analysis novel in its treatment of random and adaptive multiple presentations within a sequential fused decision architecture, is also applicable to other biometric modalities such as finger prints and handwriting samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended theory of planned behavior (TPB) was used to predict young people’s intentions to donate money to charities in the future. Students (N = 210; 18-24 years) completed a questionnaire assessing their attitude, subjective norm, perceived behavioral control [PBC], moral obligation, past behavior and intentions toward donating money. Regression analyses revealed the extended TPB explained 61% of the variance in intentions to donate money. Attitude, PBC, moral norm, and past behavior predicted intentions, representing future targets for charitable giving interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unmanned Aircraft Systems (UAS) describe a diverse range of aircraft that are operated without a human pilot on-board. Unmanned aircraft range from small rotorcraft, which can fit in the palm of your hand, through to fixed wing aircraft comparable in size to that of a commercial passenger jet. The absence of a pilot on-board allows these aircraft to be developed with unique performance capabilities facilitating a wide range of applications in surveillance, environmental management, agriculture, defence, and search and rescue. However, regulations relating to the safe design and operation of UAS first need to be developed before the many potential benefits from these applications can be realised. According to the International Civil Aviation Organization (ICAO), a Risk Management Process (RMP) should support all civil aviation policy and rulemaking activities (ICAO 2009). The RMP is described in International standard, ISO 31000:2009 (ISO, 2009a). This standard is intentionally generic and high-level, providing limited guidance on how it can be effectively applied to complex socio-technical decision problems such as the development of regulations for UAS. Through the application of principles and tools drawn from systems philosophy and systems engineering, this thesis explores how the RMP can be effectively applied to support the development of safety regulations for UAS. A sound systems-theoretic foundation for the RMP is presented in this thesis. Using the case-study scenario of a UAS operation over an inhabited area and through the novel application of principles drawn from general systems modelling philosophy, a consolidated framework of the definitions of the concepts of: safe, risk and hazard is made. The framework is novel in that it facilitates the representation of broader subjective factors in an assessment of the safety of a system; describes the issues associated with the specification of a system-boundary; makes explicit the hierarchical nature of the relationship between the concepts and the subsequent constraints that exist between them; and can be evaluated using a range of analytic or deliberative modelling techniques. Following the general sequence of the RMP, the thesis explores the issues associated with the quantified specification of safety criteria for UAS. A novel risk analysis tool is presented. In contrast to existing risk tools, the analysis tool presented in this thesis quantifiably characterises both the societal and individual risk of UAS operations as a function of the flight path of the aircraft. A novel structuring of the risk evaluation and risk treatment decision processes is then proposed. The structuring is achieved through the application of the Decision Support Problem Technique; a modelling approach that has been previously used to effectively model complex engineering design processes and to support decision-making in relation to airspace design. The final contribution made by this thesis is in the development of an airworthiness regulatory framework for civil UAS. A novel "airworthiness certification matrix" is proposed as a basis for the definition of UAS "Part 21" regulations. The outcome airworthiness certification matrix provides a flexible, systematic and justifiable method for promulgating airworthiness regulations for UAS. In addition, an approach for deriving "Part 1309" regulations for UAS is presented. In contrast to existing approaches, the approach presented in this thesis facilitates a traceable and objective tailoring of system-level reliability requirements across the diverse range of UAS operations. The significance of the research contained in this thesis is clearly demonstrated by its practical real world outcomes. Industry regulatory development groups and the Civil Aviation Safety Authority have endorsed the proposed airworthiness certification matrix. The risk models have also been used to support research undertaken by the Australian Department of Defence. Ultimately, it is hoped that the outcomes from this research will play a significant part in the shaping of regulations for civil UAS, here in Australia and around the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tissue-specific extracellular matrix (ECM) is known to be an ideal bioscaffold to inspire the future of regenerative medicine. It holds the secret of how nature has developed such an organization of molecules into a unique functional complexity. This work exploited an innovative image processing algorithm and high resolution microscopy associated with mechanical analysis to establish a correlation between the gradient organization of cartiligous ECM and its anisotropic biomechanical response. This was hypothesized to be a reliable determinant that can elucidate how microarchitecture interrelates with biomechanical properties. Hough-Radon transform of the ECM cross-section images revealed its conformational variation from tangential interface down to subchondral region. As the orientation varied layer by layer, the anisotropic mechanical response deviated relatively. Although, results were in good agreement (Kendall's tau-b > 90%), there were evidences proposing that alignment of the fibrous network, specifically in middle zone, is not as random as it was previously thought.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Communication between cultures that do not share similar norms, values, beliefs, experiences, attitudes and practices has long proven to be a difficult exercise (Balsmeier & Heck, 1994). These difficulties can have serious consequences when the miscommunication happens in the justice system; the innocent can be convicted and witnesses undermined. Much work has been carried out on the need for better communication in the courtroom (Eades, 1993; Lauchs, 2010; Supreme Court of Queensland, 2010; Supreme Court of Western Australia, 2008) but far less on language and interactions between police and indigenous Australians (Powell, 2000). It is ethically necessary that officers of the law be made aware of linguistic issues to ensure they conduct their investigations in a fair and effective manner. Despite years of awareness raising issues still arise. Issues of clashes between police and indigenous peoples are still prevalent (Heath, 2012; Remeikis, 2012). This paper will attempt to explain the reason for this discrepancy and, in doing so, suggest some solutions to the problem. This paper draws on cultural schema theory in an attempt to determine if cultural difference in language could be negatively affecting communication between Aboriginal people and the police of South East Queensland. Findings from this research are significant in determining if miscommunication is adding to the already unequal standing of Aboriginal people within the Criminal Justice system, and encouraging the already volatile relationship between Aboriginal people and police.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing practical rules for controlling invasive species is a challenging task for managers, particularly when species are long-lived, have complex life cycles and high dispersal capacities. Previous findings derived from plant matrix population analyses suggest that effective control of long-lived invaders may be achieved by focusing on killing adult plants. However, the cost-effectiveness of managing different life stages has not been evaluated. We illustrate the benefits of integrating matrix population models with decision theory to undertake this evaluation, using empirical data from the largest infestation of mesquite (Leguminosae: Prosopis spp) within Australia. We include in our model the mesquite life cycle, different dispersal rates and control actions that target individuals at different life stages with varying costs, depending on the intensity of control effort. We then use stochastic dynamic programming to derive cost-effective control strategies that minimize the cost of controlling the core infestation locally below a density threshold and the future cost of control arising from infestation of adjacent areas via seed dispersal. Through sensitivity analysis, we show that four robust management rules guide the allocation of resources between mesquite life stages for this infestation: (i) When there is no seed dispersal, no action is required until density of adults exceeds the control threshold and then only control of adults is needed; (ii) when there is seed dispersal, control strategy is dependent on knowledge of the density of adults and large juveniles (LJ) and broad categories of dispersal rates only; (iii) if density of adults is higher than density of LJ, controlling adults is most cost-effective; (iv) alternatively, if density of LJ is equal or higher than density of adults, management efforts should be spread between adults, large and to a lesser extent small juveniles, but never saplings. Synthesis and applications.In this study, we show that simple rules can be found for managing invasive plants with complex life cycles and high dispersal rates when population models are combined with decision theory. In the case of our mesquite population, focussing effort on controlling adults is not always the most cost-effective way to meet our management objective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concerns regarding students' learning and reasoning in chemistry classrooms are well documented. Students' reasoning in chemistry should be characterized by conscious consideration of chemical phenomenon from laboratory work at macroscopic, molecular/sub-micro and symbolic levels. Further, students should develop metacognition in relation to such ways of reasoning about chemistry phenomena. Classroom change eliciting metacognitive experiences and metacognitive reflection is necessary to shift entrenched views of teaching and learning in students. In this study, Activity Theory is used as the framework for intepreting changes to the rules/customs and tools of the activity systems of two different classes of students taught by the same teacher, Frances, who was teaching chemical equilibrium to those classes in consecutive years. An interpretive methodolgy involving multiple data sources was employed. Frances explicitly changed her pedagogy in the second year to direct students attention to increasingly consider chemical phenomena at the molecular/sub-micro level. Additonally, she asked students not to use the textbook until toward the end of the equilibrium unit and sought to engage them in using their prior knowledge of chemistry to understand their observations from experiments. Frances' changed pedagogy elicited metacognitive experiences and reflection in students and challenged them to reconsider their metacognitive beliefs about learning chemistry and how it might be achieved. While teacher change is essential for science education reform, students are not passive players in the change efforts and they need to be convinced of the viability of teacher pedagogical change in the context of their goals, intentions, and beliefs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This book sensitizes the reader to the fact that there is substantial disagreement within the academic community, and among policymakers and the general public, over what behaviors, conditions (e.g., physical attributes), and people should be designated as deviant or criminal. Normative conceptions, the societal reaction/labeling approach, and the critical approach are offered as frameworks within which to study these definitions. A comprehensive explanation of theory and social policy on deviance is constructed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advancing Critical Criminology constitutes a timely addition to the growing body of knowledge on critical criminology scholarship. DeKeseredy and Perry have assembled a volume that provides scholars with an in-depth review of the extant literature on several major branches of criminology as well as examples of how critical criminologists apply their theoretical perspectives to substantive topics, such as drugs, interpersonal violence, and rural crime. Accordingly, this work is divided into two main sections: overviews of theories and applications. Each chapter provides a summary of work in a specific area, along with suggestions for moving the field forward. This reader is unique in its choice of topics, which have often been overlooked in the past. An expert collection of international scholars, Advancing Critical Criminology is certain to stimulate lively debates and generate further critical social scientific work in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Left realists contend that people lacking legitimate means of solving the problem of relative deprivation may come into contact with other frustrated disenfranchised people and form subcultures, which in turn, encourage criminal behaviors. Absent from this theory is an attempt to address how, today, subcultural development in North America and elsewhere is heavily shaped simultaneously by the recent destructive consequences of right-wing Friedman or Chicago School economic policies and marginalized men's attempts to live up to the principles of hegemonic masculinity. The purpose of this paper, then, is to offer a new left realist theory that emphasizes the contribution of these two key determinants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"There once was a man who aspired to be the author of the general theory of holes. When asked ‘What kind of hole—holes dug by children in the sand for amusement, holes dug by gardeners to plant lettuce seedlings, tank traps, holes made by road makers?’ he would reply indignantly that he wished for a general theory that would explain all of these. He rejected ab initio the—as he saw it—pathetically common-sense view that of the digging of different kinds of holes there are quite different kinds of explanations to be given; why then he would ask do we have the concept of a hole? Lacking the explanations to which he originally aspired, he then fell to discovering statistically significant correlations; he found for example that there is a correlation between the aggregate hole-digging achievement of a society as measured, or at least one day to be measured, by econometric techniques, and its degree of techno- logical development. The United States surpasses both Paraguay and Upper Volta in hole-digging; there are more holes in Vietnam than there were. These observations, he would always insist, were neutral and value-free. This man’s achievement has passed totally unnoticed except by me. Had he however turned his talents to political science, had he concerned himself not with holes, but with modernization, urbanization or violence, I find it difficult to believe that he might not have achieved high office in the APSA." (MacIntyre 1971, 260)