7 resultados para STATISTICAL-METHODS

em Greenwich Academic Literature Archive - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches, which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, I reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: 'The JAMA Model and Narrative Interpretation Patterns'), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability were infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for Artificial Intelligence (AI) researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially Bayesian probability) in accounts of evidence has been flourishing among legal scholars; nowadays both the Bayesians (e.g. Peter Tillers) and the Bayesio-skeptics (e.g. Ron Allen), among those legal scholars who are involved in the controversy, are willing to give AI research a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application of probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making. Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliability of electronic parts is a major concern for many manufacturers, since early failures in the field can cost an enormous amount to repair - in many cases far more than the original cost of the product. A great deal of effort is expended by manufacturers to determine the failure rates for a process or the fraction of parts that will fail in a period of time. It is widely recognized that the traditional approach to reliability predictions for electronic systems are not suitable for today's products. This approach, based on statistical methods only, does not address the physics governing the failure mechanisms in electronic systems. This paper discusses virtual prototyping technologies which can predict the physics taking place and relate this to appropriate failure mechanisms. Simulation results illustrate the effect of temperature on the assembly process of an electronic package and the lifetime of a flip-chip package.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A computational modelling approach integrated with optimisation and statistical methods that can aid the development of reliable and robust electronic packages and systems is presented. The design for reliability methodology is demonstrated for the design of a SiP structure. In this study the focus is on the procedure for representing the uncertainties in the package design parameters, their impact on reliability and robustness of the package design and how these can be included in the design optimisation modelling framework. The analysis of thermo-mechanical behaviour of the package is conducted using non-linear transient finite element simulations. Key system responses of interest, the fatigue life-time of the lead-free solder interconnects and warpage of the package, are predicted and used subsequently for design purposes. The design tasks are to identify the optimal SiP designs by varying several package input parameters so that the reliability and the robustness of the package are improved and in the same time specified performance criteria are also satisfied

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the development of new advanced technologies in the area of micro and nano systems. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to provide knowledge of how a pre-defined geometry can be achieved through this direct milling. The geometry characterisation is obtained using a Reduced Order Models (ROM), generated from the results of a mathematical model of the Focused Ion Beam, and Design of Experiment (DoE) methods. In this work, the focus is on the design flow methodology which includes an approach on how to include process parameter uncertainties into the process optimisation modelling framework. A discussion on the impact of the process parameters, and their variations, on the quality and performance of the fabricated structure is also presented. The design task is to identify the optimal process conditions, by altering the process parameters, so that certain reliability and confidence of the application is achieved and the imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper looks at the application of some of the assessment methods in practice with the view to enhance students’ learning in mathematics and statistics. It explores the effective application of assessment methods and highlights the issues or problems, and ways of avoiding them, related to some of the common methods of assessing mathematical and statistical learning. Some observations made by the author on good assessment practice and useful approaches employed at his institution in designing and applying assessment methods are discussed. Successful strategies in implementing assessment methods at different levels are described.