64 resultados para Wheeler, Ephraim.
Resumo:
Guest editorial
Resumo:
Logic-based models are thriving within artificial intelligence. A great number of new logics have been defined, and their theory investigated. Epistemic logics introduce modal operators for knowledge or belief; deontic logics are about norms, and introduce operators of deontic necessity and possibility (i.e., obligation or prohibition). And then we have a much investigated class—temporal logics—to whose application to engineering this special issue is devoted. This kind of formalism deserves increased widespread recognition and application in engineering, a domain where other kinds of temporal models (e.g., Petri nets) are by now a fairly standard part of the modelling toolbox.
Resumo:
The attachment of electronic components to printed circuit boards using solder material is a complex process. This paper presents a novel modeling methodology, which integrates the governing physics taking place. Multiphysics modeling technology, imbedded into the simulation tool—PHYSICA is used to simulate fluid flow, heat transfer, solidification, and stress evolution in an integrated manner. Results using this code are presented, detailing the mechanical response of two solder materials as they cool, solidify and then deform. The shape that a solder joint takes upon melting is predicted using the SURFACE EVOLVER code. Details are given on how these predictions can be used in the PHYSICA code to provide a modeling route by which the shape, solidification history, and resulting stress profiles can be predicted.
Resumo:
Surface tension induced flow is implemented into a numerical modelling framework and validated for a number of test cases. Finite volume unstructured mesh techniques are used to discretize the mass, momentum and energy conservation equations in three dimensions. An explicit approach is used to include the effect of surface tension forces on the flow profile and final shape of a liquid domain. Validation of this approach is made against both analytical and experimental data. Finally, the method is used to model the wetting balance test for solder alloy material, where model predictions are used to gain a greater insight into this process. Copyright © 2000 John Wiley & Sons, Ltd.
Resumo:
This paper describes modelling technology and its use in providing data governing the assembly of flip-chip components. Details are given on the reflow and curing stages as well as the prediction of solder joint shapes. The reflow process involves the attachment of a die to a board via solder joints. After a reflow process, underfill material is placed between the die and the substrate where it is heated and cured. Upon cooling the thermal mismatch between the die, underfill, solder bumps, and substrate will result in a nonuniform deformation profile across the assembly and hence stress. Shape predictions then thermal solidification and stress prediction are undertaken on solder joints during the reflow process. Both thermal and stress calculations are undertaken to predict phenomena occurring during the curing of the underfill material. These stresses may result in delamination between the underfill and its surrounding materials leading to a subsequent reduction in component performance and lifetime. Comparisons between simulations and experiments for die curvature will be given for the reflow and curing process
Resumo:
Sometimes, technological solutions to practical problems are devised that conspicuously take into account the constraints to which a given culture is subjecting the particular task or the manner in which it is carried out. The culture may be a professional culture (e.g., the practice of law), or an ethnic-cum-professional culture (e.g., dance in given ethnic cultures from South-East Asia), or, again, a denominational culture prescribing an orthopraxy impinging on everyday life through, for example, prescribed abstinence from given categories of workday activities, or dietary laws. Massimo Negrotti's Theory of the artificial is a convenient framework for discussing some of these techniques. We discuss a few examples, but focus on the contrast of two that are taken from the same cultural background, namely, technological applications in compliance with Jewish Law orthopraxy. •Soya-, mycoprotein- or otherwise derived meat surrogates are an example ofnaturoid; they emulate the flavours and olfactory properties, as well as the texture and the outer and inner appearance, of the meat product (its kind, cut, form) they set out to emulate (including amenability to cooking in the usual manner for the model), while satisfying cultural dietary prohibitions. •In contrast, the Sabbath Notebook, a writing surrogate we describe in this paper, is atechnoid: it emulates a technique (writing to store alphanumeric information), while satisfying the prohibition of writing at particular times of the liturgical calendar (the Sabbath and the major holidays).
Resumo:
Editorial
Resumo:
In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.