891 resultados para Problem of evil
Resumo:
The simple quasi-steady analysis of the combustion of a liquid fuel droplet in an oxidising atmosphere provides unsatisfactory explanations for several experimental observations. It's prediction of values for the burning constant (K), the flame-to-droplet diameter ratio ( ) and the flame temperature (Tf) have been found to be amgibuous if not completely inaccurate. A critical survey of the literature has led us to a detailed examination of the effects of unsteadiness and variable properties. The work published to date indicates that the gas-phase unsteadiness is relatively short and therefore quite insignificant.A new theoretical analysis based on heat transfer within the droplet is presented here. It shows that the condensed-phase unsteadiness lasts for about 20â??25% of the total burning time. It is concluded that the discrepancies between experimental observations and the predictions of the constant-property quasi-steady analysis cannot be attributed either to gas-phase or condensed-phase unsteadiness.An analytical model of quasi-steady droplet combustion with variable thermodynamic and transport properties and non-unity Lewis numbers will be examined. Further findings reveal a significant improvement in the prediction of combustion parameters, particularly of K, when consideration is given to variations of cp and λ with the temperature and concentrations of several species. Tf is accurately predicted when the required conditions of incomplete combustion or low ( ) at the flame are met. Further refinement through realistic Lewis numbers predicts ( ) meaningfully.
Resumo:
Non-government actors such as think-tanks are playing an important role in Australian policy work. As governments increasingly outsource policy work previously done by education departments and academics to these new policy actors, more think-tanks have emerged that represent a wide range of political views and ideological positions. This paper looks at the emergence of the Grattan Institute as one significant player in Australian education policy with a particular emphasis on Grattan’s report ‘Turning around low-performing schools’. Grattan exemplifies many of the facets of Barber’s ‘deliverology’, as they produce reports designed to be easily digested, simply actioned and provide reassurance that there is an answer, often through focusing on ‘what works’ recipes. ‘Turning around low-performing schools’ is a perfect example of this deliverology. However, a close analysis of the Report suggests that it contains four major problems which seriously impact its usefulness for schools and policymakers: it ignores data that may be more important in explaining the turn-around of schools, the Report is overly reliant on NAPLAN data, there are reasons to be suspicious about the evidence assembled, and finally the Report falls into a classic trap of logic—the post hoc fallacy.
Resumo:
Evidence-based policy is a means of ensuring that policy is informed by more than ideology or expedience. However, what constitutes robust evidence is highly contested. In this paper, we argue policy must draw on quantitative and qualitative data. We do this in relation to a long entrenched problem in Australian early childhood education and care (ECEC) workforce policy. A critical shortage of qualified staff threatens the attainment of broader child and family policy objectives linked to the provision of ECEC and has not been successfully addressed by initiatives to date. We establish some of the limitations of existing quantitative data sets and consider the potential of qualitative studies to inform ECEC workforce policy. The adoption of both quantitative and qualitative methods is needed to illuminate the complex nature of the work undertaken by early childhood educators, as well as the environmental factors that sustain job satisfaction in a demanding and poorly understood working environment.
Resumo:
A Finite Element Method based forward solver is developed for solving the forward problem of a 2D-Electrical Impedance Tomography. The Method of Weighted Residual technique with a Galerkin approach is used for the FEM formulation of EIT forward problem. The algorithm is written in MatLAB7.0 and the forward problem is studied with a practical biological phantom developed. EIT governing equation is numerically solved to calculate the surface potentials at the phantom boundary for a uniform conductivity. An EIT-phantom is developed with an array of 16 electrodes placed on the inner surface of the phantom tank filled with KCl solution. A sinusoidal current is injected through the current electrodes and the differential potentials across the voltage electrodes are measured. Measured data is compared with the differential potential calculated for known current and solution conductivity. Comparing measured voltage with the calculated data it is attempted to find the sources of errors to improve data quality for better image reconstruction.
Resumo:
According to certain arguments, computation is observer-relative either in the sense that many physical systems implement many computations (Hilary Putnam), or in the sense that almost all physical systems implement all computations (John Searle). If sound, these arguments have a potentially devastating consequence for the computational theory of mind: if arbitrary physical systems can be seen to implement arbitrary computations, the notion of computation seems to lose all explanatory power as far as brains and minds are concerned. David Chalmers and B. Jack Copeland have attempted to counter these relativist arguments by placing certain constraints on the definition of implementation. In this thesis, I examine their proposals and find both wanting in some respects. During the course of this examination, I give a formal definition of the class of combinatorial-state automata , upon which Chalmers s account of implementation is based. I show that this definition implies two theorems (one an observation due to Curtis Brown) concerning the computational power of combinatorial-state automata, theorems which speak against founding the theory of implementation upon this formalism. Toward the end of the thesis, I sketch a definition of the implementation of Turing machines in dynamical systems, and offer this as an alternative to Chalmers s and Copeland s accounts of implementation. I demonstrate that the definition does not imply Searle s claim for the universal implementation of computations. However, the definition may support claims that are weaker than Searle s, yet still troubling to the computationalist. There remains a kernel of relativity in implementation at any rate, since the interpretation of physical systems seems itself to be an observer-relative matter, to some degree at least. This observation helps clarify the role the notion of computation can play in cognitive science. Specifically, I will argue that the notion should be conceived as an instrumental rather than as a fundamental or foundational one.
Resumo:
Ingarden (1962, 1964) postulates that artworks exist in an “Objective purely intentional” way. According to this view, objectivity and subjectivity are opposed forms of existence, parallel to the opposition between realism and idealism. Using arguments of cognitive science, experimental psychology, and semiotics, this lecture proposes that, particularly in the aesthetic phenomena, realism and idealism are not pure oppositions; rather they are aspects of a single process of cognition in different strata. Furthermore, the concept of realism can be conceived as an empirical extreme of idealism, and the concept of idealism can be conceived as a pre-operative extreme of realism. Both kind of systems of knowledge are mutually associated by a synecdoche, performing major tasks of mental order and categorisation. This contribution suggests that the supposed opposition between objectivity and subjectivity, raises, first of all, a problem of translatability, more than a problem of existential categories. Synecdoche seems to be a very basic transaction of the mind, establishing ontologies (in the more Ingardean way of the term). Wegrzecki (1994, 220) defines ontology as “the central domain of philosophy to which other its parts directly or indirectly refer”. Thus, ontology operates within philosophy as the synecdoche does within language, pointing the sense of the general into the particular and/or viceversa. The many affinities and similarities between different sign systems, like those found across the interrelationships of the arts, are embedded into a transversal, synecdochic intersemiosis. An important question, from this view, is whether Ingardean’s pure objectivities lie basically on the impossibility of translation, therefore being absolute self-referential constructions. In such a case, it would be impossible to translate pure intentionality into something else, like acts or products.
Resumo:
We still know little of why strategy processes often involve participation problems. In this paper, we argue that this crucial issue is linked to fundamental assumptions about the nature of strategy work. Hence, we need to examine how strategy processes are typically made sense of and what roles are assigned to specific organizational members. For this purpose, we adopt a critical discursive perspective that allows us to discover how specific conceptions of strategy work are reproduced and legitimized in organizational strategizing. Our empirical analysis is based on an extensive research project on strategy work in 12 organizations. As a result of our analysis, we identify three central discourses that seem to be systematically associated with nonparticipatory approaches to strategy work: “mystification,” “disciplining,” and “technologization.” However, we also distinguish three strategy discourses that promote participation: “self-actualization,” “dialogization,” and “concretization.” Our analysis shows that strategy as practice involves alternative and even competing discourses that have fundamentally different kinds of implications for participation in strategy work. We argue from a critical perspective that it is important to be aware of the inherent problems associated with dominant discourses as well as to actively advance the use of alternative ones.
Resumo:
Tanner Graph representation of linear block codes is widely used by iterative decoding algorithms for recovering data transmitted across a noisy communication channel from errors and erasures introduced by the channel. The stopping distance of a Tanner graph T for a binary linear block code C determines the number of erasures correctable using iterative decoding on the Tanner graph T when data is transmitted across a binary erasure channel using the code C. We show that the problem of finding the stopping distance of a Tanner graph is hard to approximate within any positive constant approximation ratio in polynomial time unless P = NP. It is also shown as a consequence that there can be no approximation algorithm for the problem achieving an approximation ratio of 2(log n)(1-epsilon) for any epsilon > 0 unless NP subset of DTIME(n(poly(log n))).
Resumo:
An explicit representation of an analytical solution to the problem of decay of a plane shock wave of arbitrary strength is proposed. The solution satisfies the basic equations exactly. The approximation lies in the (approximate) satisfaction of two of the Rankine-Hugoniot conditions. The error incurred is shown to be very small even for strong shocks. This solution analyses the interaction of a shock of arbitrary strength with a centred simple wave overtaking it, and describes a complete history of decay with a remarkable accuracy even for strong shocks. For a weak shock, the limiting law of motion obtained from the solution is shown to be in complete agreement with the Friedrichs theory. The propagation law of the non-uniform shock wave is determined, and the equations for shock and particle paths in the (x, t)-plane are obtained. The analytic solution presented here is uniformly valid for the entire flow field behind the decaying shock wave.
Resumo:
In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.
Resumo:
Habitat distruction and hunting for dissection specimens have taken their toll. But there may be other, subtle factors causing loss of amphibian populations.
Resumo:
A general direct technique of solving a mixed boundary value problem in the theory of diffraction by a semi-infinite plane is presented. Taking account of the correct edge-conditions, the unique solution of the problem is derived, by means of Jones' method in the theory of Wiener-Hopf technique, in the case of incident plane wave. The solution of the half-plane problem is found out in exact form. (The far-field is derived by the method of steepest descent.) It is observed that it is not the Wiener-Hopf technique which really needs any modification but a new technique is certainly required to handle the peculiar type of coupled integral equations which the Wiener-Hopf technique leads to. Eine allgemeine direkte Technik zur Lösung eines gemischten Randwertproblems in der Theorie der Beugung an einer halbunendlichen Ebene wird vorgestellt. Unter Berücksichtigung der korrekten Eckbedingungen wird mit der Methode von Jones aus der Theorie der Wiener-Hopf-Technik die eindeutige Lösung für den Fall der einfallenden ebenen Welle hergeleitet. Die Lösung des Halbebenenproblems wird in exakter Form angegeben. (Das Fernfeld wurde mit der Methode des steilsten Abstiegs bestimmt.) Es wurde bemerkt, daß es nicht die Wiener-Hopf-Technik ist, die wirklich irgend welcher Modifikationen bedurfte. Gewiß aber wird eine neue Technik zur Behandlung des besonderen Typs gekoppelter Integralgleichungen benötigt, auf die die Wiener-Hopf-Technik führt.