927 resultados para Precautionary Principle
Resumo:
The Standard Model of particle physics consists of the quantum electrodynamics (QED) and the weak and strong nuclear interactions. The QED is the basis for molecular properties, and thus it defines much of the world we see. The weak nuclear interaction is responsible for decays of nuclei, among other things, and in principle, it should also effects at the molecular scale. The strong nuclear interaction is hidden in interactions inside nuclei. From the high-energy and atomic experiments it is known that the weak interaction does not conserve parity. Consequently, the weak interaction and specifically the exchange of the Z^0 boson between a nucleon and an electron induces small energy shifts of different sign for mirror image molecules. This in turn will make the other enantiomer of a molecule energetically favorable than the other and also shifts the spectral lines of the mirror image pair of molecules into different directions creating a split. Parity violation (PV) in molecules, however, has not been observed. The topic of this thesis is how the weak interaction affects certain molecular magnetic properties, namely certain parameters of nuclear magnetic resonance (NMR) and electron spin resonance (ESR) spectroscopies. The thesis consists of numerical estimates of NMR and ESR spectral parameters and investigations of the effects of different aspects of quantum chemical computations to them. PV contributions to the NMR shielding and spin-spin coupling constants are investigated from the computational point of view. All the aspects of quantum chemical electronic structure computations are found to be very important, which makes accurate computations challenging. Effects of molecular geometry are also investigated using a model system of polysilyene chains. PV contribution to the NMR shielding constant is found to saturate after the chain reaches a certain length, but the effects of local geometry can be large. Rigorous vibrational averaging is also performed for a relatively small and rigid molecule. Vibrational corrections to the PV contribution are found to be only a couple of per cents. PV contributions to the ESR g-tensor are also evaluated using a series of molecules. Unfortunately, all the estimates are below the experimental limits, but PV in some of the heavier molecules comes close to the present day experimental resolution.
Resumo:
Atomic layer deposition (ALD) is a method for thin film deposition which has been extensively studied for binary oxide thin film growth. Studies on multicomponent oxide growth by ALD remain relatively few owing to the increased number of factors that come into play when more than one metal is employed. More metal precursors are required, and the surface may change significantly during successive stages of the growth. Multicomponent oxide thin films can be prepared in a well-controlled way as long as the same principle that makes binary oxide ALD work so well is followed for each constituent element: in short, the film growth has to be self-limiting. ALD of various multicomponent oxides was studied. SrTiO3, BaTiO3, Ba(1-x)SrxTiO3 (BST), SrTa2O6, Bi4Ti3O12, BiTaO4 and SrBi2Ta2O9 (SBT) thin films were prepared, many of them for the first time by ALD. Chemistries of the binary oxides are shown to influence the processing of their multicomponent counterparts. The compatibility of precursor volatilities, thermal stabilities and reactivities is essential for multicomponent oxide ALD, but it should be noted that the main reactive species, the growing film itself, must also be compatible with self-limiting growth chemistry. In the cases of BaO and Bi2O3 the growth of the binary oxide was very difficult, but the presence of Ti or Ta in the growing film made self-limiting growth possible. The application of the deposited films as dielectric and ferroelectric materials was studied. Post-deposition annealing treatments in different atmospheres were used to achieve the desired crystalline phase or, more generally, to improve electrical properties. Electrode materials strongly influenced the leakage current densities in the prepared metal insulator metal (MIM) capacitors. Film permittivities above 100 and leakage current densities below 110-7 A/cm2 were achieved with several of the materials.
Resumo:
Purpose – This paper aims to go beyond a bookkeeping approach to evolutionary analysis whereby surviving firms are better adapted and extinct firms were less adapted. From discussion of the preliminary findings of research into the Hobart pizza industry, evidence is presented of the need to adopt a more traditional approach to applying evolutionary theories with organizational research. Design/methodology/approach – After a brief review of the relevant literature, the preliminary findings of research into the Hobart pizza industry are presented. Then, several evolutionary concepts that are commonplace in ecological research are introduced to help explain the emergent findings. The paper concludes with consideration given to advancing a more consistent approach to employing evolutionary theories within organizational research. Findings – The paper finds that the process of selection cannot be assumed to occur evenly across time and/or space. Within geographically small markets different forms of selection operate in different ways and degrees requiring the use of more traditional evolutionary theories to highlight the causal process associated with population change. Research limitations/implications – The paper concludes by highlighting Geoffrey Hodgson’s Principle of Consistency. It is demonstrated that a failure to truly understand how and why theory is used in one domain will likely result in its misuse in another domain. That, at present, too few evolutionary concepts are employed in organisational research to ensure an appreciation of any underlying causal processes through which social change occurs. Originality/value – The concepts introduced throughout this paper, whilst not new, provide new entry points for organizational researchers intent on employing an evolutionary approach to understand the process of social change.
Resumo:
In the case of pipe trifurcation, previous observations report negative energy losses in the centre branch. This causes an anomaly, because there should not be any negative energy loss due to conservation of energy principle. Earlier investigators have suggested that this may be due to the non-inclusion of kinetic energy coefficient (a) in the computations of energy losses without any experimental evidence. In the present work, through experimentally determined velocity profiles, energy loss coefficients have been evaluated. It has been found that with the inclusion of a in the computations of energy loss, there is no negative energy loss in the centre branch.
Resumo:
This paper explores how whiteness scholarship can support deep engagement with both historical and contemporary forms of whiteness and racism in early childhood education. To this point, the uptake of whiteness scholarship in the field of early childhood has focused predominantly on autobiographical narratives. These narratives recount white educators’ stories of ‘becoming aware’ or ‘unmasking’ their whiteness. In colonising contexts including Australia, New Zealand and Canada, understanding how whiteness operates in different ways and what this means for educational research and practice, can support researchers and educators to identify and describe more fully the impacts of subtle forms of racism in their everyday practices. In this paper, whiteness is explored in a broader sense as: a form of property; an organising principle for institutional behaviours and practices; and as a fluid identity or subject position. These three intersecting elements of whiteness are drawn on to analyse data from a doctoral study about embedding Aboriginal and Torres Strait Islander perspectives in early childhood education curricula in two Australian urban childcare settings. Analysis is focused on how whiteness operated within the research site and research processes, along with the actions, inaction and talk of two educators engaged in embedding work. Findings show that both the researcher and educators reinforced, rather than reduced the impacts of whiteness and racism, despite the best of intentions.
Resumo:
In this Thesis, we develop theory and methods for computational data analysis. The problems in data analysis are approached from three perspectives: statistical learning theory, the Bayesian framework, and the information-theoretic minimum description length (MDL) principle. Contributions in statistical learning theory address the possibility of generalization to unseen cases, and regression analysis with partially observed data with an application to mobile device positioning. In the second part of the Thesis, we discuss so called Bayesian network classifiers, and show that they are closely related to logistic regression models. In the final part, we apply the MDL principle to tracing the history of old manuscripts, and to noise reduction in digital signals.
Resumo:
This paper presents a method of designing a minimax filter in the presence of large plant uncertainties and constraints on the mean squared values of the estimates. The minimax filtering problem is reformulated in the framework of a deterministic optimal control problem and the method of solution employed, invokes the matrix Minimum Principle. The constrained linear filter and its relation to singular control problems has been illustrated. For the class of problems considered here it is shown that the filter can he constrained separately after carrying out the mini maximization. Numorieal examples are presented to illustrate the results.
Resumo:
Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.
Resumo:
The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.
Resumo:
A simple technique for determining the energy sensitivities for the thermographic recording of laser beams is described. The principle behind this technique is that, if a laser beam with a known spatial distribution such as a Gaussian profile is used for imaging, the radius of the thermal image formed depends uniquely on the intensity of the impinging beam. Thus by measuring the radii of the images produced for different incident beam intensities the minimum intensity necessary (that is, the threshold) for thermographic imaging is found. The diameter of the laser beam can also be found from this measurement. A simple analysis based on the temperature distribution in the laser heated material shows that there is an inverse square root dependence on pulse duration or period of exposure for the energy fluence of the laser beam required, both for the threshold and the subsequent increase in the size of the recording. It has also been shown that except for low intensity, long duration exposure on very low conductivity materials, heat losses are not very significant.
Resumo:
The National Energy Efficient Building Project (NEEBP) Phase One report, published in December 2014, investigated “process issues and systemic failures” in the administration of the energy performance requirements in the National Construction Code. It found that most stakeholders believed that under-compliance with these requirements is widespread across Australia, with similar issues being reported in all states and territories. The report found that many different factors were contributing to this outcome and, as a result, many recommendations were offered that together would be expected to remedy the systemic issues reported. To follow up on this Phase 1 report, three additional projects were commissioned as part of Phase 2 of the overall NEEBP project. This Report deals with the development and piloting of an Electronic Building Passport (EBP) tool – a project undertaken jointly by pitt&sherry and a team at the Queensland University of Technology (QUT) led by Dr Wendy Miller. The other Phase 2 projects cover audits of Class 1 buildings and issues relating to building alterations and additions. The passport concept aims to provide all stakeholders with (controlled) access to the key documentation and information that they need to verify the energy performance of buildings. This trial project deals with residential buildings but in principle could apply to any building type. Nine councils were recruited to help develop and test a pilot electronic building passport tool. The participation of these councils – across all states – enabled an assessment of the extent to which these councils are currently utilising documentation; to track the compliance of residential buildings with the energy performance requirements in the National Construction Code (NCC). Overall we found that none of the participating councils are currently compiling all of the energy performance-related documentation that would demonstrate code compliance. The key reasons for this include: a major lack of clarity on precisely what documentation should be collected; cost and budget pressures; low public/stakeholder demand for the documentation; and a pragmatic judgement that non-compliance with any regulated documentation requirements represents a relatively low risk for them. Some councils reported producing documentation, such as certificates of final completion, only on demand, for example. Only three of the nine council participants reported regularly conducting compliance assessments or audits utilising this documentation and/or inspections. Overall we formed the view that documentation and information tracking processes operating within the building standards and compliance system are not working to assure compliance with the Code’s energy performance requirements. In other words the Code, and its implementation under state and territory regulatory processes, is falling short as a ‘quality assurance’ system for consumers. As a result it is likely that the new housing stock is under-performing relative to policy expectations, consuming unnecessary amounts of energy, imposing unnecessarily high energy bills on occupants, and generating unnecessary greenhouse gas emissions. At the same time, Councils noted that the demand for documentation relating to building energy performance was low. All the participant councils in the EBP pilot agreed that documentation and information processes need to work more effectively if the potential regulatory and market drivers towards energy efficient homes are to be harnessed. These findings are fully consistent with the Phase 1 NEEBP report. It was also agreed that an EBP system could potentially play an important role in improving documentation and information processes. However, only one of the participant councils indicated that they might adopt such a system on a voluntary basis. The majority felt that such a system would only be taken up if it were: - A nationally agreed system, imposed as a mandatory requirement under state or national regulation; - Capable of being used by multiple parties including councils, private certifiers, building regulators, builders and energy assessors in particular; and - Fully integrated into their existing document management systems, or at least seamlessly compatible rather than a separate, unlinked tool. Further, we note that the value of an EBP in capturing statistical information relating to the energy performance of buildings would be much greater if an EBP were adopted on a nationally consistent basis. Councils were clear that a key impediment to the take up of an EBP system is that they are facing very considerable budget and staffing challenges. They report that they are often unable to meet all community demands from the resources available to them. Therefore they are unlikely to provide resources to support the roll out of an EBP system on a voluntary basis. Overall, we conclude from this pilot that the public good would be well served if the Australian, state and territory governments continued to develop and implement an Electronic Building Passport system in a cost-efficient and effective manner. This development should occur with detailed input from building regulators, the Australian Building Codes Board (ABCB), councils and private certifiers in the first instance. This report provides a suite of recommendations (Section 7.2) designed to advance the development and guide the implementation of a national EBP system.
Resumo:
One of the most tangled fields of research is the field of defining and modeling affective concepts, i. e. concepts regarding emotions and feelings. The subject can be approached from many disciplines. The main problem is lack of generally approved definitions. However, e.g. linguists have recently started to check the consistency of their theories with the help of computer simulations. Definitions of affective concepts are needed for performing similar simulations in behavioral sciences. In this thesis, preliminary computational definitions of affects for a simple utility-maximizing agent are given. The definitions have been produced by synthetizing ideas from theories from several fields of research. The class of affects is defined as a superclass of emotions and feelings. Affect is defined as a process, in which a change in an agent's expected utility causes a bodily change. If the process is currently under the attention of the agent (i.e. the agent is conscious of it), the process is a feeling. If it is not, but can in principle be taken into attention (i.e. it is preconscious), the process is an emotion. Thus, affects do not presuppose consciousness, but emotions and affects do. Affects directed at unexpected materialized (i.e. past) events are delight and fright. Delight is the consequence of an unexpected positive event and fright is the consequence of an unexpected negative event. Affects directed at expected materialized (i.e. past) events are happiness (expected positive event materialized), disappointment (expected positive event did not materialize), sadness (expected negative event materialized) and relief (expected negative event did not materialize). Affects directed at expected unrealized (i.e. future) events are fear and hope. Some other affects can be defined as directed towards originators of the events. The affect classification has also been implemented as a computer program, the purpose of which is to ensure the coherence of the definitions and also to illustrate the capabilities of the model. The exact content of bodily changes associated with specific affects is not considered relevant from the point of view of the logical structure of affective phenomena. The utility function need also not be defined, since the target of examination is only its dynamics.
Resumo:
The aim of this paper is to present the evolution of the Francovich doctrine within the European legal order. The first part deals with the gradual development of the ECJ's case law on State liability in damages for breach of EC law. Starting from the seminal Francovich and Brasserie du Pêcheur, the clarification of the criteria set by the Court is attempted with reference to subsequent case law, whereas issues concerning the extent and form of the compensation owned are also mentioned. The second part concerns one of the more recent developments in the field, namely State liability for breaches of Community law attributed to national judiciary. The Court's ruling in Köbler is examined in connection with two other recent judgments, namely Commission v. Italy of 2003 and Kühne & Heitz, as an attempt of the ECJ to reframe its relationships with national supreme courts and appropriate for itself the position of the Supreme Court in the European legal order. The implications on State liability claims by the ruling in Commission v. France of 1997 constitute the theme of the third part, where it is submitted that Member States can also be held liable for disregard of Community law by private individuals within their respected territories. To this extent, Schmidberger is viewed as a manifestation of this opinion, with fundamental rights acquiring a new dimension, being invoked by the States, contra the individuals as a shield to liability claims. Finally, the third part examines the relationship between the Francovich doctrine and the principle of legal certainty and concludes that the solutions employed by the ECJ have been both predictable and acceptable by the national legal orders. Keywords: State liability, damages, Francovich, Köbler, Schmidberger
The Mediated Immediacy : João Batista Libanio and the Question of Latin American Liberation Theology
Resumo:
This study is a systematic analysis of mediated immediacy in the production of the Brazilian professor of theology João Batista Libanio. He stresses both ethical mediation and the immediate character of the faith. Libanio has sought an answer to the problem of science and faith. He makes use of the neo-scholastic distinction between matter and form. According to St. Thomas Aquinas, God cannot be known as a scientific object, but it is possible to predicate a formal theological content of other subject matter with the help of revelation. This viewpoint was emphasized in neo-Thomism and supported by the liberation theologians. For them, the material starting point was social science. It becomes a theologizable or revealable (revelabile) reality. This social science has its roots in Latin American Marxism which was influenced by the school of Louis Althusser and considered Marxism a science of history . The synthesis of Thomism and Marxism is a challenge Libanio faced, especially in his Teologia da libertação from 1987. He emphasized the need for a genuinely spiritual and ethical discernment, and was particularly critical of the ethical implications of class struggle. Libanio s thinking has a strong hermeneutic flavor. It is more important to understand than to explain. He does not deny the need for social scientific data, but that they cannot be the exclusive starting point of theology. There are different readings of the world, both scientific and theological. A holistic understanding of the nature of religious experience is needed. Libanio follows the interpretation given by H. C. de Lima Vaz, according to whom the Hegelian dialectic is a rational circulation between the totality and its parts. He also recalls Oscar Cullmann s idea of God s Kingdom that is already and not yet . In other words, there is a continuous mediation of grace into the natural world. This dialectic is reflected in ethics. Faith must be verified in good works. Libanio uses the Thomist fides caritate formata principle and the modern orthopraxis thinking represented by Edward Schillebeeckx. One needs both the ortho of good faith and the praxis of the right action. The mediation of praxis is the mediation of human and divine love. Libanio s theology has strong roots in the Jesuit spirituality that places the emphasis on contemplation in action.
Resumo:
Can war be justified? Expressions of opinions by the general assemblies of the World Council of Churches on the question of war as a method of settling conflicts. The purpose of this study is to describe and analyse the expressions of opinions recorded in the documents of the general assemblies of the WCC during the Cold War period from 1948 to 1983 on the use of war as a method of settling international and national conflicts. The main sources are the official reports of the WCC´s assemblies during the years 1948 to 1983. This study divides the discussions into three periods. The first period (1949-1968) is dominated by the pressures arising from the Second World War. Experiences of the war led the assemblies of the WCC to the conclusion that modern warfare as a method of settling conflicts should be rejected. Modern war was contrary to God´s purposes and the whole meaning of creation, said the assembly. Although the WCC rejected modern war, it left open the possibility of conflict where principles of just war may be practised. The question of war was also linked to the state and its function, which led to the need to create a politically neutral doctrine for the socio-ethical thinking of churches and of the WCC itself. The doctrine was formulated using the words "responsible society". The question of war and socio-ethical thinking were on the WCC`s agenda throughout the first period. Another issue that had an influence on the first period was the increasing role of Third World countries. This new dimension also brought new aspects to the question of war and violence. The second period (1968-1975) presented greater challenges to the WCC, especially in traditional western countries. The Third World, political activity in the socialist world and ideas of revolution were discussed. The WCC`s fourth Assembly in Uppsala was challenged by these new ideas of revolution. The old doctrine of "responsible society" was seen by many participants as unsuitable in the modern world, especially for Third World countries. The situation of a world governed by armaments, causing social and economic disruption, was felt by churches to be problematic. The peace movement gathered pace and attention. There was pressure to see armed forces as an option on the way to a new world order. The idea of a just war was challenged by that of just revolution. These ideas of revolution did not receive support from the Uppsala Assembly, but they pressured the WCC to reconsider its socio-ethical thinking. Revolution was seen as a possibility, but only when it could be peaceful. In the Nairobi Assembly the theme of just, participatory and sustainable society provided yet another viewpoint, dealing with the life of the world and its problems as a whole. The third period (1975-1983) introduced a new, alternative doctrine the "JPIC Process", justice, peace and the integrity of creation for social thinking in the WCC. The WCC no longer wanted to discuss war or poverty as separate questions, but wanted to combine all aspects of life to see the impact of an arms-governed world on humankind. Thus, during the last period, discussions focused on socio-ethical questions, where war and violence were only parts of a larger problem. Through the new JPIC Process, the WCC`s Assembly in Vancouver looked for a new world, one without violence, in all aspects of life. Despite differing opinions in socio-ethical thinking, the churches in the WCC agreed that modern warfare cannot be regarded as acceptable or just. The old idea of a "just war" still had a place, but it was not seen by all as a valid principle. As a result the WCC viewed war as a final solution to be employed when all other methods had failed. Such a war would have to secure peace and justice for all. In the discussions there was a strong political east-west divide, and, during the last two decades, a north-south divide as well. The effect of the Cold War was obvious. In the background to the theological positions were two main concepts namely the idea of God´s activity in man´s history through the so-called regiments and, the concept of the Kingdom of God on Earth.