16 resultados para Formulation de projets
em University of Queensland eSpace - Australia
Resumo:
Mental disorders are a major and rising cause of disease burden in all countries. Even when resources are available, many countries do not have the policy and planning frameworks in place to identify and deliver effective interventions. The World Health Organization (WHO) and the World Bank have emphasized the need for ready access to the basic tools for mental health policy formulation, implementation and sustained development. The Analytical Studies on Mental Health Policy and Service Project, undertaken in 1999-2001 by the International Consortium for Mental Health Services and funded by the Global Forum for Health Research aims to address this need through the development of a template for mental health policy formulation. A mental health policy template has been developed based on an inventory of the key elements of a successful mental health policy. These elements have been validated against a review of international literature, a study of existing mental health policies and the results of extensive consultations with experts in the six WHO regions of the world. The Mental Health Policy Template has been revised and its applicability will be tested in a number of developing countries during 2001-2002. The Mental Health Policy Template and the work of the Consortium for Mental Health Services will be presented and the future role of the template in mental health policy development and reform in developing countries will be discussed.
Resumo:
Clinically healthy mixed breed dogs (n = 20) were used to determine if a Tris (tromethamine)-buffered test solution, Otinide((R)) (Trademark of Dermcare-Vet Pty-Ltd, Australia), containing disodium ethylenediamine tetraacetic acid (EDTA; 1.21 g/L) and polyhexamethylene biguanide (PHMB; 0.22 g/L) caused ototoxicity or vestibular dysfunction. The dogs were randomly assigned to either a control group (group A, n = 10) receiving saline, or a treatment group (group B, n = 10) receiving the test solution. Phase 1 of the study consisted of applying 5.0 mL of saline to both ears of the control group (group A) and 5 mL of test solution to both ears of the test group (group B), for 21 days. A bilateral myringotomy was then performed on each dog under deep sedation. Phase 2 of the study then consisted of applying 2.0 mL of the saline to both ears of the control group (group A) and 2.0 mL of the test solution to both ears of the test group (group B), for 14 days. Throughout the study, dogs were examined for clinical health, and underwent otoscopic, vestibular and auditory examinations. The auditory examinations included brainstem auditory evoked potential (BAEP) threshold and supra-threshold assessments using both click and 8 kHz tone burst stimuli. The absence of vestibular signs and effects on the BAEP attributable to the test solution suggested the test solution could be applied safely to dogs, including those with a damaged tympanic membrane.
Resumo:
This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 x 5 voxel space gives 10 to 10(7) solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.
Resumo:
Numerical solutions of the sediment conservation law are reviewed in terms of their application to bed update schemes in coastal morphological models. It is demonstrated that inadequately formulated numerical techniques lead to the introduction of diffusion, dispersion and the bed elevation oscillations previously reported in the literature. Four different bed update schemes are then reviewed and tested against benchmark analytical solutions. These include a first order upwind scheme, two Lax-Wendroff schemes and a non-oscillating centred scheme (NOCS) recently applied to morphological modelling by Saint-Cast [Saint-Cast, F., 2002. Modelisation de la morphodynamique des corps sableux en milieu littoral (Modelling of coastal sand banks morphodynamics), University Bordeaux 1, Bordeaux, 245 pp.]. It is shown that NOCS limits and controls numerical errors while including all the sediment flux gradients that control morphological change. Further, no post solution filtering is required, which avoids difficulties with selecting filter strength. Finally, NOCS is compared to a recent Lax-Wendroff scheme with post-solution filtering for a longer term simulation of the morphological evolution around a trained river entrance. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
In the absence of an external frame of reference-i.e., in background independent theories such as general relativity-physical degrees of freedom must describe relations between systems. Using a simple model, we investigate how such a relational quantum theory naturally arises by promoting reference systems to the status of dynamical entities. Our goal is twofold. First, we demonstrate using elementary quantum theory how any quantum mechanical experiment admits a purely relational description at a fundamental. Second, we describe how the original non-relational theory approximately emerges from the fully relational theory when reference systems become semi-classical. Our technique is motivated by a Bayesian approach to quantum mechanics, and relies on the noiseless subsystem method of quantum information science used to protect quantum states against undesired noise. The relational theory naturally predicts a fundamental decoherence mechanism, so an arrow of time emerges from a time-symmetric theory. Moreover, our model circumvents the problem of the collapse of the wave packet as the probability interpretation is only ever applied to diagonal density operators. Finally, the physical states of the relational theory can be described in terms of spin networks introduced by Penrose as a combinatorial description of geometry, and widely studied in the loop formulation of quantum gravity. Thus, our simple bottom-up approach (starting from the semiclassical limit to derive the fully relational quantum theory) may offer interesting insights on the low energy limit of quantum gravity.