7 resultados para Value analysis (Cost control)
Resumo:
RESUMO - Introdução: A ausência de um plano de contabilidade analítica para os Cuidados de Saúde Primários é um problema para a realização da contabilidade interna, fundamental para a gestão de qualquer instituição de saúde. Sem linhas orientadoras para a uniformização dos critérios de imputação e distribuição dos custos/proveitos, torna-se complicado obter dados analíticos para que haja um controlo de gestão mais eficaz, que permita a utilização dos recursos de uma forma eficiente e racional, melhorando a qualidade da prestação de cuidados aos utentes. Objectivo: O presente projecto de investigação tem como principal objectivo apurar o custo por utente nos Cuidados de Saúde Primários. Metodologia: Foi construída uma metodologia de apuramento de custos com base no método Time-Driven Activity-Based Costing. O custo foi imputado a cada utente utilizando os seguintes costs drivers: tempo de realização da consulta e a produção realizada para a imputação dos custos com o pessoal médico; produção realizada para a imputação dos outros custos com o pessoal e dos custos indirectos variáveis; número total de utentes inscritos para a imputação dos custos indirectos fixos. Resultados: O custo total apurado foi 2.980.745,10€. O número médio de consultas é de 3,17 consultas por utente inscrito e de 4,72 consultas por utente utilizador. O custo médio por utente é de 195,76€. O custo médio por utente do género feminino é de 232,41€. O custo médio por utente do género masculino é de 154,80€. As rubricas com mais peso no custo total por utente são os medicamentos (40,32%), custo com pessoal médico (22,87%) e MCDT (17,18%). Conclusão: Na implementação de um sistema de apuramentos de custos por utente, é fulcral que existam sistemas de informação eficientes que permitam o registo dos cuidados prestados ao utente pelos vários níveis de prestação de cuidados. É importante também que a gestão não utilize apenas os resultados apurados como uma ferramenta de controlo de custos, devendo ser potenciada a sua utilização para a criação de valor ao utente.
Resumo:
Dissertation to obtain the degree of Doctor in Electrical and Computer Engineering, specialization of Collaborative Networks
Resumo:
The aim of this contribution is to extend the techniques of composite materials design to non-linear material behaviour and apply it for design of new materials for passive vibration control. As a first step a computational tool allowing determination of macroscopic optimized one-dimensional isolator behaviour was developed. Voigt, Maxwell, standard and more complex material models can be implemented. Objective function considers minimization of the initial reaction and/or displacement peak as well as minimization of the steady-state amplitude of reaction and/or displacement. The complex stiffness approach is used to formulate the governing equations in an efficient way. Material stiffness parameters are assumed as non-linear functions of the displacement. The numerical solution is performed in the complex space. The steady-state solution in the complex space is obtained by an iterative process based on the shooting method which imposes the conditions of periodicity with respect to the known value of the period. Extension of the shooting method to the complex space is presented and verified. Non-linear behaviour of material parameters is then optimized by generic probabilistic meta-algorithm, simulated annealing. Dependence of the global optimum on several combinations of leading parameters of the simulated annealing procedure, like neighbourhood definition and annealing schedule, is also studied and analyzed. Procedure is programmed in MATLAB environment.
Resumo:
Tese apresentada como requisito parcial para obtenção do grau de Doutor em Gestão de Informação
Resumo:
The Keystone XL has a big role for transforming Canadian oil to the USA. The function of the pipeline is decreasing the dependency of the American oil industry on other countries and it will help to limit external debt. The proposed pipeline seeks the most suitable route which cannot damage agricultural and natural water recourses such as the Ogallala Aquifer. Using the Geographic Information System (GIS) techniques, the suggested path in this study got extremely high correct results that will help in the future to use the least cost analysis for similar studies. The route analysis contains different weighted overlay surfaces, each, was influenced by various criteria (slope, geology, population and land use). The resulted least cost path routes for each weighted overlay surface were compared with the original proposed pipeline and each displayed surface was more effective than the proposed Keystone XL pipeline.
Resumo:
This work models the competitive behaviour of individuals who maximize their own utility managing their network of connections with other individuals. Utility is taken as a synonym of reputation in this model. Each agent has to decide between two variables: the quality of connections and the number of connections. Hence, the reputation of an individual is a function of the number and the quality of connections within the network. On the other hand, individuals incur in a cost when they improve their network of contacts. The initial value of the quality and number of connections of each individual is distributed according to an initial (given) distribution. The competition occurs over continuous time and among a continuum of agents. A mean field game approach is adopted to solve the model, leading to an optimal trajectory for the number and quality of connections for each individual.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.