31 resultados para Accelerated failure time model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT - It is the purpose of the present thesis to emphasize, through a series of examples, the need and value of appropriate pre-analysis of the impact of health care regulation. Specifically, the thesis presents three papers on the theme of regulation in different aspects of health care provision and financing. The first two consist of economic analyses of the impact of health care regulation and the third comprises the creation of an instrument for supporting economic analysis of health care regulation, namely in the field of evaluation of health care programs. The first paper develops a model of health plan competition and pricing in order to understand the dynamics of health plan entry and exit in the presence of switching costs and alternative health premium payment systems. We build an explicit model of death spirals, in which profitmaximizing competing health plans find it optimal to adopt a pattern of increasing relative prices culminating in health plan exit. We find the steady-state numerical solution for the price sequence and the plan’s optimal length of life through simulation and do some comparative statics. This allows us to show that using risk adjusted premiums and imposing price floors are effective at reducing death spirals and switching costs, while having employees pay a fixed share of the premium enhances death spirals and increases switching costs. Price regulation of pharmaceuticals is one of the cost control measures adopted by the Portuguese government, as in many European countries. When such regulation decreases the products’ real price over time, it may create an incentive for product turnover. Using panel data for the period of 1997 through 2003 on drug packages sold in Portuguese pharmacies, the second paper addresses the question of whether price control policies create an incentive for product withdrawal. Our work builds the product survival literature by accounting for unobservable product characteristics and heterogeneity among consumers when constructing quality, price control and competition indexes. These indexes are then used as covariates in a Cox proportional hazard model. We find that, indeed, price control measures increase the probability of exit, and that such effect is not verified in OTC market where no such price regulation measures exist. We also find quality to have a significant positive impact on product survival. In the third paper, we develop a microsimulation discrete events model (MSDEM) for costeffectiveness analysis of Human Immunodeficiency Virus treatment, simulating individual paths from antiretroviral therapy (ART) initiation to death. Four driving forces determine the course of events: CD4+ cell count, viral load resistance and adherence. A novel feature of the model with respect to the previous MSDEMs is that distributions of time to event depend on individuals’ characteristics and past history. Time to event was modeled using parametric survival analysis. Events modeled include: viral suppression, regimen switch due virological failure, regimen switch due to other reasons, resistance development, hospitalization, AIDS events, and death. Disease progression is structured according to therapy lines and the model is parameterized with cohort Portuguese observational data. An application of the model is presented comparing the cost-effectiveness ART initiation with two nucleoside analogue reverse transcriptase inhibitors (NRTI) plus one non-nucleoside reverse transcriptase inhibitor(NNRTI) to two NRTI plus boosted protease inhibitor (PI/r) in HIV- 1 infected individuals. We find 2NRTI+NNRTI to be a dominant strategy. Results predicted by the model reproduce those of the data used for parameterization and are in line with those published in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada para a obtenção do Grau de Mestre em Genética Molecular e Biomedicina, pela Universidade N ova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation to obtain PhD in Industrial Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia e Gestão Industrial

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper incorporates egocentric comparisons into a human capital accumulation model and studies the evolution of positive self image over time. The paper shows that the process of human capital accumulation together with egocentric comparisons imply that positive self image of a cohort is first increasing and then decreasing over time. Additionally, the paper finds that positive self image: (1) peaks earlier in activities where skill depreciation is higher, (2) is smaller in activities where the distribution of income is more dispersed, (3) is not a stable characteristic of an individual, and (4) is higher for more patient individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventionally the problem of the best path in a network refers to the shortest path problem. However, for the vast majority of networks present nowadays this solution has some limitations which directly affect their proper functioning, as well as an inefficient use of their potentialities. Problems at the level of large networks where graphs of high complexity are commonly present as well as the appearing of new services and their respective requirements, are intrinsically related to the inability of this solution. In order to overcome the needs present in these networks, a new approach to the problem of the best path must be explored. One solution that has aroused more interest in the scientific community considers the use of multiple paths between two network nodes, where they can all now be considered as the best path between those nodes. Therefore, the routing will be discontinued only by minimizing one metric, where only one path between nodes is chosen, and shall be made by the selection of one of many paths, thereby allowing the use of a greater diversity of the present paths (obviously, if the network consents). The establishment of multi-path routing in a given network has several advantages for its operation. Its use may well improve the distribution of network traffic, improve recovery time to failure, or it can still offer a greater control of the network by its administrator. These factors still have greater relevance when networks have large dimensions, as well as when their constitution is of high complexity, such as the Internet, where multiple networks managed by different entities are interconnected. A large part of the growing need to use multipath protocols is associated to the routing made based on policies. Therefore, paths with different characteristics can be considered with equal level of preference, and thus be part of the solution for the best way problem. To perform multi-path routing using protocols based only on the destination address has some limitations but it is possible. Concepts of graph theory of algebraic structures can be used to describe how the routes are calculated and classified, enabling to model the routing problem. This thesis studies and analyzes multi-path routing protocols from the known literature and derives a new algebraic condition which allows the correct operation of these protocols without any network restriction. It also develops a range of software tools that allows the planning and the respective verification/validation of new protocols models according to the study made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study analyses financial data using the result characterization of a self-organized neural network model. The goal was prototyping a tool that may help an economist or a market analyst to analyse stock market series. To reach this goal, the tool shows economic dependencies and statistics measures over stock market series. The neural network SOM (self-organizing maps) model was used to ex-tract behavioural patterns of the data analysed. Based on this model, it was de-veloped an application to analyse financial data. This application uses a portfo-lio of correlated markets or inverse-correlated markets as input. After the anal-ysis with SOM, the result is represented by micro clusters that are organized by its behaviour tendency. During the study appeared the need of a better analysis for SOM algo-rithm results. This problem was solved with a cluster solution technique, which groups the micro clusters from SOM U-Matrix analyses. The study showed that the correlation and inverse-correlation markets projects multiple clusters of data. These clusters represent multiple trend states that may be useful for technical professionals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enhanced biological phosphorus removal (EBPR) is the most economic and sustainable option used in wastewater treatment plants (WWTPs) for phosphorus removal. In this process it is important to control the competition between polyphosphate accumulating organisms (PAOs) and glycogen accumulating organisms (GAOs), since EBPR deterioration or failure can be related with the proliferation of GAOs over PAOs. This thesis is focused on the effect of operational conditions (volatile fatty acid (VFA) composition, dissolved oxygen (DO) concentration and organic carbon loading) on PAO and GAO metabolism. The knowledge about the effect of these operational conditions on EBPR metabolism is very important, since they represent key factors that impact WWTPs performance and sustainability. Substrate competition between the anaerobic uptake of acetate and propionate (the main VFAs present in WWTPs) was shown in this work to be a relevant factor affecting PAO metabolism, and a metabolic model was developed that successfully describes this effect. Interestingly, the aerobic metabolism of PAOs was not affected by different VFA compositions, since the aerobic kinetic parameters for phosphorus uptake, polyhydroxyalkanoates (PHAs) degradation and glycogen production were relatively independent of acetate or propionate concentration. This is very relevant for WWTPs, since it will simplify the calibration procedure for metabolic models, facilitating their use for full-scale systems. The DO concentration and aerobic hydraulic retention time (HRT) affected the PAO-GAO competition, where low DO levels or lower aerobic HRT was more favourable for PAOs than GAOs. Indeed, the oxygen affinity coefficient was significantly higher for GAOs than PAOs, showing that PAOs were far superior at scavenging for the often limited oxygen levels in WWTPs. The operation of WWTPs with low aeration is of high importance for full-scale systems, since it decreases the energetic costs and can potentially improve WWTP sustainability. Extended periods of low organic carbon load, which are the most common conditions that exist in full-scale WWTPs, also had an impact on PAO and GAO activity. GAOs exhibited a substantially higher biomass decay rate as compared to PAOs under these conditions, which revealed a higher survival capacity for PAOs, representing an advantage for PAOs in EBPR processes. This superior survival capacity of PAOs under conditions more closely resembling a full-scale environment was linked with their ability to maintain a residual level of PHA reserves for longer than GAOs, providing them with an effective energy source for aerobic maintenance processes. Overall, this work shows that each of these key operational conditions play an important role in the PAO-GAO competition and should be considered in WWTP models in order to improve EBPR processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontologies formalized by means of Description Logics (DLs) and rules in the form of Logic Programs (LPs) are two prominent formalisms in the field of Knowledge Representation and Reasoning. While DLs adhere to the OpenWorld Assumption and are suited for taxonomic reasoning, LPs implement reasoning under the Closed World Assumption, so that default knowledge can be expressed. However, for many applications it is useful to have a means that allows reasoning over an open domain and expressing rules with exceptions at the same time. Hybrid MKNF knowledge bases make such a means available by formalizing DLs and LPs in a common logic, the Logic of Minimal Knowledge and Negation as Failure (MKNF). Since rules and ontologies are used in open environments such as the Semantic Web, inconsistencies cannot always be avoided. This poses a problem due to the Principle of Explosion, which holds in classical logics. Paraconsistent Logics offer a solution to this issue by assigning meaningful models even to contradictory sets of formulas. Consequently, paraconsistent semantics for DLs and LPs have been investigated intensively. Our goal is to apply the paraconsistent approach to the combination of DLs and LPs in hybrid MKNF knowledge bases. In this thesis, a new six-valued semantics for hybrid MKNF knowledge bases is introduced, extending the three-valued approach by Knorr et al., which is based on the wellfounded semantics for logic programs. Additionally, a procedural way of computing paraconsistent well-founded models for hybrid MKNF knowledge bases by means of an alternating fixpoint construction is presented and it is proven that the algorithm is sound and complete w.r.t. the model-theoretic characterization of the semantics. Moreover, it is shown that the new semantics is faithful w.r.t. well-studied paraconsistent semantics for DLs and LPs, respectively, and maintains the efficiency of the approach it extends.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.