946 resultados para incremental computation
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
We present a calibrated model of the UK mobile telephony market with four mobile networks; calls to and from the fixed network; network-based price discrimination; and call externalities. Our results show that reducing mobile termination rates broadly in line with the recent European Commission Recommendation to either pure long-run incremental cost ; reciprocal termination charges with fixed networks; or Bill & Keep (i.e. zero termination rates), increases social welfare, consumer surplus and networks profits. Depending on the strength of call externalities, social welfare may increase by as much as £ 990 million to £ 4.5 billion per year, with Bill & Keep leading to the highest increase in welfare. We also apply the model to estimate the welfare effects of the 2010 merger between Orange and T-Mobile under different scenarios concerning MTRs, and predict that consumer surplus decreases strongly.
Resumo:
RESUMO - A Terapêutica de Ressincronização Cardíaca (TRC) apresenta benefícios significativos na classe funcional, função ventricular, hospitalização e mortalidade. É uma técnica com custos elevados e, com os actuais métodos de selecção de doentes, a taxa de não-respondedores ronda os 30%. Objectivo: Compreender se a inclusão da dessincronia mecânica (DM) na selecção de doentes para TRC contribui para a sua relação custo-efectividade, na perspectiva do Serviço Nacional de Saúde português. Metodologia: Estudo prospectivo baseado em coortes histórias de 12 meses de dois grupos submetidos a TRC com desfibrilhador, o grupo de intervenção com doentes seleccionados com inclusão da DM (n=133) e o de controlo com selecção baseada exclusivamente nas recomendações internacionais (n=71). Reuniram-se dados clínicos e de custos nos 12 meses subsequentes à implantação, para cálculo do rácio custo-efectividade incremental (RCEI). Resultados: O grupo de intervenção apresentou uma sobrevivência de 91% e o de controlo de 93%, aos 12 meses (p=0,335). O grupo de intervenção apresentou 60 re-internamentos e o de controlo 46 re-internamentos por qualquer causa aos 12 meses (p=0,032), com RCEI=6.886,09€/re-internamento evitado. O grupo de intervenção apresentou 19 re-internamentos e o de controlo apresentou 31 re-internamentos por Insuficiência Cardíaca (IC) descompensada aos 12 meses (p<0,001), com RCEI=2.686,26€/re-internamento por IC descompensada evitado. Relativamente à melhoria da classe funcional e da fracção de ejecção não foi possível estabelecer associações com custos (p>0,05). Conclusão: É seguro afirmar que recomendar a selecção com inclusão da DM, nos hospitais com capacidade instalada, é um passo positivo na redução de custos com re-internamentos de doentes com TRC.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Resumo:
Dissertação para obtenção do Grau de Doutor em Informática
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Civil - Perfil Estruturas
Resumo:
Diffusion Kurtosis Imaging (DKI) is a fairly new magnetic resonance imag-ing (MRI) technique that tackles the non-gaussian motion of water in biological tissues by taking into account the restrictions imposed by tissue microstructure, which are not considered in Diffusion Tensor Imaging (DTI), where the water diffusion is considered purely gaussian. As a result DKI provides more accurate information on biological structures and is able to detect important abnormalities which are not visible in standard DTI analysis. This work regards the development of a tool for DKI computation to be implemented as an OsiriX plugin. Thus, as OsiriX runs under Mac OS X, the pro-gram is written in Objective-C and also makes use of Apple’s Cocoa framework. The whole program is developed in the Xcode integrated development environ-ment (IDE). The plugin implements a fast heuristic constrained linear least squares al-gorithm (CLLS-H) for estimating the diffusion and kurtosis tensors, and offers the user the possibility to choose which maps are to be generated for not only standard DTI quantities such as Mean Diffusion (MD), Radial Diffusion (RD), Axial Diffusion (AD) and Fractional Anisotropy (FA), but also DKI metrics, Mean Kurtosis (MK), Radial Kurtosis (RK) and Axial Kurtosis (AK).The plugin was subjected to both a qualitative and a semi-quantitative analysis which yielded convincing results. A more accurate validation pro-cess is still being developed, after which, and with some few minor adjust-ments the plugin shall become a valid option for DKI computation
Resumo:
Throughout the brain, patterns of activity in postsynaptic neurons influence the properties of synaptic inputs. Such feedback regulation is central to neural network stability that underlies proper information processing and feature representation in the central nervous system. At the cellular level, tight coupling of presynaptic and postsynaptic function is fundamental to neural computation and synaptic plasticity. The cohort of protein complexes at the pre and postsynaptic membrane allows for tight synapse-specific segregation and integration of diverse molecular and electrical signals.(...)
Resumo:
Rupture of aortic aneurysms (AA) is a major cause of death in the Western world. Currently, clinical decision upon surgical intervention is based on the diameter of the aneurysm. However, this method is not fully adequate. Noninvasive assessment of the elastic properties of the arterial wall can be a better predictor for AA growth and rupture risk. The purpose of this study is to estimate mechanical properties of the aortic wall using in vitro inflation testing and 2D ultrasound (US) elastography, and investigate the performance of the proposed methodology for physiological conditions. Two different inflation experiments were performed on twelve porcine aortas: 1) a static experiment for a large pressure range (0 – 140 mmHg); 2) a dynamic experiment closely mimicking the in vivo hemodynamics at physiological pressures (70 – 130 mmHg). 2D raw radiofrequency (RF) US datasets were acquired for one longitudinal and two cross-sectional imaging planes, for both experiments. The RF-data were manually segmented and a 2D vessel wall displacement tracking algorithm was applied to obtain the aortic diameter–time behavior. The shear modulus G was estimated assuming a Neo-Hookean material model. In addition, an incremental study based on the static data was performed to: 1) investigate the changes in G for increasing mean arterial pressure (MAP), for a certain pressure difference (30, 40, 50 and 60 mmHg); 2) compare the results with those from the dynamic experiment, for the same pressure range. The resulting shear modulus G was 94 ± 16 kPa for the static experiment, which is in agreement with literature. A linear dependency on MAP was found for G, yet the effect of the pressure difference was negligible. The dynamic data revealed a G of 250 ± 20 kPa. For the same pressure range, the incremental shear modulus (Ginc) was 240 ± 39 kPa, which is in agreement with the former. In general, for all experiments, no significant differences in the values of G were found between different image planes. This study shows that 2D US elastography of aortas during inflation testing is feasible under controlled and physiological circumstances. In future studies, the in vivo, dynamic experiment should be repeated for a range of MAPs and pathological vessels should be examined. Furthermore, the use of more complex material models needs to be considered to describe the non-linear behavior of the vascular tissue.
Resumo:
The computational power is increasing day by day. Despite that, there are some tasks that are still difficult or even impossible for a computer to perform. For example, while identifying a facial expression is easy for a human, for a computer it is an area in development. To tackle this and similar issues, crowdsourcing has grown as a way to use human computation in a large scale. Crowdsourcing is a novel approach to collect labels in a fast and cheap manner, by sourcing the labels from the crowds. However, these labels lack reliability since annotators are not guaranteed to have any expertise in the field. This fact has led to a new research area where we must create or adapt annotation models to handle these weaklylabeled data. Current techniques explore the annotators’ expertise and the task difficulty as variables that influences labels’ correction. Other specific aspects are also considered by noisy-labels analysis techniques. The main contribution of this thesis is the process to collect reliable crowdsourcing labels for a facial expressions dataset. This process consists in two steps: first, we design our crowdsourcing tasks to collect annotators labels; next, we infer the true label from the collected labels by applying state-of-art crowdsourcing algorithms. At the same time, a facial expression dataset is created, containing 40.000 images and respective labels. At the end, we publish the resulting dataset.
Resumo:
Since the invention of photography humans have been using images to capture, store and analyse the act that they are interested in. With the developments in this field, assisted by better computers, it is possible to use image processing technology as an accurate method of analysis and measurement. Image processing's principal qualities are flexibility, adaptability and the ability to easily and quickly process a large amount of information. Successful examples of applications can be seen in several areas of human life, such as biomedical, industry, surveillance, military and mapping. This is so true that there are several Nobel prizes related to imaging. The accurate measurement of deformations, displacements, strain fields and surface defects are challenging in many material tests in Civil Engineering because traditionally these measurements require complex and expensive equipment, plus time consuming calibration. Image processing can be an inexpensive and effective tool for load displacement measurements. Using an adequate image acquisition system and taking advantage of the computation power of modern computers it is possible to accurately measure very small displacements with high precision. On the market there are already several commercial software packages. However they are commercialized at high cost. In this work block-matching algorithms will be used in order to compare the results from image processing with the data obtained with physical transducers during laboratory load tests. In order to test the proposed solutions several load tests were carried out in partnership with researchers from the Civil Engineering Department at Universidade Nova de Lisboa (UNL).
Resumo:
The purpose of this study is to contribute to the changing innovation management literature by providing an overview of different innovation types and organizational complexity factors. Aiming at a better understanding of effective innovation management, innovation and complexity are related to the formulation of an innovation strategy and interaction between different innovation types is further explored. The chosen approach in this study is to review the existing literature on different innovation types and organizational complexity factors in order to design a survey which allows for statistical measurement of their interactions and relationships to innovation strategy formulation. The findings demonstrate interaction between individual innovation types. Additionally, organizational complexity factors and different innovation types are significantly related to innovation strategy formulation. In particular, more closed innovation and incremental innovation positively influence the likelihood of innovation strategy formulation. Organizational complexity factors have an overall negative influence on innovation strategy formulation. In order to define best practices for innovation management and to guide managerial decision making, organizations need to be aware of the co-existence of different innovation types and formulate an innovation strategy to more closely align their innovation objectives.
Resumo:
Following the European Commission’s 2009 Recommendation on the Regulatory Treatment of Fixed and Mobile Termination Rates in the EU, the Portuguese regulatory authority (ANACOM) decided to reduce termination prices in mobile networks to their long-run incremental cost (LRIC). Nevertheless, no serious quantitative assessment of the potential effects of this decision was carried out. In this paper, we adapt and calibrate the Harbord and Hoernig (2014) model of the UK mobile telephony market to the Portuguese reality, and simulate the likely impact on consumer surplus, profits and welfare of four different regulatory approaches: pure LRIC, reciprocal termination charges with fixed networks, “bill & keep”, and asymmetric termination rates. Our results show that reducing MTRs does increase social welfare, profits and consumer surplus in the fixed market, but mobile subscribers are seriously harmed by this decision.
Resumo:
Digital Businesses have become a major driver for economic growth and have seen an explosion of new startups. At the same time, it also includes mature enterprises that have become global giants in a relatively short period of time. Digital Businesses have unique characteristics that make the running and management of a Digital Business much different from traditional offline businesses. Digital businesses respond to online users who are highly interconnected and networked. This enables a rapid flow of word of mouth, at a pace far greater than ever envisioned when dealing with traditional products and services. The relatively low cost of incremental user addition has led to a variety of innovation in pricing of digital products, including various forms of free and freemium pricing models. This thesis explores the unique characteristics and complexities of Digital Businesses and its implications on the design of Digital Business Models and Revenue Models. The thesis proposes an Agent Based Modeling Framework that can be used to develop Simulation Models that simulate the complex dynamics of Digital Businesses and the user interactions between users of a digital product. Such Simulation models can be used for a variety of purposes such as simple forecasting, analysing the impact of market disturbances, analysing the impact of changes in pricing models and optimising the pricing for maximum revenue generation or a balance between growth in usage and revenue generation. These models can be developed for a mature enterprise with a large historical record of user growth rate as well as for early stage enterprises without much historical data. Through three case studies, the thesis demonstrates the applicability of the Framework and its potential applications.