975 resultados para profitability calculation
Resumo:
Myocardial Perfusion Gated Single Photon Emission Tomography (Gated-SPET) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV) function. But standard protocols of the Gated-SPECT studies require long acquisition times for each study. It is therefore important to reduce as much as possible the total duration of image acquisition. However, it is known that this reduction leads to decrease on counts statistics per projection and raises doubts about the validity of the functional parameters determined by Gated-SPECT. Considering that, it’s difficult to carry out this analysis in real patients. For ethical, logistical and economical matters, simulated studies could be required for this analysis. Objective: Evaluate the influence of the total number of counts acquired from myocardium, in the calculation of myocardial functional parameters (LVEF – left ventricular ejection fraction, EDV – end-diastolic volume, ESV – end-sistolic volume) using routine software procedures.
Resumo:
RESUMO: Objetivos – Determinar a sensibilidade e especificidade das ponderações Difusão (DWI) e T2 Fluid-Attenuated Inversion Recovery (FLAIR) na avaliação de lesões da substância branca (SB) e verificar em que medida se complementam, por forma a criar um conjunto de boas práticas na RM cranioencefálica de rotina. Metodologia – Recorrendo-se a uma metodologia quantitativa, efetuou-se uma análise retrospetiva da qual foram selecionados 30 pacientes, 10 sem patologia e 20 com patologia (2 com EM, 7 com Leucoencefalopatia, 6 com doença microangiopática e 5 com patologia da substância branca indefinida). Obteve-se uma amostra de 60 imagens, nomeadamente: 30 imagens ponderadas em DWI e 30 em T2 FLAIR. Recorrendo ao programa Viewdex®, três observadores avaliaram um conjunto de imagens segundo sete critérios: visibilidade, deteção, homogeneidade, localização, margens e dimensões da lesão e capacidade de diagnóstico. Com os resultados obtidos recorreu-se ao cálculo de sensibilidade e especificidade pelas Curvas ROC, bem como à análise estatística, nomeadamente, Teste-T, Índice de Concordância Kappa e coeficiente de correlação de Pearson entre as variáveis em estudo. Resultados – Os resultados de sensibilidade e de especificidade obtidos para a ponderação T2 FLAIR foram superiores (0,915 e 0,038, respetivamente) aos da ponderação DWI (0,08 e 0,100, respetivamente). Não se verificaram variâncias populacionais significativas. Obteve-se uma elevada correlação linear entre as variáveis com um valor r situado entre 0,8 e 0,99. Verificou-se também uma variabilidade considerável entre os observadores. Conclusões – Dados os baixos valores de sensibilidade e especificidade obtidos para a DWI, sugere-se que esta deva ser incluída no protocolo de rotina de crânio como auxiliar no diagnóstico diferencial com outras patologias.
Resumo:
Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica - ramo de Energia
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde.
Residential property loans and performance during property price booms: evidence from European banks
Resumo:
Understanding the performance of banks is of the utmost relevance, because of the impact of this sector on economic growth and financial stability. Of all the different assets that make up a bank portfolio, the residential mortgage loans constitute one of its main. Using the dynamic panel data method, we analyse the influence of residential mortgage loans on bank profitability and risk, using a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that banks with larger weights of residential mortgage loans show lower credit risk in good times. This result explains why banks rush to lend on property during booms due to the positive effects it has on credit risk. The results show further that credit risk and profitability are lower during the upturn in the residential property price cycle. The results also reveal the existence of a non-linear relationship (U-shaped marginal effect), as a function of bank’s risk, between profitability and the residential mortgage loans exposure. For those banks that have high credit risk, a large exposure of residential mortgage loans is associated with higher risk-adjusted profitability, through lower risk. For banks with a moderate/low credit risk, the effects of higher residential mortgage loan exposure on its risk-adjusted profitability are also positive or marginally positive.
Resumo:
Mestrado em Medicina Nuclear - Área de especialização: Tomografia por Emissão de Positrões.
Resumo:
Understanding the performance of banks is of the u tmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of performance. Using a dynamic panel model , we analyse the impact of res idential mortgage loans on bank profitability and risk , based on a sample of 555 banks in the European Union ( EU - 15 ) , over the period from 1995 to 2008. We find that banks with larger weight s in residential mortgage loans display lower credit risk in good market conditions . This result may explain why banks rush to lend on property during b ooms due to the positive effect it has on credit risk . The results also show that credit risk and profitability are lower during the upturn in the residential property cy cle. Furthermore, t he results reveal the existence of a non - linear relationship ( U - shaped marginal effect), as a function of bank’s risk, between profitability and residential mortgage exposure . For those banks that have high er credit risk, a large exposur e to residential loans is associated with increased risk - adjusted profitability, through a reduction in risk. For banks with a moderate to low credit risk, the impact of higher exposure are also positive on risk - adjusted profitability.
Resumo:
A replicate evaluation of increased micronucleus (MN) frequencies in peripheral lymphocytes of workers occupationally exposed to formaldehyde (FA) was undertaken to verify the observed effect and to determine scoring variability. May–Grünwald–Giemsa-stained slides were obtained from a previously performed cytokinesis-block micronucleus test (CBMNT) with 56 workers in anatomy and pathology laboratories and 85 controls. The first evaluation by one scorer (scorer 1) had led to a highly significant difference between workers and controls (3.96 vs 0.81 MN per 1000 cells). The slides were coded before re-evaluation and the code was broken after the complete re-evaluation of the study. A total of 1000 binucleated cells (BNC) were analysed per subject and the frequency of MN (in ‰) was determined. Slides were distributed equally and randomly between two scorers, so that the scorers had no knowledge of the exposure status. Scorer 2 (32 exposed, 36 controls) measured increased MN frequencies in exposed workers (9.88 vs 6.81). Statistical analysis with the two-sample Wilcoxon test indicated that this difference was not significant (p = 0.17). Scorer 3 (20 exposed, 46 controls) obtained a similar result, but slightly higher values for the comparison of exposed and controls (19.0 vs 12.89; p = 0.089). Combining the results of the two scorers (13.38 vs 10.22), a significant difference between exposed and controls (p = 0.028) was obtained when the stratified Wilcoxon test with the scorers as strata was applied. Interestingly, the re-evaluation of the slides led to clearly higher MN frequencies for exposed and controls compared with the first evaluation. Bland–Altman plots indicated that the agreement between the measurements of the different scorers was very poor, as shown by mean differences of 5.9 between scorer 1 and scorer 2 and 13.0 between scorer 1 and scorer 3. Calculation of the intra-class correlation coefficient (ICC) revealed that all scorer comparisons in this study were far from acceptable for the reliability of this assay. Possible implications for the use of the CBMNT in human biomonitoring studies are discussed.
Resumo:
As it is well known, competitive electricity markets require new computing tools for power companies that operate in retail markets in order to enhance the management of its energy resources. During the last years there has been an increase of the renewable penetration into the micro-generation which begins to co-exist with the other existing power generation, giving rise to a new type of consumers. This paper develops a methodology to be applied to the management of the all the aggregators. The aggregator establishes bilateral contracts with its clients where the energy purchased and selling conditions are negotiated not only in terms of prices but also for other conditions that allow more flexibility in the way generation and consumption is addressed. The aggregator agent needs a tool to support the decision making in order to compose and select its customers' portfolio in an optimal way, for a given level of profitability and risk.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.
Resumo:
Purpose: We evaluated the association between risk of obesity in the Portuguese population and two obesity-related single-nucleotide gene polymorphisms: fat-mass and obesity-associated (FTO) rs9939609 and peroxisome proliferator-activated receptor gamma (PPARG) rs1801282. Patients and methods: A total of 194 Portuguese premenopausal female Caucasians aged between 18 and 50 years (95 with body mass index [BMI] ≥30 g/m2, 99 controls with BMI 18.5–24.9 kg/m2) participated in this study. The association of the single-nucleotide polymorphisms with obesity was determined by odds ratio calculation with 95% confidence intervals. Results: Significant differences in allelic expression of FTO rs9939609 (P<0.05) were found between control and case groups, indicating a 2.5-higher risk for obesity in the presence of both risk alleles when comparing the control group with the entire obese group. A fourfold-higher risk was found for subjects with class III obesity compared to those with classes I and II. No significant differences in BMI were found between the control and case groups for PPARG rs1801282 (P>0.05). Conclusion: For the first time, a study involving an adult Portuguese population shows that individuals harboring both risk alleles in the FTO gene locus are at higher risk for obesity, which is in agreement to what has been reported for other European populations.
Resumo:
Objetivo: Verificar se a eletrolipólise pode ser um método coadjuvante na Reabilitação Cardíaca (RC) de modo a diminuir a gordura abdominal em indivíduos com Doença Cardiovascular (DCV). Métodos: Seis indivíduos, de ambos os sexos, da classe de RC do Centro Clinico de Gaia (CCG), foram distribuídos aleatoriamente em dois grupos, grupo controlo (GC) (n=3) e grupo experimental (GE) (n=3). O GE realizou 8 sessões (duas vezes por semana) de eletrolipólise por microcorrentes 30 minutos antes da classe de RC. Utilizaram-se como meios de avaliação a perimetria, adipometria, bioimpedância e ultrassonografia, análise sanguíneas, bem como a aplicação da Hospital Anxiety and Depression Scale (HADS), Questionário Internacional de Actividade Física (IPAQ) e Questionário Semi-Quantitativo de Frequência Alimentar (QSQFA). Resultados: Através do cálculo da mudança mínima detetável (MMD) verificou-se diminuição do perímetro ao nível do umbigo (3,7cm e 1,5cm) e barriga (4,2cm e 2,83cm) em dois indivíduos do GE e diminuição nas avaliações da ultrassonografia (de 1,3mm a 8,4mm) no GE. Conclusão: Do presente estudo foi possível concluir que nesta amostra existiu uma tendência para a diminuição da gordura abdominal localizada através dos dados obtidos pela perimetria e ultrassonografia.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
A criação de valor no mercado da saúde enquanto factor diferenciador para a negociação de preços e competitividade em contexto de crise mundial Numa altura em que o sector da saúde é apontado como uma área crítica de custos, torna-se cada vez mais difícil orientar a contratação em saúde baseada em valor para os pacientes, ou seja, pelos resultados obtidos e não pelo volume de cuidados prestados. Pretendeu-se estudar a criação de valor no mercado da saúde enquanto factor diferenciador para a negociação de preços e competitividade em contexto de crise económica. Procedeu-se à comparação dos resultados operacionais de uma empresa enquanto prestadora de serviços de Oxigenoterapia ao domicílio, tendo por base duas estratégias diferentes: redução directa de preços ou manutenção de preços com criação de valor para o cliente. As propostas foram posteriormente apresentadas para avaliação e votação on-line por um grupo 8 gestores hospitalares. A proposta baseada em valor (Nº2) apresenta melhores resultados operacionais (41%) embora apresente maiores custos. No que se refere à votação das propostas e tendo em conta o cenário apresentado, metade dos gestores optaram pela proposta Nº1 (N=4) e outra metade pela proposta Nº2 (N=4). Contudo, a maioria dos gestores (N=7) consideraram a proposta Nº1 a mais competitiva em contexto de competição com mais fornecedores. Conclui-se que numa negociação de contratos de cuidados de saúde, uma proposta baseada em valor, pode garantir a manutenção dos preços. Todavia, mantendo-se uma situação económica de recessão e num cenário competitivo de vários fornecedores este tipo de propostas pode não eleita, pelo facto de aparentemente não representar ganhos imediatos para a instituição contratante.