961 resultados para Linear Models in Temporal Series
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores – Sistemas Digitais e Percepcionais pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Binary operations on commutative Jordan algebras, CJA, can be used to study interactions between sets of factors belonging to a pair of models in which one nests the other. It should be noted that from two CJA we can, through these binary operations, build CJA. So when we nest the treatments from one model in each treatment of another model, we can study the interactions between sets of factors of the first and the second models.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
O processo de liberalização do setor elétrico em Portugal Continental seguiu uma metodologia idêntica à da maior parte dos países europeus, tendo a abertura de mercado sido efetuada de forma progressiva. Assim, no âmbito do acompanhamento do setor elétrico nacional, reveste-se de particular interesse caracterizar a evolução mais recente do mercado liberalizado, nomeadamente em relação ao preço da energia elétrica. A previsão do preço da energia elétrica é uma questão muito importante para todos os participantes do mercado de energia elétrica. Como se trata de um assunto de grande importância, a previsão do preço da energia elétrica tem sido alvo de diversos estudos e diversas metodologias têm sido propostas. Esta questão é abordada na presente dissertação recorrendo a técnicas de previsão, nomeadamente a métodos baseados no histórico da variável em estudo. As previsões são, segundo alguns especialistas, um dos inputs essenciais que os gestores desenvolvem para ajudar no processo de decisão. Virtualmente cada decisão relevante ao nível das operações depende de uma previsão. Para a realização do modelo de previsão de preço da energia elétrica foram utilizados os modelos Autorregressivos Integrados de Médias Móveis, Autoregressive / Integrated / Moving Average (ARIMA), que geram previsões através da informação contida na própria série temporal. Como se pretende avaliar a estrutura do preço da energia elétrica do mercado de energia, é importante identificar, deste conjunto de variáveis, quais as que estão mais relacionados com o preço. Neste sentido, é realizada em paralelo uma análise exploratória, através da correlação entre o preço da energia elétrica e outras variáveis de estudo, utilizando para esse efeito o coeficiente de correlação de Pearson. O coeficiente de correlação de Pearson é uma medida do grau e da direção de relação linear entre duas variáveis quantitativas. O modelo desenvolvido foi aplicado tendo por base o histórico de preço da eletricidade desde o inicio do mercado liberalizado e de modo a obter as previsões diária, mensal e anual do preço da eletricidade. A metodologia desenvolvida demonstrou ser eficiente na obtenção das soluções e ser suficientemente rápida para prever o valor do preço da energia elétrica em poucos segundos, servindo de apoio à decisão em ambiente de mercado.
Resumo:
Dissertação para obtenção do Grau de Mestre em Matemática e Aplicações Especialização em Actuariado, Estatística e Investigação Operacional
Resumo:
Dissertação para Obtenção de Grau de Mestre em Engenharia Química e Bioquímica
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica
Resumo:
In health related research it is common to have multiple outcomes of interest in a single study. These outcomes are often analysed separately, ignoring the correlation between them. One would expect that a multivariate approach would be a more efficient alternative to individual analyses of each outcome. Surprisingly, this is not always the case. In this article we discuss different settings of linear models and compare the multivariate and univariate approaches. We show that for linear regression models, the estimates of the regression parameters associated with covariates that are shared across the outcomes are the same for the multivariate and univariate models while for outcome-specific covariates the multivariate model performs better in terms of efficiency.
Resumo:
In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Mat`ern models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.
Resumo:
Human Bartonellosis has an acute phase characterized by fever and hemolytic anemia, and a chronic phase with bacillary angiomatosis-like lesions. This cross-sectional pilot study evaluated the immunology patterns using pre- and post-treatment samples in patients with Human Bartonellosis. Patients between five and 60 years of age, from endemic areas in Peru, in the acute or chronic phases were included. In patients in the acute phase of Bartonellosis a state of immune peripheral tolerance should be established for persistence of the infection. Our findings were that elevation of the anti-inflammatory cytokine IL-10 and numeric abnormalities of CD4+ and CD8+ T-Lymphocyte counts correlated significantly with an unfavorable immune state. During the chronic phase, the elevated levels of IFN-γ and IL-4 observed in our series correlated with previous findings of endothelial invasion of B. henselae in animal models.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Dissertation for applying to a Master’s Degree in Molecular Genetics and Biomedicine submitted to the Sciences and Technology Faculty of New University of Lisbon
Resumo:
Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.
Resumo:
Introduction This study aimed to analyze the relationship between the incidence of severe dengue during the 2008 epidemic in Rio de Janeiro, Brazil, and socioeconomic indicators, as well as indicators of health service availability and previous circulation of the dengue virus serotype-3 (DENV-3). Methods In this ecological study, the units of analysis were the districts of Rio de Janeiro. The data were incorporated into generalized linear models, and the incidence of severe dengue in each district was the outcome variable. Results The districts with more cases of dengue fever in the 2001 epidemic and a higher percentage of residents who declared their skin color or race as black had higher incidence rates of severe dengue in the 2008 epidemic [incidence rate ratio (IRR)= 1.21; 95% confidence interval (95%CI)= 1.05-1.40 and IRR= 1.34; 95%CI= 1.16-1.54, respectively]. In contrast, the districts with Family Health Strategy (FHS) clinics were more likely to have lower incidence rates of severe dengue in the 2008 epidemic (IRR= 0.81; 95%CI= 0.70-0.93). Conclusions At the ecological level, our findings suggest the persistence of health inequalities in this region of Brazil that are possibly due to greater social vulnerability among the self-declared black population. Additionally, the protective effect of FHS clinics may be due to the ease of access to other levels of care in the health system or to a reduced vulnerability to dengue transmission that is afforded by local practices to promote health.
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).