958 resultados para empirical models
Resumo:
INTRODUCTION: Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using Bayesian spatiotemporal methods. METHODS: We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a Bayesian approach and Markov Chain Monte Carlo (MCMC) methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. RESULTS: The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI) are important in the prediction of malaria cases. CONCLUSIONS: It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the Bayesian paradigm is a good strategy for modeling malaria counts.
Resumo:
The paper studies what drives firms to voluntary delist from capital markets and what differs in firms’ behavior and fundamentals between public-to-private transactions and M&A deals with listed corporations. Moreover, I study the relationship between ownership percentage in controlling shareholders’ hands and cumulative returns around the delisting public announcement. I perform my tests both for the Italian and the US markets and I compare the findings to better understand how the phenomenon works in these different institutional environments. Consistent with my expectations, I find that the likelihood of delisting is mainly related to size, underperformance and undervaluation, while shareholders are more rewarded when their companies are involved in PTP transactions than in M&As with public firms.
Resumo:
This paper analyses the boundaries of simplified wind turbine models used to represent the behavior of wind turbines in order to conduct power system stability studies. Based on experimental measurements, the response of recent simplified (also known as generic) wind turbine models that are currently being developed by the International Standard IEC 61400-27 is compared to complex detailed models elaborated by wind turbine manufacturers. This International Standard, whose Technical Committee was convened in October 2009, is focused on defining generic simulation models for both wind turbines (Part 1) and wind farms (Part 2). The results of this work provide an improved understanding of the usability of generic models for conducting power system simulations.
Resumo:
The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.
Resumo:
The aim of this work project is to find a model that is able to accurately forecast the daily Value-at-Risk for PSI-20 Index, independently of the market conditions, in order to expand empirical literature for the Portuguese stock market. Hence, two subsamples, representing more and less volatile periods, were modeled through unconditional and conditional volatility models (because it is what drives returns). All models were evaluated through Kupiec’s and Christoffersen’s tests, by comparing forecasts with actual results. Using an out-of-sample of 204 observations, it was found that a GARCH(1,1) is an accurate model for our purposes.
Resumo:
This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.
Resumo:
The difference between the statutory and effective tax rate for listed groups is a complex variable influenced by a variety of factors. This paper aims to analyze whether this difference exists for listed groups in the German market and tests which factors have an impact on it. Thus the sample consists of 130 corporations listed in the three major German stock indices. The findings suggest that the companies that pay less than the statutory rate clearly outweigh the ones that pay more, and that the income earned from associated companies has a significant impact on this difference.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.
Resumo:
Enterprise Risk Management (ERM) is gaining relevance among financial and non-financial companies but its benefits still are uncertain. This paper aims at investigating the relationship between ERM adoption and firm performance based on a sample of 1130 non-financial companies belonging to the STOXX® index. A content analysis of individual accounts is performed to distinguish adopters, and a regression analysis explores the effect of ERM adoption on firm performance, proxied by Tobin’s Q. The findings suggest that there is a statistical significant positive effect of ERM adoption on firm performance, meaning that firms are benefiting from the implementation of this process.
Resumo:
In this research we conducted a mixed research, using qualitative and quantitative analysis to study the relationship and impact between mobile advertisement and mobile app user acquisition and the conclusions companies can derive from it. Data was gathered from management of mobile advertisement campaigns of a portfolio of three different mobile apps. We found that a number of implications can be extracted from this intersection, namely to product development, internationalisation and management of marketing budget. We propose further research on alternative app users sources, impact of revenue on apps and exploitation of product segments: wearable technology and Internet of Things.
Resumo:
RESUMO: Temos assistido a uma evolução impressionante nos laboratórios de análises clínicas, os quais precisam de prestar um serviço de excelência a custos cada vez mais competitivos. Nos laboratórios os sistemas de gestão da qualidade têm uma importância significativa nesta evolução, fundamentalmente pela procura da melhoria continua, que ocorre não só ao nível de processos e técnicas, mas também na qualificação dos diferentes intervenientes. Um dos problemas fundamentais da gestão e um laboratório é a eliminação de desperdícios e erros criando benefícios, conceito base na filosofia LeanThinking isto é “pensamento magro”, pelo que é essencial conseguir monitorizar funções críticas sistematicamente. Esta monitorização, num laboratório cada vez mais focalizado no utente, pode ser efetuada através de sistemas e tecnologias de informação, sendo possível contabilizar número de utentes, horas de maior afluência, tempo médio de permanência na sala de espera, tempo médio para entrega de análises, resultados entregues fora da data prevista, entre outros dados de apoio à decisão. Devem igualmente ser analisadas as reclamações, bem como a satisfação dos utentes quer através do feedback que é transmitido aos funcionários, quer através de questionários de satisfação. Usou-se principalmente dois modelos: um proposto pelo Índice Europeu de Satisfação do Consumidor (ECSI) e o outro de Estrutura Comum de Avaliação (CAF). Introduziram-se igualmente dois questionários: um apresentado em formato digital num posto de colheitas, através de um quiosque eletrónico, e um outro na página da internet do laboratório, ambos como alternativa ao questionário em papel já existente, tendo-se analisado os dados, e retirado as devidas conclusões. Propôs-se e desenvolveu-se um questionário para colaboradores cuja intenção foi a de fornecer dados úteis de apoio à decisão, face à importância dos funcionários na interação com os clientes e na garantia da qualidade ao longo de todo o processo. Avaliaram-se globalmente os resultados sem que tenha sido possível apresentá-los por política interna da empresa, bem como se comentou de forma empírica alguns benefícios deste questionário. Os principais objetivos deste trabalho foram, implementar questionários de satisfação eletrónicos e analisar os resultados obtidos, comparando-os com o estudo ECSI, de forma a acentuar a importância da análise em simultâneo de dois fatores: a motivação profissional e a satisfação do cliente, com o intuito de melhorar os sistemas de apoio à decisão. ------------------------ ABSTRACT: We have witnessed an impressive development in clinical analysis laboratories, which have to provide excellent service at increasingly competitive costs, quality management systems have a significant importance in this evolution, mainly by demanding continuous improvement, which does not occur only in terms of processes and techniques, but also in the qualification of the various stakeholders. One key problem of managing a laboratory is the elimination of waste and errors, creating benefits, concept based on Lean Thinking philosophy, therefore it is essential be able to monitor critical tasks systematically. This monitoring, in an increasingly focused on the user laboratory can be accomplished through information systems and technologies, through which it is possible to account the number of clients, peak times, average length of waiting room stay, average time for delivery analysis, delivered results out of the expected date, among other data that contribute to support decisions, however it is also decisive to analyzed complaint sand satisfaction of users through employees feedback but mainly through satisfaction questionnaires that provides accurate results. We use mainly two models one proposed by the European Index of Consumer Satisfaction (ECSI), directed to the client, and the Common Assessment Framework (CAF), used both in the client as the employees surveys. Introduced two questionnaires in a digital format, one in the central laboratory collect center, through an electronic kiosk and another on the laboratory web page, both as an alternative to survey paper currently used, we analyzed the results, and withdrew the conclusions. It was proposed and developed a questionnaire for employees whose intention would be to provide useful data to decision support, given the importance of employees in customer interaction and quality assurance throughout the whole clinical process, it was evaluated in a general way because it was not possible to show the results, however commented an empirical way some benefits of this questionnaire. The main goals of this study were to implement electronic questionnaires and analyze the results, comparing them with the ECSI, in order to emphasize the importance of analyzing simultaneously professional motivation with customer satisfaction, in order to improve decision support systems.
Resumo:
This research is titled “The Future of Airline Business Models: Which Will Win?” and it is part of the requirements for the award of a Masters in Management from NOVA BSE and another from Luiss Guido Carlo University. The purpose is to elaborate a complete market analysis of the European Air Transportation Industry in order to predict which Airlines, strategies and business models may be successful in the next years. First, an extensive literature review of the business model concept has been done. Then, a detailed overview of the main European Airlines and the strategies that they have been implementing so far has been developed. Finally, the research is illustrated with three case studies
Resumo:
This project the direct rebound effect for the electricity demand in Portugal. While we find evidence of such an effect, the estimations also reflect the institutional arrangement that has characterized the electricity market in the country. Also, issues related to energy efficiency promotion are addressed in general putting into context the case study developed.
Resumo:
The aim of this research is to investigate if a celebrity can be a mediator between two brands so that a negative event happening to one brand can spill over to a completely unrelated brand, which shares with the first brand only the celebrity endorser. Even though celebrity endorsement is a popular marketing strategy and celebrities often endorse multiple brands, so far there has not been any systematic study on this topic. Drawing on Associative Network Theory and the Meaning Transfer Model as theoretical framework, this research finds out that negative publicity about a brand can spill over and thereby not only hurt consumers’ attitude toward the celebrity endorser but also toward a second brand that is endorsed by the same celebrity. An unexpected finding is that celebrities can act as a protective shield for brands by weakening the direct impact of negative publicity.