876 resultados para Markov Model Estimation
Resumo:
This paper analyses, through a dynamic panel data model, the impact of the Financial and the European Debt crisis on the equity returns of the banking system. The model is also extended to specifically investigate the impact on countries who received rescue packages. The sample under analysis considers eleven countries from January 2006 to June 2013. The main conclusion is that there was in fact a structural change in banks’ excess returns due to the outbreak of the European Debt Crisis, when stock markets were still recovering from the Financial Crisis of 2008.
Resumo:
Hand gestures are a powerful way for human communication, with lots of potential applications in the area of human computer interaction. Vision-based hand gesture recognition techniques have many proven advantages compared with traditional devices, giving users a simpler and more natural way to communicate with electronic devices. This work proposes a generic system architecture based in computer vision and machine learning, able to be used with any interface for human-computer interaction. The proposed solution is mainly composed of three modules: a pre-processing and hand segmentation module, a static gesture interface module and a dynamic gesture interface module. The experiments showed that the core of visionbased interaction systems could be the same for all applications and thus facilitate the implementation. For hand posture recognition, a SVM (Support Vector Machine) model was trained and used, able to achieve a final accuracy of 99.4%. For dynamic gestures, an HMM (Hidden Markov Model) model was trained for each gesture that the system could recognize with a final average accuracy of 93.7%. The proposed solution as the advantage of being generic enough with the trained models able to work in real-time, allowing its application in a wide range of human-machine applications. To validate the proposed framework two applications were implemented. The first one is a real-time system able to interpret the Portuguese Sign Language. The second one is an online system able to help a robotic soccer game referee judge a game in real time.
Resumo:
Background:Statins have proven efficacy in the reduction of cardiovascular events, but the financial impact of its widespread use can be substantial.Objective:To conduct a cost-effectiveness analysis of three statin dosing schemes in the Brazilian Unified National Health System (SUS) perspective.Methods:We developed a Markov model to evaluate the incremental cost-effectiveness ratios (ICERs) of low, intermediate and high intensity dose regimens in secondary and four primary scenarios (5%, 10%, 15% and 20% ten-year risk) of prevention of cardiovascular events. Regimens with expected low-density lipoprotein cholesterol reduction below 30% (e.g. simvastatin 10mg) were considered as low dose; between 30-40%, (atorvastatin 10mg, simvastatin 40mg), intermediate dose; and above 40% (atorvastatin 20-80mg, rosuvastatin 20mg), high-dose statins. Effectiveness data were obtained from a systematic review with 136,000 patients. National data were used to estimate utilities and costs (expressed as International Dollars - Int$). A willingness-to-pay (WTP) threshold equal to the Brazilian gross domestic product per capita (circa Int$11,770) was applied.Results:Low dose was dominated by extension in the primary prevention scenarios. In the five scenarios, the ICER of intermediate dose was below Int$10,000 per QALY. The ICER of the high versus intermediate dose comparison was above Int$27,000 per QALY in all scenarios. In the cost-effectiveness acceptability curves, intermediate dose had a probability above 50% of being cost-effective with ICERs between Int$ 9,000-20,000 per QALY in all scenarios.Conclusions:Considering a reasonable WTP threshold, intermediate dose statin therapy is economically attractive, and should be a priority intervention in prevention of cardiovascular events in Brazil.
Resumo:
Préface My thesis consists of three essays where I consider equilibrium asset prices and investment strategies when the market is likely to experience crashes and possibly sharp windfalls. Although each part is written as an independent and self contained article, the papers share a common behavioral approach in representing investors preferences regarding to extremal returns. Investors utility is defined over their relative performance rather than over their final wealth position, a method first proposed by Markowitz (1952b) and by Kahneman and Tversky (1979), that I extend to incorporate preferences over extremal outcomes. With the failure of the traditional expected utility models in reproducing the observed stylized features of financial markets, the Prospect theory of Kahneman and Tversky (1979) offered the first significant alternative to the expected utility paradigm by considering that people focus on gains and losses rather than on final positions. Under this setting, Barberis, Huang, and Santos (2000) and McQueen and Vorkink (2004) were able to build a representative agent optimization model which solution reproduced some of the observed risk premium and excess volatility. The research in behavioral finance is relatively new and its potential still to explore. The three essays composing my thesis propose to use and extend this setting to study investors behavior and investment strategies in a market where crashes and sharp windfalls are likely to occur. In the first paper, the preferences of a representative agent, relative to time varying positive and negative extremal thresholds are modelled and estimated. A new utility function that conciliates between expected utility maximization and tail-related performance measures is proposed. The model estimation shows that the representative agent preferences reveals a significant level of crash aversion and lottery-pursuit. Assuming a single risky asset economy the proposed specification is able to reproduce some of the distributional features exhibited by financial return series. The second part proposes and illustrates a preference-based asset allocation model taking into account investors crash aversion. Using the skewed t distribution, optimal allocations are characterized as a resulting tradeoff between the distribution four moments. The specification highlights the preference for odd moments and the aversion for even moments. Qualitatively, optimal portfolios are analyzed in terms of firm characteristics and in a setting that reflects real-time asset allocation, a systematic over-performance is obtained compared to the aggregate stock market. Finally, in my third article, dynamic option-based investment strategies are derived and illustrated for investors presenting downside loss aversion. The problem is solved in closed form when the stock market exhibits stochastic volatility and jumps. The specification of downside loss averse utility functions allows corresponding terminal wealth profiles to be expressed as options on the stochastic discount factor contingent on the loss aversion level. Therefore dynamic strategies reduce to the replicating portfolio using exchange traded and well selected options, and the risky stock.
Resumo:
While much of the literature on cross section dependence has focused mainly on estimation of the regression coefficients in the underlying model, estimation and inferences on the magnitude and strength of spill-overs and interactions has been largely ignored. At the same time, such inferences are important in many applications, not least because they have structural interpretations and provide useful interpretation and structural explanation for the strength of any interactions. In this paper we propose GMM methods designed to uncover underlying (hidden) interactions in social networks and committees. Special attention is paid to the interval censored regression model. Our methods are applied to a study of committee decision making within the Bank of England’s monetary policy committee.
Resumo:
This paper uses an infinite hidden Markov model (IIHMM) to analyze U.S. inflation dynamics with a particular focus on the persistence of inflation. The IHMM is a Bayesian nonparametric approach to modeling structural breaks. It allows for an unknown number of breakpoints and is a flexible and attractive alternative to existing methods. We found a clear structural break during the recent financial crisis. Prior to that, inflation persistence was high and fairly constant.
Resumo:
In this paper we present a novel structure from motion (SfM) approach able to infer 3D deformable models from uncalibrated stereo images. Using a stereo setup dramatically improves the 3D model estimation when the observed 3D shape is mostly deforming without undergoing strong rigid motion. Our approach first calibrates the stereo system automatically and then computes a single metric rigid structure for each frame. Afterwards, these 3D shapes are aligned to a reference view using a RANSAC method in order to compute the mean shape of the object and to select the subset of points on the object which have remained rigid throughout the sequence without deforming. The selected rigid points are then used to compute frame-wise shape registration and to extract the motion parameters robustly from frame to frame. Finally, all this information is used in a global optimization stage with bundle adjustment which allows to refine the frame-wise initial solution and also to recover the non-rigid 3D model. We show results on synthetic and real data that prove the performance of the proposed method even when there is no rigid motion in the original sequence
Resumo:
BACKGROUND: Nicotine dependence is the major obstacle for smokers who want to quit. Guidelines have identified five effective first-line therapies, four nicotine replacement therapies (NRTs)--gum, patch, nasal spray and inhaler--and bupropion. Studying the extent to which these various treatments are cost-effective requires additional research. OBJECTIVES: To determine cost-effectiveness (CE) ratios of pharmacotherapies for nicotine dependence provided by general practitioners (GPs) during routine visits as an adjunct to cessation counselling. METHODS: We used a Markov model to generate two cohorts of one-pack-a-day smokers: (1) the reference cohort received only cessation counselling from a GP during routine office visits; (2) the second cohort received the same counselling plus an offer to use a pharmacological treatment to help them quit smoking. The effectiveness of adjunctive therapy was expressed in terms of the resultant differential in mortality rate between the two cohorts. Data on the effectiveness of therapies came from meta-analyses, and we used odds ratio for quitting as the measure of effectiveness. The costs of pharmacotherapies were based on the cost of the additional time spent by GPs offering, prescribing and following-up treatment, and on the retail prices of the therapies. We used the third-party-payer perspective. Results are expressed as the incremental cost per life-year saved. RESULTS: The cost per life-year saved for only counselling ranged from Euro 385 to Euro 622 for men and from Euro 468 to Euro 796 for women. The CE ratios for the five pharmacological treatments varied from Euro 1768 to Euro 6879 for men, and from Euro 2146 to Euro 8799 for women. Significant variations in CE ratios among the five treatments were primarily due to differences in retail prices. The most cost-effective treatments were bupropion and the patch, and, then, in descending order, the spray, the inhaler and, lastly, gum. Differences in CE between men and women across treatments were due to the shape of their respective mortality curve. The lowest CE ratio in men was for the 45- to 49-year-old group and for women in the 50- to 54-year-old group. Sensitivity analysis showed that changes in treatment efficacy produced effects only for less-well proven treatments (spray, inhaler, and bupropion) and revealed a strong influence of the discount rate and natural quit rate on the CE of pharmacological treatments. CONCLUSION: The CE of first-line treatments for nicotine dependence varied widely with age and sex and was sensitive to the assumption for the natural quit rate. Bupropion and the nicotine patch were the two most cost-effective treatments.
Resumo:
BACKGROUND/AIMS: While several risk factors for the histological progression of chronic hepatitis C have been identified, the contribution of HCV genotypes to liver fibrosis evolution remains controversial. The aim of this study was to assess independent predictors for fibrosis progression. METHODS: We identified 1189 patients from the Swiss Hepatitis C Cohort database with at least one biopsy prior to antiviral treatment and assessable date of infection. Stage-constant fibrosis progression rate was assessed using the ratio of fibrosis Metavir score to duration of infection. Stage-specific fibrosis progression rates were obtained using a Markov model. Risk factors were assessed by univariate and multivariate regression models. RESULTS: Independent risk factors for accelerated stage-constant fibrosis progression (>0.083 fibrosis units/year) included male sex (OR=1.60, [95% CI 1.21-2.12], P<0.001), age at infection (OR=1.08, [1.06-1.09], P<0.001), histological activity (OR=2.03, [1.54-2.68], P<0.001) and genotype 3 (OR=1.89, [1.37-2.61], P<0.001). Slower progression rates were observed in patients infected by blood transfusion (P=0.02) and invasive procedures or needle stick (P=0.03), compared to those infected by intravenous drug use. Maximum likelihood estimates (95% CI) of stage-specific progression rates (fibrosis units/year) for genotype 3 versus the other genotypes were: F0-->F1: 0.126 (0.106-0.145) versus 0.091 (0.083-0.100), F1-->F2: 0.099 (0.080-0.117) versus 0.065 (0.058-0.073), F2-->F3: 0.077 (0.058-0.096) versus 0.068 (0.057-0.080) and F3-->F4: 0.171 (0.106-0.236) versus 0.112 (0.083-0.142, overall P<0.001). CONCLUSIONS: This study shows a significant association of genotype 3 with accelerated fibrosis using both stage-constant and stage-specific estimates of fibrosis progression rates. This observation may have important consequences for the management of patients infected with this genotype.
Resumo:
We propose and validate a multivariate classification algorithm for characterizing changes in human intracranial electroencephalographic data (iEEG) after learning motor sequences. The algorithm is based on a Hidden Markov Model (HMM) that captures spatio-temporal properties of the iEEG at the level of single trials. Continuous intracranial iEEG was acquired during two sessions (one before and one after a night of sleep) in two patients with depth electrodes implanted in several brain areas. They performed a visuomotor sequence (serial reaction time task, SRTT) using the fingers of their non-dominant hand. Our results show that the decoding algorithm correctly classified single iEEG trials from the trained sequence as belonging to either the initial training phase (day 1, before sleep) or a later consolidated phase (day 2, after sleep), whereas it failed to do so for trials belonging to a control condition (pseudo-random sequence). Accurate single-trial classification was achieved by taking advantage of the distributed pattern of neural activity. However, across all the contacts the hippocampus contributed most significantly to the classification accuracy for both patients, and one fronto-striatal contact for one patient. Together, these human intracranial findings demonstrate that a multivariate decoding approach can detect learning-related changes at the level of single-trial iEEG. Because it allows an unbiased identification of brain sites contributing to a behavioral effect (or experimental condition) at the level of single subject, this approach could be usefully applied to assess the neural correlates of other complex cognitive functions in patients implanted with multiple electrodes.
Resumo:
GeneID is a program to predict genes in anonymous genomic sequences designed with a hierarchical structure. In the first step, splice sites, and start and stop codons are predicted and scored along the sequence using position weight matrices (PWMs). In the second step, exons are built from the sites. Exons are scored as the sum of the scores of the defining sites, plus the log-likelihood ratio of a Markov model for coding DNA. In the last step, from the set of predicted exons, the gene structure is assembled, maximizing the sum of the scores of the assembled exons. In this paper we describe the obtention of PWMs for sites, and the Markov model of coding DNA in Drosophila melanogaster. We also compare other models of coding DNA with the Markov model. Finally, we present and discuss the results obtained when GeneID is used to predict genes in the Adh region. These results show that the accuracy of GeneID predictions compares currently with that of other existing tools but that GeneID is likely to be more efficient in terms of speed and memory usage.
Resumo:
We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.
Resumo:
We investigate the determinants of regional development using a newly constructed database of 1569 sub-national regions from 110 countries covering 74 percent of the world s surface and 97 percent of its GDP. We combine the cross-regional analysis of geographic, institutional, cultural, and human capital determinants of regional development with an examination of productivity in several thousand establishments located in these regions. To organize the discussion, we present a new model of regional development that introduces into a standard migration framework elements of both the Lucas (1978) model of the allocation of talent between entrepreneurship and work, and the Lucas (1988) model of human capital externalities. The evidence points to the paramount importance of human capital in accounting for regional differences in development, but also suggests from model estimation and calibration that entrepreneurial inputs and possibly human capital externalities help understand the data.
Resumo:
OBJECTIVES: To assess the incremental cost-effectiveness ratio (ICER) and incremental cost-utility ratio (ICUR) of risedronate compared to no intervention in postmenopausal osteoporotic women in a Swiss perspective. METHODS: A previously validated Markov model was populated with epidemiological and cost data specific to Switzerland and published utility values, and run on a population of 1,000 women of 70 years with established osteoporosis and previous vertebral fracture, treated over 5 years with risedronate 35 mg weekly or no intervention (base case), and five cohorts (according to age at therapy start) with eight risk factor distributions and three lengths of residual effects. RESULTS: In the base case population, the ICER of averting a hip fracture and the ICUR per quality-adjusted life year gained were both dominant. In the presence of a previous vertebral fracture, the ICUR was below euro45,000 (pound30,000) in all the scenarios. For all osteoporotic women>or=70 years of age with at least one risk factor, the ICUR was below euro45,000 or the intervention may even be cost saving. Age at the start of therapy and the fracture risk profile had a significant impact on results. CONCLUSION: Assuming a 2-year residual effect, that ICUR of risedronate in women with postmenopausal osteoporosis is below accepted thresholds from the age of 65 and even cost saving above the age of 70 with at least one risk factor.
Resumo:
BACKGROUND: Cleavage of messenger RNA (mRNA) precursors is an essential step in mRNA maturation. The signal recognized by the cleavage enzyme complex has been characterized as an A rich region upstream of the cleavage site containing a motif with consensus AAUAAA, followed by a U or UG rich region downstream of the cleavage site. RESULTS: We studied these signals using exhaustive databases of cleavage sites obtained from aligning raw expressed sequence tags (EST) sequences to genomic sequences in Homo sapiens and Drosophila melanogaster. These data show that the polyadenylation signal is highly conserved in human and fly. In addition, de novo motif searches generated a refined description of the U-rich downstream sequence (DSE) element, which shows more divergence between the two species. These refined motifs are applied, within a Hidden Markov Model (HMM) framework, to predict mRNA cleavage sites. CONCLUSION: We demonstrate that the DSE is a specific motif in both human and Drosophila. These findings shed light on the sequence correlates of a highly conserved biological process, and improve in silico prediction of 3' mRNA cleavage and polyadenylation sites.