380 resultados para Dividend Imputation
Resumo:
Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.
Resumo:
Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is - potentially fatally - obstructed. It is one of the leading causes of sudden cardiac death in young people. Electrocardiography (ECG) and Echocardiography (Echo) are the standard tests for identifying HCM and other cardiac abnormalities. The American Heart Association has recommended using a pre-participation questionnaire for young athletes instead of ECG or Echo tests due to considerations of cost and time involved in interpreting the results of these tests by an expert cardiologist. Initially we set out to develop a classifier for automated prediction of young athletes’ heart conditions based on the answers to the questionnaire. Classification results and further in-depth analysis using computational and statistical methods indicated significant shortcomings of the questionnaire in predicting cardiac abnormalities. Automated methods for analyzing ECG signals can help reduce cost and save time in the pre-participation screening process by detecting HCM and other cardiac abnormalities. Therefore, the main goal of this dissertation work is to identify HCM through computational analysis of 12-lead ECG. ECG signals recorded on one or two leads have been analyzed in the past for classifying individual heartbeats into different types of arrhythmia as annotated primarily in the MIT-BIH database. In contrast, we classify complete sequences of 12-lead ECGs to assign patients into two groups: HCM vs. non-HCM. The challenges and issues we address include missing ECG waves in one or more leads and the dimensionality of a large feature-set. We address these by proposing imputation and feature-selection methods. We develop heartbeat-classifiers by employing Random Forests and Support Vector Machines, and propose a method to classify full 12-lead ECGs based on the proportion of heartbeats classified as HCM. The results from our experiments show that the classifiers developed using our methods perform well in identifying HCM. Thus the two contributions of this thesis are the utilization of computational and statistical methods for discovering shortcomings in a current screening procedure and the development of methods to identify HCM through computational analysis of 12-lead ECG signals.
Resumo:
BACKGROUND: Moderate-to-vigorous physical activity (MVPA) is an important determinant of children’s physical health, and is commonly measured using accelerometers. A major limitation of accelerometers is non-wear time, which is the time the participant did not wear their device. Given that non-wear time is traditionally discarded from the dataset prior to estimating MVPA, final estimates of MVPA may be biased. Therefore, alternate approaches should be explored. OBJECTIVES: The objectives of this thesis were to 1) develop and describe an imputation approach that uses the socio-demographic, time, health, and behavioural data from participants to replace non-wear time accelerometer data, 2) determine the extent to which imputation of non-wear time data influences estimates of MVPA, and 3) determine if imputation of non-wear time data influences the associations between MVPA, body mass index (BMI), and systolic blood pressure (SBP). METHODS: Seven days of accelerometer data were collected using Actical accelerometers from 332 children aged 10-13. Three methods for handling missing accelerometer data were compared: 1) the “non-imputed” method wherein non-wear time was deleted from the dataset, 2) imputation dataset I, wherein the imputation of MVPA during non-wear time was based upon socio-demographic factors of the participant (e.g., age), health information (e.g., BMI), and time characteristics of the non-wear period (e.g., season), and 3) imputation dataset II wherein the imputation of MVPA was based upon the same variables as imputation dataset I, plus organized sport information. Associations between MVPA and health outcomes in each method were assessed using linear regression. RESULTS: Non-wear time accounted for 7.5% of epochs during waking hours. The average minutes/day of MVPA was 56.8 (95% CI: 54.2, 59.5) in the non-imputed dataset, 58.4 (95% CI: 55.8, 61.0) in imputed dataset I, and 59.0 (95% CI: 56.3, 61.5) in imputed dataset II. Estimates between datasets were not significantly different. The strength of the relationship between MVPA with BMI and SBP were comparable between all three datasets. CONCLUSION: These findings suggest that studies that achieve high accelerometer compliance with unsystematic patterns of missing data can use the traditional approach of deleting non-wear time from the dataset to obtain MVPA measures without substantial bias.
Resumo:
Estimates of HIV prevalence are important for policy in order to establish the health status of a country's population and to evaluate the effectiveness of population-based interventions and campaigns. However, participation rates in testing for surveillance conducted as part of household surveys, on which many of these estimates are based, can be low. HIV positive individuals may be less likely to participate because they fear disclosure, in which case estimates obtained using conventional approaches to deal with missing data, such as imputation-based methods, will be biased. We develop a Heckman-type simultaneous equation approach which accounts for non-ignorable selection, but unlike previous implementations, allows for spatial dependence and does not impose a homogeneous selection process on all respondents. In addition, our framework addresses the issue of separation, where for instance some factors are severely unbalanced and highly predictive of the response, which would ordinarily prevent model convergence. Estimation is carried out within a penalized likelihood framework where smoothing is achieved using a parametrization of the smoothing criterion which makes estimation more stable and efficient. We provide the software for straightforward implementation of the proposed approach, and apply our methodology to estimating national and sub-national HIV prevalence in Swaziland, Zimbabwe and Zambia.
Resumo:
We present the market practice for interest rate yield curves construction and pricing interest rate derivatives. Then we give a brief description of the Vasicek and the Hull-White models, with an example of calibration to market data. We generalize the classical Black-Scholes-Merton pricing formulas, considering more general cases such as perfect or partial collateral, derivatives on a dividend paying asset subject to repo funding, and multiple currencies. Finally we derive generic pricing formulae for different combinations of cash flow and collateral currencies, and we apply the results to the pricing of FX swaps and CCS, and we discuss curve bootstrapping.
Resumo:
[Excerpt] REITs are attractive to investors, particularly institutional investors, due to their high dividend payouts and ability to provide more liquidity to the underlying market for direct real estate investment. This chapter analyzes the performance of real estate investment trusts (REITs). It compares the returns on REITs with those on more traditional asset classes, specifically bonds and mid-cap equities, and surveys the academic literature dealing with the diverse issues related to valuation. The chapter also examines the linkages between REIT performance and the behavior of the underlying real estate market. Because the chapter takes the perspective of a U.S.-based investor, it does not directly address the broader issues of global REITs.
Resumo:
Background: Primary total knee replacement is a common operation that is performed to provide pain relief and restore functional ability. Inpatient physiotherapy is routinely provided after surgery to enhance recovery prior to hospital discharge. However, international variation exists in the provision of outpatient physiotherapy after hospital discharge. While evidence indicates that outpatient physiotherapy can improve short-term function, the longer term benefits are unknown. The aim of this randomised controlled trial is to evaluate the long-term clinical effectiveness and cost-effectiveness of a 6-week group-based outpatient physiotherapy intervention following knee replacement. Methods/design: Two hundred and fifty-six patients waiting for knee replacement because of osteoarthritis will be recruited from two orthopaedic centres. Participants randomised to the usual-care group (n = 128) will be given a booklet about exercise and referred for physiotherapy if deemed appropriate by the clinical care team. The intervention group (n = 128) will receive the same usual care and additionally be invited to attend a group-based outpatient physiotherapy class starting 6 weeks after surgery. The 1-hour class will be run on a weekly basis over 6 weeks and will involve task-orientated and individualised exercises. The primary outcome will be the Lower Extremity Functional Scale at 12 months post-operative. Secondary outcomes include: quality of life, knee pain and function, depression, anxiety and satisfaction. Data collection will be by questionnaire prior to surgery and 3, 6 and 12 months after surgery and will include a resource-use questionnaire to enable a trial-based economic evaluation. Trial participation and satisfaction with the classes will be evaluated through structured telephone interviews. The primary statistical and economic analyses will be conducted on an intention-to-treat basis with and without imputation of missing data. The primary economic result will estimate the incremental cost per quality-adjusted life year gained from this intervention from a National Health Services (NHS) and personal social services perspective. Discussion: This research aims to benefit patients and the NHS by providing evidence on the long-term effectiveness and cost-effectiveness of outpatient physiotherapy after knee replacement. If the intervention is found to be effective and cost-effective, implementation into clinical practice could lead to improvement in patients’ outcomes and improved health care resource efficiency.
Resumo:
The widespread efforts to incorporate the economic values of oceans into national income accounts have reached a stage where coordination of national efforts is desirable. A symposium held in 2015 began this process by bringing together representatives from ten countries. The symposium concluded that a definition of core ocean industries was possible but beyond that core the definition of ocean industries is in flux. Better coordination of ocean income accounts will require addressing issues of aggregation, geography, partial ocean industries, confidential, and imputation is also needed. Beyond the standard national income accounts, a need to incorporate environmental resource and ecosystem service values to gain a complete picture of the economic role of the oceans was identified. The U.N. System of Environmental and Economic Accounts and the Experimental Ecosystem Service Accounts provide frameworks for this expansion. This will require the development of physical accounts of environmental assets linked to the economic accounts as well as the adaptation of transaction and welfare based economic valuation methods to environmental resources and ecosystem services. The future development of ocean economic data is most likely to require cooperative efforts at development of metadata standards and the use of multiple platforms of opportunity created by policy analysis, economic development, and conservation projects to both collect new economic data and to sustain ocean economy data collection into the future by building capacity in economic data collection and use..
Resumo:
Finance is one of the fastest growing areas in modern applied mathematics with real world applications. The interest of this branch of applied mathematics is best described by an example involving shares. Shareholders of a company receive dividends which come from the profit made by the company. The proceeds of the company, once it is taken over or wound up, will also be distributed to shareholders. Therefore shares have a value that reflects the views of investors about the likely dividend payments and capital growth of the company. Obviously such value will be quantified by the share price on stock exchanges. Therefore financial modelling serves to understand the correlations between asset and movements of buy/sell in order to reduce risk. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. There are other financial activities and it is not an intention of this paper to discuss all of these activities. The main concern of this paper is to propose a parallel algorithm for the numerical solution of an European option. This paper is organised as follows. First, a brief introduction is given of a simple mathematical model for European options and possible numerical schemes of solving such mathematical model. Second, Laplace transform is applied to the mathematical model which leads to a set of parametric equations where solutions of different parametric equations may be found concurrently. Numerical inverse Laplace transform is done by means of an inversion algorithm developed by Stehfast. The scalability of the algorithm in a distributed environment is demonstrated. Third, a performance analysis of the present algorithm is compared with a spatial domain decomposition developed particularly for time-dependent heat equation. Finally, a number of issues are discussed and future work suggested.
Resumo:
Comme les résultats obtenus relativement à la relation entre l'utilisation de la capitalisation et le niveau d'endettement de l'entreprise sont difficiles à généraliser, ils ne permettent pas de conclure à l'existence d'une telle relation. Or, il a été démontré dans la littérature comptable, qu'en absence de normalisation, les entreprises endettées favorisent la méthode de capitalisation. Cela suggère donc que les critères énoncés par l'ICCA limitent le recours à la capitalisation. Les résultats obtenus relativement à la proportion des frais de développement capitalisés suggèrent qu'aucune relation n'existe entre la proportion capitalisée et le niveau d'endettement de l'entreprise. Cela suggère que les critères énoncés limitent le montant des frais de développement capitalisés. Par ailleurs, une association négative a été observée entre le recours à l'imputation et la taille des entreprises. Ce résultat est surprenant car les grandes entreprises sont vraisemblablement fructueuses et ont de bonnes chances de satisfaire les critères énoncés par l'ICCA. Cela suggère que les grandes entreprises se soustraient à l'obligation de capitaliser leurs frais de développement et que cela est vraisemblablement toléré par l'ICCA.
Resumo:
Senttiosakkeista tehtyjä tutkimuksia on olemassa hyvin rajoitetusti, ja ne ovat keskittyneet lähinnä senttiosakelistautumisiin. Tässä tutkielmassa tarkastellaan suomalaisia julkisesti noteerattuja senttiosakkeita ja niiden suoriutumista kymmenen vuoden ajanjaksolla vuosina 2006–2015. Tavoitteena oli selvittää, onko suomalaisiin julkisesti noteerattuihin senttiosakkeisiin sijoittaminen kannattavaa toimintaa ja minkälaisia tuottoja on odotettavissa senttiosakkeisiin sijoittamalla. Tutkimusaineisto koostui tutkielmassa tehdyn määritelmän mukaisista senttiosakkeista ja muista Small Cap -indeksin osakkeista, joita kutsuttiin puolestaan ei-senttiosakkeiksi. Tuotot laskettiin osakkeiden päivittäisistä tuottoindekseistä. Tuottoja verrattiin lyhyellä, keskipitkällä ja pitkällä aikavälillä. Tuottojen tarkastelun tueksi senttiosakkeille ja ei-senttiosakkeille laskettiin seuraavat menestysmittarit: Sharpen luku, Treynorin indeksi ja Jensenin alfa. Lopuksi verrattiin vielä seuraavia tunnuslukuja: ROE (%), E/P-luku, P/B-luku, osinkotuotto-% ja velan suhde omaan pääomaan (%). Saatujen tulosten perusteella suomalaiset julkisesti noteeratut senttiosakkeet ovat lyhyellä aikavälillä kannattavia sijoituskohteita, mutta mitä pidemmäksi tarkasteluperiodi kasvoi, sitä huonommin ne suoriutuivat. Lisäksi senttiosakkeet hävisivät kaikilla tarkasteluperiodeilla ei-senttiosakkeille. Suurimmat positiiviset tuotot olivat kuitenkin yksittäisillä senttiosakkeilla. Senttiosakkeisiin havaittiin liittyvän paljon riskejä, kuten suuri volatiliteetti, suuret negatiiviset tuotot ja konkurssin mahdollisuus. Myös kaikki menestysmittarit ja tunnusluvut indikoivat senttiosakkeiden olevan ei-senttiosakkeita huonompia sijoituskohteita. Sijoittajien on oltava erityisen tarkkoja senttiosakkeiden kanssa, sillä niihin sijoittaminen on pitkälti verrattavissa uhkapelaamiseen.
Resumo:
This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.
Resumo:
The purpose of this advisory opinion is to update SC Revenue Ruling #91-15 concerning interest exempt from South Carolina income taxes. This advisory opinion provides a discussion of the types of interest exempt from South Carolina income taxes,1 the taxability of exempt interest when distributed as a dividend from a mutual fund, and Section 265 of the Internal Revenue Code which disallows a deduction for expenses allocable to tax-exempt income. This document also provides examples of tax-exempt obligations and obligations which are not tax-exempt for South Carolina income tax purposes.
Resumo:
Snapper (Pagrus auratus) is widely distributed throughout subtropical and temperate southern oceans and forms a significant recreational and commercial fishery in Queensland, Australia. Using data from government reports, media sources, popular publications and a government fisheries survey carried out in 1910, we compiled information on individual snapper fishing trips that took place prior to the commencement of fisherywide organized data collection, from 1871 to 1939. In addition to extracting all available quantitative data, we translated qualitative information into bounded estimates and used multiple imputation to handle missing values, forming 287 records for which catch rate (snapper fisher−1 h−1) could be derived. Uncertainty was handled through a parametric maximum likelihood framework (a transformed trivariate Gaussian), which facilitated statistical comparisons between data sources. No statistically significant differences in catch rates were found among media sources and the government fisheries survey. Catch rates remained stable throughout the time series, averaging 3.75 snapper fisher−1 h−1 (95% confidence interval, 3.42–4.09) as the fishery expanded into new grounds. In comparison, a contemporary (1993–2002) south-east Queensland charter fishery produced an average catch rate of 0.4 snapper fisher−1 h−1 (95% confidence interval, 0.31–0.58). These data illustrate the productivity of a fishery during its earliest years of development and represent the earliest catch rate data globally for this species. By adopting a formalized approach to address issues common to many historical records – missing data, a lack of quantitative information and reporting bias – our analysis demonstrates the potential for historical narratives to contribute to contemporary fisheries management.