23 resultados para Users of Financial Statements

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frictions are factors that hinder trading of securities in financial markets. Typical frictions include limited market depth, transaction costs, lack of infinite divisibility of securities, and taxes. Conventional models used in mathematical finance often gloss over these issues, which affect almost all financial markets, by arguing that the impact of frictions is negligible and, consequently, the frictionless models are valid approximations. This dissertation consists of three research papers, which are related to the study of the validity of such approximations in two distinct modeling problems. Models of price dynamics that are based on diffusion processes, i.e., continuous strong Markov processes, are widely used in the frictionless scenario. The first paper establishes that diffusion models can indeed be understood as approximations of price dynamics in markets with frictions. This is achieved by introducing an agent-based model of a financial market where finitely many agents trade a financial security, the price of which evolves according to price impacts generated by trades. It is shown that, if the number of agents is large, then under certain assumptions the price process of security, which is a pure-jump process, can be approximated by a one-dimensional diffusion process. In a slightly extended model, in which agents may exhibit herd behavior, the approximating diffusion model turns out to be a stochastic volatility model. Finally, it is shown that when agents' tendency to herd is strong, logarithmic returns in the approximating stochastic volatility model are heavy-tailed. The remaining papers are related to no-arbitrage criteria and superhedging in continuous-time option pricing models under small-transaction-cost asymptotics. Guasoni, Rásonyi, and Schachermayer have recently shown that, in such a setting, any financial security admits no arbitrage opportunities and there exist no feasible superhedging strategies for European call and put options written on it, as long as its price process is continuous and has the so-called conditional full support (CFS) property. Motivated by this result, CFS is established for certain stochastic integrals and a subclass of Brownian semistationary processes in the two papers. As a consequence, a wide range of possibly non-Markovian local and stochastic volatility models have the CFS property.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online content services can greatly benefit from personalisation features that enable delivery of content that is suited to each user's specific interests. This thesis presents a system that applies text analysis and user modeling techniques in an online news service for the purpose of personalisation and user interest analysis. The system creates a detailed thematic profile for each content item and observes user's actions towards content items to learn user's preferences. A handcrafted taxonomy of concepts, or ontology, is used in profile formation to extract relevant concepts from the text. User preference learning is automatic and there is no need for explicit preference settings or ratings from the user. Learned user profiles are segmented into interest groups using clustering techniques with the objective of providing a source of information for the service provider. Some theoretical background for chosen techniques is presented while the main focus is in finding practical solutions to some of the current information needs, which are not optimally served with traditional techniques.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study contributes to our knowledge of how information contained in financial statements is interpreted and priced by the stock market in two aspects. First, the empirical findings indicate that investors interpret some of the information contained in new financial statements in the context of the information of prior financial statements. Second, two central hypotheses offered in earlier literature to explain the significant connection between publicly available financial statement information and future abnormal returns, that the signals proxy for risk and that the information is priced with a delay, are evaluated utilizing a new methodology. It is found that the mentioned significant connection for some financial statement signals can be explained by that the signals proxy for risk and for other financial statement signals by that the information contained in the signals is priced with a delay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pitfalls in the treatment of persons with dementia Persons with dementia require high-quality health care, rehabilitation and sufficient social services to support their autonomy and to postpone permanent institutionalization. This study sought to investigate possible pitfalls in the care of patients with dementia: hip fracture rehabilitation, use of inappropriate or antipsychotic medication, social and medicolegal services offered to dementia caregiving families. Three different Finnish samples were used from years 1999-2005, mean age 78 to 86 years. After hip fracture operation, the weight-bearing restriction especially in group of patients with dementia, was associated with a longer rehabilitation period (73.5 days vs. 45.5 days, p=0.03) and the inability to learn to walk after six weeks (p<0.001). Almost half (44%) of the pre-surgery home-dwellers with dementia in our sample required permanent hospitalization after hip fracture. Potentially inappropriate medication was used among 36.2% of nursing home and hospital patients. The most common PIDs in Finland were temazepam over 15 mg/day, oxybutynin, and dipyridamole. However, PID use failed to predict mortality or the use of health services. Nearly half (48.4%) of the nursing home and hospital patients with dementia used antipsychotic medication. The two-year mortality did not differ among the users of conventional or atypical antipsychotics or the non-users (45.3% vs.32.1% vs.49.6%, p=0.195). The mean number of hospital admissions was highest among non-users (p=0.029). A high number of medications (HR 1.12, p<0.001) and the use of physical restraints (HR 1.72, p=0.034) predicted higher mortality at two years, while the use of atypical antipsychotics (HR 0.49, p=0.047) showed a protective effect, if any. The services most often offered to caregiving families of persons with Alzheimer s disease (AD) included financial support from the community (36%), technical devices (33%), physiotherapy (32%), and respite care in nursing homes (31%). Those services most often needed included physiotherapy for the spouse with dementia (56%), financial support (50%), house cleaning (41%), and home respite (40%). Only a third of the caregivers were satisfied with these services, and 69% felt unable to influence the range of services offered. The use of legal guardians was quite rare (only 4.3%), while the use of financial powers of attorney was 37.8%. Almost half (47.9%) of the couples expressed an unmet need for discussion with their doctor about medico-legal issues, while only 9.9% stated that their doctor had informed them of such matters. Although we already have many practical methods to develop the medical and social care of persons with AD, these patients and their families require better planning and tailoring of such services. In this way, society could offer these elderly persons better quality of life while economizing on its financial resources. This study was supported by Social Insurance Institution of Finland and part of it made in cooperation with the The Central Union of the Welfare for the Aged, Finland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A growing body of empirical research examines the structure and effectiveness of corporate governance systems around the world. An important insight from this literature is that corporate governance mechanisms address the excessive use of managerial discretionary powers to get private benefits by expropriating the value of shareholders. One possible way of expropriation is to reduce the quality of disclosed earnings by manipulating the financial statements. This lower quality of earnings should then be reflected by the stock price of firm according to value relevance theorem. Hence, instead of testing the direct effect of corporate governance on the firm’s market value, it is important to understand the causes of the lower quality of accounting earnings. This thesis contributes to the literature by increasing knowledge about the extent of the earnings management – measured as the extent of discretionary accruals in total disclosed earnings - and its determinants across the Transitional European countries. The thesis comprises of three essays of empirical analysis of which first two utilize the data of Russian listed firms whereas the third essay uses data from 10 European economies. More specifically, the first essay adds to existing research connecting earnings management to corporate governance. It testifies the impact of the Russian corporate governance reforms of 2002 on the quality of disclosed earnings in all publicly listed firms. This essay provides empirical evidence of the fact that the desired impact of reforms is not fully substantiated in Russia without proper enforcement. Instead, firm-level factors such as long-term capital investments and compliance with International financial reporting standards (IFRS) determine the quality of the earnings. The result presented in the essay support the notion proposed by Leuz et al. (2003) that the reforms aimed to bring transparency do not correspond to desired results in economies where investor protection is lower and legal enforcement is weak. The second essay focuses on the relationship between the internal-control mechanism such as the types and levels of ownership and the quality of disclosed earnings in Russia. The empirical analysis shows that the controlling shareholders in Russia use their powers to manipulate the reported performance in order to get private benefits of control. Comparatively, firms owned by the State have significantly better quality of disclosed earnings than other controllers such as oligarchs and foreign corporations. Interestingly, market performance of firms controlled by either State or oligarchs is better than widely held firms. The third essay provides useful evidence on the fact that both ownership structures and economic characteristics are important factors in determining the quality of disclosed earnings in three groups of countries in Europe. Evidence suggests that ownership structure is a more important determinant in developed and transparent countries, while economic determinants are important determinants in developing and transitional countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the study was to analyze and facilitate collaborative design in a virtual learning environment (VLE). Discussions of virtual design in design education have typically focused on technological or communication issues, not on pedagogical issues. Yet in order to facilitate collaborative design, it is also necessary to address the pedagogical issues related to the virtual design process. In this study, the progressive inquiry model of collaborative designing was used to give a structural level of facilitation to students working in the VLE. According to this model, all aspects of inquiry, such as creating the design context, constructing a design idea, evaluating the idea, and searching for new information, can be shared in a design community. The study consists of three design projects: 1) designing clothes for premature babies, 2) designing conference bags for an international conference, and 3) designing tactile books for visually impaired children. These design projects constituted a continuum of design experiments, each of which highlighted certain perspectives on collaborative designing. The design experiments were organized so that the participants worked in design teams, both face-to-face and virtually. The first design experiment focused on peer collaboration among textile teacher students in the VLE. The second design experiment took into consideration end-users needs by using a participatory design approach. The third design experiment intensified computer-supported collaboration between students and domain experts. The virtual learning environments, in these design experiments, were designed to support knowledge-building pedagogy and progressive inquiry learning. These environments enabled a detailed recording of all computer-mediated interactions and data related to virtual designing. The data analysis was based on qualitative content analysis of design statements in the VLE. This study indicated four crucial issues concerning collaborative design in the VLE in craft and design education. Firstly, using the collaborative design process in craft and design education gives rise to special challenges of building learning communities, creating appropriate design tasks for them, and providing tools for collaborative activities. Secondly, the progressive inquiry model of collaborative designing can be used as a scaffold support for design thinking and for reflection on the design process. Thirdly, participation and distributed expertise can be facilitated by considering the key stakeholders who are related to the design task or design context, and getting them to participate in virtual designing. Fourthly, in the collaborative design process, it is important that team members create and improve visual and technical ideas together, not just agree or disagree about proposed ideas. Therefore, viewing the VLE as a medium for collaborative construction of the design objects appears crucial in order to understand and facilitate the complex processes in collaborative designing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Taking the appropriation of objects as a theoretical starting point, this study makes a distinction between a conceptual and practical level of adopting new objects and products in everyday life. The study applies the concept of appropriation in social food research and examines consumers appropriation of functional foods, i.e., foods developed to improve health and well-being or reduce the risk of disease beyond the usual nutritional effects of foods. The study uses the concept of appropriation to understand the adoption and the process of making functional foods our own . First, the study focuses on the conceptual appropriation by analysing consumers interpretations and opinions on functional foods. Second, it analyses the use of functional foods and examines the role of sociodemographic and food- and health-related background factors in the use of functional foods. Both quantitative and qualitative data were used in the study. Altogether 1210 Finns representative of the population took part in a survey carried out in 2002 as computer-assisted telephone interviews (CATI). The survey examined the acceptability and use of functional foods in Finland. In 2004, eight focus group discussions were organised for 45 users and non-users of cholesterol-lowering spreads. The qualitative study focused on consumers interpretative perspectives on healthy eating and functional foods. The findings are reported in four original articles and a summary article. The results show that the appropriation of functional foods is a multifaceted phenomenon. The conceptual appropriation is related to consumers interpretations of functional foods in the context of healthy foods and healthy eating; their trust in the products, their manufacturers, research and control; and the relationship of functional foods and the ideal of natural foods. The analysis of the practical appropriation of four different types of foods marketed as functional showed that there are sociodemographic differences between users and non-users of the products, but more importantly, the differences are related to consumers food- and health-related views and practices. Consumers ways of appropriating functional foods in the conceptual and practical sense take shape in a complex web of ideas and everyday practices concerning food, health and eating as a whole. The results also indicate that the conceptual and practical appropriation are not necessarily uniform or coherent processes. Consumers interpret healthy eating and functional foods from a variety of perspectives and there is a multiplicity of rationales of using functional foods. Appropriation embraces many opposing dimensions simultaneously: good experiences and doubts, approval and criticism, expectations and things taken for granted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mobile RFID services for the Internet of Things can be created by using RFID as an enabling technology in mobile devices. Humans, devices, and things are the content providers and users of these services. Mobile RFID services can be either provided on mobile devices as stand-alone services or combined with end-to-end systems. When different service solution scenarios are considered, there are more than one possible architectural solution in the network, mobile, and back-end server areas. Combining the solutions wisely by applying the software architecture and engineering principles, a combined solution can be formulated for certain application specific use cases. This thesis illustrates these ideas. It also shows how generally the solutions can be used in real world use case scenarios. A case study is used to add further evidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to analyse the development and understanding of the idea of consensus in bilateral dialogues among Anglicans, Lutherans and Roman Catholics. The source material consists of representative dialogue documents from the international, regional and national dialogues from the 1960s until 2006. In general, the dialogue documents argue for agreement/consensus based on commonality or compatibility. Each of the three dialogue processes has specific characteristics and formulates its argument in a unique way. The Lutheran-Roman Catholic dialogue has a particular interest in hermeneutical questions. In the early phases, the documents endeavoured to describe the interpretative principles that would allow the churches to together proclaim the Gospel and to identify the foundation on which the agreement in the church is based. This investigation ended up proposing a notion of basic consensus , which later developed into a form of consensus that seeks to embrace, not to dismiss differences (so-called differentiated consensus ). The Lutheran-Roman Catholic agreement is based on a perspectival understanding of doctrine. The Anglican-Roman Catholic dialogue emphasises the correctness of interpretations. The documents consciously look towards a common future , not the separated past. The dialogue s primary interpretative concept is koinonia. The texts develop a hermeneutics of authoritative teaching that has been described as the rule of communion . The Anglican-Lutheran dialogue is characterised by an instrumental understanding of doctrine. Doctrinal agreement is facilitated by the ideas of coherence, continuity and substantial emphasis in doctrine. The Anglican-Lutheran dialogue proposes a form of sufficient consensus that considers a wide set of doctrinal statements and liturgical practices to determine whether an agreement has been reached to the degree that, although not complete , is sufficient for concrete steps towards unity. Chapter V discusses the current challenges of consensus as an ecumenically viable concept. In this part, I argue that the acceptability of consensus as an ecumenical goal is based not only the understanding of the church but more importantly on the understanding of the nature and function of the doctrine. The understanding of doctrine has undergone significant changes during the time of the ecumenical dialogues. The major shift has been from a modern paradigm towards a postmodern paradigm. I conclude with proposals towards a way to construct a form of consensus that would survive philosophical criticism, would be theologically valid and ecumenically acceptable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detecting Earnings Management Using Neural Networks. Trying to balance between relevant and reliable accounting data, generally accepted accounting principles (GAAP) allow, to some extent, the company management to use their judgment and to make subjective assessments when preparing financial statements. The opportunistic use of the discretion in financial reporting is called earnings management. There have been a considerable number of suggestions of methods for detecting accrual based earnings management. A majority of these methods are based on linear regression. The problem with using linear regression is that a linear relationship between the dependent variable and the independent variables must be assumed. However, previous research has shown that the relationship between accruals and some of the explanatory variables, such as company performance, is non-linear. An alternative to linear regression, which can handle non-linear relationships, is neural networks. The type of neural network used in this study is the feed-forward back-propagation neural network. Three neural network-based models are compared with four commonly used linear regression-based earnings management detection models. All seven models are based on the earnings management detection model presented by Jones (1991). The performance of the models is assessed in three steps. First, a random data set of companies is used. Second, the discretionary accruals from the random data set are ranked according to six different variables. The discretionary accruals in the highest and lowest quartiles for these six variables are then compared. Third, a data set containing simulated earnings management is used. Both expense and revenue manipulation ranging between -5% and 5% of lagged total assets is simulated. Furthermore, two neural network-based models and two linear regression-based models are used with a data set containing financial statement data from 110 failed companies. Overall, the results show that the linear regression-based models, except for the model using a piecewise linear approach, produce biased estimates of discretionary accruals. The neural network-based model with the original Jones model variables and the neural network-based model augmented with ROA as an independent variable, however, perform well in all three steps. Especially in the second step, where the highest and lowest quartiles of ranked discretionary accruals are examined, the neural network-based model augmented with ROA as an independent variable outperforms the other models.