8 resultados para Hazard-Based Models

em Helda - Digital Repository of University of Helsinki


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi- and intralake datasets of fossil midge assemblages in surface sediments of small shallow lakes in Finland were studied to determine the most important environmental factors explaining trends in midge distribution and abundance. The aim was to develop palaeoenvironmental calibration models for the most important environmental variables for the purpose of reconstructing past environmental conditions. The developed models were applied to three high-resolution fossil midge stratigraphies from southern and eastern Finland to interpret environmental variability over the past 2000 years, with special focus on the Medieval Climate Anomaly (MCA), the Little Ice Age (LIA) and recent anthropogenic changes. The midge-based results were compared with physical properties of the sediment, historical evidence and environmental reconstructions based on diatoms (Bacillariophyta), cladocerans (Crustacea: Cladocera) and tree rings. The results showed that the most important environmental factor controlling midge distribution and abundance along a latitudinal gradient in Finland was the mean July air temperature (TJul). However, when the dataset was environmentally screened to include only pristine lakes, water depth at the sampling site became more important. Furthermore, when the dataset was geographically scaled to southern Finland, hypolimnetic oxygen conditions became the dominant environmental factor. The results from an intralake dataset from eastern Finland showed that the most important environmental factors controlling midge distribution within a lake basin were river contribution, water depth and submerged vegetation patterns. In addition, the results of the intralake dataset showed that the fossil midge assemblages represent fauna that lived in close proximity to the sampling sites, thus enabling the exploration of within-lake gradients in midge assemblages. Importantly, this within-lake heterogeneity in midge assemblages may have effects on midge-based temperature estimations, because samples taken from the deepest point of a lake basin may infer considerably colder temperatures than expected, as shown by the present test results. Therefore, it is suggested here that the samples in fossil midge studies involving shallow boreal lakes should be taken from the sublittoral, where the assemblages are most representative of the whole lake fauna. Transfer functions between midge assemblages and the environmental forcing factors that were significantly related with the assemblages, including mean air TJul, water depth, hypolimnetic oxygen, stream flow and distance to littoral vegetation, were developed using weighted averaging (WA) and weighted averaging-partial least squares (WA-PLS) techniques, which outperformed all the other tested numerical approaches. Application of the models in downcore studies showed mostly consistent trends. Based on the present results, which agreed with previous studies and historical evidence, the Medieval Climate Anomaly between ca. 800 and 1300 AD in eastern Finland was characterized by warm temperature conditions and dry summers, but probably humid winters. The Little Ice Age (LIA) prevailed in southern Finland from ca. 1550 to 1850 AD, with the coldest conditions occurring at ca. 1700 AD, whereas in eastern Finland the cold conditions prevailed over a longer time period, from ca. 1300 until 1900 AD. The recent climatic warming was clearly represented in all of the temperature reconstructions. In the terms of long-term climatology, the present results provide support for the concept that the North Atlantic Oscillation (NAO) index has a positive correlation with winter precipitation and annual temperature and a negative correlation with summer precipitation in eastern Finland. In general, the results indicate a relatively warm climate with dry summers but snowy winters during the MCA and a cool climate with rainy summers and dry winters during the LIA. The results of the present reconstructions and the forthcoming applications of the models can be used in assessments of long-term environmental dynamics to refine the understanding of past environmental reference conditions and natural variability required by environmental scientists, ecologists and policy makers to make decisions concerning the presently occurring global, regional and local changes. The developed midge-based models for temperature, hypolimnetic oxygen, water depth, littoral vegetation shift and stream flow, presented in this thesis, are open for scientific use on request.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intention-based models have been one of the main theoretical orientations in the research on the implementation of information and communication technology (ICT). According to these models, actual behavior can be predicted from the intention towards the behavior. If the level of intention to use technology is high, the probability of actual usage of ICT increases. The purpose of this study was to find out which factors explain vocational teachers intention to use ICT in their teaching. In addition, teachers of media and information sciences and teachers of welfare and health were compared. The study also explored how regularly ICT was applied by teachers and how strong their intention to apply the technology was. This Master s thesis is a quantitative study and the data was collected using an Email survey and Eform. The instruments were based on a decomposed theory of planned behavior. The research group consisted of 22 schools of media and information sciences and 20 schools of welfare and health. The data consisted of 231 vocational teachers: 57 teachers worked with media and information sciences and 174 with welfare and health. The data was analyzed using Mann-Whitney U-test, factor analysis and regression analysis. In addition, categorized results were compared with previous study. In this study, the intention to use ICT in teaching was explained by the teachers attitudes and skills and the attitudes of their work community. However, the environment in which ICT was used, i.e., the technical environment, economical resources and time, did not explain the intention. The results did not directly support any of the intention-based models, but they could be interpreted as congruent with the technology acceptance model. The majority of the teachers used ICT at least weekly. They had a strong intention to continue to do that in the future. The study also revealed that there were more teachers who had a critical attitude towards ICT among the teachers of welfare and health. According to the results of this study, it is not possible to state that ICT would not suit any one profession because in every group with teachers with a critical attitude towards ICT there were also teachers with a positive attitude.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Detecting Earnings Management Using Neural Networks. Trying to balance between relevant and reliable accounting data, generally accepted accounting principles (GAAP) allow, to some extent, the company management to use their judgment and to make subjective assessments when preparing financial statements. The opportunistic use of the discretion in financial reporting is called earnings management. There have been a considerable number of suggestions of methods for detecting accrual based earnings management. A majority of these methods are based on linear regression. The problem with using linear regression is that a linear relationship between the dependent variable and the independent variables must be assumed. However, previous research has shown that the relationship between accruals and some of the explanatory variables, such as company performance, is non-linear. An alternative to linear regression, which can handle non-linear relationships, is neural networks. The type of neural network used in this study is the feed-forward back-propagation neural network. Three neural network-based models are compared with four commonly used linear regression-based earnings management detection models. All seven models are based on the earnings management detection model presented by Jones (1991). The performance of the models is assessed in three steps. First, a random data set of companies is used. Second, the discretionary accruals from the random data set are ranked according to six different variables. The discretionary accruals in the highest and lowest quartiles for these six variables are then compared. Third, a data set containing simulated earnings management is used. Both expense and revenue manipulation ranging between -5% and 5% of lagged total assets is simulated. Furthermore, two neural network-based models and two linear regression-based models are used with a data set containing financial statement data from 110 failed companies. Overall, the results show that the linear regression-based models, except for the model using a piecewise linear approach, produce biased estimates of discretionary accruals. The neural network-based model with the original Jones model variables and the neural network-based model augmented with ROA as an independent variable, however, perform well in all three steps. Especially in the second step, where the highest and lowest quartiles of ranked discretionary accruals are examined, the neural network-based model augmented with ROA as an independent variable outperforms the other models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increased availability of high frequency data sets have led to important new insights in understanding of financial markets. The use of high frequency data is interesting and persuasive, since it can reveal new information that cannot be seen in lower data aggregation. This dissertation explores some of the many important issues connected with the use, analysis and application of high frequency data. These include the effects of intraday seasonal, the behaviour of time varying volatility, the information content of various market data, and the issue of inter market linkages utilizing high frequency 5 minute observations from major European and the U.S stock indices, namely DAX30 of Germany, CAC40 of France, SMI of Switzerland, FTSE100 of the UK and SP500 of the U.S. The first essay in the dissertation shows that there are remarkable similarities in the intraday behaviour of conditional volatility across European equity markets. Moreover, the U.S macroeconomic news announcements have significant cross border effect on both, European equity returns and volatilities. The second essay reports substantial intraday return and volatility linkages across European stock indices of the UK and Germany. This relationship appears virtually unchanged by the presence or absence of the U.S stock market. However, the return correlation among the U.K and German markets rises significantly following the U.S stock market opening, which could largely be described as a contemporaneous effect. The third essay sheds light on market microstructure issues in which traders and market makers learn from watching market data, and it is this learning process that leads to price adjustments. This study concludes that trading volume plays an important role in explaining international return and volatility transmissions. The examination concerning asymmetry reveals that the impact of the positive volume changes is larger on foreign stock market volatility than the negative changes. The fourth and the final essay documents number of regularities in the pattern of intraday return volatility, trading volume and bid-ask spreads. This study also reports a contemporaneous and positive relationship between the intraday return volatility, bid ask spread and unexpected trading volume. These results verify the role of trading volume and bid ask quotes as proxies for information arrival in producing contemporaneous and subsequent intraday return volatility. Moreover, asymmetric effect of trading volume on conditional volatility is also confirmed. Overall, this dissertation explores the role of information in explaining the intraday return and volatility dynamics in international stock markets. The process through which the information is incorporated in stock prices is central to all information-based models. The intraday data facilitates the investigation that how information gets incorporated into security prices as a result of the trading behavior of informed and uninformed traders. Thus high frequency data appears critical in enhancing our understanding of intraday behavior of various stock markets’ variables as it has important implications for market participants, regulators and academic researchers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.