12 resultados para additive interpolation error expansion

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change will influence the living conditions of all life on Earth. For some species the change in the environmental conditions that has occurred so far has already increased the risk of extinction, and the extinction risk is predicted to increase for large numbers of species in the future. Some species may have time to adapt to the changing environmental conditions, but the rate and magnitude of the change are too great to allow many species to survive via evolutionary changes. Species responses to climate change have been documented for some decades. Some groups of species, like many insects, respond readily to changes in temperature conditions and have shifted their distributions northwards to new climatically suitable regions. Such range shifts have been well documented especially in temperate zones. In this context, butterflies have been studied more than any other group of species, partly for the reason that their past geographical ranges are well documented, which facilitates species-climate modelling and other analyses. The aim of the modelling studies is to examine to what extent shifts in species distributions can be explained by climatic and other factors. Models can also be used to predict the future distributions of species. In this thesis, I have studied the response to climate change of one species of butterfly within one geographically restricted area. The study species, the European map butterfly (Araschnia levana), has expanded rapidly northwards in Finland during the last two decades. I used statistical and dynamic modelling approaches in combination with field studies to analyse the effects of climate warming and landscape structure on the expansion. I studied possible role of molecular variation in phosphoglucose isomerase (PGI), a glycolytic enzyme affecting flight metabolism and thereby flight performance, in the observed expansion of the map butterfly at two separate expansion fronts in Finland. The expansion rate of the map butterfly was shown to be correlated with the frequency of warmer than average summers during the study period. The result is in line with the greater probability of occurrence of the second generation during warm summers and previous results on this species showing greater mobility of the second than first generation individuals. The results of a field study in this thesis indicated low mobility of the first generation butterflies. Climatic variables alone were not sufficient to explain the observed expansion in Finland. There are also problems in transferring the climate model to new regions from the ones from which data were available to construct the model. The climate model predicted a wider distribution in the south-western part of Finland than what has been observed. Dynamic modelling of the expansion in response to landscape structure suggested that habitat and landscape structure influence the rate of expansion. In southern Finland the landscape structure may have slowed down the expansion rate. The results on PGI suggested that allelic variation in this enzyme may influence flight performance and thereby the rate of expansion. Genetic differences of the populations at the two expansion fronts may explain at least partly the observed differences in the rate of expansion. Individuals with the genotype associated with high flight metabolic rate were most frequent in eastern Finland, where the rate of range expansion has been highest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cosmological observations of light from type Ia supernovae, the cosmic microwave background and the galaxy distribution seem to indicate that the expansion of the universe has accelerated during the latter half of its age. Within standard cosmology, this is ascribed to dark energy, a uniform fluid with large negative pressure that gives rise to repulsive gravity but also entails serious theoretical problems. Understanding the physical origin of the perceived accelerated expansion has been described as one of the greatest challenges in theoretical physics today. In this thesis, we discuss the possibility that, instead of dark energy, the acceleration would be caused by an effect of the nonlinear structure formation on light, ignored in the standard cosmology. A physical interpretation of the effect goes as follows: due to the clustering of the initially smooth matter with time as filaments of opaque galaxies, the regions where the detectable light travels get emptier and emptier relative to the average. As the developing voids begin to expand the faster the lower their matter density becomes, the expansion can then accelerate along our line of sight without local acceleration, potentially obviating the need for the mysterious dark energy. In addition to offering a natural physical interpretation to the acceleration, we have further shown that an inhomogeneous model is able to match the main cosmological observations without dark energy, resulting in a concordant picture of the universe with 90% dark matter, 10% baryonic matter and 15 billion years as the age of the universe. The model also provides a smart solution to the coincidence problem: if induced by the voids, the onset of the perceived acceleration naturally coincides with the formation of the voids. Additional future tests include quantitative predictions for angular deviations and a theoretical derivation of the model to reduce the required phenomenology. A spin-off of the research is a physical classification of the cosmic inhomogeneities according to how they could induce accelerated expansion along our line of sight. We have identified three physically distinct mechanisms: global acceleration due to spatial variations in the expansion rate, faster local expansion rate due to a large local void and biased light propagation through voids that expand faster than the average. A general conclusion is that the physical properties crucial to account for the perceived acceleration are the growth of the inhomogeneities and the inhomogeneities in the expansion rate. The existence of these properties in the real universe is supported by both observational data and theoretical calculations. However, better data and more sophisticated theoretical models are required to vindicate or disprove the conjecture that the inhomogeneities are responsible for the acceleration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The indigenous cloud forests in the Taita Hills have suffered substantial degradation for several centuries due to agricultural expansion. Currently, only 1% of the original forested area remains preserved in this region. Furthermore, climate change imposes an imminent threat for local economy and environmental sustainability. In such circumstances, elaborating tools to conciliate socioeconomic growth and natural resources conservation is an enormous challenge. This dissertation tackles essential aspects for understanding the ongoing agricultural activities in the Taita Hills and their potential environmental consequences in the future. Initially, alternative methods were designed to improve our understanding of the ongoing agricultural activities. Namely, methods for agricultural survey planning and to estimate evapotranspiration were evaluated, taking into account a number of limitations regarding data and resources availability. Next, this dissertation evaluates how upcoming agricultural expansion, together with climate change, will affect the natural resources in the Taita Hills up to the year 2030. The driving forces of agricultural expansion in the region were identified as aiming to delineate future landscape scenarios and evaluate potential impacts from the soil and water conservation point of view. In order to investigate these issues and answer the research questions, this dissertation combined state of the art modelling tools with renowned statistical methods. The results indicate that, if current trends persist, agricultural areas will occupy roughly 60% of the study area by 2030. Although the simulated land use changes will certainly increase soil erosion figures, new croplands are likely to come up predominantly in the lowlands, which comprise areas with lower soil erosion potential. By 2030, rainfall erosivity is likely to increase during April and November due to climate change. Finally, this thesis addressed the potential impacts of agricultural expansion and climate changes on Irrigation Water Requirements (IWR), which is considered another major issue in the context of the relations between land use and climate. Although the simulations indicate that climate change will likely increase annual volumes of rainfall during the following decades, IWR will continue to increase due to agricultural expansion. By 2030, new cropland areas may cause an increase of approximately 40% in the annual volume of water necessary for irrigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with using the bootstrap to obtain improved critical values for the error correction model (ECM) cointegration test in dynamic models. In the paper we investigate the effects of dynamic specification on the size and power of the ECM cointegration test with bootstrap critical values. The results from a Monte Carlo study show that the size of the bootstrap ECM cointegration test is close to the nominal significance level. We find that overspecification of the lag length results in a loss of power. Underspecification of the lag length results in size distortion. The performance of the bootstrap ECM cointegration test deteriorates if the correct lag length is not used in the ECM. The bootstrap ECM cointegration test is therefore not robust to model misspecification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A local algorithm with local horizon r is a distributed algorithm that runs in r synchronous communication rounds; here r is a constant that does not depend on the size of the network. As a consequence, the output of a node in a local algorithm only depends on the input within r hops from the node. We give tight bounds on the local horizon for a class of local algorithms for combinatorial problems on unit-disk graphs (UDGs). Most of our bounds are due to a refined analysis of existing approaches, while others are obtained by suggesting new algorithms. The algorithms we consider are based on network decompositions guided by a rectangular tiling of the plane. The algorithms are applied to matching, independent set, graph colouring, vertex cover, and dominating set. We also study local algorithms on quasi-UDGs, which are a popular generalisation of UDGs, aimed at more realistic modelling of communication between the network nodes. Analysing the local algorithms on quasi-UDGs allows one to assume that the nodes know their coordinates only approximately, up to an additive error. Despite the localisation error, the quality of the solution to problems on quasi-UDGs remains the same as for the case of UDGs with perfect location awareness. We analyse the increase in the local horizon that comes along with moving from UDGs to quasi-UDGs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trafficking in human beings has become one of the most talked about criminal concerns of the 21st century. But this is not all that it has become. Trafficking has also been declared as one of the most pressing human rights issues of our time. In this sense, it has become a part of the expansion of the human rights phenomenon. Although it is easy to see that the crime of trafficking violates several of the human rights of its victims, it is still, in its essence, a fairly conventional although particularly heinous and often transnational crime, consisting of acts between private actors, and lacking, therefore, the vertical effect associated traditionally with human rights violations. This thesis asks, then, why, and how, has the anti-trafficking campaign been translated in human rights language. And even more fundamentally: in light of the critical, theoretical studies surrounding the expansion of the human rights phenomenon, especially that of Costas Douzinas, who has declared that we have come to the end of human rights as a consequence of the expansion and bureaucratization of the phenomenon, can human rights actually bring salvation to the victims of trafficking? The thesis demonstrates that the translation process of the anti-trafficking campaign into human rights language has been a complicated process involving various actors, including scholars, feminist NGOs, local activists and global human rights NGOs. It has also been driven by a complicated web of interests, the most prevalent one the sincere will to help the victims having become entangled with other aims, such as political, economical, and structural goals. As a consequence of its fragmented background, the human rights approach to trafficking seeks still its final form, consisting of several different claims. After an assessment of these claims from a legal perspective, this thesis concludes that the approach is most relevant regarding the mistreatment of victims of trafficking in the hands of state authorities. It seems to be quite common that authorities have trouble identifying the victims of trafficking, which means that the rights granted to themin international and national documents are not realized in practice, but victims of trafficking are systematically deported as illegal immigrants. It is argued that in order to understand the measures of the authorities, and to assess the usefulness of human rights, it is necessary to adopt a Foucauldian perspective and to observe the measures as biopolitical defence mechanisms. From a biopolitical perspective, the victims of trafficking can be seen as a threat to the population a threat that must be eliminated either by assimilating them to the main population with the help of disciplinary techniques, or by excluding them completely from the society. This biopolitical aim is accomplished through an impenetrable net of seemingly insignificant practices and discourses that not even the participants are aware of. As a result of these practices and discourses, trafficking victims only very few of fit the myth of the perfect victim, produced by biopolitical discourses become invisible and therefore subject to deportation as (risky) illegal immigrants, turning them into bare life in the Agambenian sense, represented by the homo sacer, who cannot be sacrificed, yet does not enjoy the protection of the society and its laws. It is argued, following Jacques Rancière and Slavoj i ek, that human rights can, through their universality and formal equality, provide bare life the tools to formulate political claims and therefore utilize their politicization through their exclusion to return to the sphere of power and politics. Even though human rights have inevitably become entangled with biopolitical practices, they are still perhaps the most efficient way to challenge biopower. Human rights have not, therefore, become useless for the victims of trafficking, but they must be conceived as a universal tool to formulate political claims and challenge power .In the case of trafficking this means that human rights must be utilized to constantly renegotiate the borders of the problematic concept of victim of trafficking created by international instruments, policies and discourses, including those that are sincerely aimed to provide help for the victims.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present simple methods for construction and evaluation of finite-state spell-checking tools using an existing finite-state lexical automaton, freely available finite-state tools and Internet corpora acquired from projects such as Wikipedia. As an example, we use a freely available open-source implementation of Finnish morphology, made with traditional finite-state morphology tools, and demonstrate rapid building of Northern Sámi and English spell checkers from tools and resources available from the Internet.