10 resultados para spatial trend analysis

em Helda - Digital Repository of University of Helsinki


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Palaeoenvironments of the latter half of the Weichselian ice age and the transition to the Holocene, from ca. 52 to 4 ka, were investigated using isotopic analysis of oxygen, carbon and strontium in mammal skeletal apatite. The study material consisted predominantly of subfossil bones and teeth of the woolly mammoth (Mammuthus primigenius Blumenbach), collected from Europe and Wrangel Island, northeastern Siberia. All samples have been radiocarbon dated, and their ages range from >52 ka to 4 ka. Altogether, 100 specimens were sampled for the isotopic work. In Europe, the studies focused on the glacial palaeoclimate and habitat palaeoecology. To minimise the influence of possible diagenetic effects, the palaeoclimatological and ecological reconstructions were based on the enamel samples only. The results of the oxygen isotope analysis of mammoth enamel phosphate from Finland and adjacent nortwestern Russia, Estonia, Latvia, Lithuania, Poland, Denmark and Sweden provide the first estimate of oxygen isotope values in glacial precipitation in northern Europe. The glacial precipitation oxygen isotope values range from ca. -9.2±1.5 in western Denmark to -15.3 in Kirillov, northwestern Russia. These values are 0.6-4.1 lower than those in present-day precipitation, with the largest changes recorded in the currently marine influenced southern Sweden and the Baltic region. The new enamel-derived oxygen isotope data from this study, combined with oxygen isotope records from earlier investigations on mammoth tooth enamel and palaeogroundwaters, facilitate a reconstruction of the spatial patterns of the oxygen isotope values of precipitation and palaeotemperatures over much of Europe. The reconstructed geographic pattern of oxygen isotope levels in precipitation during 52-24 ka reflects the progressive isotopic depletion of air masses moving northeast, consistent with a westerly source of moisture for the entire region, and a circulation pattern similar to that of the present-day. The application of regionally varied δ/T-slopes, estimated from palaeogroundwater data and modern spatial correlations, yield reasonable estimates of glacial surface temperatures in Europe and imply 2-9°C lower long-term mean annual surface temperatures during the glacial period. The isotopic composition of carbon in the enamel samples indicates a pure C3 diet for the European mammoths, in agreement with previous investigations of mammoth ecology. A faint geographical gradient in the carbon isotope values of enamel is discernible, with more negative values in the northeast. The spatial trend is consistent with the climatic implications of the enamel oxygen isotope data, but may also suggest regional differences in habitat openness. The palaeogeographical changes caused by the eustatic rise of global sea level at the end of the Weichselian ice age was investigated on Wrangel Island, using the strontium isotope (Sr-87/Sr-86) ratios in the skeletal apatite of the local mammoth fauna. The diagenetic evaluations suggest good preservation of the original Sr isotope ratios, even in the bone specimens included in the study material. To estimate present-day environmental Sr isotope values on Wrangel Island, bioapatite samples from modern reindeer and muskoxen, as well as surface waters from rivers and ice wedges were analysed. A significant shift towards more radiogenic bioapatite Sr isotope ratios, from 0.71218 ± 0.00103 to 0.71491 ± 0.00138, marks the beginning of the Holocene. This implies a change in the migration patterns of the mammals, ultimately reflecting the inundation of the mainland connection and isolation of the population. The bioapatite Sr isotope data supports published coastline reconstructions placing the time of separation from the mainland to ca. 10-10.5 ka ago. The shift towards more radiogenic Sr isotope values in mid-Holocene subfossil remains after 8 ka ago reflects the rapid rise of the sea level from 10 to 8 ka, resulting in a considerable reduction of the accessible range area on the early Wrangel Island.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The United States is the world s single biggest market area, where the demand for graphic papers has increased by 80 % during the last three decades. However, during the last two decades there have been very big unpredictable changes in the graphic paper markets. For example, the consumption of newsprint started to decline from the late 1980 s, which was surprising compared to the historical consumption and projections. The consumption has declined since. The aim of this study was to see how magazine paper consumption will develop in the United States until 2030. The long-term consumption projection was made using mainly two methods. The first method was to use trend analysis to see how and if the consumption has changed since 1980. The second method was to use qualitative estimate. These estimates are then compared to the so-called classical model projections, which are usually mentioned and used in forestry literature. The purpose of the qualitative analysis is to study magazine paper end-use purposes and to analyze how and with what intensity the changes in society will effect to magazine paper consumption in the long-term. The framework of this study covers theories such as technology adaptation, electronic substitution, electronic publishing and Porter s threat of substitution. Because this study deals with markets, which have showed signs of structural change, a very substantial part of this study covers recent development and newest possible studies and statistics. The following were among the key findings of this study. Different end-uses have very different kinds of future. Electronic substitution is very likely in some end-use purposes, but not in all. Young people i.e. future consumers have very different manners, habits and technological opportunities than our parents did. These will have substantial effects in magazine paper consumption in the long-term. This study concludes to the fact that the change in magazine paper consumption is more likely to be gradual (evolutionary) than sudden collapse (revolutionary). It is also probable that the years of fast growing consumption of magazine papers are behind. Besides the decelerated growth, the consumption of magazine papers will decline slowly in the long-term. The decline will be faster depending on how far in the future we ll extend the study to.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents novel modelling applications for environmental geospatial data using remote sensing, GIS and statistical modelling techniques. The studied themes can be classified into four main themes: (i) to develop advanced geospatial databases. Paper (I) demonstrates the creation of a geospatial database for the Glanville fritillary butterfly (Melitaea cinxia) in the Åland Islands, south-western Finland; (ii) to analyse species diversity and distribution using GIS techniques. Paper (II) presents a diversity and geographical distribution analysis for Scopulini moths at a world-wide scale; (iii) to study spatiotemporal forest cover change. Paper (III) presents a study of exotic and indigenous tree cover change detection in Taita Hills Kenya using airborne imagery and GIS analysis techniques; (iv) to explore predictive modelling techniques using geospatial data. In Paper (IV) human population occurrence and abundance in the Taita Hills highlands was predicted using the generalized additive modelling (GAM) technique. Paper (V) presents techniques to enhance fire prediction and burned area estimation at a regional scale in East Caprivi Namibia. Paper (VI) compares eight state-of-the-art predictive modelling methods to improve fire prediction, burned area estimation and fire risk mapping in East Caprivi Namibia. The results in Paper (I) showed that geospatial data can be managed effectively using advanced relational database management systems. Metapopulation data for Melitaea cinxia butterfly was successfully combined with GPS-delimited habitat patch information and climatic data. Using the geospatial database, spatial analyses were successfully conducted at habitat patch level or at more coarse analysis scales. Moreover, this study showed it appears evident that at a large-scale spatially correlated weather conditions are one of the primary causes of spatially correlated changes in Melitaea cinxia population sizes. In Paper (II) spatiotemporal characteristics of Socupulini moths description, diversity and distribution were analysed at a world-wide scale and for the first time GIS techniques were used for Scopulini moth geographical distribution analysis. This study revealed that Scopulini moths have a cosmopolitan distribution. The majority of the species have been described from the low latitudes, sub-Saharan Africa being the hot spot of species diversity. However, the taxonomical effort has been uneven among biogeographical regions. Paper III showed that forest cover change can be analysed in great detail using modern airborne imagery techniques and historical aerial photographs. However, when spatiotemporal forest cover change is studied care has to be taken in co-registration and image interpretation when historical black and white aerial photography is used. In Paper (IV) human population distribution and abundance could be modelled with fairly good results using geospatial predictors and non-Gaussian predictive modelling techniques. Moreover, land cover layer is not necessary needed as a predictor because first and second-order image texture measurements derived from satellite imagery had more power to explain the variation in dwelling unit occurrence and abundance. Paper V showed that generalized linear model (GLM) is a suitable technique for fire occurrence prediction and for burned area estimation. GLM based burned area estimations were found to be more superior than the existing MODIS burned area product (MCD45A1). However, spatial autocorrelation of fires has to be taken into account when using the GLM technique for fire occurrence prediction. Paper VI showed that novel statistical predictive modelling techniques can be used to improve fire prediction, burned area estimation and fire risk mapping at a regional scale. However, some noticeable variation between different predictive modelling techniques for fire occurrence prediction and burned area estimation existed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple sclerosis (MS) is an immune-mediated demyelinating disorder of the central nervous system (CNS) affecting 0.1-0.2% of Northern European descent population. MS is considered to be a multifactorial disease, both environment and genetics play a role in its pathogenesis. Despite several decades of intense research, the etiological and pathogenic mechanisms underlying MS remain still largely unknown and no curative treatment exists. The genetic architecture underlying MS is complex with multiple genes involved. The strongest and the best characterized predisposing genetic factors for MS are located, as in other immune-mediated diseases, in the major histocompatibility complex (MHC) on chromosome 6. In humans MHC is called human leukocyte antigen (HLA). Alleles of the HLA locus have been found to associate strongly with MS and remained for many years the only consistently replicable genetic associations. However, recently other genes located outside the MHC region have been proposed as strong candidates for susceptibility to MS in several studies. In this thesis a new genetic locus located on chromosome 7q32, interferon regulatory factor 5 (IRF5), was identified in the susceptibility to MS. In particular, we found that common variation of the gene was associated with the disease in three different populations, Spanish, Swedish and Finnish. We also suggested a possible functional role for one of the risk alleles with impact on the expression of the IRF5 locus. Previous studies have pointed out a possible role played by chromosome 2q33 in the susceptibility to MS and other autoimmune disorders. The work described here also investigated the involvement of this chromosomal region in MS predisposition. After the detection of genetic association with 2q33 (article-1), we extended our analysis through fine-scale single nucleotide polymorphism (SNP) mapping to define further the contribution of this genomic area to disease pathogenesis (article-4). We found a trend (p=0.04) for association to MS with an intronic SNP located in the inducible T-cell co-stimulator (ICOS) gene, an important player in the co-stimulatory pathway of the immune system. Expression analysis of ICOS revealed a novel, previously uncharacterized, alternatively spliced isoform, lacking the extracellular domain that is needed for ligand binding. The stability of the newly-identified transcript variant and its subcellular localization were analyzed. These studies indicated that the novel isoform is stable and shows different subcellular localization as compared to full-length ICOS. The novel isoform might have a regulatory function, but further studies are required to elucidate its function. Chromosome 19q13 has been previously suggested as one of the genomic areas involved in MS predisposition. In several populations, suggestive linkage signals between MS predisposition and 19q13 have been obtained. Here, we analysed the role of allelic variation in 19q13 by family based association analysis in 782 MS families collected from Finland. In this dataset, we were not able to detect any statistically significant associations, although several previously suggested markers were included to the analysis. Replication of the previous findings on the basis of linkage disequilibrium between marker allele and disease/risk allele appears notoriously difficult because of limitations such as allelic heterogeneity. Re-sequencing based approaches may be required for elucidating the role of chromosome 19q13 with MS. This thesis has resulted in the identification of a new MS susceptibility locus (IRF5) previously associated with other inflammatory or autoimmune disorders, such as SLE. IRF5 is one of the mediators of interferons biological function. In addition to providing new insight in the possible pathogenetic pathway of the disease, this finding suggests that there might be common mechanisms between different immune-mediated disorders. Furthermore the work presented here has uncovered a novel isoform of ICOS, which may play a role in regulatory mechanisms of ICOS, an important mediator of lymphocyte activation. Further work is required to uncover its functions and possible involvement of the ICOS locus in MS susceptibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis contains three subject areas concerning particulate matter in urban area air quality: 1) Analysis of the measured concentrations of particulate matter mass concentrations in the Helsinki Metropolitan Area (HMA) in different locations in relation to traffic sources, and at different times of year and day. 2) The evolution of traffic exhaust originated particulate matter number concentrations and sizes in local street scale are studied by a combination of a dispersion model and an aerosol process model. 3) Some situations of high particulate matter concentrations are analysed with regard to their meteorological origins, especially temperature inversion situations, in the HMA and three other European cities. The prediction of the occurrence of meteorological conditions conducive to elevated particulate matter concentrations in the studied cities is examined. The performance of current numerical weather forecasting models in the case of air pollution episode situations is considered. The study of the ambient measurements revealed clear diurnal variation of the PM10 concentrations in the HMA measurement sites, irrespective of the year and the season of the year. The diurnal variation of local vehicular traffic flows seemed to have no substantial correlation with the PM2.5 concentrations, indicating that the PM10 concentrations were originated mainly from local vehicular traffic (direct emissions and suspension), while the PM2.5 concentrations were mostly of regionally and long-range transported origin. The modelling study of traffic exhaust dispersion and transformation showed that the number concentrations of particles originating from street traffic exhaust undergo a substantial change during the first tens of seconds after being emitted from the vehicle tailpipe. The dilution process was shown to dominate total number concentrations. Minimal effect of both condensation and coagulation was seen in the Aitken mode number concentrations. The included air pollution episodes were chosen on the basis of occurrence in either winter or spring, and having at least partly local origin. In the HMA, air pollution episodes were shown to be linked to predominantly stable atmospheric conditions with high atmospheric pressure and low wind speeds in conjunction with relatively low ambient temperatures. For the other European cities studied, the best meteorological predictors for the elevated concentrations of PM10 were shown to be temporal (hourly) evolutions of temperature inversions, stable atmospheric stability and in some cases, wind speed. Concerning the weather prediction during particulate matter related air pollution episodes, the use of the studied models were found to overpredict pollutant dispersion, leading to underprediction of pollutant concentration levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topics in Spatial Econometrics — With Applications to House Prices Spatial effects in data occur when geographical closeness of observations influences the relation between the observations. When two points on a map are close to each other, the observed values on a variable at those points tend to be similar. The further away the two points are from each other, the less similar the observed values tend to be. Recent technical developments, geographical information systems (GIS) and global positioning systems (GPS) have brought about a renewed interest in spatial matters. For instance, it is possible to observe the exact location of an observation and combine it with other characteristics. Spatial econometrics integrates spatial aspects into econometric models and analysis. The thesis concentrates mainly on methodological issues, but the findings are illustrated by empirical studies on house price data. The thesis consists of an introductory chapter and four essays. The introductory chapter presents an overview of topics and problems in spatial econometrics. It discusses spatial effects, spatial weights matrices, especially k-nearest neighbours weights matrices, and various spatial econometric models, as well as estimation methods and inference. Further, the problem of omitted variables, a few computational and empirical aspects, the bootstrap procedure and the spatial J-test are presented. In addition, a discussion on hedonic house price models is included. In the first essay a comparison is made between spatial econometrics and time series analysis. By restricting the attention to unilateral spatial autoregressive processes, it is shown that a unilateral spatial autoregression, which enjoys similar properties as an autoregression with time series, can be defined. By an empirical study on house price data the second essay shows that it is possible to form coordinate-based, spatially autoregressive variables, which are at least to some extent able to replace the spatial structure in a spatial econometric model. In the third essay a strategy for specifying a k-nearest neighbours weights matrix by applying the spatial J-test is suggested, studied and demonstrated. In the final fourth essay the properties of the asymptotic spatial J-test are further examined. A simulation study shows that the spatial J-test can be used for distinguishing between general spatial models with different k-nearest neighbours weights matrices. A bootstrap spatial J-test is suggested to correct the size of the asymptotic test in small samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wealthy individuals - business angels who invest a share of their net worth in entrepreneurial ventures - form an essential part of an informal venture capital market that can secure funding for entrepreneurial ventures. In Finland, business angels represent an untapped pool of capital that can contribute to fostering entrepreneurial development. In addition, business angels can bridge knowledge gaps in new business ventures by means of making their human capital available. This study has two objectives. The first is to gain an understanding of the characteristics and investment behaviour of Finnish business angels. The strongest focus here is on the due diligence procedures and their involvement post investment. The second objective is to assess whether agency theory and the incomplete contacting theory are useful theoretical lenses in the arena of business angels. To achieve the second objective, this study investigates i) how risk is mitigated in the investment process, ii) how uncertainty influences the comprehensiveness of due diligence as well as iii) how control is allocated post investment. Research hypotheses are derived from assumptions underlying agency theory and the incomplete contacting theory. The data for this study comprise interviews with 53 business angels. In terms of sample size this is the largest on Finnish business angels. The research hypotheses in this study are tested using regression analysis. This study suggests that the Finnish informal venture capital market appears to be comprised of a limited number of business angels whose style of investing much resembles their formal counterparts’. Much focus is placed on managing risks prior to making the investment by strong selectiveness and by a relatively comprehensive due diligence. The involvement is rarely on a day-to-day basis and many business angels seem to see board membership as a more suitable alternative than involvement in the operations of an entrepreneurial venture. The uncertainty involved does not seem to drive an increase in due diligence. On the contrary, it would appear that due diligence is more rigorous in safer later stage investments and when the business angels have considerable previous experience as investors. Finnish business angels’ involvement post investment is best explained by their degree of ownership in the entrepreneurial venture. It seems that when investors feel they are sufficiently rewarded, in terms of an adequate equity stake, they are willing to involve themselves actively in their investments. The lack of support for a relationship between increased uncertainty and the comprehensiveness of due diligence may partly be explained by an increasing trend towards portfolio diversification. This is triggered by a taxation system that favours investments through investment companies rather than direct investments. Many business angels appear to have substituted a specialization strategy that builds on reducing uncertainty for a diversification strategy that builds on reducing firm specific (idiosyncratic) risk by holding shares in ventures whose returns are not expected to exhibit a strong positive correlation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of soil microbiota and their activities is central to the understanding of many ecosystem processes such as decomposition and nutrient cycling. The collection of microbiological data from soils generally involves several sequential steps of sampling, pretreatment and laboratory measurements. The reliability of results is dependent on reliable methods in every step. The aim of this thesis was to critically evaluate some central methods and procedures used in soil microbiological studies in order to increase our understanding of the factors that affect the measurement results and to provide guidance and new approaches for the design of experiments. The thesis focuses on four major themes: 1) soil microbiological heterogeneity and sampling, 2) storage of soil samples, 3) DNA extraction from soil, and 4) quantification of specific microbial groups by the most-probable-number (MPN) procedure. Soil heterogeneity and sampling are discussed as a single theme because knowledge on spatial (horizontal and vertical) and temporal variation is crucial when designing sampling procedures. Comparison of adjacent forest, meadow and cropped field plots showed that land use has a strong impact on the degree of horizontal variation of soil enzyme activities and bacterial community structure. However, regardless of the land use, the variation of microbiological characteristics appeared not to have predictable spatial structure at 0.5-10 m. Temporal and soil depth-related patterns were studied in relation to plant growth in cropped soil. The results showed that most enzyme activities and microbial biomass have a clear decreasing trend in the top 40 cm soil profile and a temporal pattern during the growing season. A new procedure for sampling of soil microbiological characteristics based on stratified sampling and pre-characterisation of samples was developed. A practical example demonstrated the potential of the new procedure to reduce the analysis efforts involved in laborious microbiological measurements without loss of precision. The investigation of storage of soil samples revealed that freezing (-20 °C) of small sample aliquots retains the activity of hydrolytic enzymes and the structure of the bacterial community in different soil matrices relatively well whereas air-drying cannot be recommended as a storage method for soil microbiological properties due to large reductions in activity. Freezing below -70 °C was the preferred method of storage for samples with high organic matter content. Comparison of different direct DNA extraction methods showed that the cell lysis treatment has a strong impact on the molecular size of DNA obtained and on the bacterial community structure detected. An improved MPN method for the enumeration of soil naphthalene degraders was introduced as an alternative to more complex MPN protocols or the DNA-based quantification approach. The main advantage of the new method is the simple protocol and the possibility to analyse a large number of samples and replicates simultaneously.