45 resultados para Blog datasets
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Context: BL Lacs are the most numerous extragalactic objects which are detected in Very High Energy (VHE) gamma-rays band. They are a subclass of blazars. Large flux variability amplitude, sometimes happens in very short time scale, is a common characteristic of them. Significant optical polarization is another main characteristics of BL Lacs. BL Lacs' spectra have a continuous and featureless Spectral Energy Distribution (SED) which have two peaks. Among 1442 BL Lacs in the Roma-BZB catalogue, only 51 are detected in VHE gamma-rays band. BL Lacs are most numerous (more than 50% of 514 objects) objects among the sources that are detected above 10 GeV by FERMI-LAT. Therefore, many BL Lacs are expected to be discovered in VHE gamma-rays band. However, due to the limitation on current and near future technology of Imaging Air Cherenkov Telescope, astronomers are forced to predict whether an object emits VHE gamma-rays or not. Some VHE gamma-ray prediction methods are already introduced but still are not confirmed. Cross band correlations are the building blocks of introducing VHE gamma-rays prediction method. Aims: We will attempt to investigate cross band correlations between flux energy density, luminosity and spectral index of the sample. Also, we will check whether recently discovered MAGIC J2001+435 is a typical BL Lac. Methods: We select a sample of 42 TeV BL Lacs and collect 20 of their properties within five energy bands from literature and Tuorla blazar monitoring program database. All of the data are synchronized to be comparable to each other. Finally, we choose 55 pair of datasets for cross band correlations finding and investigating whether there is any correlation between each pair. For MAGIC J2001+435 we analyze the publicly available SWIFT-XRT data, and use the still unpublished VHE gamma-rays data from MAGIC collaboration. The results are compared to the other sources of the sample. Results: Low state luminosity of multiple detected VHE gamma-rays is strongly correlated luminosities in all other bands. However, the high state does not show such strong correlations. VHE gamma-rays single detected sources have similar behaviour to the low state of multiple detected ones. Finally, MAGIC J2001+435 is a typical TeV BL Lac. However, for some of the properties this source is located at the edge of the whole sample (e.g. in terms of X-rays flux). Keywords: BL Lac(s), Population study, Correlations finding, Multi wavelengths analysis, VHE gamma-rays, gamma-rays, X-rays, Optical, Radio
Resumo:
This thesis researches automatic traffic sign inventory and condition analysis using machine vision and pattern recognition methods. Automatic traffic sign inventory and condition analysis can be used to more efficient road maintenance, improving the maintenance processes, and to enable intelligent driving systems. Automatic traffic sign detection and classification has been researched before from the viewpoint of self-driving vehicles, driver assistance systems, and the use of signs in mapping services. Machine vision based inventory of traffic signs consists of detection, classification, localization, and condition analysis of traffic signs. The produced machine vision system performance is estimated with three datasets, from which two of have been been collected for this thesis. Based on the experiments almost all traffic signs can be detected, classified, and located and their condition analysed. In future, the inventory system performance has to be verified in challenging conditions and the system has to be pilot tested.
Resumo:
The aims were to find out 1) if schools’ oral health practices were associated with pupils’ oral health behaviour and whether 2) the national sweet-selling recommendation and 3) distributing oral health material (OHEM) affected schools as oral health promoters. Three independently collected datasets from Finnish upper comprehensive schools (N=988) were used: longitudinal oral health practices data (n=258) with three-year follow up (2007 n=480, 2008 n=508, 2009 n=593) from principals’ online questionnaires, oral health behaviour data from pupils participating in the national School Health Promotion Study (n=970 schools) and oral health education data from health education teachers’ online questionnaires (2008 n=563, 2009 n=477 teachers). Oral health practices data and oral health behaviour data were combined (n=414) to answer aim 1. For aims 2 and 3, oral health practices data and oral health education data were used independently. School sweet selling and an open campus policy were associated with pupils’ use of sweet products and tobacco products during school time. The National Recommendation was quite an effective way to reduce the number of sweet-selling schools, but there were large regional differences and a lack of a clear oral health policy in the schools. OHEM did not increase the proportion of teachers teaching oral health, but teachers started to cover oral health topics more frequently. Women started to use OHEM more often than men did. Schools’ oral health policy should include prohibiting the selling of sweet products in school by legislative actions, enabling healthy alternatives instead, and setting a closed campus policy to protect pupils from school-time sweet consuming and smoking.
Resumo:
Acid sulfate (a.s.) soils constitute a major environmental issue. Severe ecological damage results from the considerable amounts of acidity and metals leached by these soils in the recipient watercourses. As even small hot spots may affect large areas of coastal waters, mapping represents a fundamental step in the management and mitigation of a.s. soil environmental risks (i.e. to target strategic areas). Traditional mapping in the field is time-consuming and therefore expensive. Additional more cost-effective techniques have, thus, to be developed in order to narrow down and define in detail the areas of interest. The primary aim of this thesis was to assess different spatial modeling techniques for a.s. soil mapping, and the characterization of soil properties relevant for a.s. soil environmental risk management, using all available data: soil and water samples, as well as datalayers (e.g. geological and geophysical). Different spatial modeling techniques were applied at catchment or regional scale. Two artificial neural networks were assessed on the Sirppujoki River catchment (c. 440 km2) located in southwestern Finland, while fuzzy logic was assessed on several areas along the Finnish coast. Quaternary geology, aerogeophysics and slope data (derived from a digital elevation model) were utilized as evidential datalayers. The methods also required the use of point datasets (i.e. soil profiles corresponding to known a.s. or non-a.s. soil occurrences) for training and/or validation within the modeling processes. Applying these methods, various maps were generated: probability maps for a.s. soil occurrence, as well as predictive maps for different soil properties (sulfur content, organic matter content and critical sulfide depth). The two assessed artificial neural networks (ANNs) demonstrated good classification abilities for a.s. soil probability mapping at catchment scale. Slightly better results were achieved using a Radial Basis Function (RBF) -based ANN than a Radial Basis Functional Link Net (RBFLN) method, narrowing down more accurately the most probable areas for a.s. soil occurrence and defining more properly the least probable areas. The RBF-based ANN also demonstrated promising results for the characterization of different soil properties in the most probable a.s. soil areas at catchment scale. Since a.s. soil areas constitute highly productive lands for agricultural purpose, the combination of a probability map with more specific soil property predictive maps offers a valuable toolset to more precisely target strategic areas for subsequent environmental risk management. Notably, the use of laser scanning (i.e. Light Detection And Ranging, LiDAR) data enabled a more precise definition of a.s. soil probability areas, as well as the soil property modeling classes for sulfur content and the critical sulfide depth. Given suitable training/validation points, ANNs can be trained to yield a more precise modeling of the occurrence of a.s. soils and their properties. By contrast, fuzzy logic represents a simple, fast and objective alternative to carry out preliminary surveys, at catchment or regional scale, in areas offering a limited amount of data. This method enables delimiting and prioritizing the most probable areas for a.s soil occurrence, which can be particularly useful in the field. Being easily transferable from area to area, fuzzy logic modeling can be carried out at regional scale. Mapping at this scale would be extremely time-consuming through manual assessment. The use of spatial modeling techniques enables the creation of valid and comparable maps, which represents an important development within the a.s. soil mapping process. The a.s. soil mapping was also assessed using water chemistry data for 24 different catchments along the Finnish coast (in all, covering c. 21,300 km2) which were mapped with different methods (i.e. conventional mapping, fuzzy logic and an artificial neural network). Two a.s. soil related indicators measured in the river water (sulfate content and sulfate/chloride ratio) were compared to the extent of the most probable areas for a.s. soils in the surveyed catchments. High sulfate contents and sulfate/chloride ratios measured in most of the rivers demonstrated the presence of a.s. soils in the corresponding catchments. The calculated extent of the most probable a.s. soil areas is supported by independent data on water chemistry, suggesting that the a.s. soil probability maps created with different methods are reliable and comparable.
Resumo:
Tutkimuksen tavoite on selvittää digitaalisen sisällön ominaisuuksia, jotka vaikuttavat ryhtyvätkö kuluttajat jakamaan, tykkäämään ja kommentoimaan sitä sosiaalisessa mediassa. Tällä pyritään auttamaan yrityksiä ymmärtämään paremmin viraalisuutta, jotta he pystyisivät tuottamaan ja julkaisemaan nettisivuillaan ja sosiaalisessa mediassa parempaa sisältöä, jota kuluttajat jakaisivat enemmän. Tutkimus toteutetaan muodostamalla hypoteeseja mahdollisista ominaisuuksista kirjallisuuden perusteella ja testaamalla niitä regressioanalyyseillä empiirisessä osiossa. Tulokset paljastavat yhdeksän piirrettä, jotka lisäävät viraalisuutta: kiinnostavuus, neutraalisuus, yllättävyys, viihdyttävyys, epäkäytännöllisyys, artikkelin ja Facebook julkaisun pituus, eri sisältö taktiikoiden käyttö (erityisesti blogit ja kuvat lisäävät viraalisuutta) sekä kun mielipidevaikuttaja tai kuuluisuus jakaa sisällön.
Resumo:
Advancements in information technology have made it possible for organizations to gather and store vast amounts of data of their customers. Information stored in databases can be highly valuable for organizations. However, analyzing large databases has proven to be difficult in practice. For companies in the retail industry, customer intelligence can be used to identify profitable customers, their characteristics, and behavior. By clustering customers into homogeneous groups, companies can more effectively manage their customer base and target profitable customer segments. This thesis will study the use of the self-organizing map (SOM) as a method for analyzing large customer datasets, clustering customers, and discovering information about customer behavior. Aim of the thesis is to find out whether the SOM could be a practical tool for retail companies to analyze their customer data.
Resumo:
There are more than 7000 languages in the world, and many of these have emerged through linguistic divergence. While questions related to the drivers of linguistic diversity have been studied before, including studies with quantitative methods, there is no consensus as to which factors drive linguistic divergence, and how. In the thesis, I have studied linguistic divergence with a multidisciplinary approach, applying the framework and quantitative methods of evolutionary biology to language data. With quantitative methods, large datasets may be analyzed objectively, while approaches from evolutionary biology make it possible to revisit old questions (related to, for example, the shape of the phylogeny) with new methods, and adopt novel perspectives to pose novel questions. My chief focus was on the effects exerted on the speakers of a language by environmental and cultural factors. My approach was thus an ecological one, in the sense that I was interested in how the local environment affects humans and whether this human-environment connection plays a possible role in the divergence process. I studied this question in relation to the Uralic language family and to the dialects of Finnish, thus covering two different levels of divergence. However, as the Uralic languages have not previously been studied using quantitative phylogenetic methods, nor have population genetic methods been previously applied to any dialect data, I first evaluated the applicability of these biological methods to language data. I found the biological methodology to be applicable to language data, as my results were rather similar to traditional views as to both the shape of the Uralic phylogeny and the division of Finnish dialects. I also found environmental conditions, or changes in them, to be plausible inducers of linguistic divergence: whether in the first steps in the divergence process, i.e. dialect divergence, or on a large scale with the entire language family. My findings concerning Finnish dialects led me to conclude that the functional connection between linguistic divergence and environmental conditions may arise through human cultural adaptation to varying environmental conditions. This is also one possible explanation on the scale of the Uralic language family as a whole. The results of the thesis bring insights on several different issues in both a local and a global context. First, they shed light on the emergence of the Finnish dialects. If the approach used in the thesis is applied to the dialects of other languages, broader generalizations may be drawn as to the inducers of linguistic divergence. This again brings us closer to understanding the global patterns of linguistic diversity. Secondly, the quantitative phylogeny of the Uralic languages, with estimated times of language divergences, yields another hypothesis as to the shape and age of the language family tree. In addition, the Uralic languages can now be added to the growing list of language families studied with quantitative methods. This will allow broader inferences as to global patterns of language evolution, and more language families can be included in constructing the tree of the world’s languages. Studying history through language, however, is only one way to illuminate the human past. Therefore, thirdly, the findings of the thesis, when combined with studies of other language families, and those for example in genetics and archaeology, bring us again closer to an understanding of human history.
Resumo:
Optical microscopy is living its renaissance. The diffraction limit, although still physically true, plays a minor role in the achievable resolution in far-field fluorescence microscopy. Super-resolution techniques enable fluorescence microscopy at nearly molecular resolution. Modern (super-resolution) microscopy methods rely strongly on software. Software tools are needed all the way from data acquisition, data storage, image reconstruction, restoration and alignment, to quantitative image analysis and image visualization. These tools play a key role in all aspects of microscopy today – and their importance in the coming years is certainly going to increase, when microscopy little-by-little transitions from single cells into more complex and even living model systems. In this thesis, a series of bioimage informatics software tools are introduced for STED super-resolution microscopy. Tomographic reconstruction software, coupled with a novel image acquisition method STED< is shown to enable axial (3D) super-resolution imaging in a standard 2D-STED microscope. Software tools are introduced for STED super-resolution correlative imaging with transmission electron microscopes or atomic force microscopes. A novel method for automatically ranking image quality within microscope image datasets is introduced, and it is utilized to for example select the best images in a STED microscope image dataset.
Resumo:
This paper explores behavioral patterns of web users on an online magazine web-site. The goal of the study is to first find and visualize user paths within the data generated during collection, and to identify some generic behavioral typologies of user behavior. To form a theoretical foundation for processing data and identifying behavioral ar-chetypes, the study relies on established consumer behavior literature to propose typologies of behavior. For data processing, the study utilizes methodologies of ap-plied cluster analysis and sequential path analysis. Utilizing a dataset of click stream data generated from the real-life clicks of 250 ran-domly selected website visitors over a period of six weeks. Based on the data collect-ed, an exploratory method is followed in order to find and visualize generally occur-ring paths of users on the website. Six distinct behavioral typologies were recog-nized, with the dominant user consuming mainly blog content, as opposed to editori-al content. Most importantly, it was observed that approximately 80% of clicks were of the blog content category, meaning that the majority of web traffic occurring in the site takes place in content other than the desired editorial content pages. The out-come of the study is a set of managerial recommendations for each identified behavioral archetype.
Resumo:
This thesis discusses the dynamism of bilateral relations between Finland and Russia and their interconnection with wider EU-Russia relations in the sight of the recent conflict in Ukraine. In particular, incorporation of Crimea in the territory of Russia in March 2014 is believed to have triggered a series of disputes between the European Union and Russia and thus, have impacted the course of the bilateral Finnish-Russian relations. The study leans on a premise that there are some historical traditions and regularities in the Finnish foreign policy course towards Russia which make the bilateral Finnish-Russian relations special. These traditions are distinguished and described in the book “Russia Forever? Towards Pragmatism in Finnish/Russian relations” (2008) edited by H. Rytövuori-Apunen. Assuming that the featured traditions take place in modern relations between Finland and Russia, the aim of the thesis is to find out how these traditions reappear during the year shaped by the events in Ukraine. In order to do that, author follows the timeline of happenings around the Ukraine crisis starting with Crimea’s referendum on independence, and exams the way these events were commented on and evaluated by the key government officials and political institutions of Finland and Russia. The main focus is given to the Finnish official discourse on Russia during the study period. The data collection, consisting of mostly primary sources (ministerial press releases and comments, statements, speeches and blog posts of individual policy makers) is processed using the thematic analysis supported by the content analysis. The study reveals that the consequences of the Ukraine crisis have brought, among others, complications to the economic cooperation between Finland and Russia, and have stimulated the increased attention of the Finnish decision makers to the country’s security questions. As a result, the character and importance of some historical regularities of the Finnish foreign policies on Russia, like the Continental Dilemma, have taken new shape.
Resumo:
The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.
Resumo:
Wind power is a rapidly developing, low-emission form of energy production. In Fin-land, the official objective is to increase wind power capacity from the current 1 005 MW up to 3 500–4 000 MW by 2025. By the end of April 2015, the total capacity of all wind power project being planned in Finland had surpassed 11 000 MW. As the amount of projects in Finland is record high, an increasing amount of infrastructure is also being planned and constructed. Traditionally, these planning operations are conducted using manual and labor-intensive work methods that are prone to subjectivity. This study introduces a GIS-based methodology for determining optimal paths to sup-port the planning of onshore wind park infrastructure alignment in Nordanå-Lövböle wind park located on the island of Kemiönsaari in Southwest Finland. The presented methodology utilizes a least-cost path (LCP) algorithm for searching of optimal paths within a high resolution real-world terrain dataset derived from airborne lidar scannings. In addition, planning data is used to provide a realistic planning framework for the anal-ysis. In order to produce realistic results, the physiographic and planning datasets are standardized and weighted according to qualitative suitability assessments by utilizing methods and practices offered by multi-criteria evaluation (MCE). The results are pre-sented as scenarios to correspond various different planning objectives. Finally, the methodology is documented by using tools of Business Process Management (BPM). The results show that the presented methodology can be effectively used to search and identify extensive, 20 to 35 kilometers long networks of paths that correspond to certain optimization objectives in the study area. The utilization of high-resolution terrain data produces a more objective and more detailed path alignment plan. This study demon-strates that the presented methodology can be practically applied to support a wind power infrastructure alignment planning process. The six-phase structure of the method-ology allows straightforward incorporation of different optimization objectives. The methodology responds well to combining quantitative and qualitative data. Additional-ly, the careful documentation presents an example of how the methodology can be eval-uated and developed as a business process. This thesis also shows that more emphasis on the research of algorithm-based, more objective methods for the planning of infrastruc-ture alignment is desirable, as technological development has only recently started to realize the potential of these computational methods.