915 resultados para High quality


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The superconducting (or cryogenic) gravimeter (SG) is based on the levitation of a super­conducting sphere in a stable magnetic field created by current in superconducting coils. Depending on frequency, it is capable of detecting gravity variations as small as 10-11ms-2. For a single event, the detection threshold is higher, conservatively about 10-9 ms-2. Due to its high sensitivity and low drift rate, the SG is eminently suitable for the study of geodynamical phenomena through their gravity signatures. I present investigations of Earth dynamics with the superconducting gravimeter GWR T020 at Metsähovi from 1994 to 2005. The history and key technical details of the installation are given. The data processing methods and the development of the local tidal model at Metsähovi are presented. The T020 is a part of the worldwide GGP (Global Geodynamics Project) network, which consist of 20 working station. The data of the T020 and of other participating SGs are available to the scientific community. The SG T020 have used as a long-period seismometer to study microseismicity and the Earth s free oscillation. The annual variation, spectral distribution, amplitude and the sources of microseism at Metsähovi were presented. Free oscillations excited by three large earthquakes were analyzed: the spectra, attenuation and rotational splitting of the modes. The lowest modes of all different oscillation types are studied, i.e. the radial mode 0S0, the "football mode" 0S2, and the toroidal mode 0T2. The very low level (0.01 nms-1) incessant excitation of the Earth s free oscillation was detected with the T020. The recovery of global and regional variations in gravity with the SG requires the modelling of local gravity effects. The most important of them is hydrology. The variation in the groundwater level at Metsähovi as measured in a borehole in the fractured bedrock correlates significantly (0.79) with gravity. The influence of local precipitation, soil moisture and snow cover are detectable in the gravity record. The gravity effect of the variation in atmospheric mass and that of the non-tidal loading by the Baltic Sea were investigated together, as sea level and air pressure are correlated. Using Green s functions it was calculated that a 1 metre uniform layer of water in the Baltic Sea increases the gravity at Metsähovi by 31 nms-2 and the vertical deformation is -11 mm. The regression coefficient for sea level is 27 nms-2m-1, which is 87% of the uniform model. These studies are associated with temporal height variations using the GPS data of Metsähovi permanent station. Results of long time series at Metsähovi demonstrated high quality of data and correctly carried out offsets and drift corrections. The superconducting gravimeter T020 has been proved to be an eminent and versatile tool in studies of the Earth dynamics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Because of the bottlenecking operations in a complex coal rail system, millions of dollars are costed by mining companies. To handle this issue, this paper investigates a real-world coal rail system and aims to optimise the coal railing operations under constraints of limited resources (e.g., limited number of locomotives and wagons). In the literature, most studies considered the train scheduling problem on a single-track railway network to be strongly NP-hard and thus developed metaheuristics as the main solution methods. In this paper, a new mathematical programming model is formulated and coded by optimization programming language based on a constraint programming (CP) approach. A new depth-first-search technique is developed and embedded inside the CP model to obtain the optimised coal railing timetable efficiently. Computational experiments demonstrate that high-quality solutions are obtainable in industry-scale applications. To provide insightful decisions, sensitivity analysis is conducted in terms of different scenarios and specific criteria. Keywords Train scheduling · Rail transportation · Coal mining · Constraint programming

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Physics teachers are in a key position to form the attitudes and conceptions of future generations toward science and technology, as well as to educate future generations of scientists. Therefore, good teacher education is one of the key areas of physics departments education program. This dissertation is a contribution to the research-based development of high quality physics teacher education, designed to meet three central challenges of good teaching. The first challenge relates to the organization of physics content knowledge. The second challenge, connected to the first one, is to understand the role of experiments and models in (re)constructing the content knowledge of physics for purposes of teaching. The third challenge is to provide for pre-service physics teachers opportunities and resources for reflecting on or assessing their knowledge and experience about physics and physics education. This dissertation demonstrates how these challenges can be met when the content knowledge of physics, the relevant epistemological aspects of physics and the pedagogical knowledge of teaching and learning physics are combined. The theoretical part of this dissertation is concerned with designing two didactical reconstructions for purposes of physics teacher education: the didactical reconstruction of processes (DRoP) and the didactical reconstruction of structures (DRoS). This part starts with taking into account the required professional competencies of physics teachers, the pedagogical aspects of teaching and learning, and the benefits of the graphical ways of representing knowledge. Then it continues with the conceptual and philosophical analysis of physics, especially with the analysis of experiments and models role in constructing knowledge. This analysis is condensed in the form of the epistemological reconstruction of knowledge justification. Finally, these two parts are combined in the designing and production of the DRoP and DRoS. The DRoP captures the knowledge formation of physical concepts and laws in concise and simplified form while still retaining authenticity from the processes of how concepts have been formed. The DRoS is used for representing the structural knowledge of physics, the connections between physical concepts, quantities and laws, to varying extents. Both DRoP and DRoS are represented in graphical form by means of flow charts consisting of nodes and directed links connecting the nodes. The empirical part discusses two case studies that show how the three challenges are met through the use of DRoP and DRoS and how the outcomes of teaching solutions based on them are evaluated. The research approach is qualitative; it aims at the in-depth evaluation and understanding about the usefulness of the didactical reconstructions. The data, which were collected from the advanced course for prospective physics teachers during 20012006, consisted of DRoP and DRoS flow charts made by students and student interviews. The first case study discusses how student teachers used DRoP flow charts to understand the process of forming knowledge about the law of electromagnetic induction. The second case study discusses how student teachers learned to understand the development of physical quantities as related to the temperature concept by using DRoS flow charts. In both studies, the attention is focused on the use of DRoP and DRoS to organize knowledge and on the role of experiments and models in this organization process. The results show that students understanding about physics knowledge production improved and their knowledge became more organized and coherent. It is shown that the flow charts and the didactical reconstructions behind them had an important role in gaining these positive learning results. On the basis of the results reported here, the designed learning tools have been adopted as a standard part of the teaching solutions used in the physics teacher education courses in the Department of Physics, University of Helsinki.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cascaded multilevel inverters synthesize a medium-voltage output based on a series connection of power cells which use standard low-voltage component configurations. This characteristic allows one to achieve high-quality output voltages and input currents and also outstanding availability due to their intrinsic component redundancy. Due to these features, the cascaded multilevel inverter has been recognized as an important alternative in the medium-voltage inverter market. This paper presents a survey of different topologies, control strategies and modulation techniques used by these inverters. Regenerative and advanced topologies are also discussed. Applications where the mentioned features play a key role are shown. Finally, future developments are addressed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The successful interaction between leaders and their followers is central to the overall functioning of a company. The increasingly multinational nature of modern business and the resulting multicultural and increasingly heterogeneous workforce imposes specific challenges to the development of high-quality work relationships. The Western multinational companies that have started operations in China are facing these challenges. This study examines the quality of leader-follower relationships between Western expatriate leaders and their Chinese followers as well as between Chinese leaders and their Chinese followers in Western-owned subsidiaries in China. The focus is on the influence of personal, interpersonal and behavioural factors (personality, values, cultural knowledge, perceived and actual similarity, interactional justice, and follower performance) and the work-related implications of these relationships (job attitudes and organisational citizenship behaviour). One interesting finding of this study is that Chinese followers have higher perceptions of their Western than their Chinese leaders. The results also indicate that Chinese and Western leaders’ perceptions of their followers are not influenced favourably by the same follower characteristics. In a similar vein, Chinese followers value different traits in Western versus Chinese leaders. These results, as well as the numerous more specific findings of the study, have practical implications for international human resource management and areas such as selection, placement and training. Due to the different effect of personal and interpersonal factors across groups, it is difficult to achieve the “perfect match” between leader and follower characteristics that simultaneously contribute to high-quality relationships for Chinese and Western leaders as well as for followers. However, the results indicate that the ability of organisations to enhance the quality of leader-follower relations by selecting and matching people with suitable characteristics may provide an effective means for organisations to increase positive job attitudes and hence influence work-related outcomes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The growth of strongly oriented or epitaxial thin films of metal oxides generally requires relatively high growth temperatures or infusion of energy to the growth surface through means such as ion bombardment. We have grown high quality epitaxial thin films of Co3O4 on different substrates at a temperature as low as 400 degreesC by low-pressure metalorganic chemical vapour deposition (MOCVD) using cobalt(II) acetylacetonate as the precursor. With oxygen as the reactant gas, polycrystalline Co3O4 films are formed on glass and Si (100) in the temperature range 400-550 degreesC. Under similar conditions of growth. highly oriented films of Co3O4 are formed on SrTiO3 (100) and LaAlO3 (100). The activation energy for the growth of polycrystalline films on glass is significantly higher than that for epitaxial growth on SrTiO3 (100). The film on LaAlO3 (100) grown at 450 degreesC shows a rocking curve FWHM of 1.61 degrees, which reduces to 1.32 degrees when it is annealed in oxygen at 725 degreesC. The film on SrTiO3 (100) has a FWHM of 0.33 degrees (as deposited) and 0.29 (after annealing at 725 degreesC). The phi -scan analysis shows cube-on-cube epitaxy on both these substrates. The quality of epitaxy on SrTiO3 (100) is comparable to the best of the perovskite-based oxide thin films grown at significantly higher temperatures. A plausible mechanism is proposed for the observed low temperature epitaxy. (C) 2001 Published by Elsevier Science B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of high-quality tin monosulphide (SnS) layers is one of the crucial tasks in the fabrication of efficient SnS-based optoelectronic devices. Reduction of strain between film and the substrate by using an appropriate lattice-matched (LM) substrate is a new attempt for the growth of high-quality layers. In this view, the SnS films were deposited on LM Al substrate using the thermal evaporation technique with a low rate of evaporation. The as-grown SnS films were characterized using appropriate techniques and the obtained results are discussed by comparing them with the properties of SnS films grown on amorphous substrate under the same conditions. From structural analysis of the films, it is noticed that the SnS films deposited on amorphous substrate have crystallites that were oriented along different directions. However, the SnS crystallites grown on Al substrate exhibited epitaxial growth along the 101] direction. Photoluminescence (PL) and Raman studies reveal that the films grown on Al substrate have better optical properties than those of the films grown on amorphous substrates. (C) 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stroke is a major cause of death and disability, incurs significant costs to healthcare systems, and inflicts severe burden to the whole society. Stroke care in Finland has been described in several population-based studies between 1967 and 1998, but not since. In the PERFECT Stroke study presented here, a system for monitoring the Performance, Effectiveness, and Costs of Treatment episodes in Stroke was developed in Finland. Existing nationwide administrative registries were linked at individual patient level with personal identification numbers to depict whole episodes of care, from acute stroke, through rehabilitation, until the patients went home, were admitted to permanent institutional care, or died. For comparisons in time and between providers, patient case-mix was adjusted for. The PERFECT Stroke database includes 104 899 first-ever stroke patients over the years 1999 to 2008, of whom 79% had ischemic stroke (IS), 14% intracerebral hemorrhage (ICH), and 7% subarachnoid hemorrhage (SAH). A 18% decrease in the age and sex adjusted incidence of stroke was observed over the study period, 1.8% improvement annually. All-cause 1-year case-fatality rate improved from 28.6% to 24.6%, or 0.5% annually. The expected median lifetime after stroke increased by 2 years for IS patients, to 7 years and 7 months, and by 1 year for ICH patients, to 4 years 5 months. No change could be seen in median SAH patient survival, >10 years. Stroke prevalence was 82 000, 1.5% of total population of Finland, in 2008. Modern stroke center care was shown to be associated with a decrease in both death and risk of institutional care of stroke patients. Number needed to treat to prevent these poor outcomes at one year from stroke was 32 (95% confidence intervals 26 to 42). Despite improvements over the study period, more than a third of Finnish stroke patients did not have access to stroke center care. The mean first-year healthcare cost of a stroke patient was ~20 000 , and among survivors ~10 000 annually thereafter. Only part of this cost was incurred by stroke, as the same patients cost ~5000 over the year prior to stroke. Total lifetime costs after first-ever stroke were ~85 000 . A total of 1.1 Billion , 7% of all healthcare expenditure, is used in the treatment of stroke patients annually. Despite a rapidly aging population, the number of new stroke patients is decreasing, and the patients are more likely to survive. This is explained in part by stroke center care, which is effective, and should be made available for all stroke patients. It is possible, in a suitable setting with high-quality administrative registries and a common identifier, to avoid the huge workload and associated costs of setting up a conventional stroke registry, and still acquire a fairly comprehensive dataset on stroke care and outcome.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Polycrystalline diamond coatings are grown on Si (100) substrate by hot filament CVD technique. We investigate here the effect of substrate roughening on the substrate temperature and methane concentration required to maintain high quality, high growth rate and faceted morphology of the diamond coatings. It has been shown that as we increase the substrate roughness from 0.05 mu m to 0.91 mu m (centre line average or CLA) there is enhancement in deposited film quality (Raman peak intensity ratio of sp (3) to non-sp (3) content increases from 1.65 to 7.13) and the substrate temperature can be brought down to 640A degrees C without any additional substrate heating. The coatings grown at adverse conditions for sp (3) deposition has cauliflower morphology with nanocrystalline grains and coatings grown under favourable sp (3) condition gives clear faceted grains.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Paraserianthes falcataria is a very fast growing, light wood tree species, that has recently gained wide interest in Indonesia for industrial wood processing. At the moment the P. falcataria plantations managed by smallholders are lacking predefined management programmes for commercial wood production. The general objective of this study was to model the growth and yield of Paraserianthes falcataria stands managed by smallholders in Ciamis, West Java, Indonesia and to develop management scenarios for different production objectives. In total 106 circular sample plots with over 2300 P. falcataria trees were assessed on smallholder plantation inventory. In addition, information on market prices of P. falcataria wood was collected through rapid appraisals among industries. A tree growth model based on Chapman-Richards function was developed on three different site qualities and the stand management scenarios were developed under three management objectives: (1) low initial stand density with low intensity stand management, (2) high initial stand density with medium intensity of intervention, (3) high initial stand density and strong intensity of silvicultural interventions, repeated more than once. In general, the 9 recommended scenarios have rotation ages varying from 4 to 12 years, planting densities from 4x4 meters (625 trees ha-1) to 3x2 meters (1666 trees ha-1) and thinnings at intensities of removing 30 to 60 % of the standing trees. The highest annual income would be generated on high-quality with a scenario with initial planting density 3x2 m (1666 trees ha-1) one thinning at intensity of removing 55 % of the standing trees at the age of 2 years and clear cut at the age of 4 years.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Perunalla (Solanum tuberosum L.) tällä hetkellä maailmanlaajuisesti eniten sato- ja laatutappioita aiheuttaa perunan Y-virus (PVY). Vaikka pelkän Y-viruksen aiheuttamaa satotappiota on vaikea mitata, on sen arvioitu olevan 20-80 %. Viruksen tärkein leviämistapa on viroottinen siemenperuna. Korkealaatuinen siemenperuna on edellytys ruoka-, ruokateollisuus- ja tärkkelysperunan tuotannolle. Kasvuston silmämääräinen tarkastelu aliarvioi yleensä Y-viruksen esiintyvyyttä. Laboratoriotestauksen avulla saadaan tarkempi tieto pellolta korjatun sadon saastunta-asteesta. Ongelmana Y-viruksen testaamisessa on, että sitä ei havaita dormanssissa olevista perunoista otetuista näytteistä yhtä luotettavasti kuin jo dormanssin ohittaneista perunoista testattaessa. Erilaisia menetelmiä kemikaaleista (Rindite, bromietaani) kasvihormoneihin (mm. gibberelliinihappo) ja varastointiolosuhteiden muutoksiin (kylmä- ja lämpökäsittely) on kokeiltu perunan dormanssin purkamiseen, mutta tulokset ovat olleet vaihtelevia. Tässä tutkielmassa perunan dormanssin purkamiseen käytettiin happi-hiilidioksidikäsittelyä (O2 40 % ja CO2 20 %) eripituisina käsittelyaikoina. Tarkoituksena oli selvittää, vaikuttaako käsittely perunan itämiseen ja dormanssin luontaista aikaisempaan purkautumiseen tai Y-viruksen havaitsemiseen. Lisäksi haluttiin selvittää, voiko Y-viruksen määrittämisen ELISA-testillä (Enzyme Linked Immunosorbent Assay) tehdä yhtä luotettavasti myös muista kasvinosista (mukula, itu), kuin tällä hetkellä yleisesti käytetystä perunan lehdestä. Idätyskäsittelyn vaikutuksista dormanssin purkautumiseen saatiin vaihtelevia, eikä kovinkaan yleistettäviä tuloksia. Käsittelyn ei myöskään havaittu vaikuttavan PYY-viroottisuuden havaitsemiseen eri näytemateriaaleilla testattaessa. Kun eri kasvinosien toimivuutta testissä vertailtiin, mukulamateriaalin todettiin aliarvioivan PVY-viroottisuutta kaikissa kokeissa. Myös itumateriaali aliarvioi pääsääntöisesti PVY-viroottisuutta ELISA:lla tehdyissä määrityksissä. Luotettavin testimateriaali oli perunan lehti.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Listeria monocytogenes is the causative agent of the severe foodborne infection listeriosis. The number of listeriosis cases in recent years has increased in many European countries, including Finland. Contamination of the pathogen needs to be minimized and growth to high numbers in foods prevented in order to reduce the incidence of human cases. The aim of this study was to evaluate contamination routes of L. monocytogenes in the food chain and to investigate methods for control of the pathogen in food processing. L. monocytogenes was commonly found in wild birds, the pig production chain and in pork production plants. It was found most frequently in birds feeding at landfill site, organic farms, tonsil samples, and sites associated with brining. L. monococytogenes in birds, farms, food processing plant or foods did not form distinct genetic groups, but populations overlapped. The majority of genotypes recovered from birds were also detected in foods, food processing environments and other animal species and birds may disseminate L. monocytogenes into food chain. Similar genotypes were found in different pigs on the same farm, as well as in pigs on farms and later in the slaughterhouse. L. monocytogenes contamination spreads at farm level and may be a contamination source into slaughterhouses and further into meat. Incoming raw pork in the processing plant was frequently contaminated with L. monocytogenes and genotypes in raw meat were also found in processing environment and in RTE products. Thus, raw material seems to be a considerable source of contamination into processing facilities. In the pork processing plant, the prevalence of L. monocytogenes increased in the brining area, showing that the brining was an important contamination site. Recovery of the inoculated L. monocytogenes strains showed that there were strain-specific differences in the ability to survive in lettuce and dry sausage. The ability of some L. monocytogenes strains to survive well in food production raises a challenge for industry, because these strains can be especially difficult to remove from the products and raises a need to use an appropriate hurdle concept to control most resistant strains. Control of L. monocytogenes can be implemented throughout the food chain. Farm-specific factors affected the prevalence of L. monocytogenes and good farm-level practices can therefore be utilized to reduce the prevalence of this pathogen on the farm and possibly further in the food chain. Well separated areas in a pork production plant had low prevalences of L. monocytogenes, thus showing that compartmentalization controls the pathogen in the processing line. The food processing plant, especially the brining area, should be subjected to disassembling, extensive cleaning and disinfection to eliminate persistent contamination by L. monocytogenes, and replacing brining with dry-salting should be considered. All of the evaluated washing solutions decreased the populations of L. monocytogenes on precut lettuce, but did not eliminate the pathogen. Thus, the safety of fresh-cut produce cannot rely on washing with disinfectants, and high-quality raw material and good manufacturing practices remain important. L. monocytogenes was detected in higher levels in sausages without the protective culture than in sausages with this protective strain, although numbers of L. monocytogenes by the end of the ripening decreased to the level of < 100 MPN/g in all sausages. Protective starter cultures provide an appealing hurdle in dry sausage processing and assist in the control of L. monocytogenes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The world of mapping has changed. Earlier, only professional experts were responsible for map production, but today ordinary people without any training or experience can become map-makers. The number of online mapping sites, and the number of volunteer mappers has increased significantly. The development of the technology, such as satellite navigation systems, Web 2.0, broadband Internet connections, and smartphones, have had one of the key roles in enabling the rise of volunteered geographic information (VGI). As opening governmental data to public is a current topic in many countries, the opening of high quality geographical data has a central role in this study. The aim of this study is to investigate how is the quality of spatial data produced by volunteers by comparing it with the map data produced by public authorities, to follow what occurs when spatial data are opened for users, and to get acquainted with the user profile of these volunteer mappers. A central part of this study is OpenStreetMap project (OSM), which aim is to create a map of the entire world by volunteers. Anyone can become an OpenStreetMap contributor, and the data created by the volunteers are free to use for anyone without restricting copyrights or license charges. In this study OpenStreetMap is investigated from two viewpoints. In the first part of the study, the aim was to investigate the quality of volunteered geographic information. A pilot project was implemented by following what occurs when a high-resolution aerial imagery is released freely to the OpenStreetMap contributors. The quality of VGI was investigated by comparing the OSM datasets with the map data of The National Land Survey of Finland (NLS). The quality of OpenStreetMap data was investigated by inspecting the positional accuracy and the completeness of the road datasets, as well as the differences in the attribute datasets between the studied datasets. Also the OSM community was under analysis and the development of the map data of OpenStreetMap was investigated by visual analysis. The aim of the second part of the study was to analyse the user profile of OpenStreetMap contributors, and to investigate how the contributors act when collecting data and editing OpenStreetMap. The aim was also to investigate what motivates users to map and how is the quality of volunteered geographic information envisaged. The second part of the study was implemented by conducting a web inquiry to the OpenStreetMap contributors. The results of the study show that the quality of OpenStreetMap data compared with the data of National Land Survey of Finland can be defined as good. OpenStreetMap differs from the map of National Land Survey especially because of the amount of uncertainty, for example because of the completeness and uniformity of the map are not known. The results of the study reveal that opening spatial data increased notably the amount of the data in the study area, and both the positional accuracy and completeness improved significantly. The study confirms the earlier arguments that only few contributors have created the majority of the data in OpenStreetMap. The inquiry made for the OpenStreetMap users revealed that the data are most often collected by foot or by bicycle using GPS device, or by editing the map with the help of aerial imageries. According to the responses, the users take part to the OpenStreetMap project because they want to make maps better, and want to produce maps, which have information that is up-to-date and cannot be found from any other maps. Almost all of the users exploit the maps by themselves, most popular methods being downloading the map into a navigator or into a mobile device. The users regard the quality of OpenStreetMap as good, especially because of the up-to-dateness and the accuracy of the map.