6 resultados para integrated navigation systems
em Helda - Digital Repository of University of Helsinki
Resumo:
The world of mapping has changed. Earlier, only professional experts were responsible for map production, but today ordinary people without any training or experience can become map-makers. The number of online mapping sites, and the number of volunteer mappers has increased significantly. The development of the technology, such as satellite navigation systems, Web 2.0, broadband Internet connections, and smartphones, have had one of the key roles in enabling the rise of volunteered geographic information (VGI). As opening governmental data to public is a current topic in many countries, the opening of high quality geographical data has a central role in this study. The aim of this study is to investigate how is the quality of spatial data produced by volunteers by comparing it with the map data produced by public authorities, to follow what occurs when spatial data are opened for users, and to get acquainted with the user profile of these volunteer mappers. A central part of this study is OpenStreetMap project (OSM), which aim is to create a map of the entire world by volunteers. Anyone can become an OpenStreetMap contributor, and the data created by the volunteers are free to use for anyone without restricting copyrights or license charges. In this study OpenStreetMap is investigated from two viewpoints. In the first part of the study, the aim was to investigate the quality of volunteered geographic information. A pilot project was implemented by following what occurs when a high-resolution aerial imagery is released freely to the OpenStreetMap contributors. The quality of VGI was investigated by comparing the OSM datasets with the map data of The National Land Survey of Finland (NLS). The quality of OpenStreetMap data was investigated by inspecting the positional accuracy and the completeness of the road datasets, as well as the differences in the attribute datasets between the studied datasets. Also the OSM community was under analysis and the development of the map data of OpenStreetMap was investigated by visual analysis. The aim of the second part of the study was to analyse the user profile of OpenStreetMap contributors, and to investigate how the contributors act when collecting data and editing OpenStreetMap. The aim was also to investigate what motivates users to map and how is the quality of volunteered geographic information envisaged. The second part of the study was implemented by conducting a web inquiry to the OpenStreetMap contributors. The results of the study show that the quality of OpenStreetMap data compared with the data of National Land Survey of Finland can be defined as good. OpenStreetMap differs from the map of National Land Survey especially because of the amount of uncertainty, for example because of the completeness and uniformity of the map are not known. The results of the study reveal that opening spatial data increased notably the amount of the data in the study area, and both the positional accuracy and completeness improved significantly. The study confirms the earlier arguments that only few contributors have created the majority of the data in OpenStreetMap. The inquiry made for the OpenStreetMap users revealed that the data are most often collected by foot or by bicycle using GPS device, or by editing the map with the help of aerial imageries. According to the responses, the users take part to the OpenStreetMap project because they want to make maps better, and want to produce maps, which have information that is up-to-date and cannot be found from any other maps. Almost all of the users exploit the maps by themselves, most popular methods being downloading the map into a navigator or into a mobile device. The users regard the quality of OpenStreetMap as good, especially because of the up-to-dateness and the accuracy of the map.
Resumo:
It has been suggested that semantic information processing is modularized according to the input form (e.g., visual, verbal, non-verbal sound). A great deal of research has concentrated on detecting a separate verbal module. Also, it has traditionally been assumed in linguistics that the meaning of a single clause is computed before integration to a wider context. Recent research has called these views into question. The present study explored whether it is reasonable to assume separate verbal and nonverbal semantic systems in the light of the evidence from event-related potentials (ERPs). The study also provided information on whether the context influences processing of a single clause before the local meaning is computed. The focus was on an ERP called N400. Its amplitude is assumed to reflect the effort required to integrate an item to the preceding context. For instance, if a word is anomalous in its context, it will elicit a larger N400. N400 has been observed in experiments using both verbal and nonverbal stimuli. Contents of a single sentence were not hypothesized to influence the N400 amplitude. Only the combined contents of the sentence and the picture were hypothesized to influence the N400. The subjects (n = 17) viewed pictures on a computer screen while hearing sentences through headphones. Their task was to judge the congruency of the picture and the sentence. There were four conditions: 1) the picture and the sentence were congruent and sensible, 2) the sentence and the picture were congruent, but the sentence ended anomalously, 3) the picture and the sentence were incongruent but sensible, 4) the picture and the sentence were incongruent and anomalous. Stimuli from the four conditions were presented in a semi-randomized sequence. Their electroencephalography was simultaneously recorded. ERPs were computed for the four conditions. The amplitude of the N400 effect was largest in the incongruent sentence-picture -pairs. The anomalously ending sentences did not elicit a larger N400 than the sensible sentences. The results suggest that there is no separate verbal semantic system, and that the meaning of a single clause is not processed independent of the context.
Resumo:
Forest management is facing new challenges under climate change. By adjusting thinning regimes, conventional forest management can be adapted to various objectives of utilization of forest resources, such as wood quality, forest bioenergy, and carbon sequestration. This thesis aims to develop and apply a simulation-optimization system as a tool for an interdisciplinary understanding of the interactions between wood science, forest ecology, and forest economics. In this thesis, the OptiFor software was developed for forest resources management. The OptiFor simulation-optimization system integrated the process-based growth model PipeQual, wood quality models, biomass production and carbon emission models, as well as energy wood and commercial logging models into a single optimization model. Osyczka s direct and random search algorithm was employed to identify optimal values for a set of decision variables. The numerical studies in this thesis broadened our current knowledge and understanding of the relationships between wood science, forest ecology, and forest economics. The results for timber production show that optimal thinning regimes depend on site quality and initial stand characteristics. Taking wood properties into account, our results show that increasing the intensity of thinning resulted in lower wood density and shorter fibers. The addition of nutrients accelerated volume growth, but lowered wood quality for Norway spruce. Integrating energy wood harvesting into conventional forest management showed that conventional forest management without energy wood harvesting was still superior in sparse stands of Scots pine. Energy wood from pre-commercial thinning turned out to be optimal for dense stands. When carbon balance is taken into account, our results show that changing carbon assessment methods leads to very different optimal thinning regimes and average carbon stocks. Raising the carbon price resulted in longer rotations and a higher mean annual increment, as well as a significantly higher average carbon stock over the rotation.
Resumo:
In smaller countries where the key players in construction IT development tend to know each other personally and where public R&D funding is concentrated to a few channels, IT roadmaps and strategies would seem to have a better chance of influencing development than in the bigger industrial countries. In this paper Finland and the RATAS-project is presented as a historical case illustrating such impact. RATAS was initiated as a construction IT roadmap project in 1985, involving many of the key organisations and companies active in construction sector development. Several of the individuals who took an active part in the project have played an important role in later developments both in Finland and on the international scene. The central result of RATAS was the identification of what is nowadays called Building Information Modelling (BIM) technology as the central issue in getting IT into efficient use in the construction sector. BIM, which earlier was referred to as building product modelling, has been a key ingredient in many roadmaps since and the subject of international standardisation efforts such as STEP and IAI/IFCs. The RATAS project can in hindsight be seen as a forerunner with an impact which also transcended national borders.
Resumo:
The question what a business-to-business (B2B) collaboration setup and enactment application-system should look like remains open. An important element of such collaboration constitutes the inter-organizational disclosure of business-process details so that the opposing parties may protect their business secrets. For that purpose, eSourcing [37] has been developed as a general businessprocess collaboration concept in the framework of the EU research project Cross- Work. The eSourcing characteristics are guiding for the design and evaluation of an eSourcing Reference Architecture (eSRA) that serves as a starting point for software developers of B2B-collaboration systems. In this paper we present the results of a scenario-based evaluation method conducted with the earlier specified eSourcing Architecture (eSA) that generates as results risks, sensitivity, and tradeoff points that must be paid attention to if eSA is implemented. Additionally, the evaluation method detects shortcomings of eSA in terms of integrated components that are required for electronic B2B-collaboration. The evaluation results are used for the specification of eSRA, which comprises all extensions for incorporating the results of the scenario-based evaluation, on three refinement levels.
Resumo:
The present study evaluates the feasibility of undelimbed Scots pine (Pinus sylvestris L.) for integrated production of pulp and energy in a kraft pulp mill from the technical, economic and environmental points of view, focusing on the potential of bundle harvesting. The feasibility of tree sections for pulp production was tested by conducting an industrial wood-handling experiment, laboratory cooking and bleaching trials, using conventional small-diameter Scots pine pulpwood as a reference. These trials showed that undelimbed Scots pine sections can be processed in favourable conditions as a blend with conventional small-diameter pulpwood without reducing the pulp quality. However, fibre losses at various phases of the process may increase when using undelimbed material. In the economic evaluation, both pulp production and wood procurement costs were considered, using the relative wood paying capability of a kraft pulp mill as a determinant. The calculations were made for three Scots pine first-thinning stands with the breast-height diameter of the removal (6 12 cm) as the main distinctive factor. The supply chains included in the comparison were based on cut-to-length harvesting, whole-tree harvesting and bundle harvesting (whole-tree bundling). With the current ratio of pulp and energy prices, the wood paying capability declines with an increase in the proportion of the energy fraction of the raw material. The supply system based on the cut-to-length method was the most efficient option, resulting in the highest residual value at stump in most cases. A decline in the pulp price and an increase in the energy price improved the competitiveness of the whole-tree systems. With short truck transportation distances and low pulp prices, however, the harvesting of loose whole trees can result in higher residual value at stump in small-diameter stands. While savings in transportation costs did not compensate for the high cutting and compaction costs by the second prototype of the bundle harvester, an increase in transportation distances improved its competitiveness. Since harvesting undelimbed assortments increases nutrient export from the site, which can affect soil productivity, the whole-tree alternatives included in the present study cannot be recommended on infertile peatlands and mineral soils. The harvesting of loose whole trees or bundled whole trees implies a reduction in protective logging residues and an increase in site traffic or payloads. These factors increase the risk of soil damage, especially on peat soils with poor bearing capacity. Within the wood procurement parameters which were examined, the CO2 emissions of the supply systems varied from 13 27 kg m3. Compaction of whole trees into bundles reduced emissions from transportation by 30 39%, but these reductions were insufficient to compensate for the increased emissions from cutting and compaction.