944 resultados para GIS data and services


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena oli kerätä käyttövarmuustietoa savukaasulinjasta kahdelta suomalaiselta sellutehtaalta niiden käyttöönotosta aina tähän päivään asti. Käyttövarmuustieto koostuu luotettavuustiedoista sekä kunnossapitotiedoista. Kerätyn tiedon avulla on mahdollista kuvata tarkasti laitoksen käyttövarmuutta seuraavilla tunnusluvuilla: suunnittelemattomien häiriöiden lukumäärä ja korjausajat, laitteiden seisokkiaika, vikojen todennäköisyys ja korjaavan kunnossapidon kustannukset suhteessa savukaasulinjan korjaavan kunnossapidon kokonaiskustannuksiin. Käyttövarmuustiedon keräysmetodi on esitelty. Savukaasulinjan kriittisten laitteiden määrittelyyn käytetty metodi on yhdistelmä kyselytutkimuksesta ja muunnellusta vian vaikutus- ja kriittisyysanalyysistä. Laitteiden valitsemiskriteerit lopulliseen kriittisyysanalyysiin päätettiin käyttövarmuustietojen sekä kyselytutkimuksen perusteella. Kriittisten laitteiden määrittämisen tarkoitus on löytää savukaasulinjasta ne laitteet, joiden odottamaton vikaantuminen aiheuttaa vakavimmat seuraukset savukaasulinjan luotettavuuteen, tuotantoon, turvallisuuteen, päästöihin ja kustannuksiin. Tiedon avulla rajoitetut kunnossapidon resurssit voidaan suunnata oikein. Kriittisten laitteiden määrittämisen tuloksena todetaan, että kolme kriittisintä laitetta savukaasulinjassa ovat molemmille sellutehtaille yhteisesti: savukaasupuhaltimet, laahakuljettimet sekä ketjukuljettimet. Käyttövarmuustieto osoittaa, että laitteiden luotettavuus on tehdaskohtaista, mutta periaatteessa samat päälinjat voidaan nähdä suunnittelemattomien vikojen todennäköisyyttä esittävissä kuvissa. Kustannukset, jotka esitetään laitteen suunnittelemattomien kunnossapitokustannusten suhteena savukaasulinjan kokonaiskustannuksiin, noudattelevat hyvin pitkälle luotettavuuskäyrää, joka on laskettu laitteen seisokkiajan suhteena käyttötunteihin. Käyttövarmuustiedon keräys yhdistettynä kriittisten laitteiden määrittämiseen mahdollistavat ennakoivan kunnossapidon oikean kohdistamisen ja ajoittamisen laitteiston elinaikana siten, että luotettavuus- ja kustannustehokkuusvaatimukset saavutetaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markkinasegmentointi nousi esiin ensi kerran jo 50-luvulla ja se on ollut siitä lähtien yksi markkinoinnin peruskäsitteistä. Suuri osa segmentointia käsittelevästä tutkimuksesta on kuitenkin keskittynyt kuluttajamarkkinoiden segmentointiin yritys- ja teollisuusmarkkinoiden segmentoinnin jäädessä vähemmälle huomiolle. Tämän tutkimuksen tavoitteena on luoda segmentointimalli teollismarkkinoille tietotekniikan tuotteiden ja palveluiden tarjoajan näkökulmasta. Tarkoituksena on selvittää mahdollistavatko case-yrityksen nykyiset asiakastietokannat tehokkaan segmentoinnin, selvittää sopivat segmentointikriteerit sekä arvioida tulisiko tietokantoja kehittää ja kuinka niitä tulisi kehittää tehokkaamman segmentoinnin mahdollistamiseksi. Tarkoitus on luoda yksi malli eri liiketoimintayksiköille yhteisesti. Näin ollen eri yksiköiden tavoitteet tulee ottaa huomioon eturistiriitojen välttämiseksi. Tutkimusmetodologia on tapaustutkimus. Lähteinä tutkimuksessa käytettiin sekundäärisiä lähteitä sekä primäärejä lähteitä kuten case-yrityksen omia tietokantoja sekä haastatteluita. Tutkimuksen lähtökohtana oli tutkimusongelma: Voiko tietokantoihin perustuvaa segmentointia käyttää kannattavaan asiakassuhdejohtamiseen PK-yritys sektorilla? Tavoitteena on luoda segmentointimalli, joka hyödyntää tietokannoissa olevia tietoja tinkimättä kuitenkaan tehokkaan ja kannattavan segmentoinnin ehdoista. Teoriaosa tutkii segmentointia yleensä painottuen kuitenkin teolliseen markkinasegmentointiin. Tarkoituksena on luoda selkeä kuva erilaisista lähestymistavoista aiheeseen ja syventää näkemystä tärkeimpien teorioiden osalta. Tietokantojen analysointi osoitti selviä puutteita asiakastiedoissa. Peruskontaktitiedot löytyvät mutta segmentointia varten tietoa on erittäin rajoitetusti. Tietojen saantia jälleenmyyjiltä ja tukkureilta tulisi parantaa loppuasiakastietojen saannin takia. Segmentointi nykyisten tietojen varassa perustuu lähinnä sekundäärisiin tietoihin kuten toimialaan ja yrityskokoon. Näitäkään tietoja ei ole saatavilla kaikkien tietokannassa olevien yritysten kohdalta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article discusses the development of WEBDATANET established in 2011 which aims to create a multidisciplinary network of web-based data collection experts in Europe. Topics include the presence of 190 experts in 30 European countries and abroad, the establishment of web-based teaching and discussion platforms and working groups and task forces. Also discussed is the scope of the research carried by WEBDATANET. In light of the growing importance of web-based data in the social and behavioral sciences, WEBDATANET was established in 2011 as a COST Action (IS 1004) to create a multidisciplinary network of web-based data collection experts: (web) survey methodologists, psychologists, sociologists, linguists, economists, Internet scientists, media and public opinion researchers. The aim was to accumulate and synthesize knowledge regarding methodological issues of web-based data collection (surveys, experiments, tests, non-reactive data, and mobile Internet research), and foster its scientific usage in a broader community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In any discipline, where uncertainty and variability are present, it is important to haveprinciples which are accepted as inviolate and which should therefore drive statisticalmodelling, statistical analysis of data and any inferences from such an analysis.Despite the fact that two such principles have existed over the last two decades andfrom these a sensible, meaningful methodology has been developed for the statisticalanalysis of compositional data, the application of inappropriate and/or meaninglessmethods persists in many areas of application. This paper identifies at least tencommon fallacies and confusions in compositional data analysis with illustrativeexamples and provides readers with necessary, and hopefully sufficient, arguments topersuade the culprits why and how they should amend their ways

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a participant study that compares biological data exploration tasks using volume renderings of laser confocal microscopy data across three environments that vary in level of immersion: a desktop, fishtank, and cave system. For the tasks, data, and visualization approach used in our study, we found that subjects qualitatively preferred and quantitatively performed better in the cave compared with the fishtank and desktop. Subjects performed real-world biological data analysis tasks that emphasized understanding spatial relationships including characterizing the general features in a volume, identifying colocated features, and reporting geometric relationships such as whether clusters of cells were coplanar. After analyzing data in each environment, subjects were asked to choose which environment they wanted to analyze additional data sets in - subjects uniformly selected the cave environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Broadcasting systems are networks where the transmission is received by several terminals. Generally broadcast receivers are passive devices in the network, meaning that they do not interact with the transmitter. Providing a certain Quality of Service (QoS) for the receivers in heterogeneous reception environment with no feedback is not an easy task. Forward error control coding can be used for protection against transmission errors to enhance the QoS for broadcast services. For good performance in terrestrial wireless networks, diversity should be utilized. The diversity is utilized by application of interleaving together with the forward error correction codes. In this dissertation the design and analysis of forward error control and control signalling for providing QoS in wireless broadcasting systems are studied. Control signaling is used in broadcasting networks to give the receiver necessary information on how to connect to the network itself and how to receive the services that are being transmitted. Usually control signalling is considered to be transmitted through a dedicated path in the systems. Therefore, the relationship of the signaling and service data paths should be considered early in the design phase. Modeling and simulations are used in the case studies of this dissertation to study this relationship. This dissertation begins with a survey on the broadcasting environment and mechanisms for providing QoS therein. Then case studies present analysis and design of such mechanisms in real systems. The mechanisms for providing QoS considering signaling and service data paths and their relationship at the DVB-H link layer are analyzed as the first case study. In particular the performance of different service data decoding mechanisms and optimal signaling transmission parameter selection are presented. The second case study investigates the design of signaling and service data paths for the more modern DVB-T2 physical layer. Furthermore, by comparing the performances of the signaling and service data paths by simulations, configuration guidelines for the DVB-T2 physical layer signaling are given. The presented guidelines can prove useful when configuring DVB-T2 transmission networks. Finally, recommendations for the design of data and signalling paths are given based on findings from the case studies. The requirements for the signaling design should be derived from the requirements for the main services. Generally, these requirements for signaling should be more demanding as the signaling is the enabler for service reception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Raportissa esitellään Kainuun suoselvitys -projektin keskeisimpiä tuloksia ja niistä vedettyjä johtopäätöksiä. Vuosina 2010–2012 toteutetun projektin tarkoituksena on ollut parantaa tietopohjaa Kainuun soista, ja sitä kautta luoda edellytyksiä soiden eri käyttömuotojen yhteensovittamiselle ja samalla nostaa esille Kainuun soihin liittyviä mahdollisuuksia. Soiden käyttö nähtiin projektissa laajasti, sisältäen paitsi erilaiset taloudellisen hyödyntämisen muodot myös soiden merkityksen ihmisten virkistykselle ja hyvinvoinnille sekä luonnon monimuotoisuudelle ja monille sääteleville toiminnoille. Taustana projektille on samaan aikaan laadittu kansallinen suostrategia. Sen mukaan soiden luonnontilaa muuttavien toimintojen, maa-, metsä- ja turvetalouden, tulisi jatkossa toimia jo ojitetuilla tai muuten merkittävästi muuttuneilla soilla. Näin soiden luonnontilaisuudesta hyötyville soiden käytön muodoille ja ekosysteemipalveluille jää paremmin tilaa toimia. Kainuu on Suomen kolmanneksi soisin maakunta. Suot ovat keskeinen osa sen luontoa ja maisemaa ja myös monipuolinen luonnonvara. Valtaosa Kainuun suopinta-alasta on ojitettu. Uutta tietoa on projektissa tuotettu paikkatietomenetelmin ja kohdennetuin maastoselvityksin. Raportin aihepiirejä ovat Kainuun soiden käyttö, jäljellä olevat ojittamattomat suot, suojeltujen soiden vesitaloudellinen eheys, metsätalouden käytöstä poistuvat ojitetut suot, soiden merkitys virkistyksen ja matkailun kannalta sekä maastossa tutkittujen 143 suon arviointi turpeennoston ja luonnon monimuotoisuuden kannalta. Raportissa esitetään, että kansallisen suostrategian tavoite soiden kestävästä ja vastuullisesta käytöstä ja suojelusta on mahdollista saavuttaa eri toimijoiden yhteistyön ja maankäytön ohjauksen avulla. Se edellyttää sitoutumista suostrategian linjauksiin sekä tietopohjan parantamista soista ja eri toimintojen vaikutuksista. Kainuun suoselvitys -projektissa on osaltaan tuotettu tätä tietoa ja myös kehitelty menetelmiä sen hankkimiseksi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid changes in biodiversity are occurring globally, as a consequence of anthropogenic disturbance. This has raised concerns, since biodiversity is known to significantly contribute to ecosystem functions and services. Marine benthic communities participate in numerous functions provided by soft-sedimentary ecosystems. Eutrophication-induced oxygen deficiency is a growing threat against infaunal communities, both in open sea areas and in coastal zones. There is thus a need to understand how such disturbance affects benthic communities, and what is lost in terms of ecosystem functioning if benthic communities are harmed. In this thesis, the status of benthic biodiversity was assessed for the open Baltic Sea, a system severely affected by broad-scale hypoxia. Long-term monitoring data made it possible to establish quantitative biodiversity baselines against which change could be compared. The findings show that benthic biodiversity is currently severely impaired in large areas of the open Baltic Sea, from the Bornholm Basin to the Gulf of Finland. The observed reduction in biodiversity indicates that benthic communities are structurally and functionally impoverished in several of the sub-basins due to the hypoxic stress. A more detailed examination of disturbance impacts (through field studies and -experiments) on benthic communities in coastal areas showed that changes in benthic community structure and function took place well before species were lost from the system. The degradation of benthic community structure and function was directed by the type of disturbance, and its specific temporal and spatial characteristics. The observed shifts in benthic trait composition were primarily the result of reductions in species’ abundances, or of changes in demographic characteristics, such as the loss of large, adult bivalves. Reduction in community functions was expressed as declines in the benthic bioturbation potential and in secondary biomass production. The benthic communities and their degradation accounted for a substantial proportion of the changes observed in ecosystem multifunctionality. Individual ecosystem functions (i.e. measures of sediment ecosystem metabolism, elemental cycling, biomass production, organic matter transformation and physical structuring) were observed to differ in their response to increasing hypoxic disturbance. Interestingly, the results suggested that an impairment of ecosystem functioning could be detected at an earlier stage if multiple functions were considered. Importantly, the findings indicate that even small-scale hypoxic disturbance can reduce the buffering capacity of sedimentary ecosystem, and increase the susceptibility of the system towards further stress. Although the results of the individual papers are context-dependent, their combined outcome implies that healthy benthic communities are important for sustaining overall ecosystem functioning as well as ecosystem resilience in the Baltic Sea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This monograph dissertation looks into the field of ICT-mediated health and well-being services. Through six chapters that extend the work done in the reviewed and published articles, the dissertation focuses on new and emerging technologies, and to impact of their use on the beneficiary; the individual who eventually derives advantage from the services. As the field is currently going through major changes particularly in the OECD countries, the focus is on shortterm developments in the field and the analysis on the long term developments is cursory by nature. The dissertation includes theoretical and empirical elements. Most of the empirical elements are linked to product development and conceptualization performed in the national MyWellbeing project that ended in 2010. In the project, the emphasis was on conceptualization of a personal aid for the beneficiary that could be used for managing information and services in the field of health and well-being services. This work continued the theme of developing individual-centric solutions for the field; a work that started in the InnoElli Senior program in 2006. The nature of this thesis is foremost a conceptual elaboration based on a literature review, illustrated in empirical work performed in different projects. As a theoretical contribution, this dissertation elaborates the role of a mediator, i.e. an intermediary, and it is used as an overarching theme. The role acts as a ‘lens’ through which a number of technology-related phenomena are looked at, pinned down and addressed to a degree. This includes introduction of solutions, ranging from anthropomorphic artefacts to decision support systems that may change the way individuals experience clinical encounters in the near-future. Due to the complex and multiform nature of the field, it is impractical and effectively impossible to cover all aspects that are related to mediation in a single work. Issues such as legislation, financing and privacy are all of equal importance. Consideration of all these issues is beyond the scope of this dissertation and their investigation is left to other work. It follows from this that the investigation on the role is not intended as inclusive one. The role of the mediator is also used to highlight some of the ethical issues related to personal health information management, and to mediating health and well-being related issues on behalf of another individual, such as an elderly relative or a fellow member of a small unit in the armed forces. The dissertation concludes in a summary about the use and functions of the mediator, describing some potential avenues for implementing such support mechanisms to the changing field of ICT-mediated health and well-being services. The conclusions also describe some of the limitations of this dissertation, including remarks on methodology and content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014