31 resultados para Keys to Database Searching
Resumo:
The main objective of this thesis was to analyze the usability of registers and indexes of electronic marketplaces. The work is focused on UDDI-based electronic marketplaces, which are standardized by the W3C. UDDI-registers are usable in intranets, extranets and in Internet. Using UDDI-registers Web-services can be searched in many ways, including alphabetical and domain specific searches. Humans and machines can use the features UDDI-registers. The thesis deals the design principles, architectures and specifications of UDDI-registers. In addition, the thesis includes the design and the specifications of an electronic marketplace developed for supporting electronic logistics services.
Resumo:
Nokia Push To Talk järjestelmä tarjoaa uuden kommunikointimetodin tavallisen puhelun oheen. Yksi tärkeimmistä uuden järjestelmän ominaisuuksista on puhelunmuodostuksen nopeus. Lisäksi järjestelmän tulee olla telekommunikaatiojärjestelmien yleisten periaatteiden mukainen, mahdollisimman stabiili ja skaalautuva, jotta järjestelmä olisi mahdollisimman vikasietoinen ja laajennettavissa. Diplomityön päätavoite on esitellä "C++"-tietokantakirjastojen suunnittelua ja testausta. Aluksi tutkitaan tietokantajärjestelmien problematiikkaa alkaen tietokantajärjestelmän valinnasta ja huomioiden erityisesti nopeuskriteerit. Sitten esitellään kaksi teknistä toteutusta kahta "C++"-tietokantakirjastoa varten ja pohditaan joitakin vaihtoehtoisia toteutustapoja.
Resumo:
Tietokantamarkkinointi voi olla vain apuväline markkinointitoimenpiteiden suorittamisessa, mutta se voidaan toisaalta nähdä myös olennaisena osana asiakassuhdehallintaa. Tietokantamarkkinointi asiakassuhdehallinnan näkökulmasta tähtää asiakastyytyväisyyteen ja asiakasuskollisuuteen, sekä asiakassuhteen tuottavuuteen ja kannattavuuteen, mikä voidaan saavuttaa tehokkaan tiedonhallinnan avulla. Tämä mahdollistaa räätälöityjen toimenpiteiden suorittamisen ja tehostaa kohdentamista ja segmentointia, myös asiakkaiden tuottavuuden perusteella. Normatiivinen case-tutkimus, joka tehtiin Alankomaissa Eurooppalaisessa tietotekniikan lisäarvoa tuottavassa jälleenmyyntikanavassa osoittaa, että tietokantamarkkinointi etenkin asiakas- suhdehallinnan näkökulmasta olisi sopiva keino lisätä asiakastyytyväisyyttä ja –tuottavuutta. Se myös tehostaisi sisäisiä tietovirtoja ja markkinointitoimenpiteitä, kuten esimerkiksi markkinointiviestintää, kampanjanhallintaa ja myyntiprosesseja yritysten välisessä kaupankäynnissä.
Resumo:
The objective of this master’s thesis was to examine the effect of customer orientation on customer satisfaction and how customer satisfaction and customer retention contribute to firm profitability. Beside customer orientation, also other antecedents of customer satisfaction, i.e. service quality, flexibility, trust and commitment, were investigated as control variables. Literature review revealed several research gaps concerning research of the key concepts. These research calls were also answered. The empirical study focused on one case company, a telecommunication expert. The data for the empirical part was collected with web-based questionnaire from case company’s business customers in January-February 2008. Sample (N=95) produced 59 answers, thus the response rate of the survey was 62,1%. The data was analyzed by using statistical analysis program, SPSS. As a conclusion, the results indicate that customer orientation do not affect customer satisfaction directly, but through service quality, flexibility and trust. Moreover, customer satisfaction has positive impacts on commitment and intentions to stay as a customer in the future, but not on profitability. In the present study, only past purchase behavior, measured with customer database measure, is positively related to firm profitability.
Resumo:
Tämä Pro-Gradu –tutkielma tarkastelee etelä- ja itäsuomalaisia kuljetusyrityksiä kannattavuuden, vakavaraisuuden ja maksuvalmiuden näkökulmasta kuljetuslajeittain. Tutkielman pyrkimyksenä on valottaa Etelä- ja Itä-Suomen läänien kilpailuympäristöä numeeristen tunnuslukujen ja tilinpäätöserien perusteella. Päätavoitteena on osoittaa, onko kuljetuslajien välillä kannattavuuseroja. Tutkielma koostuu teoriaosasta sekä empiirisestä osasta, jonka aineisto koostuu etelä- ja itäsuomalaisten kuljetusyritysten tilinpäätöstiedoista ja tunnusluvuista. Tutkimuksen empiirinen aineisto on kerätty haulla Amadeus–tietokannasta ja postikyselynä haun tuottamilta yrityksiltä. Aineistoa on analysoitu kvantitatiivisilla menetelmillä SPSS–ohjelmistolla. Tuloksista selviää, että kuljetuslajilla on yhteys kannattavuuteen ROCE–tunnusluvulla mitattuna. Kannattavuuden osa-alueita, kuten tuottoja, kustannuksia ja sidottua pääomaa analysoitiin erikseen, mutta kannattavuuseroille ei silti löydetty selitystä.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
In modern day organizations there are an increasing number of IT devices such as computers, mobile phones and printers. These devices can be located and maintained by using specialized IT management applications. Costs related to a single device accumulate from various sources and are normally categorized as direct costs like hardware costs and indirect costs such as labor costs. These costs can be saved in a configuration management database and presented to users using web based development tools such as ASP.NET. The overall costs of IT devices during their lifecycle can be ten times higher than the actual purchase price of the product and ability to define and reduce these costs can save organizations noticeable amount of money. This Master’s Thesis introduces the research field of IT management and defines a custom framework model based on Information Technology Infrastructure Library (ITIL) best practices which is designed to be implemented as part of an existing IT management application for defining and presenting IT costs.
Resumo:
The purpose of the thesis is to study innovativeness in a context of the construction industry especially the front-end of the innovation process. The construction industry is often considered an old-fashioned manufacturing industry. Innovations and innovativeness are rarely linked to the industry. The construction industry, as well as other industries in Finland, is facing challenges such as productivity, the climate change and internationalization. The meaning of innovations is greater than ever in continuously changing markets, for standing out from competitors or increasing the competitiveness. Traditional production methods, tight building regulations, unique buildings, one-of-a-kind project organizations and highlighting the cheapest price in building contracts are particular challenges in the construction industry. The research questions of the thesis were: - What kind of factors shift the existing company culture towards innovativeness? - What are the phases of the front-end of the innovation process? - What kind of tools and methods enable managing the front-end of the innovation process? The theoretical part of the thesis bases on the literature review. The research methodology of the empirical part was the action research and qualitative approach. Empirical data was collected by the theme interviews from three companies. The results were practical methods and experiences from innovation activities of the companies. The results of the thesis can be clarified as follows: enhancement of the innovation activities requires support and commitment of the top management, innovative culture and innovation strategy. Innovativeness can be promoted by systematical methods for example collecting ideas from employees. Controlling and managing the front-end phase is essential to succeeding. Despite that managing the front-end is the most challenging part of the innovation process, development and management of that save companies’ money, resources and prevents useless investments. Further clarification and studies are needed to find out furthermore functional tools and methods to manage innovations and implementing them to the culture of the companies.
Resumo:
The objective of this study is to find out how email marketing is conducted towards existing customers in Company X. The first chapter of the study focuses on theoretical literature on direct marketing, especially on solicited and unsolicited email marketing, and on relationship marketing. The following relationship marketing areas: database marketing, customer retention, trust and commitment, loyalty, engagement and satisfaction are described and the possibilities to use email marketing within these entities of relationship marketing. The empirical second part of the study revealed that email marketing tactics to be used on relationship marketing in Company X are little used and there is potential for significant improvements in relationship marketing especially with marketing automation tools.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The pressure has grown to develop cost-effective emission reduction strategies in the Baltic Sea. The forthcoming stringent regulations of the International Maritime Organization for reducing harmful emissions of shipping in the Baltic Sea are causing increasing expenses for the operators. A market-based attitude towards pricing of economic incentives could be seen as a new approach for a successful application for the additional emission reduction of nitrogen oxides (NOx). In this study the aim is to understand the phenomenon of environmentally differentiated port fees and its effects on shipping companies’ emission reduction investments. The goal is to examine empirically the real-life effects of the possible environmental differentiated port fee system and the effect of environmentally differentiated port fees on NOx reduction investments in the Baltic Sea. The research approach of this study is nomothetical. In this study research questions are answered by analyzing the broad database of the Baltic Sea fleet. Also the framework of theory is confirmed and plays an important role in analyzing the research problem. Existing investment costs of NOx emission reduction technology to ship owners are estimated and compared to investment costs with granted discounts added to the cash flows. The statistical analysis in this study is descriptive. The major statistic examination of this study is the calculation of the net present values of investments with different port fee scenarios. This is done to investigate if the NOx technology investments could be economically reasonable. Based on calculations it is clear that the effect of environmentally differentiated port fees is not adequate to compensate the total investment costs for NOx reduction. If the investment decision is made only with profitability considerations, sources will prefer to emission abatement as long as incomes from the given subsidy exceeds their abatement costs. Despite of the results, evidence was found that shipping companies are nevertheless willing to invest on voluntary emission abatement technology. In that case, investment decision could be made with criteria of, for example, sustainable strategy or brand image. Combined fairway and port fee system or governmental regulations and recommendation could also function as additional incentives to compensate the investment costs. Also, the results imply that the use of NPV is not necessarily the best method to evaluate environmental investments. If the calculations would be done with more environmental methods the results would probably be different.
Resumo:
Large-headed total hip arthroplasty (THA) and hip resurfacing arthroplasty (HRA) with metal-on-metal (MoM) bearings became popular during the last decade. Recently, it has become evident that the large-head MoM hip implants are associated with increased revision rates despite their theoretical advantages. The purpose of this study was to evaluate the early results of primary MoM hip replacements and of acetabular revisions. I analyzed retrospectively the results of four MoM implant designs and the survival rate of acetabular revisions with impaction bone grafting, as documented in the Turku University Hospital database. Further, I evaluated the correlation between femoral head size and dislocation rate, and used the Finnish Arthroplasty Register data to compare the survival of three large-head MoM THAs to analogous HRAs. The early results for the Magnum M2A–ReCap THA were good. A larger head size decreased the risk of dislocation. Articular surface replacement (ASR) THA yielded inferior results compared to analogous HRA. For two other designs the results were similar. The R3–Synergy THA yielded inferior results compared to the reference implants. The survival of acetabular reconstructions with impaction bone grafting was inferior compared to previous reports. In conclusion, the early results of the Biomet ReCap–Magnum design were promising, and large head sizes decreased the dislocation rate. The survival of different MoM hip implant designs varied. The survival of new designs and techniques may be inferior to those reported by the clinics where implants are developed. An important caveat is that early promising results of new devices may rapidly worsen. New implants need to be introduced in a controlled fashion to the market; here, arthroplasty registers are a valuable tool that needs to be used.
Resumo:
Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.
Resumo:
In recent decade customer loyalty programs have become very popular and almost every retail chain seems to have one. Through the loyalty programs companies are able to collect information about the customer behavior and to use this information in business and marketing management to guide decision making and resource allocation. The benefits for the loyalty program member are often monetary, which has an effect on the profitability of the loyalty program. Not all the loyalty program members are equally profitable, as some purchase products for the recommended retail price and some buy only discounted products. If the company spends similar amount of resources to all members, it can be seen that the customer margin is lower on the customer who bought only discounted products. It is vital for a company to measure the profitability of their members in order to be able to calculate the customer value. To calculate the customer value several different customer value metrics can be used. During the recent years especially customer lifetime value has received a lot of attention and it is seen to be superior against other customer value metrics. In this master’s thesis the customer lifetime value is implemented on the case company’s customer loyalty program. The data was collected from the customer loyalty program’s database and represents year 2012 on the Finnish market. The data was not complete to fully take advantage of customer lifetime value and as a conclusion it can be stated that a new key performance indicator of customer margin should be acquired in order to profitably drive the business of the customer loyalty program. Through the customer margin the company would be able to compute the customer lifetime value on regular basis enabling efficient resource allocation in marketing.
Resumo:
TAVOITTEET: Tämän tutkielman tarkoitus on tarkastella eri toimialojen likviditeettitasoja vuosien 2007 ja 2013 välillä. Se tarkastelee myös kassanhallinnan ja likviditeetin kirjallisuutta, erilaisia likviditeettiä kuvaavia tunnuslukuja sekä asioita, joilla on vaikutusta likviditeettiin. Tämän lisäksi se tutkii informaatio ja kommunikaatio sektoria tarkemmin. DATA: Data on kerätty Orbis tietokannasta. Toimialakohtaiset keskiarvot on laskettu joko kappaleen 2 esittämillä kaavoilla tai noudettu suoraan tietokannasta. Hajonta kuvaajat on tehty Excelillä ja korrelaatio matriisi ja regressioanalyysit SAS EG:llä. TULOKSET: Tämä tutkimus esittää toimialakohtaiset keskiarvot liquidity ratiosta, solvency ratiosta sekä gearingista, kuten monista muista likviditeettiä kuvaavista tai siihen vaikuttavista tunnusluvuista. Tutkimus osoittaa, että keskimäärin likviditeetti ja maksuvalmius ovat säilyneet melko samana, mutta toimialakohtaiset muutokset ovat voimakkaita. IC sektorilla likviditeettiin vaikuttaa katetuotto, työntekijöiden määrä, liikevaihto, taseen määrä sekä maksuaika.