903 resultados para Search and retrieval
Resumo:
Tutkielma tarkastelee vapaa alue konseptia osana yritysten kansainvälistä toimitusketjua. Tarkoituksena on löytää keinoja, millä tavoin vapaa alueen houkuttelevuutta voidaan lisätä yritysten näkökulmasta ja millaista liiketoimintaa yritysten on vapaa alueella mahdollista harjoittaa. Tutkielmassa etsitään tekijöitä, jotka vaikuttavat vapaa alueen menestykseen ja jotka voisivat olla sovellettavissa Kaakkois-Suomen ja Venäjän raja-alueelle ottaen huomioon vallitsevat olosuhteet ja lainsäädäntö rajoittavina tekijöinä. Menestystekijöitä ja liiketoimintamalleja haetaan tutkimalla ja analysoimalla lyhyesti muutamia olemassa olevia ja toimivia vapaa alueita. EU tullilain harmonisointi ja kansainvälisen kaupan vapautuminen vähentää vapaa alueen perinteistä merkitystä tullivapaana alueena. Sen sijaan vapaa alueet toimivat yhä enenevissä määrin logistisina keskuksina kansainvälisessä kaupassa ja tarjoavat palveluita, joiden avulla yritykset voivat parantaa logistista kilpailukykyään. Verkostoituminen, satelliitti-ratkaisut ja yhteistoiminta ovat keinoja, millä Kaakkois-Suomen alueen eri logistiikkapalvelujen tarjoajat voivat parantaa suorituskykyään ja joustavuutta kansainvälisessä toimitusketjussa.
Resumo:
This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.
Resumo:
The majority of the Swiss population uses the internet to seek information about health. The objective is to be better informed, before or after the consultation. Doctors can advise their information-seeking patients about high quality websites, be it medical portals or websites dedicated to a specific pathology. Doctors should not see the internet as a threat but rather as an opportunity to strengthen the doctor-patient relationship.
Resumo:
OBJECTIVE: The aim of this study is to review highly cited articles that focus on non-publication of studies, and to develop a consistent and comprehensive approach to defining (non-) dissemination of research findings. SETTING: We performed a scoping review of definitions of the term 'publication bias' in highly cited publications. PARTICIPANTS: Ideas and experiences of a core group of authors were collected in a draft document, which was complemented by the findings from our literature search. INTERVENTIONS: The draft document including findings from the literature search was circulated to an international group of experts and revised until no additional ideas emerged and consensus was reached. PRIMARY OUTCOMES: We propose a new approach to the comprehensive conceptualisation of (non-) dissemination of research. SECONDARY OUTCOMES: Our 'What, Who and Why?' approach includes issues that need to be considered when disseminating research findings (What?), the different players who should assume responsibility during the various stages of conducting a clinical trial and disseminating clinical trial documents (Who?), and motivations that might lead the various players to disseminate findings selectively, thereby introducing bias in the dissemination process (Why?). CONCLUSIONS: Our comprehensive framework of (non-) dissemination of research findings, based on the results of a scoping literature search and expert consensus will facilitate the development of future policies and guidelines regarding the multifaceted issue of selective publication, historically referred to as 'publication bias'.
Resumo:
The Catalan reception of the 1966 manifestos by Robert Venturi and Aldo Rossi marks the scenario of a breakup: while North America debates about the architectural shape as a linguistic structure, Italy dips its roots in the Modern Movement tradition as an origin for a new temporal and ideological architectural dimension. The first contacts between Rossi and Spain verify this search and allow the Italian to construct common itineraries with some architects from Barcelona. From these exchanges the 2C group will be born, taking part on typical vanguardist mechanisms: they will publish the magazine, 2C. The construction of the city (1972-1985), they will attend the XV Triennale di Milano in 1973 with the Torres Clavé Plan (1971) and the Aldo Rossi + 21 Spanish architects exhibition (1975) while he will organize the three issues of the Seminarios Internacionales de Arquitectura Contemporánea (S.I.AC.) which took place in Santiago, Sevilla and Barcelona between 1976 and 1980. In front of the unfolding of the firsts, the American contacts of Federico Correa, Oriol Bohigas, Lluís Domènech and the PER studio or the teaching work of Rafael Moneo from Barcelona since 1971, allow to draw replica itineraries with the foundation of the magazine Arquitecturas Bis (1974-1985), the organization of the meetings between international publications such as Lotus and Oppositions in Cadaqués (1975) and New York (1977), while stablishing exchanges with members of the Five Architects. Replicas that in 1976 conduct the initial ideological affirmations between Rossi and the 2C group towards irreconcilable distancing. Verifying the itinerary of the journey that the Italian leads from the Italian resistance towards the American backing down is part of the aim of this article
Resumo:
The purpose of this study was to develop co-operation between business units of the company operating in graphic industry. The development was done by searching synergy opportunities between these business units. The final aim was to form a business model, which is based on co-operation of these business units.The literature review of this thesis examines synergies and especially the process concerning the search and implementation of synergies. Also the concept of business model and its components are examined. The research was done by using qualitative research method. The main data acquiring method to the empirical part was theme interviews. The data was analyzed using thematisation and content analysis.The results of the study include seven identified possible synergies and a business model, which is based on the co-operation of the business units. The synergy opportunities are evaluated and the implementation order of the synergies is suggested. The presented synergies create the base for the proposed business model.
Resumo:
Speaker diarization is the process of sorting speeches according to the speaker. Diarization helps to search and retrieve what a certain speaker uttered in a meeting. Applications of diarization systemsextend to other domains than meetings, for example, lectures, telephone, television, and radio. Besides, diarization enhances the performance of several speech technologies such as speaker recognition, automatic transcription, and speaker tracking. Methodologies previously used in developing diarization systems are discussed. Prior results and techniques are studied and compared. Methods such as Hidden Markov Models and Gaussian Mixture Models that are used in speaker recognition and other speech technologies are also used in speaker diarization. The objective of this thesis is to develop a speaker diarization system in meeting domain. Experimental part of this work indicates that zero-crossing rate can be used effectively in breaking down the audio stream into segments, and adaptive Gaussian Models fit adequately short audio segments. Results show that 35 Gaussian Models and one second as average length of each segment are optimum values to build a diarization system for the tested data. Uniting the segments which are uttered by same speaker is done in a bottom-up clustering by a newapproach of categorizing the mixture weights.
Resumo:
Long-term independent budget travel to countries far away has become increasingly common over the last few decades, and backpacking has now entered the tourism mainstream. Nowadays, backpackers are a very important segment of the global travel market. Backpacking is a type of tourism that involves a lot of information search activities. The Internet has become a major source of information as well as a platform for tourism business transactions. It allows travelers to gain information very effortlessly and to learn about tourist destinations and products directly from other travelers in the form of electronic word-of-mouth (eWOM). Social media has penetrated and changed the backpacker market, as now modern travelers can stay connected to people at home, read online recommendations, and organize and book their trips very independently. In order to create a wider understanding on modern-day backpackers and their information search and share behavior in the Web 2.0 era, this thesis examined contemporary backpackers and their use of social media as an information and communication platform. In order to achieve this goal, three sub-objectives were identified: 1. to describe contemporary backpacker tourism 2. to examine contemporary backpackers’ travel information search and share behavior 3. to explore the impacts of new information and communications technologies and Web 2.0 on backpacker tourism The empirical data was gathered with an online survey, thus the method of analysis was mainly quantitative, and a qualitative method was used for a brief analysis of open questions. The research included both descriptive and analytical approaches, as the goal was to describe modern-day backpackers, and to examine possible interdependencies between information search and share behavior and background variables. The interdependencies were tested for statistical significance with the help of five research hypotheses. The results suggested that backpackers no longer fall under the original backpacker definitions described some decades ago. Now, they are mainly short-term travelers, whose trips resemble more those of mainstream tourists. They use communication technologies very actively, and particularly social media. Traditional information sources, mainly guide books and recommendations from friends, are of great importance to them but also eWOM sources are widely used in travel decision making. The use of each source varies according to the stage of the trip. All in all, Web 2.0 and new ICTs have transformed the backpacker tourism industry in many ways. Although the experience has become less authentic in some travelers’ eyes, the backpacker culture is still recognizable.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
In this article, we will review some behavioral, pharmacological and neurochemical studies from our laboratory on mice, which might contribute to our understanding of the complex processes of memory consolidation and reconsolidation. We discuss the post-training (memory consolidation) and post-reactivation (memory reconsolidation) effects of icv infusions of hemicholinium, a central inhibitor of acetylcholine synthesis, of intraperitoneal administration of L-NAME, a non-specific inhibitor of nitric oxide synthase, of intrahippocampal injections of an inhibitor of the transcription factor NF-κB, and the exposure of mice to a new learning situation on retention performance of an inhibitory avoidance response. All treatments impair long-term memory consolidation and retrieval-induced memory processes different from extinction, probably in accordance with the "reconsolidation hypothesis".
Resumo:
This study examines the efficiency of search engine advertising strategies employed by firms. The research setting is the online retailing industry, which is characterized by extensive use of Web technologies and high competition for market share and profitability. For Internet retailers, search engines are increasingly serving as an information gateway for many decision-making tasks. In particular, Search engine advertising (SEA) has opened a new marketing channel for retailers to attract new customers and improve their performance. In addition to natural (organic) search marketing strategies, search engine advertisers compete for top advertisement slots provided by search brokers such as Google and Yahoo! through keyword auctions. The rationale being that greater visibility on a search engine during a keyword search will capture customers' interest in a business and its product or service offerings. Search engines account for most online activities today. Compared with the slow growth of traditional marketing channels, online search volumes continue to grow at a steady rate. According to the Search Engine Marketing Professional Organization, spending on search engine marketing by North American firms in 2008 was estimated at $13.5 billion. Despite the significant role SEA plays in Web retailing, scholarly research on the topic is limited. Prior studies in SEA have focused on search engine auction mechanism design. In contrast, research on the business value of SEA has been limited by the lack of empirical data on search advertising practices. Recent advances in search and retail technologies have created datarich environments that enable new research opportunities at the interface of marketing and information technology. This research uses extensive data from Web retailing and Google-based search advertising and evaluates Web retailers' use of resources, search advertising techniques, and other relevant factors that contribute to business performance across different metrics. The methods used include Data Envelopment Analysis (DEA), data mining, and multivariate statistics. This research contributes to empirical research by analyzing several Web retail firms in different industry sectors and product categories. One of the key findings is that the dynamics of sponsored search advertising vary between multi-channel and Web-only retailers. While the key performance metrics for multi-channel retailers include measures such as online sales, conversion rate (CR), c1ick-through-rate (CTR), and impressions, the key performance metrics for Web-only retailers focus on organic and sponsored ad ranks. These results provide a useful contribution to our organizational level understanding of search engine advertising strategies, both for multi-channel and Web-only retailers. These results also contribute to current knowledge in technology-driven marketing strategies and provide managers with a better understanding of sponsored search advertising and its impact on various performance metrics in Web retailing.
Resumo:
Les filtres de recherche bibliographique optimisés visent à faciliter le repérage de l’information dans les bases de données bibliographiques qui sont presque toujours la source la plus abondante d’évidences scientifiques. Ils contribuent à soutenir la prise de décisions basée sur les évidences. La majorité des filtres disponibles dans la littérature sont des filtres méthodologiques. Mais pour donner tout leur potentiel, ils doivent être combinés à des filtres permettant de repérer les études couvrant un sujet particulier. Dans le champ de la sécurité des patients, il a été démontré qu’un repérage déficient de l’information peut avoir des conséquences tragiques. Des filtres de recherche optimisés couvrant le champ pourraient s’avérer très utiles. La présente étude a pour but de proposer des filtres de recherche bibliographique optimisés pour le champ de la sécurité des patients, d’évaluer leur validité, et de proposer un guide pour l’élaboration de filtres de recherche. Nous proposons des filtres optimisés permettant de repérer des articles portant sur la sécurité des patients dans les organisations de santé dans les bases de données Medline, Embase et CINAHL. Ces filtres réalisent de très bonnes performances et sont spécialement construits pour les articles dont le contenu est lié de façon explicite au champ de la sécurité des patients par leurs auteurs. La mesure dans laquelle on peut généraliser leur utilisation à d’autres contextes est liée à la définition des frontières du champ de la sécurité des patients.
Resumo:
Cette thèse étudie des modèles de séquences de haute dimension basés sur des réseaux de neurones récurrents (RNN) et leur application à la musique et à la parole. Bien qu'en principe les RNN puissent représenter les dépendances à long terme et la dynamique temporelle complexe propres aux séquences d'intérêt comme la vidéo, l'audio et la langue naturelle, ceux-ci n'ont pas été utilisés à leur plein potentiel depuis leur introduction par Rumelhart et al. (1986a) en raison de la difficulté de les entraîner efficacement par descente de gradient. Récemment, l'application fructueuse de l'optimisation Hessian-free et d'autres techniques d'entraînement avancées ont entraîné la recrudescence de leur utilisation dans plusieurs systèmes de l'état de l'art. Le travail de cette thèse prend part à ce développement. L'idée centrale consiste à exploiter la flexibilité des RNN pour apprendre une description probabiliste de séquences de symboles, c'est-à-dire une information de haut niveau associée aux signaux observés, qui en retour pourra servir d'à priori pour améliorer la précision de la recherche d'information. Par exemple, en modélisant l'évolution de groupes de notes dans la musique polyphonique, d'accords dans une progression harmonique, de phonèmes dans un énoncé oral ou encore de sources individuelles dans un mélange audio, nous pouvons améliorer significativement les méthodes de transcription polyphonique, de reconnaissance d'accords, de reconnaissance de la parole et de séparation de sources audio respectivement. L'application pratique de nos modèles à ces tâches est détaillée dans les quatre derniers articles présentés dans cette thèse. Dans le premier article, nous remplaçons la couche de sortie d'un RNN par des machines de Boltzmann restreintes conditionnelles pour décrire des distributions de sortie multimodales beaucoup plus riches. Dans le deuxième article, nous évaluons et proposons des méthodes avancées pour entraîner les RNN. Dans les quatre derniers articles, nous examinons différentes façons de combiner nos modèles symboliques à des réseaux profonds et à la factorisation matricielle non-négative, notamment par des produits d'experts, des architectures entrée/sortie et des cadres génératifs généralisant les modèles de Markov cachés. Nous proposons et analysons également des méthodes d'inférence efficaces pour ces modèles, telles la recherche vorace chronologique, la recherche en faisceau à haute dimension, la recherche en faisceau élagué et la descente de gradient. Finalement, nous abordons les questions de l'étiquette biaisée, du maître imposant, du lissage temporel, de la régularisation et du pré-entraînement.
Resumo:
Queueing system in which arriving customers who find all servers and waiting positions (if any) occupied many retry for service after a period of time are retrial queues or queues with repeated attempts. This study deals with two objectives one is to introduce orbital search in retrial queueing models which allows to minimize the idle time of the server. If the holding costs and cost of using the search of customers will be introduced, the results we obtained can be used for the optimal tuning of the parameters of the search mechanism. The second one is to provide insight of the link between the corresponding retrial queue and the classical queue. At the end we observe that when the search probability Pj = 1 for all j, the model reduces to the classical queue and when Pj = 0 for all j, the model becomes the retrial queue. It discusses the performance evaluation of single-server retrial queue. It was determined by using Poisson process. Then it discuss the structure of the busy period and its analysis interms of Laplace transforms and also provides a direct method of evaluation for the first and second moments of the busy period. Then it discusses the M/ PH/1 retrial queue with disaster to the unit in service and orbital search, and a multi-server retrial queueing model (MAP/M/c) with search of customers from the orbit. MAP is convenient tool to model both renewal and non-renewal arrivals. Finally the present model deals with back and forth movement between classical queue and retrial queue. In this model when orbit size increases, retrial rate also correspondingly increases thereby reducing the idle time of the server between services
Resumo:
Remote Data acquisition and analysing systems developed for fisheries and related environmental studies have been reported. It consists of three units. The first one namely multichannel remote data acquisition system is installed at the remote place powered by a rechargeable battery. It acquires and stores the 16 channel environmental data on a battery backed up RAM. The second unit called the Field data analyser is used for insitue display and analysis of the data stored in the backed up RAM. The third unit namely Laboratory data analyser is an IBM compatible PC based unit for detailed analysis and interpretation of the data after bringing the RAM unit to the laboratory. The data collected using the system has been analysed and presented in the form of a graph. The system timer operated at negligibly low current, switches on the power to the entire remote operated system at prefixed time interval of 2 hours.Data storage at remote site on low power battery backedupRAM and retrieval and analysis of data using PC are the special i ty of the system. The remote operated system takes about 7 seconds including the 5 second stabilization time to acquire and store data and is very ideal for remote operation on rechargeable bat tery. The system can store 16 channel data scanned at 2 hour interval for 10 days on 2K backed up RAM with memory expansion facility for 8K RAM.