959 resultados para Search Engine
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
Introduzione a tecniche di web semantico e realizzazione di un approccio in grado di ricreare un ambiente familiare di un qualsiasi motore di ricerca con funzionalità semantico-lessicali e possibilità di estrazione, in base ai risultati di ricerca, dei concetti e termini chiave che costituiranno i relativi gruppi di raccolta per i vari documenti con argomenti in comune.
Resumo:
La presente tesi di laurea si concentra sulla localizzazione in inglese di varie sezioni del nuovo sito web della Pinacoteca di Brera. Il progetto di localizzazione è stato contestualizzato da un lato all’interno della letteratura sulla comunicazione museale, e dall’altro sulla comunicazione web, per poter avanzare proposte di miglioramento alla luce di ricerche nel campo della SEO (Search Engine Optimization). Lo studio della comunicazione museale si è arricchito grazie all’esperienza di documentazione presso la University of Leicester (UK). La tesi mira a porre le basi per la produzione di contenuti museali adatti ad una lettura sul web, in modo da offrire non solo una traduzione ben fatta dal punto di vista linguistico e culturale, ma anche facilmente fruibile per un utente online e reperibile attraverso motori di ricerca. L’elaborato intende fornire ai musei italiani alcuni spunti di riflessione circa possibili miglioramenti delle proprie piattaforme online grazie alla localizzazione e ad un’analisi approfondita dei contenuti web secondo principi di usabilità e visibilità. Il capitolo 1 introduce la letteratura sugli studi museali, prestando particolare attenzione alla comunicazione. Il capitolo 2 fornisce una panoramica generale sul web: vengono suggerite buone pratiche di web writing, analizzate le strategie di SEO per migliorare la visibilità dei siti e delineato le principali caratteristiche del processo di localizzazione. Il capitolo 3 riunisce i due universi finora esplorati individualmente, ovvero i musei e il web, concentrandosi sulla comunicazione online dei musei e concludendo con uno schema di valutazione dei siti dei musei. Il capitolo 4 applica le strategie precedentemente discusse al caso specifico della Pinacoteca di Brera, concentrandosi sulla valutazione del sito, sulla localizzazione di alcune sezioni e sulla proposta di strategie SEO. Infine, il capitolo 5 tira le fila dell’intero lavoro mettendo in evidenza i principali risultati ottenuti.
Resumo:
La realizzazione di un motore di ricerca per uno specifico ambito documentale comporta molte scelte. Questo documento intende esplicarne problemi riscontrati e soluzioni ottenute durante la realizzazione di un motore di ricerca per ricette culinarie. Questa dissertazione illustra il problema sia da un punto di vista architetturale che implementativo, in particolare, la tesi tratta sia del design pattern MVC, usato come base del progetto, che di algoritmi di stemming e ranking.
Resumo:
OBJECTIVE: Neurologically normal term infants sometimes present with repetitive, rhythmic myoclonic jerks that occur during sleep. The condition, which is traditionally resolved by 3 months of age with no sequelae, is termed benign neonatal sleep myoclonus. The goal of this review was to synthesize the published literature on benign neonatal sleep myoclonus. METHODS: The US National Library of Medicine database and the Web-based search engine Google, through June 2009, were used as data sources. All articles published after the seminal description in 1982 as full-length articles or letters were collected. Reports that were published in languages other than English, French, German, Italian, Portuguese, or Spanish were not considered. RESULTS: We included 24 reports in which 164 term-born (96%) or near-term-born (4%) infants were described. Neonatal sleep myoclonus occurred in all sleep stages, disappeared after arousal, and was induced by rocking the infant or repetitive sound stimuli. Furthermore, in affected infants, jerks stopped or even worsened by holding the limbs or on medication with antiepileptic drugs. Finally, benign neonatal sleep myoclonus did not resolve by 3 months of age in one-third of the infants. CONCLUSIONS: This review provides new insights into the clinical features and natural course of benign neonatal sleep myoclonus. The most significant limitation of the review comes from the small number of reported cases.
Resumo:
Reengineering and integrated development plat- forms typically do not list search results in a particularly useful order. PageRank is the algorithm prominently used by the Google internet search engine to rank the relative importance of elements in a set of hyperlinked documents. To determine the relevance of objects, classes, attributes, and methods we propose to apply PageRank to software artifacts and their relationship (reference, inheritance, access, and invocation). This paper presents various experiments that demonstrate the usefulness of the ranking algorithm in software (re)engineering.
Resumo:
Neuroimaging and electrophysiological investigations have demonstrated numerous differences in brain morphology and function of chronic schizophrenia patients compared to healthy controls. Studying patients at the beginning of their disease without the confounding effects of chronicity, medication, and institutionalization may provide a better understanding of schizophrenia. Recently, at many institutions around the world, special projects have been launched for specialized treatment and research of this interesting patient group. Using the PubMed search engine in this update, the authors summarize recent investigations between January 2002 and September 2006 that focus on whether signs of disconnectivity already exist early in the disease process. They discuss gray and white matter changes, their impact on symptomatology, electroencephalogram-based studies on connectivity, and possible influences of medication.
Resumo:
BACKGROUND: Lodox-Statscan is a whole-body, skeletal and soft-tissue, low-dose X-ray scanner Anterior-posterior and lateral thoraco-abdominal studies are obtained in 3-5 minutes with only about one-third of the radiation required for conventional radiography. Since its approval by the Food and Drug Administration (FDA) in the USA, several trauma centers have incorporated this technology into their Advanced Trauma Life Support protocols. This review provides a brief overview of the system, and describes the authors' own experience with the system. METHODS: We performed a PubMed search to retrieve all references with 'Lodox' and 'Stat-scan' used as search terms. We furthermore used the google search engine to identify existing alternatives. To the best of our knowledge, this is the only FDA-approved device of its kind currently used in trauma. RESULTS AND CONCLUSION: The intention of our review has been to sensitize the readership that such alternative devices exist. The key message is that low dosage full body radiography may be an alternative to conventional resuscitation room radiography which is usually a prelude to CT scanning (ATLS algorithm). The combination of both is radiation intensive and therefore we consider any reduction of radiation a success. But only the future will show whether LS will survive in the face of low-dose radiation CT scanners and magnetic resonance imaging devices that may eventually completely replace conventional radiography.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
The long-awaited verdict by the German Federal Court of Justice towards Google image search has drawn much attention to the problem of copyright infringement by search engines on the Internet. In the past years the question has arose whether the listing itself in a search engine like Google can be an infringement of copyright. The decision is widely seen as one of the most important of the last years. With significant amount of effort, the German Fede- ral Court tried to balance the interests of the right holders and those of the digital reality.
Resumo:
Web 2.0 und soziale Netzwerke gaben erste Impulse für neue Formen der Online-Lehre, welche die umfassende Vernetzung von Objekten und Nutzern im Internet nachhaltig einsetzen. Die Vielfältigkeit der unterschiedlichen Systeme erschwert aber deren ganzheitliche Nutzung in einem umfassenden Lernszenario, das den Anforderungen der modernen Informationsgesellschaft genügt. In diesem Beitrag wird eine auf dem Konnektivismus basierende Plattform für die Online-Lehre namens “Wiki-Learnia” präsentiert, welche alle wesentlichen Abschnitte des lebenslangen Lernens abbildet. Unter Einsatz zeitgemäßer Technologien werden nicht nur Nutzer untereinander verbunden, sondern auch Nutzer mit dedizierten Inhalten sowie ggf. zugehörigen Autoren und/oder Tutoren verknüpft. Für ersteres werden verschiedene Kommunikations-Werkzeuge des Web 2.0 (soziale Netzwerke, Chats, Foren etc.) eingesetzt. Letzteres fußt auf dem sogenannten “Learning-Hub”-Ansatz, welcher mit Hilfe von Web-3.0-Mechanismen insbesondere durch eine semantische Metasuchmaschine instrumentiert wird. Zum Aufzeigen der praktischen Relevanz des Ansatzes wird das mediengestützte Juniorstudium der Universität Rostock vorgestellt, ein Projekt, das Schüler der Abiturstufe aufs Studium vorbereitet. Anhand der speziellen Anforderungen dieses Vorhabens werden der enorme Funktionsumfang und die große Flexibilität von Wiki-Learnia demonstriert.
Resumo:
Web 2.0 und soziale Netzwerke gaben erste Impulse für neue Formen der Online-Lehre, welche die umfassende Vernetzung von Objekten und Nutzern im Internet nachhaltig einsetzen. Die Vielfältigkeit der unterschiedlichen Systeme erschwert aber deren ganzheitliche Nutzung in einem umfassenden Lernszenario, das den Anforderungen der modernen Informationsgesellschaft genügt. In diesem Beitrag wird eine auf dem Konnektivismus basierende Plattform für die Online-Lehre namens “Wiki-Learnia” präsentiert, welche alle wesentlichen Abschnitte des lebenslangen Lernens abbildet. Unter Einsatz zeitgemäßer Technologien werden nicht nur Nutzer untereinander verbunden, sondern auch Nutzer mit dedizierten Inhalten sowie ggf. zugehörigen Autoren und/oder Tutoren verknüpft. Für ersteres werden verschiedene Kommunikations-Werkzeuge des Web 2.0 (soziale Netzwerke, Chats, Foren etc.) eingesetzt. Letzteres fußt auf dem sogenannten “Learning-Hub”-Ansatz, welcher mit Hilfe von Web-3.0-Mechanismen insbesondere durch eine semantische Metasuchmaschine instrumentiert wird. Zum Aufzeigen der praktischen Relevanz des Ansatzes wird das mediengestützte Juniorstudium der Universität Rostock vorgestellt, ein Projekt, das Schüler der Abiturstufe aufs Studium vorbereitet. Anhand der speziellen Anforderungen dieses Vorhabens werden der enorme Funktionsumfang und die große Flexibilität von Wiki-Learnia demonstriert.
Resumo:
AIMS AND BACKGROUND Tumor progression due to seeding of tumor cells after definitive treatment for squamous cell carcinomas of the head and neck is an uncommon condition that can considerably worsen the outcome of patients with head and neck cancer. METHODS AND STUDY DESIGN We report two cases of recurrence due to neoplastic seeding from oropharyngeal and oral cancer, respectively. We performed a literature review with MEDLINE as the main search engine. RESULTS Seeding was found to occur most often in tracheotomy scars and gastrostomy sites. The oral cavity, hypopharynx and oropharynx were the primary sites in most cases, and advanced tumor stage seemed to be a risk factor for seeding. Treatment options include salvage surgery, which requires thorough resections, radiotherapy when possible, and palliative management. The prognosis of such events is poor. CONCLUSION Although neoplastic seeding is a well-known phenomenon in cancer surgery, many questions remain unanswered, especially regarding preventive measures and management strategies.
Resumo:
This paper presents fuzzy clustering algorithms to establish a grassroots ontology – a machine-generated weak ontology – based on folksonomies. Furthermore, it describes a search engine for vaguely associated terms and aggregates them into several meaningful cluster categories, based on the introduced weak grassroots ontology. A potential application of this ontology, weblog extraction, is illustrated using a simple example. Added value and possible future studies are discussed in the conclusion.
Resumo:
This paper uses folksonomies and fuzzy clustering algorithms to establish term-relevant related results. This paper will propose a Meta search engine with the ability to search for vaguely associated terms and aggregate them into several meaningful cluster categories. The potential of the fuzzy weblog extraction is illustrated using a simple example and added value and possible future studies are discussed in the conclusion.