923 resultados para Search Engine Optimization Methods
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
Schon seit einigen Jahrzehnten wird die Sportwissenschaft durch computergestützte Methoden in ihrer Arbeit unterstützt. Mit der stetigen Weiterentwicklung der Technik kann seit einigen Jahren auch zunehmend die Sportpraxis von deren Einsatz profitieren. Mathematische und informatische Modelle sowie Algorithmen werden zur Leistungsoptimierung sowohl im Mannschafts- als auch im Individualsport genutzt. In der vorliegenden Arbeit wird das von Prof. Perl im Jahr 2000 entwickelte Metamodell PerPot an den ausdauerorientierten Laufsport angepasst. Die Änderungen betreffen sowohl die interne Modellstruktur als auch die Art der Ermittlung der Modellparameter. Damit das Modell in der Sportpraxis eingesetzt werden kann, wurde ein Kalibrierungs-Test entwickelt, mit dem die spezifischen Modellparameter an den jeweiligen Sportler individuell angepasst werden. Mit dem angepassten Modell ist es möglich, aus gegebenen Geschwindigkeitsprofilen die korrespondierenden Herzfrequenzverläufe abzubilden. Mit dem auf den Athleten eingestellten Modell können anschliessend Simulationen von Läufen durch die Eingabe von Geschwindigkeitsprofilen durchgeführt werden. Die Simulationen können in der Praxis zur Optimierung des Trainings und der Wettkämpfe verwendet werden. Das Training kann durch die Ermittlung einer simulativ bestimmten individuellen anaeroben Schwellenherzfrequenz optimal gesteuert werden. Die statistische Auswertung der PerPot-Schwelle zeigt signifikante Übereinstimmungen mit den in der Sportpraxis üblichen invasiv bestimmten Laktatschwellen. Die Wettkämpfe können durch die Ermittlung eines optimalen Geschwindigkeitsprofils durch verschiedene simulationsbasierte Optimierungsverfahren unterstützt werden. Bei der neuesten Methode erhält der Athlet sogar im Laufe des Wettkampfs aktuelle Prognosen, die auf den Geschwindigkeits- und Herzfrequenzdaten basieren, die während des Wettkampfs gemessen werden. Die mit PerPot optimierten Wettkampfzielzeiten für die Athleten zeigen eine hohe Prognosegüte im Vergleich zu den tatsächlich erreichten Zielzeiten.
Resumo:
La realizzazione di un motore di ricerca per uno specifico ambito documentale comporta molte scelte. Questo documento intende esplicarne problemi riscontrati e soluzioni ottenute durante la realizzazione di un motore di ricerca per ricette culinarie. Questa dissertazione illustra il problema sia da un punto di vista architetturale che implementativo, in particolare, la tesi tratta sia del design pattern MVC, usato come base del progetto, che di algoritmi di stemming e ranking.
Resumo:
Plant volatiles typically occur as a complex mixture of low-molecular weight lipophilic compounds derived from different biosynthetic pathways, and are seemingly produced as part of a defense strategy against biotic and abiotic stress, as well as contributing to various physiological functions of the producer organism. The biochemistry and molecular biology of plant volatiles is complex, and involves the interplay of several biochemical pathways and hundreds of genes. All plants are able to store and emit volatile organic compounds (VOCs), but the process shows remarkable genotypic variation and phenotypic plasticity. From a physiological standpoint, plant volatiles are involved in three critical processes, namely plant–plant interaction, the signaling between symbiotic organisms, and the attraction of pollinating insects. Their role in these ‘‘housekeeping’’ activities underlies agricultural applications that range from the search for sustainable methods for pest control to the production of flavors and fragrances. On the other hand, there is also growing evidence that VOCs are endowed with a range of biological activities in mammals, and that they represent a substantially under-exploited and still largely untapped source of novel drugs and drug leads. This review summarizes recent major developments in the study of biosynthesis, ecological functions and medicinal applications of plant VOCs.
Resumo:
Collagen is a major component of extracellular matrix and a wide variety of types exist. Cells recognise collagen in different ways depending on sequence and structure. They can recognise predominantly primary sequence, they may require triple-helical structure or they can require fibrillar structures. Since collagens are major constituents of the subendothelium that determine the thrombogenicity of the injured or pathological vessel wall, a major role is induction of platelet activation and aggregation as the start of repair processes. Platelets have at least two direct and one indirect (via von Willebrand factor) receptors for collagen, and collagen has specific recognition motifs for these receptors. These receptors and recognition motifs are under intensive investigation in the search for possible methods to control platelet activation in vivo. A wide range of proteins has been identified and, in part, characterised from both haematophageous insects and invertebrates but also from snake venoms that inhibit platelet activation by collagen or induce platelet activation via collagen receptors on platelets. These will provide model systems to test the effect of inhibition of specific collagen-platelet receptor interactions for both effectiveness as well as for side effects and should provide assay systems for the development of small molecule inhibitors. Since platelet inhibitors for long-term prophylaxis of cardiovascular diseases are still in clinical trials there are many unanswered questions about long-term effects both positive and negative. The major problem which still has to be definitively solved about these alternative approaches to inhibition of platelet activation is whether they will show advantages in terms of dose-response curves while offering decreased risks of bleeding problems. Preliminary studies would seem to suggest that this is indeed the case.
Resumo:
Neuroimaging and electrophysiological investigations have demonstrated numerous differences in brain morphology and function of chronic schizophrenia patients compared to healthy controls. Studying patients at the beginning of their disease without the confounding effects of chronicity, medication, and institutionalization may provide a better understanding of schizophrenia. Recently, at many institutions around the world, special projects have been launched for specialized treatment and research of this interesting patient group. Using the PubMed search engine in this update, the authors summarize recent investigations between January 2002 and September 2006 that focus on whether signs of disconnectivity already exist early in the disease process. They discuss gray and white matter changes, their impact on symptomatology, electroencephalogram-based studies on connectivity, and possible influences of medication.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
The long-awaited verdict by the German Federal Court of Justice towards Google image search has drawn much attention to the problem of copyright infringement by search engines on the Internet. In the past years the question has arose whether the listing itself in a search engine like Google can be an infringement of copyright. The decision is widely seen as one of the most important of the last years. With significant amount of effort, the German Fede- ral Court tried to balance the interests of the right holders and those of the digital reality.
Resumo:
Web 2.0 und soziale Netzwerke gaben erste Impulse für neue Formen der Online-Lehre, welche die umfassende Vernetzung von Objekten und Nutzern im Internet nachhaltig einsetzen. Die Vielfältigkeit der unterschiedlichen Systeme erschwert aber deren ganzheitliche Nutzung in einem umfassenden Lernszenario, das den Anforderungen der modernen Informationsgesellschaft genügt. In diesem Beitrag wird eine auf dem Konnektivismus basierende Plattform für die Online-Lehre namens “Wiki-Learnia” präsentiert, welche alle wesentlichen Abschnitte des lebenslangen Lernens abbildet. Unter Einsatz zeitgemäßer Technologien werden nicht nur Nutzer untereinander verbunden, sondern auch Nutzer mit dedizierten Inhalten sowie ggf. zugehörigen Autoren und/oder Tutoren verknüpft. Für ersteres werden verschiedene Kommunikations-Werkzeuge des Web 2.0 (soziale Netzwerke, Chats, Foren etc.) eingesetzt. Letzteres fußt auf dem sogenannten “Learning-Hub”-Ansatz, welcher mit Hilfe von Web-3.0-Mechanismen insbesondere durch eine semantische Metasuchmaschine instrumentiert wird. Zum Aufzeigen der praktischen Relevanz des Ansatzes wird das mediengestützte Juniorstudium der Universität Rostock vorgestellt, ein Projekt, das Schüler der Abiturstufe aufs Studium vorbereitet. Anhand der speziellen Anforderungen dieses Vorhabens werden der enorme Funktionsumfang und die große Flexibilität von Wiki-Learnia demonstriert.
Resumo:
Web 2.0 und soziale Netzwerke gaben erste Impulse für neue Formen der Online-Lehre, welche die umfassende Vernetzung von Objekten und Nutzern im Internet nachhaltig einsetzen. Die Vielfältigkeit der unterschiedlichen Systeme erschwert aber deren ganzheitliche Nutzung in einem umfassenden Lernszenario, das den Anforderungen der modernen Informationsgesellschaft genügt. In diesem Beitrag wird eine auf dem Konnektivismus basierende Plattform für die Online-Lehre namens “Wiki-Learnia” präsentiert, welche alle wesentlichen Abschnitte des lebenslangen Lernens abbildet. Unter Einsatz zeitgemäßer Technologien werden nicht nur Nutzer untereinander verbunden, sondern auch Nutzer mit dedizierten Inhalten sowie ggf. zugehörigen Autoren und/oder Tutoren verknüpft. Für ersteres werden verschiedene Kommunikations-Werkzeuge des Web 2.0 (soziale Netzwerke, Chats, Foren etc.) eingesetzt. Letzteres fußt auf dem sogenannten “Learning-Hub”-Ansatz, welcher mit Hilfe von Web-3.0-Mechanismen insbesondere durch eine semantische Metasuchmaschine instrumentiert wird. Zum Aufzeigen der praktischen Relevanz des Ansatzes wird das mediengestützte Juniorstudium der Universität Rostock vorgestellt, ein Projekt, das Schüler der Abiturstufe aufs Studium vorbereitet. Anhand der speziellen Anforderungen dieses Vorhabens werden der enorme Funktionsumfang und die große Flexibilität von Wiki-Learnia demonstriert.
Resumo:
This paper presents fuzzy clustering algorithms to establish a grassroots ontology – a machine-generated weak ontology – based on folksonomies. Furthermore, it describes a search engine for vaguely associated terms and aggregates them into several meaningful cluster categories, based on the introduced weak grassroots ontology. A potential application of this ontology, weblog extraction, is illustrated using a simple example. Added value and possible future studies are discussed in the conclusion.
Resumo:
OBJECTIVE: To characterize PubMed usage over a typical day and compare it to previous studies of user behavior on Web search engines. DESIGN: We performed a lexical and semantic analysis of 2,689,166 queries issued on PubMed over 24 consecutive hours on a typical day. MEASUREMENTS: We measured the number of queries, number of distinct users, queries per user, terms per query, common terms, Boolean operator use, common phrases, result set size, MeSH categories, used semantic measurements to group queries into sessions, and studied the addition and removal of terms from consecutive queries to gauge search strategies. RESULTS: The size of the result sets from a sample of queries showed a bimodal distribution, with peaks at approximately 3 and 100 results, suggesting that a large group of queries was tightly focused and another was broad. Like Web search engine sessions, most PubMed sessions consisted of a single query. However, PubMed queries contained more terms. CONCLUSION: PubMed's usage profile should be considered when educating users, building user interfaces, and developing future biomedical information retrieval systems.
Resumo:
OBJECTIVES: To determine the characteristics of popular breast cancer related websites and whether more popular sites are of higher quality. DESIGN: The search engine Google was used to generate a list of websites about breast cancer. Google ranks search results by measures of link popularity---the number of links to a site from other sites. The top 200 sites returned in response to the query "breast cancer" were divided into "more popular" and "less popular" subgroups by three different measures of link popularity: Google rank and number of links reported independently by Google and by AltaVista (another search engine). MAIN OUTCOME MEASURES: Type and quality of content. RESULTS: More popular sites according to Google rank were more likely than less popular ones to contain information on ongoing clinical trials (27% v 12%, P=0.01 ), results of trials (12% v 3%, P=0.02), and opportunities for psychosocial adjustment (48% v 23%, P<0.01). These characteristics were also associated with higher number of links as reported by Google and AltaVista. More popular sites by number of linking sites were also more likely to provide updates on other breast cancer research, information on legislation and advocacy, and a message board service. Measures of quality such as display of authorship, attribution or references, currency of information, and disclosure did not differ between groups. CONCLUSIONS: Popularity of websites is associated with type rather than quality of content. Sites that include content correlated with popularity may best meet the public's desire for information about breast cancer.
Resumo:
The purpose of the internet-based teachware mySCM is that students of economics, informatics and industrial engineering get familiar with quantitative methods for supply chain management. Input-output-relationships of various optimization methods can be detected by sampling input values, parameters, and alternative methods for the same problem. Students can gain extra benefits by passing so-called mini-exams that motivate active learning. mySCM can be used for free, round-the-clock, and any place where access to the Internet is available.
Resumo:
Two new approaches to quantitatively analyze diffuse diffraction intensities from faulted layer stacking are reported. The parameters of a probability-based growth model are determined with two iterative global optimization methods: a genetic algorithm (GA) and particle swarm optimization (PSO). The results are compared with those from a third global optimization method, a differential evolution (DE) algorithm [Storn & Price (1997). J. Global Optim. 11, 341–359]. The algorithm efficiencies in the early and late stages of iteration are compared. The accuracy of the optimized parameters improves with increasing size of the simulated crystal volume. The wall clock time for computing quite large crystal volumes can be kept within reasonable limits by the parallel calculation of many crystals (clones) generated for each model parameter set on a super- or grid computer. The faulted layer stacking in single crystals of trigonal three-pointedstar- shaped tris(bicylco[2.1.1]hexeno)benzene molecules serves as an example for the numerical computations. Based on numerical values of seven model parameters (reference parameters), nearly noise-free reference intensities of 14 diffuse streaks were simulated from 1280 clones, each consisting of 96 000 layers (reference crystal). The parameters derived from the reference intensities with GA, PSO and DE were compared with the original reference parameters as a function of the simulated total crystal volume. The statistical distribution of structural motifs in the simulated crystals is in good agreement with that in the reference crystal. The results found with the growth model for layer stacking disorder are applicable to other disorder types and modeling techniques, Monte Carlo in particular.