923 resultados para Search Engine Optimization Methods
Resumo:
Recently, the target function for crystallographic refinement has been improved through a maximum likelihood analysis, which makes proper allowance for the effects of data quality, model errors, and incompleteness. The maximum likelihood target reduces the significance of false local minima during the refinement process, but it does not completely eliminate them, necessitating the use of stochastic optimization methods such as simulated annealing for poor initial models. It is shown that the combination of maximum likelihood with cross-validation, which reduces overfitting, and simulated annealing by torsion angle molecular dynamics, which simplifies the conformational search problem, results in a major improvement of the radius of convergence of refinement and the accuracy of the refined structure. Torsion angle molecular dynamics and the maximum likelihood target function interact synergistically, the combination of both methods being significantly more powerful than each method individually. This is demonstrated in realistic test cases at two typical minimum Bragg spacings (dmin = 2.0 and 2.8 Å, respectively), illustrating the broad applicability of the combined method. In an application to the refinement of a new crystal structure, the combined method automatically corrected a mistraced loop in a poor initial model, moving the backbone by 4 Å.
Resumo:
Introducción: Analizar la calidad de las páginas web de los servicios de catering en el ámbito escolar y su contenido en educación alimentaria, y tener una primera experiencia con la herramienta de evaluación EDALCAT. Material y métodos: Estudio descriptivo transversal. La población de estudio son páginas web de empresas de catering encargadas de la gestión de los comedores escolares. La muestra se obtuvo utilizando el buscador Google y un Ranking de las principales empresas de catering por facturación, escogiendo aquellas que tenían página web. Para la prueba piloto se seleccionaron diez páginas web según proximidad geográfica a la ciudad de Alicante y nivel de facturación. Para la evaluación de los sitios web se diseñó un cuestionario (EDALCAT), compuesto de un primer bloque de predictores de calidad con 19 variables de fiabilidad, diseño y navegación; y de un segundo bloque de contenidos específicos de educación alimentaria con 19 variables de contenido y actividades educativas. Resultados: Se han obtenido resultados positivos en 31 de las 38 variables del cuestionario, excepto en los ítems: “Buscador”, “Idioma” (40%) y “Ayuda” (10%) del bloque predictores de calidad y en los ítems: “Talleres”, “Recetario”, “Web alimentación-nutrición” (40%) y “Ejemplos” (30%) del bloque de contenidos específicos de educación alimentaria. Todas las páginas web evaluadas superan valores del 50% de cumplimiento de criterios de calidad y de contenidos mínimos en educación alimentaria, y sólo una de ellas, incumple el nivel de actividad mínimo establecido. Conclusiones: Los predictores de calidad y los contenidos específicos en educación alimentaria dieron buenos resultados en todas las páginas web evaluadas. La mayoría de ellas obtuvieron una alta puntuación en su valoración, y en su análisis individual por bloques. Tras el estudio piloto el cuestionario se ha modificado y se obtiene el EDALCAT definitivo. En líneas generales EDALCAT parece ser adecuado para evaluar la calidad de las páginas web de servicios de catering y su contenido en educación alimentaria, sin embargo el presente estudio no puede considerarse como validación del mismo.
Resumo:
This work follows a feasibility study (187) which suggested that a process for purifying wet-process phosphoric acid by solvent extraction should be economically viable. The work was divided into two main areas, (i) chemical and physical measurements on the three-phase system, with or without impurities; (ii) process simulation and optimization. The object was to test the process technically and economically and to optimise the type of solvent. The chemical equilibria and distribution curves for the system water - phosphoric acid - solvent for the solvents n-amyl alcohol, tri-n-butyl phosphate, di-isopropyl ether and methyl isobutyl ketone have been determined. Both pure phosphoric acid and acid containing known amounts of naturally occurring impurities (Fe P0 4 , A1P0 4 , Ca3(P04)Z and Mg 3(P0 4 )Z) were examined. The hydrodynamic characteristics of the systems were also studied. The experimental results obtained for drop size distribution were compared with those obtainable from Hinze's equation (32) and it was found that they deviated by an amount related to the turbulence. A comprehensive literature survey on the purification of wet-process phosphoric acid by organic solvents has been made. The literature regarding solvent extraction fundamentals and equipment and optimization methods for the envisaged process was also reviewed. A modified form of the Kremser-Brown and Souders equation to calculate the number of contact stages was derived. The modification takes into account the special nature of phosphoric acid distribution curves in the studied systems. The process flow-sheet was developed and simulated. Powell's direct search optimization method was selected in conjunction with the linear search algorithm of Davies, Swann and Campey. The objective function was defined as the total annual manufacturing cost and the program was employed to find the optimum operating conditions for anyone of the chosen solvents. The final results demonstrated the following order of feasibility to purify wet-process acid: di-isopropyl ether, methylisobutyl ketone, n-amyl alcohol and tri-n-butyl phosphate.
Resumo:
When a query is passed to multiple search engines, each search engine returns a ranked list of documents. Researchers have demonstrated that combining results, in the form of a "metasearch engine", produces a significant improvement in coverage and search effectiveness. This paper proposes a linear programming mathematical model for optimizing the ranked list result of a given group of Web search engines for an issued query. An application with a numerical illustration shows the advantages of the proposed method. © 2011 Elsevier Ltd. All rights reserved.
Resumo:
* This work was financially supported by RFBF-04-01-00858.
Resumo:
Search engines sometimes apply the search on the full text of documents or web-pages; but sometimes they can apply the search on selected parts of the documents only, e.g. their titles. Full-text search may consume a lot of computing resources and time. It may be possible to save resources by applying the search on the titles of documents only, assuming that a title of a document provides a concise representation of its content. We tested this assumption using Google search engine. We ran search queries that have been defined by users, distinguishing between two types of queries/users: queries of users who are familiar with the area of the search, and queries of users who are not familiar with the area of the search. We found that searches which use titles provide similar and sometimes even (slightly) better results compared to searches which use the full-text. These results hold for both types of queries/users. Moreover, we found an advantage in title-search when searching in unfamiliar areas because the general terms used in queries in unfamiliar areas match better with general terms which tend to be used in document titles.
Resumo:
The design of interfaces to facilitate user search has become critical for search engines, ecommercesites, and intranets. This study investigated the use of targeted instructional hints to improve search by measuring the quantitative effects of users' performance and satisfaction. The effects of syntactic, semantic and exemplar search hints on user behavior were evaluated in an empirical investigation using naturalistic scenarios. Combining the three search hint components, each with two levels of intensity, in a factorial design generated eight search engine interfaces. Eighty participants participated in the study and each completed six realistic search tasks. Results revealed that the inclusion of search hints improved user effectiveness, efficiency and confidence when using the search interfaces, but with complex interactions that require specific guidelines for search interface designers. These design guidelines will allow search designers to create more effective interfaces for a variety of searchapplications.
Resumo:
Background: Information seeking is an important coping mechanism for dealing with chronic illness. Despite a growing number of mental health websites, there is little understanding of how patients with bipolar disorder use the Internet to seek information. Methods: A 39 question, paper-based, anonymous survey, translated into 12 languages, was completed by 1222 patients in 17 countries as a convenience sample between March 2014 and January 2016. All patients had a diagnosis of bipolar disorder from a psychiatrist. Data were analyzed using descriptive statistics and generalized estimating equations to account for correlated data. Results: 976 (81 % of 1212 valid responses) of the patients used the Internet, and of these 750 (77 %) looked for information on bipolar disorder. When looking online for information, 89 % used a computer rather than a smartphone, and 79 % started with a general search engine. The primary reasons for searching were drug side effects (51 %), to learn anonymously (43 %), and for help coping (39 %). About 1/3 rated their search skills as expert, and 2/3 as basic or intermediate. 59 % preferred a website on mental illness and 33 % preferred Wikipedia. Only 20 % read or participated in online support groups. Most patients (62 %) searched a couple times a year. Online information seeking helped about 2/3 to cope (41 % of the entire sample). About 2/3 did not discuss Internet findings with their doctor. Conclusion: Online information seeking helps many patients to cope although alternative information sources remain important. Most patients do not discuss Internet findings with their doctor, and concern remains about the quality of online information especially related to prescription drugs. Patients may not rate search skills accurately, and may not understand limitations of online privacy. More patient education about online information searching is needed and physicians should recommend a few high quality websites.
Resumo:
Background: Worldwide, it is estimated that there are up to 150 million street children. Street children are an understudied, vulnerable population. While many studies have characterized street children’s physical health, few have addressed the circumstances and barriers to their utilization of health services.
Methods: A systematic literature review was conducted to understand the barriers and facilitators that street children face when accessing healthcare in low and middle income countries. Six databases were used to search for peer review literature and one database and Google Search engine were used to find grey literature (theses, dissertations, reports, etc.). There were no exclusions based on study design. Studies were eligible for inclusion if the study population included street children, the study location was a low and middle income country defined by the World Bank, AND whose subject pertained to healthcare.
In addition, a cross-sectional study was conducted between May 2015 and August 2015 with the goal of understanding knowledge, attitudes, and health seeking practices of street children residing in Battambang, Cambodia. Time location and purposive sampling were used to recruit community (control) and street children. Both boys and girls between the ages of 10 and 18 were recruited. Data was collected through a verbally administered survey. The knowledge, attitudes and health seeking practices of community and street children were compared to determine potential differences in healthcare utilization.
Results: Of the 2933 abstracts screened for inclusion in the systematic literature review, eleven articles met all the inclusion criteria and were found to be relevant. Cost and perceived stigma appeared to be the largest barriers street children faced when attempting to seek care. Street children preferred to receive care from a hospital. However, negative experiences and mistreatment by health providers deterred children from going there. Instead, street children would often self treat and/or purchase medicine from a pharmacy or drug vendor. Family and peer support were found to be important for facilitating treatment.
The survey found similar results to the systematic review. Forty one community and thirty four street children were included in the analysis. Both community and street children reported the hospital as their top choice for care. When asked if someone went with them to seek care, both community and street children reported that family members, usually mothers, accompanied them. Community and street children both reported perceived stigma. All children had good knowledge of preventative care.
Conclusions: While most current services lack the proper accommodations for street children, there is a great potential to adapt them to better address street children’s needs. Street children need health services that are sensitive to their situation. Subsidies in health service costs or provision of credit may be ways to reduce constraints street children face when deciding to seek healthcare. Health worker education and interventions to reduce stigma are needed to create a positive environment in which street children are admitted and treated for health concerns.
Resumo:
This work applies a hybrid approach in solving the university curriculum-based course timetabling problem as presented as part of the 2nd International Timetabling Competition 2007 (ITC2007). The core of the hybrid approach is based on an artificial bee colony algorithm. Past methods have applied artificial bee colony algorithms to university timetabling problems with high degrees of success. Nevertheless, there exist inefficiencies in the associated search abilities in term of exploration and exploitation. To improve the search abilities, this work introduces a hybrid approach entitled nelder-mead great deluge artificial bee colony algorithm (NMGD-ABC) where it combined additional positive elements of particle swarm optimization and great deluge algorithm. In addition, nelder-mead local search is incorporated into the great deluge algorithm to further enhance the performance of the resulting method. The proposed method is tested on curriculum-based course timetabling as presented in the ITC2007. Experimental results reveal that the proposed method is capable of producing competitive results as compared with the other approaches described in literature
Resumo:
Les réseaux de capteurs sont formés d’un ensemble de dispositifs capables de prendre individuellement des mesures d’un environnement particulier et d’échanger de l’information afin d’obtenir une représentation de haut niveau sur les activités en cours dans la zone d’intérêt. Une telle détection distribuée, avec de nombreux appareils situés à proximité des phénomènes d’intérêt, est pertinente dans des domaines tels que la surveillance, l’agriculture, l’observation environnementale, la surveillance industrielle, etc. Nous proposons dans cette thèse plusieurs approches pour effectuer l’optimisation des opérations spatio-temporelles de ces dispositifs, en déterminant où les placer dans l’environnement et comment les contrôler au fil du temps afin de détecter les cibles mobiles d’intérêt. La première nouveauté consiste en un modèle de détection réaliste représentant la couverture d’un réseau de capteurs dans son environnement. Nous proposons pour cela un modèle 3D probabiliste de la capacité de détection d’un capteur sur ses abords. Ce modèle inègre également de l’information sur l’environnement grâce à l’évaluation de la visibilité selon le champ de vision. À partir de ce modèle de détection, l’optimisation spatiale est effectuée par la recherche du meilleur emplacement et l’orientation de chaque capteur du réseau. Pour ce faire, nous proposons un nouvel algorithme basé sur la descente du gradient qui a été favorablement comparée avec d’autres méthodes génériques d’optimisation «boites noires» sous l’aspect de la couverture du terrain, tout en étant plus efficace en terme de calculs. Une fois que les capteurs placés dans l’environnement, l’optimisation temporelle consiste à bien couvrir un groupe de cibles mobiles dans l’environnement. D’abord, on effectue la prédiction de la position future des cibles mobiles détectées par les capteurs. La prédiction se fait soit à l’aide de l’historique des autres cibles qui ont traversé le même environnement (prédiction à long terme), ou seulement en utilisant les déplacements précédents de la même cible (prédiction à court terme). Nous proposons de nouveaux algorithmes dans chaque catégorie qui performent mieux ou produits des résultats comparables par rapport aux méthodes existantes. Une fois que les futurs emplacements de cibles sont prédits, les paramètres des capteurs sont optimisés afin que les cibles soient correctement couvertes pendant un certain temps, selon les prédictions. À cet effet, nous proposons une méthode heuristique pour faire un contrôle de capteurs, qui se base sur les prévisions probabilistes de trajectoire des cibles et également sur la couverture probabiliste des capteurs des cibles. Et pour terminer, les méthodes d’optimisation spatiales et temporelles proposées ont été intégrées et appliquées avec succès, ce qui démontre une approche complète et efficace pour l’optimisation spatio-temporelle des réseaux de capteurs.
Resumo:
INTRODUCTION Fear and anxiety are part of all human experiences and they may contribute directly to a patient's behavior. The Atraumatic Restorative Treatment (ART) is a technique that may be an alternative approach in treating special care patients or those who suffer fear or anxiety. OBJECTIVE the aim of this paper is to review the ART technique as an alternative to reduce pain and fear during dental treatment. MATERIAL AND METHODS A search for the term "atraumatic restorative treatment" was carried out in the MEDLINE search engine. References, from the last 10 years, containing at least one of the terms: "psychological aspects", "discomfort", "fear", "anxiety" or "pain", were selected. RESULTS A total of 120 references were found, from which only 17 fit the criteria. Discussion: All authors agreed that the ART promotes less discomfort for patients, contributing to a reduction of anxiety and fear during the dental treatment. Results also indicated that ART minimizes pain reported by patients. CONCLUSIONS The ART approach can be considered as having favorable characteristics for the patient, promoting an "atraumatic" treatment. This technique may be indicated for patients who suffer from fear or anxiety towards dental treatments and whose behavior may cause the treatment to become unfeasible or even impossible altogether.
Resumo:
Reverse engineering is usually the stepping stone of a variety of at-tacks aiming at identifying sensitive information (keys, credentials, data, algo-rithms) or vulnerabilities and flaws for broader exploitation. Software applica-tions are usually deployed as identical binary code installed on millions of com-puters, enabling an adversary to develop a generic reverse-engineering strategy that, if working on one code instance, could be applied to crack all the other in-stances. A solution to mitigate this problem is represented by Software Diversity, which aims at creating several structurally different (but functionally equivalent) binary code versions out of the same source code, so that even if a successful attack can be elaborated for one version, it should not work on a diversified ver-sion. In this paper, we address the problem of maximizing software diversity from a search-based optimization point of view. The program to protect is subject to a catalogue of transformations to generate many candidate versions. The problem of selecting the subset of most diversified versions to be deployed is formulated as an optimisation problem, that we tackle with different search heuristics. We show the applicability of this approach on some popular Android apps.
Resumo:
A radiação-x foi descoberta há mais de uma centena de anos e com o avanço tecnológico diversas formas de aplicação têm vindo a ser descobertas. A utilização constante, devido à ajuda preciosa que estes dispositivos fornecem aos clínicos de Medicina Dentária, faz com que o paciente venha, de forma crescente, a ser exposto a este tipo de radiação. Este trabalho de revisão descritiva tem como principal propósito analisar os riscos, as formas de avaliação e de proteção da radiação-x, no âmbito do seu uso em Medicina Dentária. Para o efeito, realizou-se uma revisão da literatura, tendo-se recorrido ao motor de busca EBSCO host©, utilizando as frases boleanas: (X-ray risk OR Radiography risk OR Dental x-ray risk) AND (Dental Medicine OR Oral), foi substituída a palavra risk por “evaluation” e por “protection”. Desta pesquisa resultaram 82 documentos como referências bibliográficas. Esta revisão de literatura permitiu compreender a diferença entre radiação ionizante e não ionizante, quais os dispositivos utlizados na Medicina Dentária e o seu funcionamento geral, quais as unidades de medida da radiação, quais os riscos iminentes à exposição cumulativa e quais as suas consequências biológicas, perceber de que forma a legislação Portuguesa tem evoluído em matéria de radiação ionizante, quais as diversas formas de redução da dose absorvida pelo paciente e quais as medidas de proteção para os profissionais de saúde expostos à radiação ionizante.
Resumo:
Contextualização: A extração de terceiros molares é um dos atos clínicos mais realizados em Cirurgia Oral. A presença de um terceiro molar incluso pode estar na origem de uma variedade de complicações. Pericoronarite, cáries dentárias e doença periodontal são algumas das indicações para extração de terceiros molares inclusos. A antibioterapia profilática para a prevenção de complicações pós-operatórias como a alveolite e a infeção do local cirúrgico é ainda um assunto que gera alguma controvérsia, particularmente em indivíduos saudáveis. Objetivo: Estudar a necessidade da antibioterapia profilática na extração de terceiros molares e os seus potenciais riscos e benefícios. Materiais e Métodos: Foi efetuada uma pesquisa bibliográfica na base de dados da MEDLINE/PubMed e no motor de busca da ResearchGate. Adicionalmente, foram coletados artigos de interesse da bibliografia recolhida e consultados alguns livros de forma a complementar a informação obtida. Conclusão: Não existe um consenso no que toca à profilaxia antibiótica na extração de terceiros molares. Os médicos dentistas devem, por isso, efetuar uma avaliação cuidada do estado clínico de cada paciente de forma a tomar uma decisão consciente relativamente à administração de antibióticos com o intuito de prevenir complicações pós-operatórias da cirurgia de terceiros molares inclusos.