954 resultados para Google Web Tolkien -- Evaluation
Resumo:
En el present treball es definirà el cicle de vida i la metodologia que s'ha de seguir, segons el disseny centrat en l'usuari, per a un projecte web orientat a continguts específics per a persones de la tercera edat, per a això, es determinarà i justificarà quins mètodes i tècniques d'avaluació de la usabilitat són els més adequats per poder dissenyar un lloc web d'aquest tipus.
Resumo:
El proyecto Web de Control de Tráfico Marítimo (WCTM) un proyecto Web que va a gestionar información útil para el Control del Tráfico Marítimo de una determinada zona del mundo. Usa el concepto de arquitectura SOA, una arquitectura que se basa en los servicios Web, proporciona en tiempo real información sobre los buques usando los mapas de Google.
Resumo:
Boletín semanal para profesionales sanitarios de la Secretaría General de Salud Pública y Participación Social de la Consejería de Salud
Resumo:
El projecte es basa en estudiar i avaluar diferents sistemes gestors de bases de dades (SGBDs) per desar informació dins el context de la Web Semàntica. Els SGBDs hauran de tractar, emmagatzemar i gestionar la informació classificada segons uns criteris semàntics i interrelacionada amb conceptes afins, alhora que permetin la comunicació entre sistemes de manera transparent a l'usuari.
Resumo:
Aplicació web per a correcció automàtica de proves.
Resumo:
El objetivo de este trabajo es desarrollar una aplicación web que sirva de gestión de los aparatos conectados a la centralita Asterisk y generar los ficheros de configuración, tanto para los terminales como para el correcto funcionamiento de la centralita. Además, se instalará un servidor que conecte centralita y Google Apps para que los usuarios puedan usar el videochat de Gmail como softphone, permitiéndoles enviar y recibir llamadas a través de la centralita de la universidad en cualquier parte del mundo.
Resumo:
AIM: The aim of this study was to evaluate a new pedagogical approach in teaching fluid, electrolyte and acid-base pathophysiology in undergraduate students. METHODS: This approach comprises traditional lectures, the study of clinical cases on the web and a final interactive discussion of these cases in the classroom. When on the web, the students are asked to select laboratory tests that seem most appropriate to understand the pathophysiological condition underlying the clinical case. The percentage of students having chosen a given test is made available to the teacher who uses it in an interactive session to stimulate discussion with the whole class of students. The same teacher used the same case studies during 2 consecutive years during the third year of the curriculum. RESULTS: The majority of students answered the questions on the web as requested and evaluated positively their experience with this form of teaching and learning. CONCLUSIONS: Complementing traditional lectures with online case-based studies and interactive group discussions represents, therefore, a simple means to promote the learning and the understanding of complex pathophysiological mechanisms. This simple problem-based approach to teaching and learning may be implemented to cover all fields of medicine.
Resumo:
Introduction: Our goal was to know the web contents and examine the technical information pest control services available to users through their webpages. Method: A total of 70 webpages from biocides services in the province of Málaga (Spain) were analyzed. We used 15 evaluation indicators grouped into 5 parameters relating to data of the service provider; information’s reliability and services; accuracy of content and writing style; technical resources and interaction with the users. As test instruments were used sectoral legislation, official records of products and deliveries, standards and technical guides. Results: Companies showed a remarkable degree of awareness with the implementation and use of new technologies. Aspects negative that they can have an impact on the confidence of users, relating to the reliability of the information and deficiencies associated with the description of the services portfolio and credentials of the companies were identified. The integration and use of collaborative platforms 2.0 was poorly developed and squandered. Discussion: It is possible to improve the trust of users intervening in those aspects that affect the reliability of the information provided on the web.
Resumo:
BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.
Resumo:
Este trabajo final de carrera presenta la arquitectura e implementación de un entorno web para la descripción y visualización de instancias reales del TSP (Travelling Salesman Problem), a través de Google Maps, y su posterior resoluación mediante tècnicas de optimización combinatoria.
Analysis and evaluation of techniques for the extraction of classes in the ontology learning process
Resumo:
This paper analyzes and evaluates, in the context of Ontology learning, some techniques to identify and extract candidate terms to classes of a taxonomy. Besides, this work points out some inconsistencies that may be occurring in the preprocessing of text corpus, and proposes techniques to obtain good terms candidate to classes of a taxonomy.
Resumo:
El propósito de esta comunicación es analizar el estado actual de los periódicos digitales, prestandoespecial atención a la adaptación de éstos al entorno de la Web 2.0. Para llevar a cabo este trabajo sehan estudiado, entre otras, las siguientes variables: el tipo de herramientas propias de la Web 2.0que los periódicos digitales han incorporado a sus sitios web, la forma en que éstas ha cambiado elproceso de creación de los productos periodísticos, la respuesta que los periódicos digitales hanrecibido por parte de la audiencia y el nuevo tipo de interacción creado con los usuarios.
Resumo:
This paper presents a novel efficiencybased evaluation of sentence and word aligners. This assessment is critical in order to make a reliable use in industrial scenarios. The evaluation shows that the resourcesrequired by aligners differ rather broadly. Subsequently, we establish limitation mechanisms on a set of aligners deployed as web services. These results, paired with the quality expected from the aligners, allow providers to choose the most appropriate aligner according to the task at hand.
Resumo:
El proyecto consta de una aplicación web destinada a generar cuadros de mandos para distintos clientes de una manera ágil y dinámica reduciendo así los tiempos de implementación de una herramienta de estas características. Dicha aplicación será desarrollada bajo .NET para el lado del servidor y jQuery & google Maps API para el lado cliente, por lo que hace a los informes y la base de datos todo estará implementado con Microsoft SQL Server 2008 R2.