949 resultados para Web application
Resumo:
En esta tesis se propone el uso de agentes inteligentes en entornos de aprendizaje en línea con el fin de mejorar la asistencia y motivación del estudiante a través de contenidos personalizados que tienen en cuenta el estilo de aprendizaje del estudiante y su nivel de conocimiento. Los agentes propuestos se desempeñan como asistentes personales que ayudan al estudiante a llevar a cabo las actividades de aprendizaje midiendo su progreso y motivación. El entorno de agentes se construye a través de una arquitectura multiagente llamada MASPLANG diseñada para dar soporte adaptativo (presentación y navegación adaptativa) a un sistema hipermedia educativo desarrollado en la Universitat de Girona para impartir educación virtual a través del web. Un aspecto importante de esta propuesta es la habilidad de construir un modelo de estudiante híbrido que comienza con un modelo estereotípico del estudiante basado en estilos de aprendizaje y se modifica gradualmente a medida que el estudiante interactúa con el sistema (gustos subjetivos). Dentro del contexto de esta tesis, el aprendizaje se define como el proceso interno que, bajo factores de cambio resulta en la adquisición de la representación interna de un conocimiento o de una actitud. Este proceso interno no se puede medir directamente sino a través de demostraciones observables externas que constituyen el comportamiento relacionado con el objeto de conocimiento. Finalmente, este cambio es el resultado de la experiencia o entrenamiento y tiene una durabilidad que depende de factores como la motivación y el compromiso. El MASPLANG está compuesto por dos niveles de agentes: los intermediarios llamados IA (agentes de información) que están en el nivel inferior y los de Interfaz llamados PDA (agentes asistentes) que están en el nivel superior. Los agentes asistentes atienden a los estudiantes cuando trabajan con el material didáctico de un curso o una lección de aprendizaje. Esta asistencia consiste en la recolección y análisis de las acciones de los estudiantes para ofrecer contenidos personalizados y en la motivación del estudiante durante el aprendizaje mediante el ofrecimiento de contenidos de retroalimentación, ejercicios adaptados al nivel de conocimiento y mensajes, a través de interfaces de usuario animadas y atractivas. Los agentes de información se encargan del mantenimiento de los modelos pedagógico y del dominio y son los que están en completa interacción con las bases de datos del sistema (compendio de actividades del estudiante y modelo del dominio). El escenario de funcionamiento del MASPLANG está definido por el tipo de usuarios y el tipo de contenidos que ofrece. Como su entorno es un sistema hipermedia educativo, los usuarios se clasifican en profesores quienes definen y preparan los contenidos para el aprendizaje adaptativo, y los estudiantes quienes llevan a cabo las actividades de aprendizaje de forma personalizada. El perfil de aprendizaje inicial del estudiante se captura a través de la evaluación del cuestionario ILS (herramienta de diagnóstico del modelo FSLSM de estilos de aprendizaje adoptado para este estudio) que se asigna al estudiante en su primera interacción con el sistema. Este cuestionario consiste en un conjunto de preguntas de naturaleza sicológica cuyo objetivo es determinar los deseos, hábitos y reacciones del estudiante que orientarán la personalización de los contenidos y del entorno de aprendizaje. El modelo del estudiante se construye entonces teniendo en cuenta este perfil de aprendizaje y el nivel de conocimiento obtenido mediante el análisis de las acciones del estudiante en el entorno.
Resumo:
El projecte iSAC (Servei Intel·ligent d’Atenció Ciutadana via web) es va iniciar el mes de gener de 2006 amb l’ajut del nou coneixement científic en agents intel·ligents, junt amb l’aplicació de les Tecnologies de la Informació i la Comunicació (TIC) i els cercadors. Actualment, el servei actual d’atenció al ciutadà està composat per dues àrees: l’atenció directa a les oficines i l’atenció telefònica a través del Call Center. Les limitacions de personal i horari d’atenció fan que aquest servei perdi eficàcia. Es vol desenvolupar un producte amb una tecnologia capaç d’ampliar i millorar la capacitat i la qualitat de l’atenció ciutadana en les administracions públiques, sigui quina sigui la seva dimensió. Tot i això, aquest projecte l’explotaran especialment els ajuntaments, als quals la ciutadania s'acosta amb tot tipus de preguntes i dubtes, habitualment no restringides a l'àmbit local. Més concretament, es vol automatitzar a través d’un portal web l’atenció al ciutadà per tal d’obtenir un servei més efectiu
Resumo:
The amateur birding community has a long and proud tradition of contributing to bird surveys and bird atlases. Coordinated activities such as Breeding Bird Atlases and the Christmas Bird Count are examples of "citizen science" projects. With the advent of technology, Web 2.0 sites such as eBird have been developed to facilitate online sharing of data and thus increase the potential for real-time monitoring. However, as recently articulated in an editorial in this journal and elsewhere, monitoring is best served when based on a priori hypotheses. Harnessing citizen scientists to collect data following a hypothetico-deductive approach carries challenges. Moreover, the use of citizen science in scientific and monitoring studies has raised issues of data accuracy and quality. These issues are compounded when data collection moves into the Web 2.0 world. An examination of the literature from social geography on the concept of "citizen sensors" and volunteered geographic information (VGI) yields thoughtful reflections on the challenges of data quality/data accuracy when applying information from citizen sensors to research and management questions. VGI has been harnessed in a number of contexts, including for environmental and ecological monitoring activities. Here, I argue that conceptualizing a monitoring project as an experiment following the scientific method can further contribute to the use of VGI. I show how principles of experimental design can be applied to monitoring projects to better control for data quality of VGI. This includes suggestions for how citizen sensors can be harnessed to address issues of experimental controls and how to design monitoring projects to increase randomization and replication of sampled data, hence increasing scientific reliability and statistical power.
Resumo:
The storage and processing capacity realised by computing has lead to an explosion of data retention. We now reach the point of information overload and must begin to use computers to process more complex information. In particular, the proposition of the Semantic Web has given structure to this problem, but has yet realised practically. The largest of its problems is that of ontology construction; without a suitable automatic method most will have to be encoded by hand. In this paper we discus the current methods for semi and fully automatic construction and their current shortcomings. In particular we pay attention the application of ontologies to products and the particle application of the ontologies.
Resumo:
Many producers of geographic information are now disseminating their data using open web service protocols, notably those published by the Open Geospatial Consortium. There are many challenges inherent in running robust and reliable services at reasonable cost. Cloud computing provides a new kind of scalable infrastructure that could address many of these challenges. In this study we implement a Web Map Service for raster imagery within the Google App Engine environment. We discuss the challenges of developing GIS applications within this framework and the performance characteristics of the implementation. Results show that the application scales well to multiple simultaneous users and performance will be adequate for many applications, although concerns remain over issues such as latency spikes. We discuss the feasibility of implementing services within the free usage quotas of Google App Engine and the possibility of extending the approaches in this paper to other GIS applications.
Resumo:
The construction industry has incurred a considerable amount of waste as a result of poor logistics supply chain network management. Therefore, managing logistics in the construction industry is critical. An effective logistic system ensures delivery of the right products and services to the right players at the right time while minimising costs and rewarding all sectors based on value added to the supply chain. This paper reports on an on-going research study on the concept of context-aware services delivery in the construction project supply chain logistics. As part of the emerging wireless technologies, an Intelligent Wireless Web (IWW) using context-aware computing capability represents the next generation ICT application to construction-logistics management. This intelligent system has the potential of serving and improving the construction logistics through access to context-specific data, information and services. Existing mobile communication deployments in the construction industry rely on static modes of information delivery and do not take into account the worker’s changing context and dynamic project conditions. The major problems in these applications are lack of context-specificity in the distribution of information, services and other project resources, and lack of cohesion with the existing desktop based ICT infrastructure. The research works focus on identifying the context dimension such as user context, environmental context and project context, selection of technologies to capture context-parameters such wireless sensors and RFID, selection of supporting technologies such as wireless communication, Semantic Web, Web Services, agents, etc. The process of integration of Context-Aware Computing and Web-Services to facilitate the creation of intelligent collaboration environment for managing construction logistics will take into account all the necessary critical parameters such as storage, transportation, distribution, assembly, etc. within off and on-site project.
Resumo:
Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware or faulty system or application software. Monitoring systems such as GridRM provide the ability to connect to any number of different types of monitoring agents and provide different views of the system, based on a client's particular preferences. Web 2.0 technologies, and in particular 'mashups', are emerging as a promising technique for rapidly constructing rich user interfaces, that combine and present data in intuitive ways. This paper describes a Web 2.0 user interface that was created to expose resource data harvested by the GridRM resource monitoring system.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.
Resumo:
OWL-S is an application of OWL, the Web Ontology Language, that describes the semantics of Web Services so that their discovery, selection, invocation and composition can be automated. The research literature reports the use of UML diagrams for the automatic generation of Semantic Web Service descriptions in OWL-S. This paper demonstrates a higher level of automation by generating complete complete Web applications from OWL-S descriptions that have themselves been generated from UML. Previously, we proposed an approach for processing OWL-S descriptions in order to produce MVC-based skeletons for Web applications. The OWL-S ontology undergoes a series of transformations in order to generate a Model-View-Controller application implemented by a combination of Java Beans, JSP, and Servlets code, respectively. In this paper, we show in detail the documents produced at each processing step. We highlight the connections between OWL-S specifications and executable code in the various Java dialects and show the Web interfaces that result from this process.
Resumo:
Detta examensarbete har utförts på SSAB- Tunnplåt i Borlänge under vårterminen 2004 och omfattar 10 veckors arbete.SSAB sköter idag sin kommunikation med distributionslagren via fax, telefon eller e-post. Eftersom detta är ett ganska tidskrävande kommunikationssätt, vill SSAB ha en smidigare och snabbare kommunikationslösning. Den lösning som SSAB vill ha är en extern Web-service-lösning för att upprätta en säker kommunikation med sina distributionslager.Parallellt med byggandet av Web-service-lösningen arbetades en förvaltningsmodell fram. Den beskriver hur förvaltningsorganisationen med dess rutiner kan se ut vid implementering av lösningen.För att skapa en säker förbindelse med Web-servicen skall en webbklient användas som i sin tur anropar en COM+ komponent. Detta för att kunna skicka med certifikatet ifrån webbklienten till webbservern där Web-servicen ligger. COM+ komponenten måste få tillgång till en användarprofil när den kommunicerar med Web-servicen. Detta för att kunna upprätta en SSL-förbindelse i det inledande skedet. SSL-förbindelsen skall läggas i den VPN-tunnel som mVPN tillhandahåller via WSSAL.