971 resultados para web-site development
Resumo:
"June 22, 2006."
Resumo:
"B-286196"--P. 3.
Resumo:
In this article the medical data-advisory web-resource developed by authors is considered. This resource allows carrying out information interchange between consumers of medical services and the medical establishments which give these services, and firms-manufacturers of medical equipment and medicaments. Main sections of this web-site, their purposes and capabilities are considered in this article.
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. ^ Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a twofold “custom wrapper” approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. ^ Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. ^ This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases. ^
Resumo:
The purpose of this study was to explore the attitudes, beliefs, and practices of community college professors regarding education for sustainable development (ESD). In-depth interviews with 14 professors from different disciplines were conducted. The participants taught at Miami Dade College, Florida, a Talloires Declaration signatory since 2006, and all had attended Green Studies professional development workshops. Written documents such as assignments and samples of student work were used for triangulation. The annual report of the college’s Earth Ethics Institute and its Web site served as additional sources. The interviews were recorded, transcribed, and analyzed for common themes. The Talloires Declaration’s 10-point action plan and the key characteristics of ESD (UN DESD, 2006) served as the conceptual framework. The study found that the professors considered ESD an essential issue. The majority discussed the economic and social aspects of ESD; however, the environmental aspect was mentioned most frequently. The professors’ conceptualizations of ESD were influenced by their experiences and evidenced by the metaphors they used. Although their engagement with ESD differed, the professors expressed optimism toward ESD related teaching and learning. They regarded ESD as compatible with their subjects, and most had already been infusing sustainability into their courses or planned to do so. Additionally, the participants’ teaching practices reflected many of the characteristics of ESD. Even though the professors considered ESD challenging, they believed that they could make contributions to the college’s effort. The metaphor of “Planting a Seed” was frequently used to describe this holistic approach. The study also found that many professors regarded interpersonal relationships and communication significant factors for the advancement of ESD. The participants described several challenges to integrating ESD at their college. These related to time constraints, density of curriculum, institutional size and fragmentation, dearth of administrative support and incentives, students’ lack of academic preparation and sustainability awareness, students’ inability to focus on ESD because of personal, social, or economic circumstances, and professors’ frustration about a divisive atmosphere as a result of their engagement with sustainability. Despite these obstacles, the professors believed that ESD could be successfully woven into the community college experience.
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.
Resumo:
A manutenção do conteúdo web pode ser uma tarefa difícil, especialmente se considerarmos websites em que muitos utilizadores têm permissões para alterar o seu conteúdo. Um exemplo deste tipo de websites são os wikis. Se por um lado permitem rápida disseminação de conhecimento, por outro lado implicam um grande esforço para verificar a qualidade do seu conteúdo. Nesta tese analisamos diferentes abordagens à modelação de websites, especialmente para a verificação de conteúdo, onde contribuímos com uma extensão à ferramenta VeriFLog para a tornar mais adequada à verificação de conteúdos em websites colaborativos.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as Nuclear Magnetic Resonance, Gas or Liquid Chromatography, Mass Spectrometry, Infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines.
Resumo:
L'explosió d'aplicacions a Internet basades en oferir serveis de localització, com són portals web de mobilitat o aplicacions de seguiment de vehicles en línia han motivat aquest projecte. Google Maps ens permet afegir mapes fàcilment en un lloc web amb la seva API, però OpenLayers, una llibreria JavaScript lliure ens dóna l'opció de carregar cobertures de mapa i marcadors des de qualsevol font. OpenStreetMap proporciona dades geogràfiques de manera lliure, com per exemple mapes de carrers i carreteres. Un estudi acurat de l'estructura i agrupació de les dades en el format OSM i el desenvolupament d'un servidor basat en el model de tessel·les, són els principals elements de partida per crear la nostra pròpia font de dades cartogràfiques. En aquest projecte s'analitza i processa DXF Drawing eXchange Format passant al format OSM. Un fitxer OSM conté la informació geogràfica necessària per a la base de dades espaial a partir de la qual, entre d'altres aplicacions, es podran visualitzar els mapes propis en una aplicació de seguiment de vehicles o en un portal web.
Resumo:
PTSA és el projecte resultat del desenvolupament de la Plataforma Telemàtica de Serveis Administratius proposada com a iniciativa per l'Ajuntament de Ripollet. Amb l'objectiu de promoure la utilització dels canals de comunicació segurs entre el ciutadà i l'Administració, aporta els serveis d'acreditació, registre i consulta. Integrada amb el portal ripollet.cat, implementa les tecnologies per al tractament de l'acreditació d'usuaris mitjançant certificació digital i la generació dinàmica de documents signats electrònicament. Dins el marc d'execució de la llei 11/2007 d'accés electrònic dels ciutadans als serveis públics, és un gran primer pas en el llarg camí amb destinació a una Administració electrònica completa i de qualitat.
Resumo:
Desenvolupament d'una aplicació web amb projecció a comunitat virtual d’usuaris, que també integra un sistema de transmissió de streaming. Mitjançant aquest sistema els usuaris són capaços d’emetre o d’escoltar un programa radiofònic en temps real o offline.
Resumo:
Most life science processes involve, at the atomic scale, recognition between two molecules. The prediction of such interactions at the molecular level, by so-called docking software, is a non-trivial task. Docking programs have a wide range of applications ranging from protein engineering to drug design. This article presents SwissDock, a web server dedicated to the docking of small molecules on target proteins. It is based on the EADock DSS engine, combined with setup scripts for curating common problems and for preparing both the target protein and the ligand input files. An efficient Ajax/HTML interface was designed and implemented so that scientists can easily submit dockings and retrieve the predicted complexes. For automated docking tasks, a programmatic SOAP interface has been set up and template programs can be downloaded in Perl, Python and PHP. The web site also provides an access to a database of manually curated complexes, based on the Ligand Protein Database. A wiki and a forum are available to the community to promote interactions between users. The SwissDock web site is available online at http://www.swissdock.ch. We believe it constitutes a step toward generalizing the use of docking tools beyond the traditional molecular modeling community.
Resumo:
Aquest treball té com a finalitat, d'una banda, col·laborar a determinar les mancances, tant de disseny com de contingut, que té actualment el web dels Estudis d'Humanitats i Filologia de la UOC i, de l'altra, fer una proposta d'un nou disseny, que inclogui la reestructuració dels continguts.
Resumo:
En el present treball es definirà el cicle de vida i la metodologia que s'ha de seguir, segons el disseny centrat en l'usuari, per a un projecte web orientat a continguts específics per a persones de la tercera edat, per a això, es determinarà i justificarà quins mètodes i tècniques d'avaluació de la usabilitat són els més adequats per poder dissenyar un lloc web d'aquest tipus.