959 resultados para Web access characterization


Relevância:

80.00% 80.00%

Publicador:

Resumo:

On the last years, several middleware platforms for Wireless Sensor Networks (WSN) were proposed. Most of these platforms does not consider issues of how integrate components from generic middleware architectures. Many requirements need to be considered in a middleware design for WSN and the design, in this case, it is possibility to modify the source code of the middleware without changing the external behavior of the middleware. Thus, it is desired that there is a middleware generic architecture that is able to offer an optimal configuration according to the requirements of the application. The adoption of middleware based in component model consists of a promising approach because it allows a better abstraction, low coupling, modularization and management features built-in middleware. Another problem present in current middleware consists of treatment of interoperability with external networks to sensor networks, such as Web. Most current middleware lacks the functionality to access the data provided by the WSN via the World Wide Web in order to treat these data as Web resources, and they can be accessed through protocols already adopted the World Wide Web. Thus, this work presents the Midgard, a component-based middleware specifically designed for WSNs, which adopts the architectural patterns microkernel and REST. The microkernel architectural complements the component model, since microkernel can be understood as a component that encapsulates the core system and it is responsible for initializing the core services only when needed, as well as remove them when are no more needed. Already REST defines a standardized way of communication between different applications based on standards adopted by the Web and enables him to treat WSN data as web resources, allowing them to be accessed through protocol already adopted in the World Wide Web. The main goals of Midgard are: (i) to provide easy Web access to data generated by WSN, exposing such data as Web resources, following the principles of Web of Things paradigm and (ii) to provide WSN application developer with capabilities to instantiate only specific services required by the application, thus generating a customized middleware and saving node resources. The Midgard allows use the WSN as Web resources and still provide a cohesive and weakly coupled software architecture, addressing interoperability and customization. In addition, Midgard provides two services needed for most WSN applications: (i) configuration and (ii) inspection and adaptation services. New services can be implemented by others and easily incorporated into the middleware, because of its flexible and extensible architecture. According to the assessment, the Midgard provides interoperability between the WSN and external networks, such as web, as well as between different applications within a single WSN. In addition, we assessed the memory consumption, the application image size, the size of messages exchanged in the network, and response time, overhead and scalability on Midgard. During the evaluation, the Midgard proved satisfies their goals and shown to be scalable without consuming resources prohibitively

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Environmental monitoring of aquatic systems is an important tool to support policy makers and environmental managers' decisions. Long-term, continuous collection of environmental data is fundamental to the understanding of an aquatic system. This paper aims to present the integrated system for environmental monitoring (SIMA), a long-term temporal series system with a web-based archive for limnological and meteorological data. The following environmental parameters are measured by SIMA: chlorophyll-a (µgL-1), water surface temperature (ºC), water column temperature by a thermistor string (ºC), turbidity (NTU), pH, dissolved oxygen concentration (mg L-1), electric conductivity (µS cm-1), wind speed (ms-1) and direction (º), relative humidity (%), shortwave radiation (Wm-2) and barometric pressure (hPa). The data were collected in a preprogrammed time interval (1 hour) and were transmitted by satellite in quasi-real time for any user within 2500 km of the acquisition point. So far, 11 hydroelectric reservoirs are being monitored with the SIMA buoy. Basic statistics (mean and standard deviation) and an example of the temporal series of some parameters were displayed at a database with web access. However, sensor and satellite problems occurred due to the high data acquisition frequency. Sensors problems occurred due to the environmental characteristics of each aquatic system. Water quality sensors rapidly degrade in acidic waters, rendering the collected data invalid. Data is also rendered invalid when sensors become infested with periphyton. Problems occur with the satellites' reception of system data when satellites pass over the buoy antenna. However, the data transfer at some inland locations was not completed due to the satellite constellation position. Nevertheless, the integrated system of water quality and meteorological parameters is an important tool in understanding the aquatic system dynamic. It can also be used to create hydrodynamics models of the aquatic system to allow for the study of meteorological implications to the water body.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Educational institutions of all levels invest large amounts of time and resources into instructional technology, with the goal of enhancing the educational effectiveness of the learning environment. The decisions made by instructors and institutions regarding the implementation of technology are guided by perceptions of usefulness held by those who are in control. The primary objective of this mixed methods study was to examine the student and faculty perceptions of technology being used in general education courses at a community college. This study builds upon and challenges the assertions of writers such as Prensky (2001a, 2001b) and Tapscott (1998) who claim that a vast difference in technology perception exists between generational groups, resulting in a diminished usefulness of technology in instruction. In this study, data were gathered through student surveys and interviews, and through faculty surveys and interviews. Analysis of the data used Kendall’s Tau test for correlation between various student and faculty variables in various groupings, and also typological analysis of the transcribed interview data. The analysis of the quantitative data revealed no relationship between age and perception of technology’s usefulness. A positive relationship was found to exist between the perception of the frequency of technology use and the perception of technology’s effectiveness, suggesting that both faculty members and students believed that the more technology is used, the more useful it is in instruction. The analysis of the qualitative data revealed that both faculty and students perceive technology to be useful, and that the most significant barriers to technology’s usefulness include faulty hardware and software systems,lack of user support, and lack of training for faculty. The results of the study suggest that the differences in perception of technology between generations that are proposed by Prensky may not exist when comparing adults from the younger generation with adults from the older generation. Further, the study suggests that institutions continue to invest in instructional technology, with a focus on high levels of support and training for faculty, and more universal availability of specific technologies, including web access, in class video, and presentation software. Adviser: Ronald Joekel

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Latest issue consulted: New ser., v. 15, no. 2 (2001).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presented is webComputing – a general framework of mathematically oriented services including remote access to hardware and software resources for mathematical computations, and web interface to dynamic interactive computations and visualization in a diversity of contexts: mathematical research and engineering, computer-aided mathematical/technical education and distance learning. webComputing builds on the innovative webMathematica technology connecting technical computing system Mathematica to a web server and providing tools for building dynamic and interactive web-interface to Mathematica-based functionality. Discussed are the conception and some of the major components of webComputing service: Scientific Visualization, Domain- Specific Computations, Interactive Education, and Authoring of Interactive Pages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Access to the Internet has grown exponentially in Latin America over the past decade. The International Telecommunications Union (ITU) estimates that in 2009 there were 144.5 million Internet users in South America, 6.4 million in Central America, and 8.2 million in the Caribbean, or a total 159.2 million users in all of Latin America.1 At that time, ITU reported an estimated 31 million Internet users in Mexico, which would bring the overall number of users in Latin America to 190.2 million people. More recent estimates published by Internet World Stats place Internet access currently at an estimated 204.6 million out of a total population of 592.5 million in the region (this figure includes Mexico).2 According to those figures, 34.5 per cent of the Latin American population now enjoys Internet access. In recent years, universal access policies contributed to the vast increase in digital literacy and Internet use in Argentina, Brazil, Chile, Colombia, and Costa Rica. Whereas the latter was the first country in the region to adopt a policy of universal access, the most expansive and successful digital inclusion programs in the region have taken hold in Brazil and Chile. These two countries have allocated considerable resources to the promotion of digital literacy and Internet access among low income and poor populations; in both cases, civil society groups significantly assisted in the promotion of inclusion at the grassroots level. Digital literacy and Internet access have come to represent, particularly in the area of education, a welcome complementary resource for populations chronically underserved in nations with a long-standing record of inadequate public social services. Digital inclusion is vastly expanding throughout the region, thanks to stabilizing economies, increasingly affordable technology, and the rapid growth in the supply of cellular mobile telephony. A recent study by the global advertising agency Razorfish revealed significant shifts in the demographics of digital inclusion in the major economies of South America, where Web access is rapidly increasing amid the lower middle class and the working poor.3 Several researchers have suggested that Internet access will bring about greater civic participation and engagement, although skeptics remain unsure this could happen in Latin America. Yet, there have been some recent instances of political mobilization facilitated through the use of the Web and social media applications, starting in Chile when “smart mobs” nationwide demonstrated against former Chilean President Michelle Bachelet when she failed to enact education reforms in May 2006. The Internet has also been used by marginalized groups and by guerrillas groups to highlight their stories. In sum, Internet access in Latin is no longer a medium restricted to the elite. It is rather a public sphere upon which civil society has staked its claim. Some of the examples noted in this study point toward a developing trend whereby civil society, through online grassroots movements, is able to effectively pressure public officials, instill transparency and demand accountability in government. Access to the Internet has also made it possible for voices on the margins to participate in the conversation in a way that was never previously feasible. 1 International Telecommunications Union [ITU], “Information Technology Public & Report,” accessed May 15, 2011, http://www.itu.int/. 2 Internet World Stats, “Internet Usage Statistics for the Americas,” accessed March 24, 2011, http://www.internetworldstats.com/stats2.htm 3 J. Crump, “The finch and the fox,” London, UK (2010), http://www.slideshare.net/razorfishmarketing/the-finch-and-the-fox.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

delta-Atracotoxin-Ar1a (delta-ACTX-Ar1a) is the major polypeptide neurotoxin isolated from the venom of the male Sydney funnel-web spider, Atrax robustus. This neurotoxin targets both insect and mammalian voltage-gated sodium channels, where it competes with scorpion alpha-toxins for neurotoxin receptor site-3 to slow sodium-channel inactivation. Progress in characterizing the structure and mechanism of action of this toxin has been hampered by the limited supply of pure toxin from natural sources. In this paper, we describe the first successful chemical synthesis and oxidative refolding of the four-disulfide bond containing delta-ACTX-Ar1a. This synthesis involved solid-phase Boc chemistry using double coupling, followed by oxidative folding of purified peptide using a buffer of 2 M GdnHCl and glutathione/glutathiol in a 1:1 mixture of 2-propanol (pH 8.5). Successful oxidation and refolding was confirmed using both chemical and pharmacological characterization. Ion spray mass spectrometry was employed to confirm the molecular weight. H-1 NMR analysis showed identical chemical shifts for native and synthetic toxins, indicating that the synthetic toxin adopts the native fold. Pharmacological studies employing whole-cell patch clamp recordings from rat dorsal root ganglion neurons confirmed that synthetic delta-ACTX-Ar1a produced a slowing of the sodium current inactivation and hyperpolarizing shifts in the voltage-dependence of activation and inactivation similar to native toxin. Under current clamp conditions, we show for the first time that delta-ACTX-Ar1a produces spontaneous repetitive plateau potentials underlying the clinical symptoms seen during envenomation. This successful oxidative refolding of synthetic delta-ACTX-Ar1a paves the way for future structure-activity studies to determine the toxin pharmacophore.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PADICAT is the web archive created in 2005 in Catalonia (Spain ) by the Library of Catalonia (BC ) , the National Library of Catalonia , with the aim of collecting , processing and providing permanent access to the digital heritage of Catalonia . Its harvesting strategy is based on the hybrid model ( of massive harvesting . SPA top level domain ; selective compilation of the web site output of Catalan organizations; focused harvesting of public events) . The system provides open access to the whole collection , on the Internet . We consider necessary to complement the current search for new and visualization software with open source software tool, CAT ( Curator Archiving Tool) , composed by three modules aimed to effectively managing the processes of human cataloguing ; to publish directories where the digital resources and special collections ; and to offer statistical information of added value to end users. Within the framework of the International Internet Preservation Consortium meeting ( Vienna 2010) , the progress in the development of this new tool, and the philosophy that has motivated his design, are presented to the international community.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Es descriuen els passos necessaris per publicar a Internet una base de dades creada amb Microsoft Access utilitzant la tecnologia ASP. Es parteix d'un exemple de base de dades creada per a controlar el procés d'adquisicions i es van resseguint, en forma de tutorial, els diferents passos que seran necessaris per a la seva consulta des del web. Finalment, s'indiquen algunes aplicacions de la tecnologia ASP que podren ser útils per a biblioteques i centres de documentació.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Se describen los elementos necesarios para publicar en Internet una base de datos creada con Microsoft Access utilizando la tecnología ASP. Se parte de un ejemplo de base de datos creada para controlar el proceso de adquisiciones y se van resiguiendo, en forma de tutorial, los diferentes pasos que serán necesarios para su consulta desde el web. Finalmente, se indican algunas aplicaciones de la tecnología ASP que pueden ser útiles para bibliotecas y centros de documentación.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This Thesis deals with the fabrication and characterization of novel all-fiber components for access networks. All fiber components offer distinctive advantages due to low forward and backward losses, epoxy free optical path and high power handling. A novel fabrication method for monolithic 1x4 couplers, which are vital components in distributed passive optical networks, is realized. The fabrication method differs from conventional structures with a symmetric coupling profile and hence offers ultra wideband performance and easy process control. New structure for 1x4 couplers, by fusing five fibers is proposed to achieve high uniformity, which gives equivalent uniformity performance to 1x4 planar lightwave splitters, isolation in fused fiber WDM is improved with integration of long period gratings. Packaging techniques of fused couplers are analyzed for long term stability.