957 resultados para end user programming


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recently, there has been growing interest in developing optical fiber networks to support the increasing bandwidth demands of multimedia applications, such as video conferencing and World Wide Web browsing. One technique for accessing the huge bandwidth available in an optical fiber is wavelength-division multiplexing (WDM). Under WDM, the optical fiber bandwidth is divided into a number of nonoverlapping wavelength bands, each of which may be accessed at peak electronic rates by an end user. By utilizing WDM in optical networks, we can achieve link capacities on the order of 50 THz. The success of WDM networks depends heavily on the available optical device technology. This paper is intended as a tutorial on some of the optical device issues in WDM networks. It discusses the basic principles of optical transmission in fiber and reviews the current state of the art in optical device technology. It introduces some of the basic components in WDM networks, discusses various implementations of these components, and provides insights into their capabilities and limitations. Then, this paper demonstrates how various optical components can be incorporated into WDM optical networks for both local and wide-area applications. Last, the paper provides a brief review of experimental WDM networks that have been implemented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computer and telecommunication networks are changing the world dramatically and will continue to do so in the foreseeable future. The Internet, primarily based on packet switches, provides very flexible data services such as e-mail and access to the World Wide Web. The Internet is a variable-delay, variable- bandwidth network that provides no guarantee on quality of service (QoS) in its initial phase. New services are being added to the pure data delivery framework of yesterday. Such high demands on capacity could lead to a “bandwidth crunch” at the core wide-area network, resulting in degradation of service quality. Fortunately, technological innovations have emerged which can provide relief to the end user to overcome the Internet’s well-known delay and bandwidth limitations. At the physical layer, a major overhaul of existing networks has been envisaged from electronic media (e.g., twisted pair and cable) to optical fibers - in wide-area, metropolitan-area, and even local-area settings. In order to exploit the immense bandwidth potential of optical fiber, interesting multiplexing techniques have been developed over the years.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lightpath scheduling is an important capability in next-generation wavelength-division multiplexing (WDM) optical networks to reserve resources in advance for a specified time period while provisioning end-to-end lightpaths. In a dynamic environment, the end user requests for dynamic scheduled lightpath demands (D-SLDs) need to be serviced without the knowledge of future requests. Even though the starting time of the request may be hours or days from the current time, the end-user however expects a quick response as to whether the request could be satisfied. We propose a two-phase approach to dynamically schedule and provision D-SLDs. In the first phase, termed the deterministic lightpath scheduling phase, upon arrival of a lightpath request, the network control plane schedules a path with guaranteed resources so that the user can get a quick response with a deterministic lightpath schedule. In the second phase, termed the lightpath re-optimization phase, we re-provision some already scheduled lightpaths to re-optimize for improving network performance. We study two reoptimization scenarios to reallocate network resources while maintaining the existing lightpath schedules. Experimental results show that our proposed two-phase dynamic lightpath scheduling approach can greatly reduce network blocking.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Spreadsheets are widely used but often contain faults. Thus, in prior work we presented a data-flow testing methodology for use with spreadsheets, which studies have shown can be used cost-effectively by end-user programmers. To date, however, the methodology has been investigated across a limited set of spreadsheet language features. Commercial spreadsheet environments are multiparadigm languages, utilizing features not accommodated by our prior approaches. In addition, most spreadsheets contain large numbers of replicated formulas that severely limit the efficiency of data-flow testing approaches. We show how to handle these two issues with a new data-flow adequacy criterion and automated detection of areas of replicated formulas, and report results of a controlled experiment investigating the feasibility of our approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a wide range of video services over complex transmission networks, and in some cases end users fail to receive an acceptable quality level. In this paper, the different factors that degrade users' quality of experience (QoE) in video streaming service that use TCP as transmission protocol are studied. In this specific service, impairment factors are: number of pauses, their duration and temporal location. In order to measure the effect that each temporal segment has in the overall video quality, subjective tests. Because current subjective test methodologies are not adequate to assess video streaming over TCP, some recommendations are provided here. At the application layer, a customized player is used to evaluate the behavior of player buffer, and consequently, the end user QoE. Video subjective test results demonstrate that there is a close correlation between application parameters and subjective scores. Based on this fact, a new metrics named VsQM is defined, which considers the importance of temporal location of pauses to assess the user QoE of video streaming service. A useful application scenario is also presented, in which the metrics proposed herein is used to improve video services(1).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a wide range of telecommunications services that transmit voice, video and data through complex transmission networks and in some cases, the service has not an acceptable quality level for the end user. In this sense the study of methods for assessing video quality and voice have a very important role. This paper presents a classification scheme, based on different criteria, of the methods and metrics that are being studied in recent years. This paper presents how the video quality is affected by degradation in the transmission channel in two kinds of services: Digital TV (ISDB-TB) due the fading in the air interface and video streaming service on an IP network due packet loss. For Digital TV tests was set up a scenario where the digital TV transmitter is connected to an RF channel emulator, where are inserted different fading models and at the end, the videos are saved in a mobile device. The tests of streaming video were performed in an isolated scenario of IP network, which are scheduled several network conditions, resulting in different qualities of video reception. The video quality assessment is performed using objective assessment methods: PSNR, SSIM and VQM. The results show how the losses in the transmission channel affects the quality of end-user experience on both services studied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In many countries buildings are responsible for a substantial part of the energy consumption, nd it varies according to their energetic and environmental performances. The potential for major reductions in buildings consumption have bee well documented in Brazil. Opportunities have been identified throughout the life cycle of the buildings, due of projects in diverse locations without the proper adjustments. This article offers a reflection about project processes and how its understanding can be conducted in an integrated way, favoring the use of natural resources and lowering energy consumption. It concludes by indicating that the longest phase in the life cycle of a building is also the phase responsible for its largest energy consumption, not only because of its duration but also for the interaction with the end user. Therefore, in order to harvest the energy cost reduction potential from future buildings designers need a holistic view of the surrounding, end users, materials and methodologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, due to the rapid convergence of multimedia services, Internet and wireless communications, there has been a growing trend of heterogeneity (in terms of channel bandwidths, mobility levels of terminals, end-user quality-of-service (QoS) requirements) for emerging integrated wired/wireless networks. Moreover, in nowadays systems, a multitude of users coexists within the same network, each of them with his own QoS requirement and bandwidth availability. In this framework, embedded source coding allowing partial decoding at various resolution is an appealing technique for multimedia transmissions. This dissertation includes my PhD research, mainly devoted to the study of embedded multimedia bitstreams in heterogenous networks, developed at the University of Bologna, advised by Prof. O. Andrisano and Prof. A. Conti, and at the University of California, San Diego (UCSD), where I spent eighteen months as a visiting scholar, advised by Prof. L. B. Milstein and Prof. P. C. Cosman. In order to improve the multimedia transmission quality over wireless channels, joint source and channel coding optimization is investigated in a 2D time-frequency resource block for an OFDM system. We show that knowing the order of diversity in time and/or frequency domain can assist image (video) coding in selecting optimal channel code rates (source and channel code rates). Then, adaptive modulation techniques, aimed at maximizing the spectral efficiency, are investigated as another possible solution for improving multimedia transmissions. For both slow and fast adaptive modulations, the effects of imperfect channel estimation errors are evaluated, showing that the fast technique, optimal in ideal systems, might be outperformed by the slow adaptive modulation, when a real test case is considered. Finally, the effects of co-channel interference and approximated bit error probability (BEP) are evaluated in adaptive modulation techniques, providing new decision regions concepts, and showing how the widely used BEP approximations lead to a substantial loss in the overall performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis deals with Context Aware Services, Smart Environments, Context Management and solutions for Devices and Service Interoperability. Multi-vendor devices offer an increasing number of services and end-user applications that base their value on the ability to exploit the information originating from the surrounding environment by means of an increasing number of embedded sensors, e.g. GPS, compass, RFID readers, cameras and so on. However, usually such devices are not able to exchange information because of the lack of a shared data storage and common information exchange methods. A large number of standards and domain specific building blocks are available and are heavily used in today's products. However, the use of these solutions based on ready-to-use modules is not without problems. The integration and cooperation of different kinds of modules can be daunting because of growing complexity and dependency. In this scenarios it might be interesting to have an infrastructure that makes the coexistence of multi-vendor devices easy, while enabling low cost development and smooth access to services. This sort of technologies glue should reduce both software and hardware integration costs by removing the trouble of interoperability. The result should also lead to faster and simplified design, development and, deployment of cross-domain applications. This thesis is mainly focused on SW architectures supporting context aware service providers especially on the following subjects: - user preferences service adaptation - context management - content management - information interoperability - multivendor device interoperability - communication and connectivity interoperability Experimental activities were carried out in several domains including Cultural Heritage, indoor and personal smart spaces – all of which are considered significant test-beds in Context Aware Computing. The work evolved within european and national projects: on the europen side, I carried out my research activity within EPOCH, the FP6 Network of Excellence on “Processing Open Cultural Heritage” and within SOFIA, a project of the ARTEMIS JU on embedded systems. I worked in cooperation with several international establishments, including the University of Kent, VTT (the Technical Reserarch Center of Finland) and Eurotech. On the national side I contributed to a one-to-one research contract between ARCES and Telecom Italia. The first part of the thesis is focused on problem statement and related work and addresses interoperability issues and related architecture components. The second part is focused on specific architectures and frameworks: - MobiComp: a context management framework that I used in cultural heritage applications - CAB: a context, preference and profile based application broker which I designed within EPOCH Network of Excellence - M3: "Semantic Web based" information sharing infrastructure for smart spaces designed by Nokia within the European project SOFIA - NoTa: a service and transport independent connectivity framework - OSGi: the well known Java based service support framework The final section is dedicated to the middleware, the tools and, the SW agents developed during my Doctorate time to support context-aware services in smart environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Precision Agriculture (PA) and the more specific branch of Precision Horticulture are two very promising sectors. They focus on the use of technologies in agriculture to optimize the use of inputs, so to reach a better efficiency, and minimize waste of resources. This important objective motivated many researchers and companies to search new technology solutions. Sometimes the effort proved to be a good seed, but sometimes an unfeasible idea. So that PA, from its birth more or less 25 years ago, is still a “new” management, interesting for the future, but an actual low adoption rate is still reported by experts and researchers. This work aims to give a contribution in finding the causes of this low adoption rate and proposing a methodological solution to this problem. The first step was to examine prior research about Precision Agriculture adoption, by ex ante and ex post approach. It was supposed as important to find connections between these two phases of a purchase experience. In fact, the ex ante studies dealt with potential consumer’s perceptions before a usage experience occurred, therefore before purchasing a technology, while the ex post studies described the drivers which made a farmer become an end-user of PA technology. Then, an example of consumer research is presented. This was an ex ante research focused on pre-prototype technology for fruit production. This kind of research could give precious information about consumer acceptance before reaching an advanced development phase of the technology, and so to have the possibility to change something with the least financial impact. The final step was to develop the pre-prototype technology that was the subject of the consumer acceptance research and test its technical characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Volatile amines are prominent indicators of food freshness, as they are produced during many microbiological food degradation processes. Monitoring and indicating the volatile amine concentration within the food package by intelligent packaging solutions might therefore be a simple yet powerful way to control food safety throughout the distribution chain.rnrnIn this context, this work aims to the formation of colourimetric amine sensing surfaces on different substrates, especially transparent PET packaging foil. The colour change of the deposited layers should ideally be discernible by the human eye to facilitate the determination by the end-user. rnrnDifferent tailored zinc(II) and chromium(III) metalloporphyrins have been used as chromophores for the colourimetric detection of volatile amines. A new concept to increase the porphyrins absorbance change upon exposure to amines is introduced. Moreover, the novel porphyrins’ processability during the deposition process is increased by their enhanced solubility in non-polar solvents.rnrnThe porphyrin chromophores have successfully been incorporated into polysiloxane matrices on different substrates via a dielectric barrier discharge enhanced chemical vapour deposition. This process allows the use of nitrogen as a cheap and abundant plasma gas, produces minor amounts of waste and by-products and can be easily introduced into (existing) roll-to-roll production lines. The formed hybrid sensing layers tightly incorporate the porphyrins and moreover form a porous structure to facilitate the amines diffusion to and interaction with the chromophores.rnrnThe work is completed with the thorough analysis of the porphyrins’ amine sensing performance in solution as well as in the hybrid coatings . To reveal the underlying interaction mechanisms, the experimental results are supported by DFT calculations. The deposited layers could be used for the detection of NEt3 concentrations below 10 ppm in the gas phase. Moreover, the coated foils have been tested in preliminary food storage experiments. rnrnThe mechanistic investigations on the interaction of amines with chromium(III) porphyrins revealed a novel pathway to the formation of chromium(IV) oxido porphyrins. This has been used for electrochemical epoxidation reactions with dioxygen as the formal terminal oxidant.rn