8 resultados para Network performance

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed at analyzing the relationship between slow- and fast-alpha asymmetry within frontal cortex and the planning, execution and voluntary control of saccadic eye movements (SEM), and quantitative electroencephalography (qEEG) was recorded using a 20-channel EEG system in 12 healthy participants performing a fixed (i.e., memory-driven) and a random SEM (i.e., stimulus-driven) condition. We find main effects for SEM condition in slow- and fast-alpha asymmetry at electrodes F3-F4, which are located over premotor cortex, specifically a negative asymmetry between conditions. When analyzing electrodes F7-F8, which are located over prefrontal cortex, we found a main effect for condition in slow-alpha asymmetry, particularly a positive asymmetry between conditions. In conclusion, the present approach supports the association of slow- and fast-alpha bands with the planning and preparation of SEM, and the specific role of these sub-bands for both, the attention network and the coordination and integration of sensory information with a (oculo)-motor response. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider a communication system in which a transmitter equipment sends fixed-size packets of data at a uniform rate to a receiver equipment. Consider also that these equipments are connected by a packet-switched network, which introduces a random delay to each packet. Here we propose an adaptive clock recovery scheme able of synchronizing the frequencies and the phases of these devices, within specified limits of precision. This scheme for achieving frequency and phase synchronization is based on measurements of the packet arrival times at the receiver, which are used to control the dynamics of a digital phase-locked loop. The scheme performance is evaluated via numerical simulations performed by using realistic parameter values. (C) 2011 Elsevier By. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trigeneration systems have been used with advantage in the last years in distributed electricity generation systems as a function of a growth of natural gas pipeline network distribution system, tax incentives, and energy regulation policies. Typically, a trigeneration system is used to produce electrical power simultaneously with supplying heating and cooling load by recovering the combustion products thermal power content that otherwise would be driven to atmosphere. Concerning that, two small scale trigeneration plants have been tested for overall efficiency evaluation and operational comparison. The first system is based on a 30 kW (ISO) natural gas powered microturbine, and the second one uses a 26 kW natural gas powered internal combustion engine coupled to an electrical generator as a prime mover. The stack gases from both machines were directed to a 17.6 kW ammonia-water absorption refrigeration chiller for producing chilled water first and next to a water heat recovery boiler in order to produce hot water. Experimental results are presented along with relevant system operational parameters for appropriate operation including natural gas consumption, net electrical and thermal power production, i.e., hot and cold water production rates, primary energy saving index, and the energy utilization factor over total and partial electrical load operational conditions. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional supervised data classification considers only physical features (e. g., distance or similarity) of the input data. Here, this type of learning is called low level classification. On the other hand, the human (animal) brain performs both low and high orders of learning and it has facility in identifying patterns according to the semantic meaning of the input data. Data classification that considers not only physical attributes but also the pattern formation is, here, referred to as high level classification. In this paper, we propose a hybrid classification technique that combines both types of learning. The low level term can be implemented by any classification technique, while the high level term is realized by the extraction of features of the underlying network constructed from the input data. Thus, the former classifies the test instances by their physical features or class topologies, while the latter measures the compliance of the test instances to the pattern formation of the data. Our study shows that the proposed technique not only can realize classification according to the pattern formation, but also is able to improve the performance of traditional classification techniques. Furthermore, as the class configuration's complexity increases, such as the mixture among different classes, a larger portion of the high level term is required to get correct classification. This feature confirms that the high level classification has a special importance in complex situations of classification. Finally, we show how the proposed technique can be employed in a real-world application, where it is capable of identifying variations and distortions of handwritten digit images. As a result, it supplies an improvement in the overall pattern recognition rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of industrial clustering has been studied in-depth by policy makers and researchers from many fields, mainly due to the competitive advantages it may bring to regional economies. Companies often take part in collaborative initiatives with local partners while also taking advantage of knowledge spillovers to benefit from locating in a cluster. Thus, Knowledge Management (KM) and Performance Management (PM) have become relevant topics for policy makers and cluster associations when undertaking collaborative initiatives. Taking this into account, this paper aims to explore the interplay between both topics using a case study conducted in a collaborative network formed within a cluster. The results show that KM should be acknowledged as a formal area of cluster management so that PM practices can support knowledge-oriented initiatives and therefore make better use of the new knowledge created. Furthermore, tacit and explicit knowledge resulting from PM practices needs to be stored and disseminated throughout the cluster as a way of improving managerial practices and regional strategic direction. Knowledge Management Research & Practice (2012) 10, 368-379. doi:10.1057/kmrp.2012.23

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A neural network model to predict ozone concentration in the Sao Paulo Metropolitan Area was developed, based on average values of meteorological variables in the morning (8:00-12:00 hr) and afternoon (13:00-17: 00 hr) periods. Outputs are the maximum and average ozone concentrations in the afternoon (12:00-17:00 hr). The correlation coefficient between computed and measured values was 0.82 and 0.88 for the maximum and average ozone concentration, respectively. The model presented good performance as a prediction tool for the maximum ozone concentration. For prediction periods from 1 to 5 days 0 to 23% failures (95% confidence) were obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Brazilian network for genotyping is composed of 21 laboratories that perform and analyze genotyping tests for all HIV-infected patients within the public system, performing approximately 25,000 tests per year. We assessed the interlaboratory and intralaboratory reproducibility of genotyping systems by creating and implementing a local external quality control evaluation. Plasma samples from HIV-1-infected individuals (with low and intermediate viral loads) or RNA viral constructs with specific mutations were used. This evaluation included analyses of sensitivity and specificity of the tests based on qualitative and quantitative criteria, which scored laboratory performance on a 100-point system. Five evaluations were performed from 2003 to 2008, with 64% of laboratories scoring over 80 points in 2003, 81% doing so in 2005, 56% in 2006, 91% in 2007, and 90% in 2008 (Kruskal-Wallis, p = 0.003). Increased performance was aided by retraining laboratories that had specific deficiencies. The results emphasize the importance of investing in laboratory training and interpretation of DNA sequencing results, especially in developing countries where public (or scarce) resources are used to manage the AIDS epidemic.