938 resultados para Wide Area Control
Resumo:
This paper proposes the deployment of a neural network computing environment on Active Networks. Active Networks are packet-switched computer networks in which packets can contain code fragments that are executed on the intermediate nodes. This feature allows the injection of small pieces of codes to deal with computer network problems directly into the network core, and the adoption of new computing techniques to solve networking problems. The goal of our project is the adoption of a distributed neural network for approaching tasks which are specific of the computer network environment. Dynamically reconfigurable neural networks are spread on an experimental wide area backbone of active nodes (ABone) to show the feasibility of the proposed approach.
Resumo:
Recently, two approaches have been introduced that distribute the molecular fragment mining problem. The first approach applies a master/worker topology, the second approach, a completely distributed peer-to-peer system, solves the scalability problem due to the bottleneck at the master node. However, in many real world scenarios the participating computing nodes cannot communicate directly due to administrative policies such as security restrictions. Thus, potential computing power is not accessible to accelerate the mining run. To solve this shortcoming, this work introduces a hierarchical topology of computing resources, which distributes the management over several levels and adapts to the natural structure of those multi-domain architectures. The most important aspect is the load balancing scheme, which has been designed and optimized for the hierarchical structure. The approach allows dynamic aggregation of heterogenous computing resources and is applied to wide area network scenarios.
Resumo:
This paper presents a study on applying an integrated Global Position System (GPS) and Geographacial Information System (GIS) technology to the reduction of construction waste. During the study, a prototype study is developed from automatic data capture system such as the barcoding system for construction material and equipment (M&E) management onsite, whilst the integrated GPS and GIS technology is combined to the M&E system based on the Wide Area Network (WAN). Then, a case study is conducted to demonstrate the deployment of the system. Experimental results indicate that the proposed system can minimize the amount of onsite material wastage.
Resumo:
The Java language first came to public attention in 1995. Within a year, it was being speculated that Java may be a good language for parallel and distributed computing. Its core features, including being objected oriented and platform independence, as well as having built-in network support and threads, has encouraged this view. Today, Java is being used in almost every type of computer-based system, ranging from sensor networks to high performance computing platforms, and from enterprise applications through to complex research-based.simulations. In this paper the key features that make Java a good language for parallel and distributed computing are first discussed. Two Java-based middleware systems, namely MPJ Express, an MPI-like Java messaging system, and Tycho, a wide-area asynchronous messaging framework with an integrated virtual registry are then discussed. The paper concludes by highlighting the advantages of using Java as middleware to support distributed applications.
Resumo:
Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware, a faulty, system or application software. Desirable characteristics for monitoring systems are the ability to connect to any number of different types of monitoring agents and to provide different views of the system, based on a client's particular preferences. This paper outlines and discusses the ongoing activities within the GridRM wide-area resource-monitoring project.
Resumo:
Tycho was conceived in 2003 in response to a need by the GridRM [1] resource-monitoring project for a ldquolight-weightrdquo, scalable and easy to use wide-area distributed registry and messaging system. Since Tycho's first release in 2006 a number of modifications have been made to the system to make it easier to use and more flexible. Since its inception, Tycho has been utilised across a number of application domains including widearea resource monitoring, distributed queries across archival databases, providing services for the nodes of a Cray supercomputer, and as a system for transferring multi-terabyte scientific datasets across the Internet. This paper provides an overview of the initial Tycho system, describes a number of applications that utilise Tycho, discusses a number of new utilities, and how the Tycho infrastructure has evolved in response to experience of building applications with it.
Resumo:
In any wide-area distributed system there is a need to communicate and interact with a range of networked devices and services ranging from computer-based ones (CPU, memory and disk), to network components (hubs, routers, gateways) and specialised data sources (embedded devices, sensors, data-feeds). In order for the ensemble of underlying technologies to provide an environment suitable for virtual organisations to flourish, the resources that comprise the fabric of the Grid must be monitored in a seamless manner that abstracts away from the underlying complexity. Furthermore, as various competing Grid middleware offerings are released and evolve, an independent overarching monitoring service should act as a corner stone that ties these systems together. GridRM is a standards-based approach that is independent of any given middleware and that can utilise legacy and emerging resource-monitoring technologies. The main objective of the project is to produce a standardised and extensible architecture that provides seamless mechanisms to interact with native monitoring agents across heterogeneous resources.
Resumo:
Electricity load shifting is becoming a big topic in the world of ‘green’ retail. Marks & Spencer (M&S) aim to become the world’s most sustainable retailer (1) and part of that commitment means contributing to the future electricity network. While intelligent operation of fridges and Heating, Ventilation and Air Conditioning (HVAC) systems are a wide area of research, standby generators should be considered too, as they are the most widely adopted form of distributed generation. In this paper, the experience of using standby generators in Northern Ireland to support the grid is shared and the logistics of future projects are discussed. Interactions with maintenance schedules, electricity costs, grid code, staffing and store opening times are discussed as well as the financial implications associated with running generators for grid support.
Resumo:
Drawing upon Brazilian experience, this research explores some of the key issues to be addressed in using e-government technical cooperation designed to enhance service provision of Patent Offices in developing countries. While the development of software applications is often seen merely as a technical engineering exercise, localization and adaptation are context bounded matters that are characterized by many entanglements of human and non-humans. In this work, technical, legal and policy implications of technical cooperation are also discussed in a complex and dynamic implementation environment characterized by the influence of powerful hidden agendas associated with the arena of intellectual property (IP), which are shaped by recent technological, economic and social developments in our current knowledge-based economy. This research employs two different theoretical lenses to examine the same case, which consists of transfer of a Patent Management System (PMS) from the European Patent Office (EPO) to the Brazilian Patent Office that is locally named ‘Instituto Nacional da Propriedade Industrial’ (INPI). Fundamentally, we have opted for a multi-paper thesis comprising an introduction, three scientific articles and a concluding chapter that discusses and compares the insights obtained from each article. The first article is dedicated to present an extensive literature review on e-government and technology transfer. This review allowed the proposition on an integrative meta-model of e-government technology transfer, which is named E-government Transfer Model (ETM). Subsequently, in the second article, we present Actor-Network Theory (ANT) as a framework for understanding the processes of transferring e-government technologies from Patent Offices in developed countries to Patent Offices in developing countries. Overall, ANT is seen as having a potentially wide area of application and being a promising theoretical vehicle in IS research to carry out a social analysis of messy and heterogeneous processes that drive technical change. Drawing particularly on the works of Bruno Latour, Michel Callon and John Law, this work applies this theory to a longitudinal study of the management information systems supporting the Brazilian Patent Office restructuration plan that involved the implementation of a European Patent Management System in Brazil. Based upon the ANT elements, we follow the actors to identify and understand patterns of group formation associated with the technical cooperation between the Brazilian Patent Office (INPI) and the European Patent Office (EPO). Therefore, this research explores the intricate relationships and interactions between human and non-human actors in their attempts to construct various network alliances, thereby demonstrating that technologies embodies compromise. Finally, the third article applies ETM model as a heuristic frame to examine the same case previously studied from an ANT perspective. We have found evidence that ETM has strong heuristic qualities that can guide practitioners who are engaged in the transfer of e-government systems from developed to developing countries. The successful implementation of e-government projects in developing countries is important to stimulate economic growth and, as a result, we need to understand the processes through which such projects are being implemented and succeed. Here, we attempt to improve understanding on the development and stabilization of a complex social-technical system in the arena of intellectual property. Our preliminary findings suggest that e-government technology transfer is an inherently political process and that successful outcomes require continuous incremental actions and improvisations to address the ongoing issues as they emerge.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work presents a theoretical and numerical analysis of structures using frequency selective surfaces applied on patch antennas. The FDTD method is used to determine the time domain reflected fields. Applications of frequency selective surfaces and patch antennas cover a wide area of telecommunications, especially mobile communications, filters and WB antennas. scattering parameters are obteained from Fourier Transformer of transmited and reflected fields in time domain. The PML are used as absorbing boundary condition, allowing the determination of the fields with a small interference of reflections from discretized limit space. Rectangular patches are considered on dielectric layer and fed by microstrip line. Frequency selective surfaces with periodic and quasi-periodic structures are analyzed on both sides of antenna. A literature review of the use of frequency selective surfaces in patch antennas are also performed. Numerical results are also compared with measured results for return loss of analyzed structures. It is also presented suggestions of continuity to this work
Resumo:
We investigated the production of interleukin-6 (IL-6) and tumor necrosis factor-alpha (TNF-alpha) during canine visceral leishmaniasis (VL) to gain a better understanding of the role of such multi-functional cytokines in parasite resistance. IL-6 and TNF-alpha levels were measured by capture ELISA in sera from 8 healthy dogs from a non-endemic area (control group) and in sera from 16 dogs from Aracatuba, SP, Brazil, an area endemic for leishmaniosis. The dogs from the endemic area were selected by positive ELISA serology against total Leishmania chagasi antigen, positive spleen imprints for Leishmania, and the presence of at least three clinical signs associated with active visceral leishmaniasis (fever, dermatitis, lymphoadenopathy, onychogryphosis, weight loss, cachexia, locomotory difficulty, conjunctivitis, epistaxis, hepatosplenomegaly, edema, and apathy).Enhanced systemic IL-6 production was found in sera from dogs with the active disease compared to healthy dogs (t-test, P < 0.05). In contrast, TNF-alpha did not differ between the two groups studied. There was no correlation between IL-6 production and anti-leishmanial antibody titers in the sera. Our findings suggest that IL-6 is a good marker of active disease during leishmaniasis, and that other cytokines may be involved in the hypergammaglobulinemia characteristic of canine visceral leishmaniasis. (c) 2006 Published by Elsevier B.V.
Resumo:
Satellite remote sensing of ocean colour is the only method currently available for synoptically measuring wide-area properties of ocean ecosystems, such as phytoplankton chlorophyll biomass. Recently, a variety of bio-optical and ecological methods have been established that use satellite data to identify and differentiate between either phytoplankton functional types (PFTs) or phytoplankton size classes (PSCs). In this study, several of these techniques were evaluated against in situ observations to determine their ability to detect dominant phytoplankton size classes (micro-, nano- and picoplankton). The techniques are applied to a 10-year ocean-colour data series from the SeaWiFS satellite sensor and compared with in situ data (6504 samples) from a variety of locations in the global ocean. Results show that spectral-response, ecological and abundance-based approaches can all perform with similar accuracy. Detection of microplankton and picoplankton were generally better than detection of nanoplankton. Abundance-based approaches were shown to provide better spatial retrieval of PSCs. Individual model performance varied according to PSC, input satellite data sources and in situ validation data types. Uncertainty in the comparison procedure and data sources was considered. Improved availability of in situ observations would aid ongoing research in this field. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
A study of the characteristics and distribution of the soil humus fractions in representative ecosystems of central Brazil was carried out with special emphasis on the comparison between the soils under virgin vegetation-Cerrado-and those subjected to cultivation. In spite of the contrasted vegetation and cultural practices in the sites studied, the soil humus showed analogous characteristics: there was a negligible amount of plant residues, the humic and fulvic acids amounted to approximately 70% of the total organic carbon, and about 40% of these humic substances were in extremely stable association with the soil mineral fraction, the HCl-HF treatment being required for their extraction. The stability of such organo-mineral complexes increased slightly in the cultured sites. The study of the humic acid fraction showed increased oxidation and aromaticity in most of the cultivated sites: the lowest values for the IR alkyl vibrations and H/C atomic ratios and the highest ones for the optical density at 465 nm were observed in sites transformed into orchards, whereas the above changes were small in those used as pasture. The 14C NMR spectra confirmed that the proportion of polyalkyl structures decreased in the humic acids of soils subjected to cultivation, as opposed to that of carboxyl groups. In spite of the high stability inferred for the organic matter throughout the wide area examined, the samples from the original Cerrado as well as from those transformed into pastures showed, in laboratory conditions, higher mineralization rates than those from the sites subjected to cultivation. This is partly attributed to the decreased proportions of extractable humic substances in the latter. © 1992.
Resumo:
The purpose of this study was to identify the drugs most often prescribed for hypertension at the Municipal Health Care Center of the town of Rincäo, State of São Paulo, Brazil, and the principal interactions arising from their association with other drugs, both anti-hypertensives and those in other classes. The study included 725 hypertensive patients registered at this health care center who were regularly seen by a physician every three months. Data were collected on age, sex, occurrence of diabetes, smoking, sedentary lifestyle and overweight, to obtain a profile of the hypertensive population of the area. Control records of all patients were available at the pharmacy in the health care center, where patients obtained their drugs once a month. Of the 725 patients, 38% were male and 62% female. Most (57%) were between 50 and 70 years of age, 21% used tobacco and 43% led a sedentary lifestyle. Single-drug therapy accounted for 33% of the prescriptions, multidrug therapy for 66%. In addition to anti-hypertensives, 50% of the patients took drugs of other therapeutic classes. Of those receiving multidrug therapy, 34% used three or more anti-hypertensives and 66% used only two of these drugs. Drug interactions were detected in as many as 47% of the prescriptions. Captopril was the drug that showed most interactions with others (54%), followed by hydrochlorothiazide (27%), furosemide (14%), propanolol (4%), and nifedipine (1%). The analysis revealed that drug consumption by the patients investigated is high, with a concomitantly high number of episodes of drug interaction.