945 resultados para Computer management
Resumo:
Systems of distributed artificial intelligence can be powerful tools in a wide variety of practical applications. Its most surprising characteristic, the emergent behavior, is also the most answerable for the difficulty in. projecting these systems. This work proposes a tool capable to beget individual strategies for the elements of a multi-agent system and thereof providing to the group means on obtaining wanted results, working in a coordinated and cooperative manner as well. As an application example, a problem was taken as a basis where a predators` group must catch a prey in a three-dimensional continuous ambient. A synthesis of system strategies was implemented of which internal mechanism involves the integration between simulators by Particle Swarm Optimization algorithm (PSO), a Swarm Intelligence technique. The system had been tested in several simulation settings and it was capable to synthesize automatically successful hunting strategies, substantiating that the developed tool can provide, as long as it works with well-elaborated patterns, satisfactory solutions for problems of complex nature, of difficult resolution starting from analytical approaches. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper addresses the minimization of the mean absolute deviation from a common due date in a two-machine flowshop scheduling problem. We present heuristics that use an algorithm, based on proposed properties, which obtains an optimal schedule fora given job sequence. A new set of benchmark problems is presented with the purpose of evaluating the heuristics. Computational experiments show that the developed heuristics outperform results found in the literature for problems up to 500 jobs. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.
Resumo:
In this work, a system using active RFID tags to supervise truck bulk cargo is described. The tags are attached to the bodies of the trucks and readers are distributed in the cargo buildings and attached to weighs and the discharge platforms. PDAs with camera and support to a WiFi network are provided to the inspectors and access points are installed throughout the discharge area to allow effective confirmations of unload actions and the acquisition of pictures for future audit. Broadband radio equipments are used to establish efficient communication links between the weighs and cargo buildings which are usually located very far from each other in the field. A web application software was especially developed to enable robust communication between the equipments for efficient device management, data processing and reports generation to the operating personal. The system was deployed in a cargo station of a Brazilian seashore port. The obtained results prove the effectiveness of the proposed system.
Resumo:
The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.
Resumo:
Hub-and-spoke networks are widely studied in the area of location theory. They arise in several contexts, including passenger airlines, postal and parcel delivery, and computer and telecommunication networks. Hub location problems usually involve three simultaneous decisions to be made: the optimal number of hub nodes, their locations and the allocation of the non-hub nodes to the hubs. In the uncapacitated single allocation hub location problem (USAHLP) hub nodes have no capacity constraints and non-hub nodes must be assigned to only one hub. In this paper, we propose three variants of a simple and efficient multi-start tabu search heuristic as well as a two-stage integrated tabu search heuristic to solve this problem. With multi-start heuristics, several different initial solutions are constructed and then improved by tabu search, while in the two-stage integrated heuristic tabu search is applied to improve both the locational and allocational part of the problem. Computational experiments using typical benchmark problems (Civil Aeronautics Board (CAB) and Australian Post (AP) data sets) as well as new and modified instances show that our approaches consistently return the optimal or best-known results in very short CPU times, thus allowing the possibility of efficiently solving larger instances of the USAHLP than those found in the literature. We also report the integer optimal solutions for all 80 CAB data set instances and the 12 AP instances up to 100 nodes, as well as for the corresponding new generated AP instances with reduced fixed costs. Published by Elsevier Ltd.
Resumo:
Recently, the development of industrial processes brought on the outbreak of technologically complex systems. This development generated the necessity of research relative to the mathematical techniques that have the capacity to deal with project complexities and validation. Fuzzy models have been receiving particular attention in the area of nonlinear systems identification and analysis due to it is capacity to approximate nonlinear behavior and deal with uncertainty. A fuzzy rule-based model suitable for the approximation of many systems and functions is the Takagi-Sugeno (TS) fuzzy model. IS fuzzy models are nonlinear systems described by a set of if then rules which gives local linear representations of an underlying system. Such models can approximate a wide class of nonlinear systems. In this paper a performance analysis of a system based on IS fuzzy inference system for the calibration of electronic compass devices is considered. The contribution of the evaluated IS fuzzy inference system is to reduce the error obtained in data acquisition from a digital electronic compass. For the reliable operation of the TS fuzzy inference system, adequate error measurements must be taken. The error noise must be filtered before the application of the IS fuzzy inference system. The proposed method demonstrated an effectiveness of 57% at reducing the total error based on considered tests. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Since the computer viruses pose a serious problem to individual and corporative computer systems, a lot of effort has been dedicated to study how to avoid their deleterious actions, trying to create anti-virus programs acting as vaccines in personal computers or in strategic network nodes. Another way to combat viruses propagation is to establish preventive policies based on the whole operation of a system that can be modeled with population models, similar to those that are used in epidemiological studies. Here, a modified version of the SIR (Susceptible-Infected-Removed) model is presented and how its parameters are related to network characteristics is explained. Then, disease-free and endemic equilibrium points are calculated, stability and bifurcation conditions are derived and some numerical simulations are shown. The relations among the model parameters in the several bifurcation conditions allow a network design minimizing viruses risks. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Only 7% of the once extensive forest along the eastern coast of Brazil remains, and much of that is degraded and threatened by agricultural expansion and urbanization. We wondered if methods similar to those developed to establish fast-growing Eucalyptus plantations might also work to enhance survival and growth of rainforest species on degraded pastures composed of highly competitive C(4) grasses. An 8-factor experiment was laid out to contrast the value of different intensities of cultivation, application of fertilizer and weed control on the growth and survival of a mixture of 20 rainforest species planted at two densities: 3 m x 1 m, and 3 m x 2 m. Intensive management increased seedling survival from 90% to 98%, stemwood production and leaf area index (LAI) by similar to 4-fold, and stemwood production per unit of light absorbed by 30%. Annual growth in stem biomass was closely related to LAI alone (r(2) = 0.93, p < 0.0001), and the regression improved further in combination with canopy nitrogen content (r(2) =0.99, p < 0.0001). Intensive management resulted in a nearly closed forest canopy in less than 4 years, and offers a practical means to establish functional forests on abandoned agricultural land. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The evaluations of the effect of the climatic conditions and of the intensity of forest management in the trunk of the Gmelina arborea Linn. Roxb. trees are restricted to its physical-mechanical properties and use. The present work has as objective to study the radial variations of the wood anatomy of the gmelina trees sampled in plantations of 30 sites in Costa Rica, characterized by two climatic conditions (tropical dry and humid) and three intensities of forest management (intensive, moderate and without management). The results of the analyses demonstrated the existence of radial variation of the different anatomical parameters, except for the fiber lumen diameter and multiple vessels in the wood of the gmelina trees. For the wood anatomical elements, fibers (width, lumen diameter, and length), vessels (multiple vessels, diameter and frequency) and radial parenchyma (height) relationships were observed with the climate (tropical humid and dry). The radial variations of the wood anatomical elements were, also, influenced by the management regimes of the gmelina trees.
Resumo:
Tropical forests are characterized by diverse assemblages of plant and animal species compared to temperate forests. Corollary to this general rule is that most tree species, whether valued for timber or not, occur at low densities (<1 adult tree ha(-1)) or may be locally rare. In the Brazilian Amazon, many of the most highly valued timber species occur at extremely low densities yet are intensively harvested with little regard for impacts on population structures and dynamics. These include big-leaf mahogany (Swietenia macrophylla), ipe (Tabebuia serratifolia and Tabebuia impetiginosa), jatoba (Hymenaea courbaril), and freijo cinza (Cordia goeldiana). Brazilian forest regulations prohibit harvests of species that meet the legal definition of rare - fewer than three trees per 100 ha - but treat all species populations exceeding this density threshold equally. In this paper we simulate logging impacts on a group of timber species occurring at low densities that are widely distributed across eastern and southern Amazonia, based on field data collected at four research sites since 1997, asking: under current Brazilian forest legislation, what are the prospects for second harvests on 30-year cutting cycles given observed population structures, growth, and mortality rates? Ecologically `rare` species constitute majorities in commercial species assemblages in all but one of the seven large-scale inventories we analyzed from sites spanning the Amazon (range 49-100% of total commercial species). Although densities of only six of 37 study species populations met the Brazilian legal definition of a rare species, timber stocks of five of the six timber species declined substantially at all sites between first and second harvests in simulations based on legally allowable harvest intensities. Reducing species-level harvest intensity by increasing minimum felling diameters or increasing seed tree retention levels improved prospects for second harvests of those populations with a relatively high proportion of submerchantable stems, but did not dramatically improve projections for populations with relatively flat diameter distributions. We argue that restrictions on logging very low-density timber tree populations, such as the current Brazilian standard, provide inadequate minimum protection for vulnerable species. Population declines, even if reduced-impact logging (RIL) is eventually adopted uniformly, can be anticipated for a large pool of high-value timber species unless harvest intensities are adapted to timber species population ecology, and silvicultural treatments are adopted to remedy poor natural stocking in logged stands. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Eucalyptus is the dominant and most productive planted forest in Brazil, covering around 3.4 million ha for the production of charcoal, pulp, sawtimber, timber plates, wood foils, plywood and for building purposes. At the early establishment of the forest plantations, during the second half of the 1960s, the eucalypt yield was 10 m(3) ha(-1) y(-1). Now, as a result of investments in research and technology, the average productivity is 38 m3 ha(-1) y(-1). The productivity restrictions are related to the following environmental factors, in order of importance: water deficits > nutrient deficiency > soil depth and strength. The clonal forests have been fundamental in sites with larger water and nutrient restrictions, where they out-perform those established from traditional seed-based planting stock. When the environmental limitations are small the productivities of plantations based on clones or seeds appear to be similar. In the long term there are risks to sustainability, because of the low fertility and low reserves of primary minerals in the soils, which are, commonly, loamy and clayey oxisols and ultisols. Usually, a decline of soil quality is caused by management that does not conserve soil and site resources, damages soil physical and chemical characteristics, and insufficient or unbalanced fertiliser management. The problem is more serious when fast-growing genotypes are planted, which have a high nutrient demand and uptake capacity, and therefore high nutrient output through harvesting. The need to mobilise less soil by providing more cover and protection, reduce the nutrient and organic matter losses, preserve crucial physical properties as permeability ( root growth, infiltration and aeration), improve weed control and reduce costs has led to a progressive increase in the use of minimum cultivation practices during the last 20 years, which has been accepted as a good alternative to keep or increase site quality in the long term. In this paper we provide a synthesis and critical appraisal of the research results and practical implications of early silvicultural management on long-term site productivity of fast-growing eucalypt plantations arising from the Brazilian context.
Resumo:
BACKGROUND: Defoliation by Anticarsia gemmatalis (Hubner), Pseudoplusia includens (Walker), Spodoptera eridania (Cramer), S. cosmioides (Walker) and S. frugiperda (JE Smith) (Lepidoptera: Noctuidae) was evaluated in four soybean genotypes. A multiple-species economic threshold (ET), based upon the species` feeding capacity, is proposed with the aim of improving growers` management decisions on when to initiate control measures for the species complex. RESULTS: Consumption by A. gemmatalis, S. cosmioides or S. eridania on different genotypes was similar. The highest consumption of P. includens was 92.7 cm(2) on Codetec 219RR; that of S. frugiperda was 118 cm(2) on Codetec 219RR and 115.1 cm(2) on MSoy 8787RR. The insect injury equivalent for S. cosmoides, calculated on the basis of insect consumption, was double the standard consumption by A. gemmatalis, and statistically different from the other species tested, which were similar to each other. CONCLUSIONS: As S. cosmioides always defoliated nearly twice the leaf area of the other species, the injury equivalent would be 2 for this lepidopteran species and 1 for the other species. The recommended multiple-species ET to trigger the beginning of insect control would then be 20 insect equivalents per linear metre. (C) 2010 Society of Chemical Industry