901 resultados para Computer Network Resources
Resumo:
An administrative border might hinder the optimal allocation of a given set of resources by restricting the flow of goods, services, and people. In this paper we address the question: Do administrative borders lead to poor accessibility to public service such as hospitals? In answering the question, we have examined the case of Sweden and its regional borders. We have used detailed data on the Swedish road network, its hospitals, and its geo-coded population. We have assessed the population’s spatial accessibility to Swedish hospitals by computing the inhabitants’ distance to the nearest hospital. We have also elaborated several scenarios ranging from strongly confining regional borders to no confinements of borders and recomputed the accessibility. Our findings imply that administrative borders are only marginally worsening the accessibility.
Resumo:
Optimal location on the transport infrastructure is the preferable requirement for many decision making processes. Most studies have focused on evaluating performances of optimally locate p facilities by minimizing their distances to a geographically distributed demand (n) when p and n vary. The optimal locations are also sensitive to geographical context such as road network, especially when they are asymmetrically distributed in the plane. The influence of alternating road network density is however not a very well-studied problem especially when it is applied in a real world context. This paper aims to investigate how the density level of the road network affects finding optimal location by solving the specific case of p-median location problem. A denser network is found needed when a higher number of facilities are to locate. The best solution will not always be obtained in the most detailed network but in a middle density level. The solutions do not further improve or improve insignificantly as the density exceeds 12,000 nodes, some solutions even deteriorate. The hierarchy of the different densities of network can be used according to location and transportation purposes and increase the efficiency of heuristic methods. The method in this study can be applied to other location-allocation problem in transportation analysis where the road network density can be differentiated.
Resumo:
For many years, drainage design was mainly about providing sufficient network capacity. This traditional approach had been successful with the aid of computer software and technical guidance. However, the drainage design criteria had been evolving due to rapid population growth, urbanisation, climate change and increasing sustainability awareness. Sustainable drainage systems that bring benefits in addition to water management have been recommended as better alternatives to conventional pipes and storages. Although the concepts and good practice guidance had already been communicated to decision makers and public for years, network capacity still remains a key design focus in many circumstances while the additional benefits are generally considered secondary only. Yet, the picture is changing. The industry begins to realise that delivering multiple benefits should be given the top priority while the drainage service can be considered a secondary benefit instead. The shift in focus means the industry has to adapt to new design challenges. New guidance and computer software are needed to assist decision makers. For this purpose, we developed a new decision support system. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. Users can systematically quantify the performance, life-cycle costs and benefits of different drainage systems using the evaluation framework. The optimisation tool can assist users to determine combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will focus on the optimisation component of the decision support framework. The optimisation problem formation, parameters and general configuration will be discussed. We will also look at the sensitivity of individual variables and the benchmark results obtained using common multi-objective optimisation algorithms. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.
Resumo:
This paper describes the formulation of a Multi-objective Pipe Smoothing Genetic Algorithm (MOPSGA) and its application to the least cost water distribution network design problem. Evolutionary Algorithms have been widely utilised for the optimisation of both theoretical and real-world non-linear optimisation problems, including water system design and maintenance problems. In this work we present a pipe smoothing based approach to the creation and mutation of chromosomes which utilises engineering expertise with the view to increasing the performance of the algorithm whilst promoting engineering feasibility within the population of solutions. MOPSGA is based upon the standard Non-dominated Sorting Genetic Algorithm-II (NSGA-II) and incorporates a modified population initialiser and mutation operator which directly targets elements of a network with the aim to increase network smoothness (in terms of progression from one diameter to the next) using network element awareness and an elementary heuristic. The pipe smoothing heuristic used in this algorithm is based upon a fundamental principle employed by water system engineers when designing water distribution pipe networks where the diameter of any pipe is never greater than the sum of the diameters of the pipes directly upstream resulting in the transition from large to small diameters from source to the extremities of the network. MOPSGA is assessed on a number of water distribution network benchmarks from the literature including some real-world based, large scale systems. The performance of MOPSGA is directly compared to that of NSGA-II with regard to solution quality, engineering feasibility (network smoothness) and computational efficiency. MOPSGA is shown to promote both engineering and hydraulic feasibility whilst attaining good infrastructure costs compared to NSGA-II.
Resumo:
Muitos historiadores afirmam que estamos iniciando uma nova era, a era do conhecimento, da informação, a era digital. Surgem duas grandes armas estratégicas nesse novo ambiente global, para que as empresas sejam competitivas no século vinte e um: a criatividade e a integração. E muitas empresas estão adotando uma nova estrutura organizacional, a estrutura do tipo network, como solução para a gerência da criatividade e da integração. Essa estrutura não se preocupa com novas maneiras de manipular subordinados em vantagem própria. Ao contrário, ela nos desafia a repensar o básico: nossos valores, atitudes e considerações a respeito de liderança, trabalho e tempo. As estruturas hierárquicas convencionais não proporcionam a agilidade de resposta requerida pelo mercado atualmente, devido à burocracia por trás de todas as atividades. As pessoas especializam-se em pequenas atividades, perdendo o sentido do trabalho e a motivação intrínseca. E uma vez que as pessoas são crescentemente reconhecidas como o capital mais importante de qualquer empreendimento, a desmotivação se toma desastrosa para o futuro de qualquer negócio. A reciprocidade empresa-indivíduo é essencial. Esta dissertação pretende analisar o fator humano nos trabalhos realizados dentro da estrutura de network, traçando-se um paralelo entre as propostas dessa estrutura e as necessidades humanas, demonstrando a relação existente entre a estrutura organizacional da criatividade e da integração e a satisfação no trabalho. Iniciamente, apresenta-se uma revisão bibliográfica, sob três diferentes enfoques. Primeiro, explica-se como as transformações mundiais estão afetando a estratégia das empresas. Depois, mostra-se o impacto da estratégia do século vinte e um dentro da organização. Por fim, focaliza-se o lado psicológico do ser humano, suas necessidades, tais quais a autonomia, a competência e o relacionamento interpessoal, os fatores de satisfação intrínsecos e extrínsecos. Assim, pode-se avaliar o impacto de uma nova estrutura organizacional na motivação dos funcionários. A seguir, apresenta-se o projeto de uma pesquisa-piloto dos fatores de satisfação mais relevantes para as pessoas, confirmando-se a importância dos fatores de satisfação intrínsecos. Mostra-se também que os índices de satisfação são diretamente afetados pelo ambiente empresarial onde atuam, de acordo com seu grau de autonomia. Então, são mostradas as conclusões do trabalho e recomendações práticas para mudanças na estrutura organizacional dentro de uma empresa, seus custos e como elas devem ser administradas no longo prazo.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
The increasing of the number of attacks in the computer networks has been treated with the increment of the resources that are applied directly in the active routers equip-ments of these networks. In this context, the firewalls had been consolidated as essential elements in the input and output control process of packets in a network. With the advent of intrusion detectors systems (IDS), efforts have been done in the direction to incorporate packets filtering based in standards of traditional firewalls. This integration incorporates the IDS functions (as filtering based on signatures, until then a passive element) with the already existing functions in firewall. In opposite of the efficiency due this incorporation in the blockage of signature known attacks, the filtering in the application level provokes a natural retard in the analyzed packets, and it can reduce the machine performance to filter the others packets because of machine resources demand by this level of filtering. This work presents models of treatment for this problem based in the packets re-routing for analysis by a sub-network with specific filterings. The suggestion of implementa- tion of this model aims reducing the performance problem and opening a space for the consolidation of scenes where others not conventional filtering solutions (spam blockage, P2P traffic control/blockage, etc.) can be inserted in the filtering sub-network, without inplying in overload of the main firewall in a corporative network
Resumo:
Artificial neural networks are usually applied to solve complex problems. In problems with more complexity, by increasing the number of layers and neurons, it is possible to achieve greater functional efficiency. Nevertheless, this leads to a greater computational effort. The response time is an important factor in the decision to use neural networks in some systems. Many argue that the computational cost is higher in the training period. However, this phase is held only once. Once the network trained, it is necessary to use the existing computational resources efficiently. In the multicore era, the problem boils down to efficient use of all available processing cores. However, it is necessary to consider the overhead of parallel computing. In this sense, this paper proposes a modular structure that proved to be more suitable for parallel implementations. It is proposed to parallelize the feedforward process of an RNA-type MLP, implemented with OpenMP on a shared memory computer architecture. The research consistes on testing and analizing execution times. Speedup, efficiency and parallel scalability are analyzed. In the proposed approach, by reducing the number of connections between remote neurons, the response time of the network decreases and, consequently, so does the total execution time. The time required for communication and synchronization is directly linked to the number of remote neurons in the network, and so it is necessary to investigate which one is the best distribution of remote connections
Resumo:
Urban stormwater can be considered as potential water resources as well as problems for the proper functioning of the manifold activities of the city, resulting from inappropriate use and occupation of the soil, usually due to poor planning of the occupation of the development areas, with little care for the environmental aspects of the drainage of surface runoff. As a basic premise, we must seek mechanisms to preserve the natural flow in all stages of development of an urban area, preserving the soil infiltration capacity in the scale of the urban area, comprising the mechanisms of natural drainage, and noting preserving natural areas of dynamic water courses, both in the main channel and in the secondary. They are challenges for a sustainable urban development in a harmonious coexistence of modern developmental, which are consistent with the authoritative economic environmental and social quality. Integrated studies involving the quantity and quality of rainwater are absolutely necessary to achieve understanding and obtaining appropriate technologies, involving both aspects of the drainage problems and aspects of use of water when subjected to an adequate management of surface runoff , for example, the accumulation of these reservoirs in detention with the possibility of use for other purposes. The purpose of this study aims to develop a computer model, adjusted to prevailing conditions of an experimental urban watershed in order to enable the implementation of management practices for water resources, hydrological simulations of quantity and, in a preliminary way, the quality of stormwater that flow to a pond located at the downstream end of the basin. To this end, we used in parallel with the distributed model SWMM data raised the basin with the highest possible resolution to allow the simulation of diffuse loads, heterogeneous characteristics of the basin both in terms of hydrological and hydraulic parameters on the use and occupation soil. The parallel work should improve the degree of understanding of the phenomena simulated in the basin as well as the activity of the calibration models, and this is supported by monitoring data acquired during the duration of the project MAPLU (Urban Stormwater Management) belonging to the network PROSAB (Research Program in Basic Sanitation) in the years 2006 to 2008
Resumo:
This work shows the design, simulation, and analysis of two optical interconnection networks for a Dataflow parallel computer architecture. To verify the optical interconnection network performance on the Dataflow architecture, we have analyzed the load balancing among the processors during the parallel programs executions. The load balancing is a very important parameter because it is directly associated to the dataflow parallelism degree. This article proves that optical interconnection networks designed with simple optical devices can provide efficiently the dataflow requirements of a high performance communication system.
Resumo:
Computerized technological resources have become essential in education, particularly for teaching topics that require the performance of specific tasks. These resources can effectively help the execution of such tasks and the teaching-learning process itself. After the development of a Web site on the topic of nursing staff scheduling, this study aimed at comparing the development of students involved in the teaching-learning process of the previously mentioned topic, with and without the use of computer technology. Two random groups of undergraduate nursing students from a public university in São Paulo state, Brazil, were organized: a case group (used the Web site) and a control group (did not use the Web site). Data were collected from 2003 to 2005 after approval by the Research Ethics Committee. Results showed no significant difference in motivation or knowledge acquisition. A similar performance for the two groups was also verified. Other aspects observed were difficulty in doing the nursing staff scheduling exercise and the students' acknowledgment of the topic's importance for their training and professional lives; easy access was considered to be a positive aspect for maintaining the Web site.
Resumo:
Este artigo apresenta uma pesquisa que é resultado de cursos online de formação continuada de professores, concebidos a partir de uma parceria entre a UNESP e uma rede nacional de escolas de Ensino Básico. Os cursos buscavam familiarizar os professores de Matemática com os recursos da tecnologia informática, especificamente dois softwares, o Geometricks e o Winplot, no que diz respeito à utilização destes na sala de aula. Após alguns anos da realização dos mesmos, na pesquisa aqui descrita, objetivamos identificar se e como os softwares foram incorporados à prática profissional, em um cenário em que os professores podem contar com laboratórios, formação continuada e suporte técnico. A partir de entrevistas online, pudemos mapear as diferentes escolhas dos professores: não-uso; uso de forma semelhante (ou não) à vivenciada no curso online, e o uso interdisciplinar, mostrando variadas formas pelas quais os professores retraduziram o curso para sua prática.
Resumo:
Teaching a course of special electric loads in a continuing education program to power engineers is a difficult task because they are not familiarized with switching topology circuits. Normally, in a typical program, many hours are dedicated to explain the thyristors switching sequence and to draw the converter currents and terminal voltages waveforms for different operative conditions. This work presents teaching support software in order to optimize the time spent in this task and, mainly to benefit the assimilation of the proposed subjects, studying the static converter under different non-ideal operative conditions.
Resumo:
Piecewise-Linear Programming (PLP) is an important area of Mathematical Programming and concerns the minimisation of a convex separable piecewise-linear objective function, subject to linear constraints. In this paper a subarea of PLP called Network Piecewise-Linear Programming (NPLP) is explored. The paper presents four specialised algorithms for NPLP: (Strongly Feasible) Primal Simplex, Dual Method, Out-of-Kilter and (Strongly Polynomial) Cost-Scaling and their relative efficiency is studied. A statistically designed experiment is used to perform a computational comparison of the algorithms. The response variable observed in the experiment is the CPU time to solve randomly generated network piecewise-linear problems classified according to problem class (Transportation, Transshipment and Circulation), problem size, extent of capacitation, and number of breakpoints per arc. Results and conclusions on performance of the algorithms are reported.
Resumo:
The ability of neural networks to realize some complex nonlinear function makes them attractive for system identification. This paper describes a novel barrier method using artificial neural networks to solve robust parameter estimation problems for nonlinear model with unknown-but-bounded errors and uncertainties. This problem can be represented by a typical constrained optimization problem. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the network convergence to the equilibrium points. A solution for the robust estimation problem with unknown-but-bounded error corresponds to an equilibrium point of the network. Simulation results are presented as an illustration of the proposed approach.