869 resultados para NETWORK DESIGN PROBLEMS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the computer viruses pose a serious problem to individual and corporative computer systems, a lot of effort has been dedicated to study how to avoid their deleterious actions, trying to create anti-virus programs acting as vaccines in personal computers or in strategic network nodes. Another way to combat viruses propagation is to establish preventive policies based on the whole operation of a system that can be modeled with population models, similar to those that are used in epidemiological studies. Here, a modified version of the SIR (Susceptible-Infected-Removed) model is presented and how its parameters are related to network characteristics is explained. Then, disease-free and endemic equilibrium points are calculated, stability and bifurcation conditions are derived and some numerical simulations are shown. The relations among the model parameters in the several bifurcation conditions allow a network design minimizing viruses risks. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OctVCE is a cartesian cell CFD code produced especially for numerical simulations of shock and blast wave interactions with complex geometries. Virtual Cell Embedding (VCE) was chosen as its cartesian cell kernel as it is simple to code and sufficient for practical engineering design problems. This also makes the code much more ‘user-friendly’ than structured grid approaches as the gridding process is done automatically. The CFD methodology relies on a finite-volume formulation of the unsteady Euler equations and is solved using a standard explicit Godonov (MUSCL) scheme. Both octree-based adaptive mesh refinement and shared-memory parallel processing capability have also been incorporated. For further details on the theory behind the code, see the companion report 2007/12.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The goal of the present study is mapping the nature of possible contributions of participatory online platforms in citizen actions that may contribute in the fight against cancer and its associated consequences. The research is based on the analysis of online solidarity networks, namely the ones residing on Facebook and the blogosphere, that citizens have been gradually resorting to. The research is also based on the development of newer and more efficient solutions that provide the individual (directly or indirectly affected by issues of oncology) with the means to overcome feelings of impotence and fatality. In this chapter, the authors summarize the processes of usage of these decentralized, freer participatory platforms by citizens and institutions, while attempting to unravel existing hype and stigma; the authors also provide a first survey of the importance and the role of institutions in this kind of endeavor; lastly, they present a prototype, developed in the context of the present study that is specifically dedicated to addressing oncology through social media.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The interest in the development of climbing robots has grown rapidly in the last years. Climbing robots are useful devices that can be adopted in a variety of applications, such as maintenance and inspection in the process and construction industries. These systems are mainly adopted in places where direct access by a human operator is very expensive, because of the need for scaffolding, or very dangerous, due to the presence of an hostile environment. The main motivations are to increase the operation efficiency, by eliminating the costly assembly of scaffolding, or to protect human health and safety in hazardous tasks. Several climbing robots have already been developed, and other are under development, for applications ranging from cleaning to inspection of difficult to reach constructions. A wall climbing robot should not only be light, but also have large payload, so that it may reduce excessive adhesion forces and carry instrumentations during navigation. These machines should be capable of travelling over different types of surfaces, with different inclinations, such as floors, walls, or ceilings, and to walk between such surfaces (Elliot et al. (2006); Sattar et al. (2002)). Furthermore, they should be able of adapting and reconfiguring for various environment conditions and to be self-contained. Up to now, considerable research was devoted to these machines and various types of experimental models were already proposed (according to Chen et al. (2006), over 200 prototypes aimed at such applications had been developed in the world by the year 2006). However, we have to notice that the application of climbing robots is still limited. Apart from a couple successful industrialized products, most are only prototypes and few of them can be found in common use due to unsatisfactory performance in on-site tests (regarding aspects such as their speed, cost and reliability). Chen et al. (2006) present the main design problems affecting the system performance of climbing robots and also suggest solutions to these problems. The major two issues in the design of wall climbing robots are their locomotion and adhesion methods. With respect to the locomotion type, four types are often considered: the crawler, the wheeled, the legged and the propulsion robots. Although the crawler type is able to move relatively faster, it is not adequate to be applied in rough environments. On the other hand, the legged type easily copes with obstacles found in the environment, whereas generally its speed is lower and requires complex control systems. Regarding the adhesion to the surface, the robots should be able to produce a secure gripping force using a light-weight mechanism. The adhesion method is generally classified into four groups: suction force, magnetic, gripping to the surface and thrust force type. Nevertheless, recently new methods for assuring the adhesion, based in biological findings, were proposed. The vacuum type principle is light and easy to control though it presents the problem of supplying compressed air. An alternative, with costs in terms of weight, is the adoption of a vacuum pump. The magnetic type principle implies heavy actuators and is used only for ferromagnetic surfaces. The thrust force type robots make use of the forces developed by thrusters to adhere to the surfaces, but are used in very restricted and specific applications. Bearing these facts in mind, this chapter presents a survey of different applications and technologies adopted for the implementation of climbing robots locomotion and adhesion to surfaces, focusing on the new technologies that are recently being developed to fulfill these objectives. The chapter is organized as follows. Section two presents several applications of climbing robots. Sections three and four present the main locomotion principles, and the main "conventional" technologies for adhering to surfaces, respectively. Section five describes recent biological inspired technologies for robot adhesion to surfaces. Section six introduces several new architectures for climbing robots. Finally, section seven outlines the main conclusions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’aparició d’un nou paradigma per al disseny de sistemes multiprocessador, les NoC; requereixen una manera d’adaptar els IP cores ja existents i permetre la seva connexió en xarxa. Aquest projecte presenta un disseny d’una interfície que aconsegueix adaptar un IP core existent, el LEON3; del protocol del bus AMBA al protocol de la xarxa. D’aquesta manera i basant-nos en idees d’interfícies discutides en l’estat de l’art, aconseguim desacoblar el processador del disseny i topologia de la xarxa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose of the evaluation This is a scheduled standard mid-term evaluation (MTR) of a UNDP implemented GEF LDCF co-financed project. It is conducted by a team of an international and a national independent evaluator. The objective of the MTR, as set out in the Terms of Reference (TORs; Annex 1), is to provide an independent analysis of the progress of the project so far. The MTR aims to:  identify potential project design problems,  assess progress towards the achievement of the project objective and outcomes,  identify and document lessons learned (including lessons that might improve design and implementation of other projects, including UNDP-GEF supported projects), and  make recommendations regarding specific actions that should be taken to improve the project. The MTR is intended to assess signs of project success or failure and identify the necessary changes to be made. The project commenced its implementation in the first half of 2010 with the recruitment of project staff. According to the updated project plan, it is due to close in July 201410 with operations scaling down in December 2013 due to funding limits. Because of a slow implementation start, the mid-term evaluation was delayed to July 201311 The intended target audience of the evaluation are:  The project team and decision makers in the INGRH  The GEF and UNFCCC Operational Focal Points  The project partners and beneficiaries  UNDP in Cape Verde as well as the regional and headquarter (HQ) office levels  The GEF Secretariat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge about spatial biodiversity patterns is a basic criterion for reserve network design. Although herbarium collections hold large quantities of information, the data are often scattered and cannot supply complete spatial coverage. Alternatively, herbarium data can be used to fit species distribution models and their predictions can be used to provide complete spatial coverage and derive species richness maps. Here, we build on previous effort to propose an improved compositionalist framework for using species distribution models to better inform conservation management. We illustrate the approach with models fitted with six different methods and combined using an ensemble approach for 408 plant species in a tropical and megadiverse country (Ecuador). As a complementary view to the traditional richness hotspots methodology, consisting of a simple stacking of species distribution maps, the compositionalist modelling approach used here combines separate predictions for different pools of species to identify areas of alternative suitability for conservation. Our results show that the compositionalist approach better captures the established protected areas than the traditional richness hotspots strategies and allows the identification of areas in Ecuador that would optimally complement the current protection network. Further studies should aim at refining the approach with more groups and additional species information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A workshop recently held at the Ecole Polytechnique Federale de Lausanne (EPFL, Switzerland) was dedicated to understanding the genetic basis of adaptive change, taking stock of the different approaches developed in theoretical population genetics and landscape genomics and bringing together knowledge accumulated in both research fields. Indeed, an important challenge in theoretical population genetics is to incorporate effects of demographic history and population structure. But important design problems (e.g. focus on populations as units, focus on hard selective sweeps, no hypothesis-based framework in the design of the statistical tests) reduce their capability of detecting adaptive genetic variation. In parallel, landscape genomics offers a solution to several of these problems and provides a number of advantages (e.g. fast computation, landscape heterogeneity integration). But the approach makes several implicit assumptions that should be carefully considered (e.g. selection has had enough time to create a functional relationship between the allele distribution and the environmental variable, or this functional relationship is assumed to be constant). To address the respective strengths and weaknesses mentioned above, the workshop brought together a panel of experts from both disciplines to present their work and discuss the relevance of combining these approaches, possibly resulting in a joint software solution in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The safe use of nuclear power plants (NPPs) requires a deep understanding of the functioning of physical processes and systems involved. Studies on thermal hydraulics have been carried out in various separate effects and integral test facilities at Lappeenranta University of Technology (LUT) either to ensure the functioning of safety systems of light water reactors (LWR) or to produce validation data for the computer codes used in safety analyses of NPPs. Several examples of safety studies on thermal hydraulics of the nuclear power plants are discussed. Studies are related to the physical phenomena existing in different processes in NPPs, such as rewetting of the fuel rods, emergency core cooling (ECC), natural circulation, small break loss-of-coolant accidents (SBLOCA), non-condensable gas release and transport, and passive safety systems. Studies on both VVER and advanced light water reactor (ALWR) systems are included. The set of cases include separate effects tests for understanding and modeling a single physical phenomenon, separate effects tests to study the behavior of a NPP component or a single system, and integral tests to study the behavior of the whole system. In the studies following steps can be found, not necessarily in the same study. Experimental studies as such have provided solutions to existing design problems. Experimental data have been created to validate a single model in a computer code. Validated models are used in various transient analyses of scaled facilities or NPPs. Integral test data are used to validate the computer codes as whole, to see how the implemented models work together in a code. In the final stage test results from the facilities are transferred to the NPP scale using computer codes. Some of the experiments have confirmed the expected behavior of the system or procedure to be studied; in some experiments there have been certain unexpected phenomena that have caused changes to the original design to avoid the recognized problems. This is the main motivation for experimental studies on thermal hydraulics of the NPP safety systems. Naturally the behavior of the new system designs have to be checked with experiments, but also the existing designs, if they are applied in the conditions that differ from what they were originally designed for. New procedures for existing reactors and new safety related systems have been developed for new nuclear power plant concepts. New experiments have been continuously needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diplomityössä määritellään sähkönjakeluverkon suunnitteluperusteet. Suunnitteluperusteet antavat ohjeet verkostosuunnittelijoille siitä, miten sähköverkko suunnitellaan sähköteknisesti oikein, taloudelliset näkökohdat huomioiden. Työn alussa määritellään kaikkiin suunnittelutehtäviin vaikuttavat sähkötekniset ja taloudelliset laskentaparametrit. Oikeiden parametrien käyttäminen on ehdottoman tärkeää totuudenmukaisten lopputulosten saavuttamiseksi. Eniten lopputuloksiin vaikuttaville laskentaparametreille suoritetaan työn loppuosassa herkkyysanalyysi, jotta tulevaisuuden mahdollisesti erilaiset olosuhteet voitaisiin huomioida. Varsinaisissa suunnitteluun liittyvissä osioissa käsitellään keski- ja pienjänniteverkkojen suunnittelunlisäksi jakelumuuntajan mitoittaminen, sekä määritellään rajat erilaisille verkoston rakenteille. Erityisesti elinkaarikustannusajattelua painotetaan suunnittelutehtävissä ottamalla huomioon komponenttien koko pitoaikana syntyvät kustannukset.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän työn tavoitteena oli laatia KSS Energia Oy:n keskijänniteverkon kehittämissuunnitelma. Tätä varten selvitettiin verkon nykytila ja sen toimivuus korvaustilanteissa. Suunnitelmaa varten laadittiin verkoston vuoteen 2020 asti ulottuva kuormitusennuste. Kehittämissuunnitelmassa paikannettiin alueet, joilla kuormitettavuus, korvattavuus tai oikosulkukestoisuus olisivat toimivansähkönjakelun esteinä tulevaisuudessa. Työn painopistealueiksi muodostuivat Vuolenkosken, Paimenpolun ja Valkealan alueet. Näiden alueiden kasvavan tehontarpeen tyydyttämiseksi selvitettiin uusien sähköasemien rakentamisvaihtoehtoja. Suunnitelmassa tarkasteltiin ja vertailtiin kolmen uuden sähköaseman rakentamista keskijänniteverkon saneerausvaihtoehtoon. Työssä selvitettiin myös miten alueen kuormituksen kasvun aiheuttama lisätehontarpeen jakelu voidaan hoitaa. Näiden alueiden kuormituksen kasvu edellyttää uusien sähköasemien rakentamista seuraavien 10vuoden kuluessa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämä diplomityökuuluu tietoliikenneverkkojen suunnittelun tutkimukseen ja pohjimmiltaan kohdistuu verkon mallintamiseen. Tietoliikenneverkkojen suunnittelu on monimutkainen ja vaativa ongelma, joka sisältää mutkikkaita ja aikaa vieviä tehtäviä. Tämä diplomityö esittelee ”monikerroksisen verkkomallin”, jonka tarkoitus on auttaa verkon suunnittelijoita selviytymään ongelmien monimutkaisuudesta ja vähentää verkkojen suunnitteluun kuluvaa aikaa. Monikerroksinen verkkomalli perustuu yleisille objekteille, jotka ovat yhteisiä kaikille tietoliikenneverkoille. Tämä tekee mallista soveltuvan mielivaltaisille verkoille, välittämättä verkkokohtaisista ominaisuuksista tai verkon toteutuksessa käytetyistä teknologioista. Malli määrittelee tarkan terminologian ja käyttää kolmea käsitettä: verkon jakaminen tasoihin (plane separation), kerrosten muodostaminen (layering) ja osittaminen (partitioning). Nämä käsitteet kuvataan yksityiskohtaisesti tässä työssä. Monikerroksisen verkkomallin sisäinen rakenne ja toiminnallisuus ovat määritelty käyttäen Unified Modelling Language (UML) -notaatiota. Tämä työ esittelee mallin use case- , paketti- ja luokkakaaviot. Diplomityö esittelee myös tulokset, jotka on saatu vertailemalla monikerroksista verkkomallia muihin verkkomalleihin. Tulokset osoittavat, että monikerroksisella verkkomallilla on etuja muihin malleihin verrattuna.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

IP-verkkojen hyvin tunnettu haitta on, että nämä eivät pysty takaamaan tiettyä palvelunlaatua (Quality of Service) lähetetyille paketeille. Seuraavat kaksi tekniikkaa pidetään lupaavimpina palvelunlaadun tarjoamiselle: Differentiated Services (DiffServ) ja palvelunlaatureititys (QoS Routing). DiffServ on varsin uusi IETF:n määrittelemä Internetille tarkoitettu palvelunlaatumekanismi. DiffServ tarjoaa skaalattavaa palvelujen erilaistamista ilman viestintää joka hypyssä ja per-flow –tilan ohjausta. DiffServ on hyvä esimerkki hajautetusta verkkosuunnittelusta. Tämän palvelutasomekanismin tavoite on viestintäjärjestelmien suunnittelun yksinkertaistaminen. Verkkosolmu voidaan rakentaa pienestä hyvin määritellystä rakennuspalikoiden joukosta. Palvelunlaatureititys on reititysmekanismi, jolla liikennereittejä määritellään verkon käytettävissä olevien resurssien pohjalta. Tässä työssä selvitetään uusi palvelunlaatureititystapa, jota kutsutaan yksinkertaiseksi monitiereititykseksi (Simple Multipath Routing). Tämän työn tarkoitus on suunnitella palvelunlaatuohjain DiffServille. Tässä työssä ehdotettu palvelunlaatuohjain on pyrkimys yhdistää DiffServ ja palvelunlaatureititysmekanismeja. Työn kokeellinen osuus keskittyy erityisesti palvelunlaatureititysalgoritmeihin.