932 resultados para distributed computing projects


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the problem of computing the minimal and maximal diameter of the Cayley graph of Coxeter groups. We first present and assert relevant parts of polytope theory and related Coxeter theory. After this, a method of contracting the orthogonal projections of a polytope from Rd onto R2 and R3, d ¸ 3 is presented. This method is the Equality Set Projection algorithm that requires a constant number of linearprogramming problems per facet of the projection in the absence of degeneracy. The ESP algorithm allows us to compute also projected geometric diameters of high-dimensional polytopes. A representation set of projected polytopes is presented to illustrate the methods adopted in this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A visual SLAM system has been implemented and optimised for real-time deployment on an AUV equipped with calibrated stereo cameras. The system incorporates a novel approach to landmark description in which landmarks are local sub maps that consist of a cloud of 3D points and their associated SIFT/SURF descriptors. Landmarks are also sparsely distributed which simplifies and accelerates data association and map updates. In addition to landmark-based localisation the system utilises visual odometry to estimate the pose of the vehicle in 6 degrees of freedom by identifying temporal matches between consecutive local sub maps and computing the motion. Both the extended Kalman filter and unscented Kalman filter have been considered for filtering the observations. The output of the filter is also smoothed using the Rauch-Tung-Striebel (RTS) method to obtain a better alignment of the sequence of local sub maps and to deliver a large-scale 3D acquisition of the surveyed area. Synthetic experiments have been performed using a simulation environment in which ray tracing is used to generate synthetic images for the stereo system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The activated sludge process - the main biological technology usually applied towastewater treatment plants (WWTP) - directly depends on live beings (microorganisms), and therefore on unforeseen changes produced by them. It could be possible to get a good plant operation if the supervisory control system is able to react to the changes and deviations in the system and can take thenecessary actions to restore the system’s performance. These decisions are oftenbased both on physical, chemical, microbiological principles (suitable to bemodelled by conventional control algorithms) and on some knowledge (suitable to be modelled by knowledge-based systems). But one of the key problems in knowledge-based control systems design is the development of an architecture able to manage efficiently the different elements of the process (integrated architecture), to learn from previous cases (spec@c experimental knowledge) and to acquire the domain knowledge (general expert knowledge). These problems increase when the process belongs to an ill-structured domain and is composed of several complex operational units. Therefore, an integrated and distributed AIarchitecture seems to be a good choice. This paper proposes an integrated and distributed supervisory multi-level architecture for the supervision of WWTP, that overcomes some of the main troubles of classical control techniques and those of knowledge-based systems applied to real world systems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports the results of a three-year study of the effectiveness of mini-projects in a first year laboratory course in chemistry at a Scottish university. A mini-project is a short, practical problem which requires for its solution the application of the knowledge and skills developed in previously completed set experiments. A number of recommendations have been made about the most appropriate ways of introducing mini-projects into undergraduate laboratory course. The main hypothesis of this survey was concerned with the value of mini-projects in laboratory courses formulated within the context of Information Processing Theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laser scanning is becoming an increasingly popular method for measuring 3D objects in industrial design. Laser scanners produce a cloud of 3D points. For CAD software to be able to use such data, however, this point cloud needs to be turned into a vector format. A popular way to do this is to triangulate the assumed surface of the point cloud using alpha shapes. Alpha shapes start from the convex hull of the point cloud and gradually refine it towards the true surface of the object. Often it is nontrivial to decide when to stop this refinement. One criterion for this is to do so when the homology of the object stops changing. This is known as the persistent homology of the object. The goal of this thesis is to develop a way to compute the homology of a given point cloud when processed with alpha shapes, and to infer from it when the persistent homology has been achieved. Practically, the computation of such a characteristic of the target might be applied to power line tower span analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous genetic studies have demonstrated that natal homing shapes the stock structure of marine turtle nesting populations. However, widespread sharing of common haplotypes based on short segments of the mitochondrial control region often limits resolution of the demographic connectivity of populations. Recent studies employing longer control region sequences to resolve haplotype sharing have focused on regional assessments of genetic structure and phylogeography. Here we synthesize available control region sequences for loggerhead turtles from the Mediterranean Sea, Atlantic, and western Indian Ocean basins. These data represent six of the nine globally significant regional management units (RMUs) for the species and include novel sequence data from Brazil, Cape Verde, South Africa and Oman. Genetic tests of differentiation among 42 rookeries represented by short sequences (380 bp haplotypes from 3,486 samples) and 40 rookeries represented by long sequences (~800 bp haplotypes from 3,434 samples) supported the distinction of the six RMUs analyzed as well as recognition of at least 18 demographically independent management units (MUs) with respect to female natal homing. A total of 59 haplotypes were resolved. These haplotypes belonged to two highly divergent global lineages, with haplogroup I represented primarily by CC-A1, CC-A4, and CC-A11 variants and haplogroup II represented by CC-A2 and derived variants. Geographic distribution patterns of haplogroup II haplotypes and the nested position of CC-A11.6 from Oman among the Atlantic haplotypes invoke recent colonization of the Indian Ocean from the Atlantic for both global lineages. The haplotypes we confirmed for western Indian Ocean RMUs allow reinterpretation of previous mixed stock analysis and further suggest that contemporary migratory connectivity between the Indian and Atlantic Oceans occurs on a broader scale than previously hypothesized. This study represents a valuable model for conducting comprehensive international cooperative data management and research in marine ecology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed supply chains, tight delivery times and tailor made products are typical for international project business. This type of environment sets great challenges for logistics management. The scope of this work was to study logistics and information management in delivery projects. The logistic information of delivery project is divided into four categories: The information used to plan, execute and follow up the material flow and the information directly embedded into material flow. On the practical study the operations of the target company are dived into main processes supporting project execution. Logistic information is studied by modelling information flows between the processes. The aim was to identify the information crucial for project materials management and describe the flow of information between the processes. Results of the study show that the information related to execution of material flow is usually emphasized when it comes to operations and tools for data management. The traditional system tools poorly support the planning of material flow in project environment. In addition the significance of clearly defined and documented practices is highlighted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines Smart Grids and distributed generation, which is connected to a single-family house. The distributed generation comprises small wind power plant and solar panels. The study is done from the consumer point of view and it is divided into two parts. The first part presents the theoretical part and the second part presents the research part. The theoretical part consists of the definition of distributed generation, wind power, solar energy and Smart Grids. The study examines what the Smart Grids will enable. New technology concerning Smart Grids is also examined. The research part introduces wind and sun conditions from two countries. The countries are Finland and Germany. According to the wind and sun conditions of these two countries, the annual electricity production from wind power plant and solar panels will be calculated. The costs of generating electricity from wind and solar energy are calculated from the results of annual electricity productions. The study will also deal with feed-in tariffs, which are supporting systems for renewable energy resources. It is examined in the study, if it is cost-effective for the consumers to use the produced electricity by themselves or sell it to the grid. Finally, figures for both countries are formed. The figures include the calculated cost of generating electricity from wind power plant and solar panels, retail and wholesale prices and feed-in tariffs. In Finland, it is not cost-effective to sell the produced electricity to the grid, before there are support systems. In Germany, it is cost-effective to sell the produced electricity from solar panels to the grid because of feed-in tariffs. On the other hand, in Germany it is cost-effective to produce electricity from wind to own use because the retail price is higher than the produced electricity from wind.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Workflow management systems aim at the controlled execution of complex application processes in distributed and heterogeneous environments. These systems will shape the structure of information systems in business and non-business environments. E business and system integration is a fertile soil for WF and groupware tools. This thesis aims to study WF and groupware tools in order to gather in house knowledge of WF to better utilize WF solutions in future, and to focus on SAP Business Workflow in order to find a global solution for Application Link Enabling support for system integration. Piloting this solution in Nokia collects the experience of SAP R/3 WF tool for other development projects in future. The literary part of this study will guide to the world of business process automation providing a general description of the history, use and potentials of WF & groupware software. The empirical part of this study begins with the background of the case study describing the IT environment initiating the case by introducing SAP R/3 in Nokia, the communication technique in use and WF tool. Case study is focused in one solution with SAP Business Workflow. This study provides a concept to monitor communication between ERP systems and to increase the quality of system integration. Case study describes a way create support model for ALE/EDI interfaces. Support model includes monitoring organization and the workflow processes to solve the most common IDoc related errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les tendencies del mercat actual i futur obliguen a les empreses a ser cada vegada més competitives. Tota empresa que desitgi competir en el mercat actual ha de considerar la informació com un dels seus principals actius. Per aquesta raó, és necessari que l’empresa disposi dels sistemes d’informació adequats per ser gestionada. En les pimes ( petites i mitjanes empreses ), i en la majoria dels casos, la seva informació es troba repartida en diversos aplicatius informàtifcs. Això comporta principalment una duplicitat de dades, costos de manteniment i possibles errades en la informació. Dins del sector de les pimes, es troben empreses contructores d’un volum petit o mig, les quals enmagatzemen molta informació tècnica i de planificació per desemvolupar els seus projectes. Aquesta necessitat els hi suposa, tenir un aplicatiu eficient per la gestió i seguiment de la producció de les seves tasques i per altra banda l’aplicatiu necessari com qualsevol pime a nivell administratiu i comptable. L’objectiu principal d’aquest treball és generar un aplicatiu que integri la informació administrativa, comptable i tècnica per una empresa constructora. Amb l’assoliment d’aquest objectiu l’empresa constructora guanya temps en l’entrada i accés a les dades, evita la seva duplicitat i redueix el seu manteniment. En resum, redueix els costos de l’empresa i augmenta la seva seguretat en la informació. Un el mercat, existeixen aplicatius anomenats ERP ( Enterprise resource planning o sistemes de planificació de recursos ) els quals la seva gran virtud es la integració de les dades. Aprofitant aquests aplicatius i seleccionant-ne el més adequat, s’ha generat la part de gestió i planificació necessària per l’empresa constructora integrant-la i personalitzant-la en els processos existents de l’aplicació ( comptabilitat, ventes, compres, recursos humans, magatzems, etc… ) Entre les conclusions més rellevants obtingudes en aquest treball, voldriem destacar la millora i reducció d’entrada, accés i manteniment de la informació, l’històric que proporciona l’eina i per tan la millora de la gestío i planificació de l’empresa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software integration is a stage in a software development process to assemble separate components to produce a single product. It is important to manage the risks involved and being able to integrate smoothly, because software cannot be released without integrating it first. Furthermore, it has been shown that the integration and testing phase can make up 40 % of the overall project costs. These issues can be mitigated by using a software engineering practice called continuous integration. This thesis work presents how continuous integration is introduced to the author's employer organisation. This includes studying how the continuous integration process works and creating the technical basis to start using the process on future projects. The implemented system supports software written in C and C++ programming languages on Linux platform, but the general concepts can be applied to any programming language and platform by selecting the appropriate tools. The results demonstrate in detail what issues need to be solved when the process is acquired in a corporate environment. Additionally, they provide an implementation and process description suitable to the organisation. The results show that continuous integration can reduce the risks involved in a software process and increase the quality of the product as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The environmental impact of landfill is a growing concern in waste management practices. Thus, assessing the effectiveness of the solutions implemented to alter the issue is of importance. The objectives of the study were to provide an insight of landfill advantages, and to consolidate landfill gas importance among others alternative fuels. Finally, a case study examining the performances of energy production from a land disposal at Ylivieska was carried out to ascertain the viability of waste to energy project. Both qualitative and quantitative methods were applied. The study was conducted in two parts; the first was the review of literatures focused on landfill gas developments. Specific considerations were the conception of mechanism governing the variability of gas production and the investigation of mathematical models often used in landfill gas modeling. Furthermore, the analysis of two main distributed generation technologies used to generate energy from landfill was carried out. The review of literature revealed a high influence of waste segregation and high level of moisture content for waste stabilization process. It was found that the enhancement in accuracy for forecasting gas rate generation can be done with both mathematical modeling and field test measurements. The result of the case study mainly indicated the close dependence of the power output with the landfill gas quality and the fuel inlet pressure.