20 resultados para Data compression (Electronic computers)

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tämän diplomityön tarkoituksena oli tehdä selvitys EDI:in liittyvistä vaikutuksista, tarpeista ja eduista sekä valmistella Oracle Applications- toiminnanohjausjärjestelmän EDI Gateway- modulin ottamista tuotantokäyttöön. Tietoa tarvekartoitukseen saatiin keskustelujen avulla. Uusia kaupallisista lähtökohdista johdettuja, yritysten väliseen kaupankäyntiin ja internet-teknologian hyödyntämiseen kehitettyjä aloitteita käsiteltiin EDI-näkökulmasta tulevaisuutta varten. Ajankohtaisinta tietoa tätä diplomityötä varten löydettiin myös internetistä. Tämän jälkeen oli mahdollista toteuttaa sopivan laaja mutta rajattu EDI pilottiprojekti EDI-konseptin luomista varten. EDI:n vaikutuksiin ostossa keskityttiin tässä diplomityössä enemmän ja EDI:ä päätettiin soveltaa aluksi ostotilauksissa. EDI:n hyötyjä on vaikea mitata numeerisesti. Suurta määrää rahaa tai tuoteyksiköitä on käsiteltävä EDI-partnerin kanssa riittävän usein. EDI:n käyttöönottovaiheessa pääongelmat ovat sovelluksiin liittyviä tietotekniikkaongelmia. Selvityksistä ja EDI-projektista saatu tieto on mahdollista hyödyntää jatkokehityksessä. Lisätoimenpiteitä tarvitaan kokonaan toimivan järjestelmän luomiseksi.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological progress has made a huge amount of data available at increasing spatial and spectral resolutions. Therefore, the compression of hyperspectral data is an area of active research. In somefields, the original quality of a hyperspectral image cannot be compromised andin these cases, lossless compression is mandatory. The main goal of this thesisis to provide improved methods for the lossless compression of hyperspectral images. Both prediction- and transform-based methods are studied. Two kinds of prediction based methods are being studied. In the first method the spectra of a hyperspectral image are first clustered and and an optimized linear predictor is calculated for each cluster. In the second prediction method linear prediction coefficients are not fixed but are recalculated for each pixel. A parallel implementation of the above-mentioned linear prediction method is also presented. Also,two transform-based methods are being presented. Vector Quantization (VQ) was used together with a new coding of the residual image. In addition we have developed a new back end for a compression method utilizing Principal Component Analysis (PCA) and Integer Wavelet Transform (IWT). The performance of the compressionmethods are compared to that of other compression methods. The results show that the proposed linear prediction methods outperform the previous methods. In addition, a novel fast exact nearest-neighbor search method is developed. The search method is used to speed up the Linde-Buzo-Gray (LBG) clustering method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tavoitteena oli arvioida ja selvittää toimittajasuhteeseen vaikuttavia tekijöitä JOT Automation Group Oyj.n ja sen alihankkijoiden välisessä yhteistyössä ja muodostaa yrityksen kilpailukykyä parantava toimittaja-arviointiprosessi. Työssä keskityttiin tarkastelemaan yleisillä materiaali- ja komponenttimarkkinoilla toimivia toimittajia elektroniikkateollisuuden tuotantojärjestelmien valmistuksessa. Ensin tutustuttiin toimittajasuhdetta ja sen arviointia käsitelleeseen kirjallisuuteen. Teorian tueksi tehtiin haastatteluja ja kartoitettiin ensisijaisia tarpeita ja tavoitteita arviointiprosessille. Valmis prosessi testattiin käytännössä kahden eri case-esimerkin avulla. Prosessista muodostui kahteen eri työkaluun jakautunut kokonaisuus, joista auditointi arvioi toimittajan kyvykkyyttä vastatta sille asetettuihin vaatimuksiin. Toimittajan suorituskyvyn mittaaminen puolestaan testaa ja vertaa jatkuvasti toiminnan todellista tasoa auditoinnissa saatuihin tuloksiin. Työ sisältää selvityksen ja ohjeistuksen toimittaja-arviointiprosessin käytöstä. Prosessin käyttö alentaa toimittajaan kohdistuvaa materiaalien saatavuuteen ja hankintaan liittyviä riskejä. Esimerkeistä saadut kokemukset osoittivat, että prosessin avulla päästään pureutumaan tärkeisiin ydinalueisiin ja kehittämään niitä sekä toimittajalle, että ostajayritykselle edullisella tavalla. Toimittaja-arviointiprosessista kehittyy toimintatapa yrityksen ja sen toimittajan välisen suhteen ylläpitämiseksi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vaatimus kuvatiedon tiivistämisestä on tullut entistä ilmeisemmäksi viimeisen kymmenen vuoden aikana kuvatietoon perustuvien sovellutusten myötä. Nykyisin kiinnitetään erityistä huomiota spektrikuviin, joiden tallettaminen ja siirto vaativat runsaasti levytilaa ja kaistaa. Aallokemuunnos on osoittautunut hyväksi ratkaisuksi häviöllisessä tiedontiivistämisessä. Sen toteutus alikaistakoodauksessa perustuu aallokesuodattimiin ja ongelmana on sopivan aallokesuodattimen valinta erilaisille tiivistettäville kuville. Tässä työssä esitetään katsaus tiivistysmenetelmiin, jotka perustuvat aallokemuunnokseen. Ortogonaalisten suodattimien määritys parametrisoimalla on työn painopisteenä. Työssä todetaan myös kahden erilaisen lähestymistavan samanlaisuus algebrallisten yhtälöiden avulla. Kokeellinen osa sisältää joukon testejä, joilla perustellaan parametrisoinnin tarvetta. Erilaisille kuville tarvitaan erilaisia suodattimia sekä erilaiset tiivistyskertoimet saavutetaan eri suodattimilla. Lopuksi toteutetaan spektrikuvien tiivistys aallokemuunnoksen avulla.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The globalization and development of an information society promptly change shape of the modern world. Cities and especially megacities including Saint-Petersburg are in the center of occuring changes. As a result of these changes the economic activities connected to reception and processing of the information now play very important role in economy of megacities what allows to characterize them as "information". Despite of wide experience in decision of information questions Russia, and in particular Saint-Petersburg, lag behind in development of information systems from the advanced European countries. The given master's thesis is devoted to development of an information system (data transmission network) on the basis of wireless technology in territory of Saint-Petersburg region within the framework of FTOP "Electronic Russia" and RTOP "Electronic Saint-Petersburg" programs. Logically the master's thesis can be divided into 3 parts: 1. The problems, purposes, expected results, terms and implementation of the "Electronic Russia" program. 2. Discussion about wireless data transmission networks (description of technology, substantiation of choice, description of signal's transmission techniques and types of network topology). 3. Fulfillment of the network (organization of central network node, regional centers, access lines, description of used equipment, network's capabilities), financial provision of the project, possible network management models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Main purpose of this thesis is to introduce a new lossless compression algorithm for multispectral images. Proposed algorithm is based on reducing the band ordering problem to the problem of finding a minimum spanning tree in a weighted directed graph, where set of the graph vertices corresponds to multispectral image bands and the arcs’ weights have been computed using a newly invented adaptive linear prediction model. The adaptive prediction model is an extended unification of 2–and 4–neighbour pixel context linear prediction schemes. The algorithm provides individual prediction of each image band using the optimal prediction scheme, defined by the adaptive prediction model and the optimal predicting band suggested by minimum spanning tree. Its efficiency has been compared with respect to the best lossless compression algorithms for multispectral images. Three recently invented algorithms have been considered. Numerical results produced by these algorithms allow concluding that adaptive prediction based algorithm is the best one for lossless compression of multispectral images. Real multispectral data captured from an airplane have been used for the testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power electronic converter drives use, for the sake of high efficiency, pulse-width modulation that results in sequences of high-voltage high-frequency steep-edged pulses. Such a signal contains a set of high harmonics not required for control purposes. Harmonics cause reflections in the cable between the motor and the inverter leading to faster winding insulation ageing. Bearing failures and problems with electromagnetic compatibility may also result. Electrical du/dt filters provide an effective solution to problems caused by pulse-width modulation, thereby increasing the performance and service life of the electrical machines. It is shown that RLC filters effectively decrease the reflection phenomena in the cable. Improved (simple, but effective) solutions are found for both differential- and common-mode signals; these solutions use a galvanic connection between the RLC filter star point and the converter DC link. Foil chokes and film capacitors are among the most widely used components in high-power applications. In actual applications they can be placed in different parts of the cabinet. This fact complicates the arrangement of the cabinet and decreases the reliability of the system. In addition, the inductances of connection wires may prevent filtration at high frequencies. This thesis introduces a new hybrid LC filter that uses a natural capacitance between the turns of the foil choke based on integration of an auxiliary layer into it. The main idea of the hybrid LC filter results from the fact that both the foil choke and the film capacitors have the same roll structure. Moreover, the capacitance between the turns (“intra capacitance”) of the foil inductors is the reason for the deterioration of their properties at high frequencies. It is shown that the proposed filter has a natural cancellation of the intra capacitance. A hybrid LC filter may contain two or more foil layers isolated from each other and coiled on a core. The core material can be iron or even air as in the filter considered in this work. One of the foils, called the main foil, can be placed between the inverter and the motor cable. Other ones, called auxiliary foils, may be connected in star to create differential-mode noise paths, and then coupled to the DC link midpoint to guarantee a traveling path, especially for the common-mode currents. This way, there is a remarkable capacitance between the main foil and the auxiliary foil. Investigations showed that such a system can be described by a simple equivalent LC filter in a wide range of frequencies. Because of its simple hybrid construction, the proposed LC filter can be a cost-effective and competitive solution for modern power drives. In the thesis, the application field of the proposed filter is considered and determined. The basics of hybrid LC filter design are developed further. High-frequency behaviour of the proposed filter is analysed by simulations. Finally, the thesis presents experimental data proving that the hybrid LC filter can be used for du/dt of PWM pulses and reduction of common-mode currents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Personalised ubiquitous services have rapidly proliferated due technological advancements in sensing, ubiquitous and mobile computing. Evolving societal trends, business and the economic potential of Personal Information (PI) have overlapped the service niches. At the same time, the societal thirst for more personalised services has increased and are met by soliciting deeper and more privacy invasive PI from customers. Consequentially, reinforcing traditional privacy challenges and unearthed new risks that render classical safeguards ine ective. The absence of solutions to criticise personalised ubiquitous services from privacy perspectives, aggravates the situation. This thesis presents a solution permitting users' PI, stored in their mobile terminals to be disclosed to services in privacy preserving manner for personalisation needs. The approach termed, Mobile Electronic Personality Version 2 (ME2.0), is compared to alternative mechanisms. Within ME2.0, PI handling vulnerabilities of ubiquitous services are identi ed and sensitised on their practices and privacy implications. Vulnerability where PI may leak through covert solicits, excessive acquisitions and legitimate data re-purposing to erode users privacy are also considered. In this thesis, the design, components, internal structures, architectures, scenarios and evaluations of ME2.0 are detailed. The design addresses implications and challenges leveraged by mobile terminals. ME2.0 components and internal structures discusses the functions related to how PI pieces are stored and handled by terminals and services. The architecture focusses on di erent components and their exchanges with services. Scenarios where ME2.0 is used are presented from di erent environment views, before evaluating for performance, privacy and usability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Companies and organizations business activities rely on information. IT systems that manage master data must provide the highest level of service, world-class scalability and reliable information. This study discusses product information and how it can support the creation of a spare part catalog and the launching of eBusiness. The study consists of a theoretical and empirical part. The theoretical part contains a literature review and a framework for the analysis. For the empirical study two companies were selected and their information management processes were studied and analyzed based on the framework. The empirical results indicate that the challenges the companies face reflect the ones that can be found in the literature study. The results of the empirical study also show that the companies had recognized the issues which need to be developed and had recognized trends in eBusiness and product information management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper is devoted to study specific aspects of heat transfer in the combustion chamber of compression ignited reciprocating internal combustion engines and possibility to directly measure the heat flux by means of Gradient Heat Flux Sensors (GHFS). A one – dimensional single zone model proposed by Kyung Tae Yun et al. and implemented with the aid of Matlab, was used to obtain approximate picture of heat flux behavior in the combustion chamber with relation to the crank angle. The model’s numerical output was compared to the experimental results. The experiment was accomplished by A. Mityakov at four stroke diesel engine Indenor XL4D. Local heat fluxes on the surface of cylinder head were measured with fast – response, high – sensitive GHFS. The comparison of numerical data with experimental results has revealed a small deviation in obtained heat flux values throughout the cycle and different behavior of heat flux curve after Top Dead Center.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The necessity of EC (Electronic Commerce) and enterprise systems integration is perceived from the integrated nature of enterprise systems. The proven benefits of EC to provide competitive advantages to the organizations force enterprises to adopt and integrate EC with their enterprise systems. Integration is a complex task to facilitate seamless flow of information and data between different systems within and across enterprises. Different systems have different platforms, thus to integrate systems with different platforms and infrastructures, integration technologies, such as middleware, SOA (Service-Oriented Architecture), ESB (Enterprise Service Bus), JCA (J2EE Connector Architecture), and B2B (Business-to-Business) integration standards are required. Huge software vendors, such as Oracle, IBM, Microsoft, and SAP suggest various solutions to address EC and enterprise systems integration problems. There are limited numbers of literature about the integration of EC and enterprise systems in detail. Most of the studies in this area have focused on the factors which influence the adoption of EC by enterprise or other studies provide limited information about a specific platform or integration methodology in general. Therefore, this thesis is conducted to cover the technical details of EC and enterprise systems integration and covers both the adoption factors and integration solutions. In this study, many literature was reviewed and different solutions were investigated. Different enterprise integration approaches as well as most popular integration technologies were investigated. Moreover, various methodologies of integrating EC and enterprise systems were studied in detail and different solutions were examined. In this study, the influential factors to adopt EC in enterprises were studied based on previous literature and categorized to technical, social, managerial, financial, and human resource factors. Moreover, integration technologies were categorized based on three levels of integration, which are data, application, and process. In addition, different integration approaches were identified and categorized based on their communication and platform. Also, different EC integration solutions were investigated and categorized based on the identified integration approaches. By considering different aspects of integration, this study is a great asset to the architectures, developers, and system integrators in order to integrate and adopt EC with enterprise systems.