941 resultados para XML-RPC
Resumo:
The large hadron collider constructed at the European organization for nuclear research, CERN, is the world’s largest single measuring instrument ever built, and also currently the most powerful particle accelerator that exists. The large hadron collider includes six different experiment stations, one of which is called the compact muon solenoid, or the CMS. The main purpose of the CMS is to track and study residue particles from proton-proton collisions. The primary detectors utilized in the CMS are resistive plate chambers (RPCs). To obtain data from these detectors, a link system has been designed. The main idea of the link system is to receive data from the detector front-end electronics in parallel form, and to transmit it onwards in serial form, via an optical fiber. The system is mostly ready and in place. However, a problem has occurred with innermost RPC detectors, located in sector labeled RE1/1; transmission lines for parallel data suffer from signal integrity issues over long distances. As a solution to this, a new version of the link system has been devised, a one that fits in smaller space and can be located within the CMS, closer to the detectors. This RE1/1 link system has been so far completed only partially, with just the mechanical design and casing being done. In this thesis, link system electronics for RE1/1 sector has been designed, by modifying the existing link system concept to better meet the requirements of the RE1/1 sector. In addition to completion of the prototype of the RE1/1 link system electronics, some testing for the system has also been done, to ensure functionality of the design.
Resumo:
Context awareness is emerging on mobile devices. Context awareness can be used to improve usability of a mobile device. Context awareness is particularly important on mobile devices due the limitations they have. At first in this work, a literature review on context awareness and mobile environment is made. For aiding context awareness there exist an implementation of a Context Framework for Symbian S60 devices. It provides a possibility for exchanging the contexts inside the device between the client applications of the local Context Framework. The main contribution of this thesis is to design and implement an enhancement to the S60 Context Framework for providing possibility to exchange context over device boundaries. Using the implemented Context Exchange System, the context exchange is neither depending on the type of the context nor the type of the client. In addition, the clients and the contexts can reside on any interconnected device. The usage of the system is independent of the programming language since in addition to using only Symbian C++ function interfaces it can also be utilized using XML scripts. The Meeting Sniffer application, which uses the Context Exchange System, was also developed in this work. Using this application, it is possible to recognize a meeting situation and suggest device profile change to a user.
Resumo:
The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.
Resumo:
Aquest projecte mostrarà la implementació de l'ERP OpenERP, en les àrees vendes, compres i comptabilitat, d'una empresa pyme que per motius de privacitat no direm el seu nom i les dades de la base de dades seran fictícies. L'empresa és dedica al sector tèxtil i vol aconseguir que l'aplicació li funcioni com si és comportes com un e-commerce pels seus clients especials però amb les funcionalitats dels ERP's. És detallarà les funcionalitats de l'OpenERP, s'exposaran els requeriments que espera aconseguir l'empresa amb la implementació, el disseny, les configuracions i el desenvolupament d'un mòdul especific per l'empresa.
Resumo:
Amb aquest treball de final de carrera de la titulació d'enginyeria tècnica en informàtica de gestió es pretén fer una primera aproximació al món de l'anàlisi semàntic de webs. Consisteix, per una banda, en la creació d'una ontologia per emmagatzemar informació provinent de la web de LinkedIn de manera que després pugui ser analitzada i permeti filtrar les dades de manera pràctica evitant l'excés d'informació no útil. Per altra banda, el treball inclou el desenvolupament d'una aplicació per a l'obtenció de la informació de la web de LinkedIn de manera automàtica, i un mètode per a la importació a l'ontologia creada.
Resumo:
Estudio de la tecnología necesaria para la obtención de una herramienta de conversión de texto en formato OpenDocument a voz y que el resultado sea almacenado en un archivo de audio.
Resumo:
Tietojärjestelmien integraatio on nykypäivänä tärkeä osa alue yritysten toiminnassa ja kilpailukyvyn ylläpitämisessä. Palvelukeskeinen arkkitehtuuri ja Web palvelut on uusi joustava tapa tehdä tietojärjestelmien välinen integraatio. Web palveluiden yksi ydinkomponentti on UDDI, Universal Description, Discovery and Integration. UDDI toimii palvelurekisterin tavoin. UDDI määrittää tavan julkaista, löytää ja ottaa käyttöön Web palveluja. Web palveluja voidaan hakea UDDI:sta erilaisin kriteerein, kuten esimerkiksi palvelun sijainnin, yrityksen nimen ja toimialan perusteella. UDDI on myös itsessään Web palvelu, joka perustuu XML kuvauskieleen ja SOAP protokollaan. Työssä paneudutaan tarkemmin UDDI:in. UDDI:ta käsitellään tarkemmin myös teknisesti. Oleellinen osa UDDI:ta on ollut julkaisijoiden ja käyttäjien mielestä tietoturvan puute, joka on rajoittanut huomattavasti UDDI:n käyttöä ja käyttöönottamista. Työssä tarkastellaankin tarkemmin juuri tietoturvaan liittyviä asioita ja ratkaisuja sekä myös UDDI:n merkitystä yrityksille.
Resumo:
Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.
Resumo:
Com a continuació del treball de final de carrera “Desenvolupament d’un laboratori virtual per a les pràctiques de Biologia Molecular” de Jordi Romero, s’ha realitzat una eina complementaria per a la visualització de molècules integrada en el propi laboratori virtual. Es tracta d’una eina per a la visualització gràfica de gens, ORF, marques i seqüències de restricció de molècules reals o fictícies. El fet de poder treballar amb molècules fictícies és la gran avantatge respecte a les solucions com GENBANK que només permet treballar amb molècules pròpies. Treballar amb molècules fictícies fa que sigui una solució ideal per a l’ensenyament, ja que dóna la possibilitat als professors de realitzar exercicis o demostracions amb molècules reals o dissenyades expressament per a l’exercici a demostrar. A més, permet mostrar de forma visual les diferents parts simultàniament o per separat, de manera que ofereix una primera aproximació interpretació dels resultats. Per altra banda, permet marcar gens, crear marques, localitzar seqüències de restricció i generar els ORF de la molècula que nosaltres creem o modificar una ja existent. Per l’implementació, s’ha continuat amb l’idea de separar la part de codi i la part de disseny en les aplicacions Flash. Per fer-ho, s’ha utilitzat la plataforma de codi lliure Ariware ARPv2.02 que proposa un marc de desenvolupament d’aplicacions Flash orientades a objectes amb el codi (classes ActionScript 2.0) separats del movieclip. Per al processament de dades s’ha fet servir Perl per ser altament utilitzat en Bioinformàtica i per velocitat de càlcul. Les dades generades es guarden en una Base de Dades en MYSQL (de lliure distribució), de la que s’extreuen les dades per generar fitxers XML, fent servir tant PHP com la plataforma AMFPHP com a enllaç entre Flash i la resta de parts.
Resumo:
The RPC Detector Control System (RCS) is the main subject of this PhD work. The project, involving the Lappeenranta University of Technology, the Warsaw University and INFN of Naples, is aimed to integrate the different subsystems for the RPC detector and its trigger chain in order to develop a common framework to control and monitoring the different parts. In this project, I have been strongly involved during the last three years on the hardware and software development, construction and commissioning as main responsible and coordinator. The CMS Resistive Plate Chambers (RPC) system consists of 912 double-gap chambers at its start-up in middle of 2008. A continuous control and monitoring of the detector, the trigger and all the ancillary sub-systems (high voltages, low voltages, environmental, gas, and cooling), is required to achieve the operational stability and reliability of a so large and complex detector and trigger system. Role of the RPC Detector Control System is to monitor the detector conditions and performance, control and monitor all subsystems related to RPC and their electronics and store all the information in a dedicated database, called Condition DB. Therefore the RPC DCS system has to assure the safe and correct operation of the sub-detectors during all CMS life time (more than 10 year), detect abnormal and harmful situations and take protective and automatic actions to minimize consequential damages. The analysis of the requirements and project challenges, the architecture design and its development as well as the calibration and commissioning phases represent themain tasks of the work developed for this PhD thesis. Different technologies, middleware and solutions has been studied and adopted in the design and development of the different components and a big challenging consisted in the integration of these different parts each other and in the general CMS control system and data acquisition framework. Therefore, the RCS installation and commissioning phase as well as its performance and the first results, obtained during the last three years CMS cosmic runs, will be
Resumo:
Palvelukeskeinen arkkitehtuuri on uusi tapa rakentaa tietojärjestelmiä. Se perustuu siihen, että logiikasta koostetaan yleiskäyttöisiä palveluita, joita tarjotaan muiden järjestelmän osien käyttöön. Tällöin samoja asioita ei tarvitse toteuttaa moneen kertaan ja järjestelmää voidaan hyödyntää tehokkaasti ja monipuolisesti. Näiden palveluiden hallinnassa voidaan hyödyntää palveluväyliä, eli ESB -tuotteita. Palveluväylät sisältävät erilaisia mekanismeja, joiden avulla palveluihin liittyvää viestiliikennettä voidaan reitittää, muokata ja valvoa eri tavoin. Nykyisissä palvelukeskeisissä toteutuksissa käytetään usein XML -kieleen pohjautuvia Web Service -määrityksiä. Ne tarjoavat ympäristöriippumattoman pohjan, joka täyttää suoraan useita palvelukeskeisen arkkitehtuurin vaatimuksia. Määritysten ympärille on myös paljon valmiita laajennuksia, joiden avulla palveluihin voidaan liittää lisätoiminnallisuutta. Lahden kaupunki lähti Fenix -projektin yhteydessä kehittämään uutta kuntien käyttöön soveltuvaa järjestelmää, joka hyödyntää palvelukeskeisen arkkitehtuurin periaatteita. Järjestelmä jaettiin selkeisiin kerroksiin siten, että käyttöliittymä erotettiin palvelulogiikoista palveluväylän avulla. Tällöin järjestelmä saatiin jaettua loogisiin kokonaisuuksiin, joilla on selkeä rooli. Taustapalvelut hoitavat käsitteiden hallinnan, sekä niihin liittyvät liiketoimintasäännöt. Käyttöliittymäkerros hoitaa tiedon esittämisen ja tarjoaa graafisen, selainpohjaisen käyttöliittymän palveluihin. Palveluväylä hoitaa liikenteen reitittämisen, sekä huolehtii palveluihin liittyvistä käyttöoikeuksista ja tilastoinnista. Lopputuloksena on loputtomiin laajennettavissa oleva järjestelmä, jonka päälle voidaan kehittää erilaisia sähköisiä palveluita kunnan ja sen asukkaiden välille.
Resumo:
During the past decades testing has matured from ad-hoc activity into being an integral part of the development process. The benefits of testing are obvious for modern communication systems, which operate in heterogeneous environments amongst devices from various manufacturers. The increased demand for testing also creates demand for tools and technologies that support and automate testing activities. This thesis discusses applicability of visualization techniques in the result analysis part of the testing process. Particularly, the primary focus of this work is visualization of test execution logs produced by a TTCN-3 test system. TTCN-3 is an internationally standardized test specification and implementation language. The TTCN-3 standard suite includes specification of a test logging interface and a graphical presentation format, but no immediate relationship between them. This thesis presents a technique for mapping the log events to the graphical presentation format along with a concrete implementation, which is integrated with the Eclipse Platform and the OpenTTCN Tester toolchain. Results of this work indicate that for majority of the log events, a visual representation may be derived from the TTCN-3 standard suite. The remaining events were analysed and three categories relevant in either log analysis or implementation of the visualization tool were identified: events indicating insertion of something into the incoming queue of a port, events indicating a mismatch and events describing the control flow during the execution. Applicability of the results is limited into the domain of TTCN-3, but the developed mapping and the implementation may be utilized with any TTCN-3 tool that is able to produce the execution log in the standardized XML format.
Resumo:
Työssä tutkitaan eri tekniikoita, joilla web-käyttöliittymä voidaan toteuttaa. Tutkituista tekniikoista valitaan työn tavoitteisiin ja rajoitteisiin parhaiten soveltuvat tekniikat, joita käytetään hyväksi luotaessa varsinainen käyttöliittymäkerros olemassa olevalle web-sovellukselle. Varsinaiset käyttöliittymät luodaan automaattisesti työn aikana toteutettavalla käyttöliittymägeneraattorilla, joka käyttää hyväkseen käyttöliittymiä kuvaavia XML-kuvaustiedostoja. Tekniikoista parhaiten tarpeisiimme soveltui AJAX-lähestymistapa, joka mahdollistaa sivun osittaisen päivittämisen ja täten työpöytäsovellusmaisemman käytettävyyden nopeamman sivun päivityksen vuoksi. Käyttöliittymägeneraattorin käyttämät kuvaustiedostot puolestaan mahdollistavat käyttöliittymäkontrollien valmiin mallintamisen yleisessä kontrollikuvaustiedostossa sekä niiden helpon muokkaamisen ja sijoittelun sivu-kohtaisesti. Lisäksi käyttöliittymäkerros sisältää monipuoliset käyttöliittymäkontrollit.
Resumo:
Työn tutkimusongelmana oli selvittää EDI-yhteyksien vaihtoehtoiset toteutustavat kustannuksineen. EDI-palveluiden käytöstä aiheutuvien kustannusten arviointia varten muodostettiin MS Office Excel -laskentamalli, jonka avulla voidaan arvioida perushankinnasta ja käytöstä aiheutuvia kustannuksia sekä EDI-projektin kannattavuutta. Tilaustenkäsittelyn yhteydessä syntyviä vuotuisia kustannuksia sekä kustannussäätöjä arvioitiin aikaperusteisen toimintolaskennan avulla. Sovellettava teoria rajattiin toiminto- ja investointilaskennan alueelle. EDI-palveluita tarjoavilta operaattoreilta selvitettiin hinta, palveluiden laatu ja muut lisäarvopalvelut. Kustannustarkastelu rajattiin perushankintakustannusten ja käytönaikaisten kustannusten selvittämiseen. Työn tutkimusote on luonteeltaan konstruktiivinen, sillä tavoitteena oli luoda johdon päätöksentekoa tukeva laskentamalli. Kartoituksen perusteella EDI:n käyttökustannuksiin vaikuttavat tekijät muodostuvat kolmesta ryhmästä, joita ovat sähköisten tilaussanomien piirissä olevien asiakkaiden ja toimittajien lukumäärä, verkkolaskut sekä EDI-sanomat. EDI-sanomien osalta vaikuttavana tekijänä on lähettävän sanoman muoto ja veloitusperuste.
Resumo:
The study examines the signalling of text organisation in research articles (RA) in French. The work concentrates on a particular type of organisation provided by text sequences, i.e. structures organising text to items of which at least some are signalled by markers of addition or order: First… 0… The third point… In addition… / Premièrement… 0… Le troisième point… De plus… By indicating the way the text is organised, these structures guide the reader in the reading process so that he doesn’t need to interpret the text structure himself. The aim of the work is to study factors affecting the marking of text sequences. Why is their structure sometimes signalled explicitly by markers such as secondly, whereas in other places such markers are not used? The corpus is manually XML-annotated and consists of 90 RAs (~800 000 words) in French from the fields of linguistics, education and history. The analysis highlights several factors affecting the marking of text sequences. First, exact markers (such as fist ) seem to be more frequent in sequences where all the items are explicitly signalled by a marker, whereas additive markers (such as moreover) are used in sequences with both explicitly signalled and unmarked items. The marking of explicitly signalled sequences seems thus to be precise and even repetitive, whereas the signalling of sequences with unmarked items is altogether more vague. Second, the marking of text sequences seems to depend on the length of the text. The longer the text segment, the more vague the marking. Additive markers and unmarked items are more frequent in longer sequences possibly covering several pages, whereas shorter sequences are often signalled explicitly by exact markers. Also the marker types vary according to the sequence length. Anaphoric expressions, such as first, are fairly close to their referents and are used in short sequences, connectors, such as secondly, are frequently used in sequences of intermediate length, whereas the longest sequences are often signalled by constructions composed of an ordinal and a noun acting as a subject of the sentence: The first item is… Finally, the marking of text organisation depends also on the discipline the RA belongs to. In linguistics, the marking is fairly frequent and precise; exact markers such as second are the most used, and structures with unmarked items are less common. Similarly, the marking is fairly frequent in education. In this field, however, it is also less precise than in linguistics, with frequent unmarked items and additive markers. History, on the other hand, is characterised by less frequent marking. In addition, when used, the marking in this field is also less precise and less explicit.