994 resultados para pacs: data interchange
Resumo:
O presente trabalho tem como tema a Troca Electrónica de Dados (Electronic Data Interchange – EDI) transferência de dados estruturados, respeitando mensagens normalizadas estabelecidas, computador a computador, por meios electrónicos. Trocas entre computadores refere-se a trocas entre aplicações informáticas: por exemplo o sistema de encomendas envia ordens ao sistema central de controlo da produção, o qual então fará o envio da respectiva factura. O EDI aberto faz a troca electrónica de dados entre parceiros autónomos que se associaram para fazerem trocas de dados estruturados que são destinados a serem processados por programas de aplicações.
Resumo:
Tämän diplomityön tarkoituksena oli tehdä selvitys EDI:in liittyvistä vaikutuksista, tarpeista ja eduista sekä valmistella Oracle Applications- toiminnanohjausjärjestelmän EDI Gateway- modulin ottamista tuotantokäyttöön. Tietoa tarvekartoitukseen saatiin keskustelujen avulla. Uusia kaupallisista lähtökohdista johdettuja, yritysten väliseen kaupankäyntiin ja internet-teknologian hyödyntämiseen kehitettyjä aloitteita käsiteltiin EDI-näkökulmasta tulevaisuutta varten. Ajankohtaisinta tietoa tätä diplomityötä varten löydettiin myös internetistä. Tämän jälkeen oli mahdollista toteuttaa sopivan laaja mutta rajattu EDI pilottiprojekti EDI-konseptin luomista varten. EDI:n vaikutuksiin ostossa keskityttiin tässä diplomityössä enemmän ja EDI:ä päätettiin soveltaa aluksi ostotilauksissa. EDI:n hyötyjä on vaikea mitata numeerisesti. Suurta määrää rahaa tai tuoteyksiköitä on käsiteltävä EDI-partnerin kanssa riittävän usein. EDI:n käyttöönottovaiheessa pääongelmat ovat sovelluksiin liittyviä tietotekniikkaongelmia. Selvityksistä ja EDI-projektista saatu tieto on mahdollista hyödyntää jatkokehityksessä. Lisätoimenpiteitä tarvitaan kokonaan toimivan järjestelmän luomiseksi.
Resumo:
This issue describes progress in EDI in Argentina, Brazil, Chile, Mexico, the United States and Venezuela up to August 1996. The information is based on the progress reports prepared by country representatives for the Pan-American EDIFACT Board (PAEB), which coordinates EDI development activities in the Americas.
Resumo:
Every port is unique. Although all ports exist for the same basic purpose (to act as an interface in the transfer from one mode of transport to another), no two are ever organized in the same way.Ports may be classified according to: Physical conditions: location (geographical position, man-made or natural harbour, estuary location, difficult weather conditions, tides, etc.) and size (large, small or medium-sized). Use: commercial (general cargo, bulk solids, bulk liquids, oil, break bulk, mixed), passenger, sport and leisure, fishing, mixed, etc. Ownership: private, municipal, regional or State-owned. The Port Authority's role in management of the port: Overall control, i.e. the Port Authority plans, sets up and operates the whole range of services. Facilitator, i.e. the Port Authority plans and sets up the infrastructure and the superstructure, but services are provided by private companies. Landlord, i.e. the Port Authority allows private companies to be responsible for the superstructure and provide port services. Different combinations of port types will therefore give rise to different kinds of organization and different information flows, which means that the associated information systems may differ significantly from port to port. Since this paper relates to the port of Barcelona, with its own specific characteristics, the contents may not always be applicable to other ports.
Electronic data interchange. EDI - 1992 and beyond. Conference proceedings. Brussels, September 1989
Resumo:
Der CampusSource Workshop fand vom 10. bis 12. Oktober 2006 an der Westfälischen Wilhelms Universität (WWU) in Münster statt. Kernpunkte der Veranstaltung waren die Entwicklung einer Engine zur Verknüpfung von e-Learning Anwendungen mit Systemen der HIS GmbH und die Erstellung von Lehr- und Lerninhalten mit dem Ziel der Wiederverwendung. Im zweiten Kapitel sind Vorträge der Veranstaltung im Adobe Flash Format zusammengetragen. Zur Betrachtung der Vorträge ist der Adobe Flash Player, mindestens in der Version 6 erforderlich
Resumo:
Unterstützungssysteme für die Programmierausbildung sind weit verbreitet, doch gängige Standards für den Austausch von allgemeinen (Lern-) Inhalten und Tests erfüllen nicht die speziellen Anforderungen von Programmieraufgaben wie z. B. den Umgang mit komplexen Einreichungen aus mehreren Dateien oder die Kombination verschiedener (automatischer) Bewertungsverfahren. Dadurch können Aufgaben nicht zwischen Systemen ausgetauscht werden, was aufgrund des hohen Aufwands für die Entwicklung guter Aufgaben jedoch wünschenswert wäre. In diesem Beitrag wird ein erweiterbares XML-basiertes Format zum Austausch von Programmieraufgaben vorgestellt, das bereits von mehreren Systemen prototypisch genutzt wird. Die Spezifikation des Austauschformats ist online verfügbar [PFMA].
Resumo:
Biofilm research is growing more diverse and dependent on high-throughput technologies and the large-scale production of results aggravates data substantiation. In particular, it is often the case that experimental protocols are adapted to meet the needs of a particular laboratory and no statistical validation of the modified method is provided. This paper discusses the impact of intra-laboratory adaptation and non-rigorous documentation of experimental protocols on biofilm data interchange and validation. The case study is a non-standard, but widely used, workflow for Pseudomonas aeruginosa biofilm development, considering three analysis assays: the crystal violet (CV) assay for biomass quantification, the XTT assay for respiratory activity assessment, and the colony forming units (CFU) assay for determination of cell viability. The ruggedness of the protocol was assessed by introducing small changes in the biofilm growth conditions, which simulate minor protocol adaptations and non-rigorous protocol documentation. Results show that even minor variations in the biofilm growth conditions may affect the results considerably, and that the biofilm analysis assays lack repeatability. Intra-laboratory validation of non-standard protocols is found critical to ensure data quality and enable the comparison of results within and among laboratories.
Resumo:
This report describes the results of the research project investigating the use of advanced field data acquisition technologies for lowa transponation agencies. The objectives of the research project were to (1) research and evaluate current data acquisition technologies for field data collection, manipulation, and reporting; (2) identify the current field data collection approach and the interest level in applying current technologies within Iowa transportation agencies; and (3) summarize findings, prioritize technology needs, and provide recommendations regarding suitable applications for future development. A steering committee consisting oretate, city, and county transportation officials provided guidance during this project. Technologies considered in this study included (1) data storage (bar coding, radio frequency identification, touch buttons, magnetic stripes, and video logging); (2) data recognition (voice recognition and optical character recognition); (3) field referencing systems (global positioning systems [GPS] and geographic information systems [GIs]); (4) data transmission (radio frequency data communications and electronic data interchange); and (5) portable computers (pen-based computers). The literature review revealed that many of these technologies could have useful applications in the transponation industry. A survey was developed to explain current data collection methods and identify the interest in using advanced field data collection technologies. Surveys were sent out to county and city engineers and state representatives responsible for certain programs (e.g., maintenance management and construction management). Results showed that almost all field data are collected using manual approaches and are hand-carried to the office where they are either entered into a computer or manually stored. A lack of standardization was apparent for the type of software applications used by each agency--even the types of forms used to manually collect data differed by agency. Furthermore, interest in using advanced field data collection technologies depended upon the technology, program (e.g.. pavement or sign management), and agency type (e.g., state, city, or county). The state and larger cities and counties seemed to be interested in using several of the technologies, whereas smaller agencies appeared to have very little interest in using advanced techniques to capture data. A more thorough analysis of the survey results is provided in the report. Recommendations are made to enhance the use of advanced field data acquisition technologies in Iowa transportation agencies: (1) Appoint a statewide task group to coordinate the effort to automate field data collection and reporting within the Iowa transportation agencies. Subgroups representing the cities, counties, and state should be formed with oversight provided by the statewide task group. (2) Educate employees so that they become familiar with the various field data acquisition technologies.
Resumo:
Il processo di Data Entry manuale non solo è oneroso dal punto di vista temporale ed economico, lo è ancor di più poiché rappresenta una fonte di errore: per questi motivi, l’acquisizione automatizzata delle informazioni lungo la catena produttiva è un obiettivo fortemente desiderato dal Gruppo per migliorare i propri business. Le tecnologie analizzate, ormai diffuse e standardizzate in ampia scala come barcode, etichette logistiche, terminali in radiofrequenza, possono apportare grandi benefici ai processi aziendali, ancor più integrandole su misura agli ERP aziendali, permettendo una registrazione rapida e corretta delle informazioni e la diffusione immediata delle stesse all’intera organizzazione. L’analisi dei processi e dei flussi hanno evidenziato le criticità e permesso di capire dove e quando intervenire con una progettazione che risultasse quanto più la best suite possibile. Il lancio dei fabbisogni, l’entrata, la mappatura e la movimentazione merci in Magazzino, lo stato di produzione, lo scarico componenti ed il carico di produzione in Confezionamento e Semilavorazione, l’istituzione di un magazzino di interscambio Dogana, un flusso di tracciabilità preciso e rapido, sono tutti eventi che modificheranno i processi aziendali, snellendoli e svincolando risorse che potranno essere reinvestite in operatività a valore aggiunto superiore. I risultati potenzialmente ottenibili, comprovati anche dalle esperienze esterne di fornitori e consulenza, hanno generato le condizioni necessarie ad un rapido studio e start dei lavori: il Gruppo è entusiasta ed impaziente di portare a termine quanto prima il progetto e di andare a regime con la nuova modalità operativa, snellita ed ottimizzata.
Resumo:
Advances in communication, navigation and imaging technologies are expected to fundamentally change methods currently used to collect data. Electronic data interchange strategies will also minimize data handling and automatically update files at the point of capture. This report summarizes the outcome of using a multi-camera platform as a method to collect roadway inventory data. It defines basic system requirements as expressed by users, who applied these techniques and examines how the application of the technology met those needs. A sign inventory case study was used to determine the advantages of creating and maintaining the database and provides the capability to monitor performance criteria for a Safety Management System. The project identified at least 75 percent of the data elements needed for a sign inventory can be gathered by viewing a high resolution image.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
It has been argued that beyond software engineering and process engineering, ontological engineering is the third capability needed if successful e-commerce is to be realized. In our experience of building an ontological-based tendering system, we face the problem of building an ontology. In this paper, we demonstrate how to build ontologies in the tendering domain. The ontology life cycle is identified. Extracting concepts from existing resources like on-line catalogs is described. We have reused electronic data interchange (EDI) to build conceptual structures in the tendering domain. An algorithm to extract abstract ontological concepts from these structures is proposed.