939 resultados para Electronic data processing
Resumo:
There is described data processing at the flaw detector with combined multisectional eddy-current transducer and heterofrequency magnetic field. The application of this method for detecting flaws in rods and pipes under the conditions of significant transverse displacements is described.
Resumo:
Configuració d'un entorn de desenvolupament en el IDE Eclipse. Introducció als SIG. Usos, utilitats i exemples. Conèixer la eina gvSIG. Conèixer els estàndards més estesos de l'Open Geospatial Consortium (OGC) i en especial del Web Processing Services. Analitzar, dissenyar i desenvolupar un client capaç de consumir serveis wps.
Resumo:
Estudi dels estàndards definits per l'Open Geospatial Consortium, i més concretament en l'estàndard Web Processing Service (wps). Així mateix, ha tingut una component pràctica que ha consistit en el disseny i desenvolupament d'un client capaç de consumir serveis Web creats segons wps i integrat a la plataforma gvSIG.
Resumo:
Tämän diplomityön tarkoituksena oli tehdä selvitys EDI:in liittyvistä vaikutuksista, tarpeista ja eduista sekä valmistella Oracle Applications- toiminnanohjausjärjestelmän EDI Gateway- modulin ottamista tuotantokäyttöön. Tietoa tarvekartoitukseen saatiin keskustelujen avulla. Uusia kaupallisista lähtökohdista johdettuja, yritysten väliseen kaupankäyntiin ja internet-teknologian hyödyntämiseen kehitettyjä aloitteita käsiteltiin EDI-näkökulmasta tulevaisuutta varten. Ajankohtaisinta tietoa tätä diplomityötä varten löydettiin myös internetistä. Tämän jälkeen oli mahdollista toteuttaa sopivan laaja mutta rajattu EDI pilottiprojekti EDI-konseptin luomista varten. EDI:n vaikutuksiin ostossa keskityttiin tässä diplomityössä enemmän ja EDI:ä päätettiin soveltaa aluksi ostotilauksissa. EDI:n hyötyjä on vaikea mitata numeerisesti. Suurta määrää rahaa tai tuoteyksiköitä on käsiteltävä EDI-partnerin kanssa riittävän usein. EDI:n käyttöönottovaiheessa pääongelmat ovat sovelluksiin liittyviä tietotekniikkaongelmia. Selvityksistä ja EDI-projektista saatu tieto on mahdollista hyödyntää jatkokehityksessä. Lisätoimenpiteitä tarvitaan kokonaan toimivan järjestelmän luomiseksi.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
A student from the Data Processing program at the New York Trade School is shown working. Black and white photograph with some edge damage due to writing in black along the top.
Resumo:
Felice Gigante a graduate from the New York Trade School Electronics program works on a machine in his job as Data Processing Customer Engineer for the International Business Machines Corp. Original caption reads, "Felice Gigante - Electronices, International Business Machines Corp." Black and white photograph with caption glued to reverse.
Resumo:
The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.