956 resultados para Automatic Data Processing.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Implementation of GEOSS/GMES initiative requires creation and integration of service providers, most of which provide geospatial data output from Grid system to interactive user. In this paper approaches of DOS- centers (service providers) integration used in Ukrainian segment of GEOSS/GMES will be considered and template solutions for geospatial data visualization subsystems will be suggested. Developed patterns are implemented in DOS center of Space Research Institute of National Academy of Science of Ukraine and National Space Agency of Ukraine (NASU-NSAU).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data processing services for Meteosat geostationary satellite are presented. Implemented services correspond to the different levels of remote-sensing data processing, including noise reduction at preprocessing level, cloud mask extraction at low-level and fractal dimension estimation at high-level. Cloud mask obtained as a result of Markovian segmentation of infrared data. To overcome high computation complexity of Markovian segmentation parallel algorithm is developed. Fractal dimension of Meteosat data estimated using fractional Brownian motion models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of heterogeneous big data sources with ever increasing volumes, velocities and veracities over the he last few years has inspired the data science and research community to address the challenge of extracting knowledge form big data. Such a wealth of generated data across the board can be intelligently exploited to advance our knowledge about our environment, public health, critical infrastructure and security. In recent years we have developed generic approaches to process such big data at multiple levels for advancing decision-support. It specifically concerns data processing with semantic harmonisation, low level fusion, analytics, knowledge modelling with high level fusion and reasoning. Such approaches will be introduced and presented in context of the TRIDEC project results on critical oil and gas industry drilling operations and also the ongoing large eVacuate project on critical crowd behaviour detection in confined spaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Data Processing Department of ISHC has developed coding forms to be used for the data to be entered into the program. The Highway Planning and Programming and the Design Departments are responsible for coding and submitting the necessary data forms to Data Processing for the noise prediction on the highway sections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined the properties of ERP effects elicited by unattended (spatially uncued) objects using a short-lag repetition-priming paradigm. Same or different common objects were presented in a yoked prime-probe trial either as intact images or slightly scrambled (half-split) versions. Behaviourally, only objects in a familiar (intact) view showed priming. An enhanced negativity was observed at parietal and occipito-parietal electrode sites within the time window of the posterior N250 after the repetition of intact, but not split, images. An additional post-hoc N2pc analysis of the prime display supported that this result could not be attributed to differences in salience between familiar intact and split views. These results demonstrate that spatially unattended objects undergo visual processing but only if shown in familiar views, indicating a role of holistic processing of objects that is independent of attention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current physiological sensors are passive and transmit sensed data to Monitoring centre (MC) through wireless body area network (WBAN) without processing data intelligently. We propose a solution to discern data requestors for prioritising and inferring data to reduce transactions and conserve battery power, which is important requirements of mobile health (mHealth). However, there is a problem for alarm determination without knowing the activity of the user. For example, 170 beats per minute of heart rate can be normal during exercising, however an alarm should be raised if this figure has been sensed during sleep. To solve this problem, we suggest utilising the existing activity recognition (AR) applications. Most of health related wearable devices include accelerometers along with physiological sensors. This paper presents a novel approach and solution to utilise physiological data with AR so that they can provide not only improved and efficient services such as alarm determination but also provide richer health information which may provide content for new markets as well as additional application services such as converged mobile health with aged care services. This has been verified by experimented tests using vital signs such as heart pulse rate, respiration rate and body temperature with a demonstrated outcome of AR accelerometer sensors integrated with an Android app.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When wearable and personal health device and sensors capture data such as heart rate and body temperature for fitness tracking and health services, they simply transfer data without filtering or optimising. This can cause over-loading to the sensors as well as rapid battery consumption when they interact with Internet of Things (IoT) networks, which are expected to increase and de-mand more health data from device wearers. To solve the problem, this paper proposes to infer sensed data to reduce the data volume, which will affect the bandwidth and battery power reduction that are essential requirements to sensor devices. This is achieved by applying beacon data points after the inferencing of data processing utilising variance rates, which compare the sensed data with ad-jacent data before and after. This novel approach verifies by experiments that data volume can be saved by up to 99.5% with a 98.62% accuracy. Whilst most existing works focus on sensor network improvements such as routing, operation and reading data algorithms, we efficiently reduce data volume to reduce band-width and battery power consumption while maintaining accuracy by implement-ing intelligence and optimisation in sensor devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ENGLISH: Comparison of physical and biological environmental factors affecting the aggregation of tunas with the success of fishing by the commercial fleets, requires that catch and effort data be examined in greater detail than has been presented in these publications. Consequently, the United States Bureau of Commercial Fisheries Biological Laboratory, San Diego, to serve the needs of its program of research on causes of variations in tuna abundance, made arrangements with the Tuna Commission to summarize these catch and effort data by month, by one-degree area, by fishing vessel size-class, for the years 1951-1960 for bait boats and 1953-1960 for purse-seiners. The present paper describes the techniques employed in summarizing these data by automatic data processing methods. It also presents the catch and effort information by months, by five-degree areas and certain combinations of five-degree areas for use by fishermen, industry personnel, and research agencies. Because of space limitations and other considerations, the one-degree tabulations are not included but are available at the Tuna Commission and Bureau laboratories. SPANISH: La comparación de los factores ambientales físicos y biológicos que afectan la agrupación del atún, con el éxito obtenido en la pesca por las flotas comerciales, requiere que los datos sobre la captura y el esfuerzo sean examinados con mayor detalle de lo que han sido presentados en estas publicaciones. En consecuencia, el Laboratorio Biológico del Buró de Pesquerías Comerciales de los Estados Unidos, situado en San Diego, a fin de llenar los requisitos de su programa de investigación sobre las causas de las variaciones en la abundancia del atún, hizo arreglos con la Comisión del Atún para sumarizar esos datos sobre la captura y el esfuerzo por meses, por áreas de un grado, por clases de tamaño de las embarcaciones de pesca durante los años 1951-1960 en lo que concierne a los barcos de carnada y durante el período 1953-1960 en lo que respecta a los barcos rederos. El presente trabajo describe la técnica empleada en la sumarización de dichos datos mediante métodos automáticos de manejo de datos. También se da aquí la información sobre la captura y el esfuerzo por meses, por áreas de cinco grados y ciertas combinaciones de áreas de cinco grados para el uso de los pescadores, del personal de la industria y de las oficinas de investigación. Por falta de espacio y otras razones, las tabulaciones de las áreas de un grado no han sido incluídos en este trabajo, pero están a la disposición de quien tenga interés en los laboratorios de la Comisión del Atún y del Buró.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: This study describes the prevalence, associated anomalies, and demographic characteristics of cases of multiple congenital anomalies (MCA) in 19 population-based European registries (EUROCAT) covering 959,446 births in 2004 and 2010. METHODS: EUROCAT implemented a computer algorithm for classification of congenital anomaly cases followed by manual review of potential MCA cases by geneticists. MCA cases are defined as cases with two or more major anomalies of different organ systems, excluding sequences, chromosomal and monogenic syndromes. RESULTS: The combination of an epidemiological and clinical approach for classification of cases has improved the quality and accuracy of the MCA data. Total prevalence of MCA cases was 15.8 per 10,000 births. Fetal deaths and termination of pregnancy were significantly more frequent in MCA cases compared with isolated cases (p < 0.001) and MCA cases were more frequently prenatally diagnosed (p < 0.001). Live born infants with MCA were more often born preterm (p < 0.01) and with birth weight < 2500 grams (p < 0.01). Respiratory and ear, face, and neck anomalies were the most likely to occur with other anomalies (34% and 32%) and congenital heart defects and limb anomalies were the least likely to occur with other anomalies (13%) (p < 0.01). However, due to their high prevalence, congenital heart defects were present in half of all MCA cases. Among males with MCA, the frequency of genital anomalies was significantly greater than the frequency of genital anomalies among females with MCA (p < 0.001). CONCLUSION: Although rare, MCA cases are an important public health issue, because of their severity. The EUROCAT database of MCA cases will allow future investigation on the epidemiology of these conditions and related clinical and diagnostic problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives. To study mortality trends related to Chagas disease taking into account all mentions of this cause listed on any line or part of the death certificate. Methods. Mortality data for 1985-2006 were obtained from the multiple cause-of-death database maintained by the Sao Paulo State Data Analysis System (SEADE). Chagas disease was classified as the underlying cause-of-death or as an associated cause-of-death (non-underlying). The total number of times Chagas disease was mentioned on the death certificates was also considered. Results. During this 22-year period, there were 40 002 deaths related to Chagas disease: 34 917 (87.29%) classified as the underlying cause-of-death and 5 085 (12.71%) as an associated cause-of-death. The results show a 56.07% decline in the death rate due to Chagas disease as the underlying cause and a stabilized rate as associated cause. The number of deaths was 44.5% higher among men. The fact that 83.5% of the deaths occurred after 45 years of age reflects a cohort effect. The main causes associated with Chagas disease as the underlying cause-of-death were direct complications due to cardiac involvement, such as conduction disorders, arrhythmias and heart failure. Ischemic heart disease, cerebrovascular disorders and neoplasms were the main underlying causes when Chagas was an associated cause-of-death. Conclusions. For the total mentions to Chagas disease, a 51.34% decline in the death rate was observed, whereas the decline in the number of deaths was only 5.91%, being lower among women and showing a shift of deaths to older age brackets. Using the multiple cause-of-death method contributed to the understanding of the natural history of Chagas disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Description based on: Fiscal year 1984.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.