862 resultados para Cataloging - Data processing
Resumo:
The effect of zirconium on the hot working characteristics of alpha and alpha-beta brass was studied in the temperature range of 500 to 850-degrees-C and the strain rate range of 0.001 to 100 s-1. On the basis of the flow stress data, processing maps showing the variation of the efficiency of power dissipation (given by [2m/(m+1)] where m is the strain rate sensitivity) with temperature and strain rate were obtained. The addition of zirconium to alpha brass decreased the maximum efficiency of power dissipation from 53 to 39%, increased the strain rate for dynamic recrystallization (DRX) from 0.001 to 0.1 s-1 and improved the hot workability. Alpha-beta brasses with and without zirconium exhibit a domain in the temperature range from 550 to 750-degrees-C and at strain rates lower than 1 s-1 with a maximum efficiency of power dissipation of nearly 50 % occurring in the temperature range of 700 to 750-degrees-C and a strain rate of 0.001 s-1. In the domain, the alpha phase undergoes DRX and controls the hot deformation of the alloy whereas the beta phase deforms superplastically. The addition of zirconium to alpha-beta brass has not affected the processing maps as it gets partitioned to the beta phase and does not alter the constitutive behavior of the alpha phase
Resumo:
The constitutive behaviour of agr — nickel silver in the temperature range 700–950 °C and strain rate range 0.001–100 s–1 was characterized with the help of a processing map generated on the basis of the principles of the ldquodynamic materials modelrdquo of Prasadet al Using the flow stress data, processing maps showing the variation of the efficiency of power dissipation (given by 2m/(m+1) wherem is the strain-rate sensitivity) with temperature and strain rate were obtained, agr-nickel silver exhibits a single domain at temperatures greater than 750 °C and at strain rates lower than 1s–1, with a maximum efficiency of 38% occurring at about 950 °C and at a strain rate of 0.1 s–1. In the domain the material undergoes dynamic recrystallization (DRX). On the basis of a model, it is shown that the DRX is controlled by the rate of interface formation (nucleation) which depends on the diffusion-controlled process of thermal recovery by climb. At high strain rates (10 and 100s–1) the material undergoes microstructural instabilities, the manifestations of which are in the form of adiabatic shear bands and strain markings.
Resumo:
The constitutive behaviour of agr-beta nickel silver in the temperature range 600�850 °C and strainrate range 0.001�100s�1 was characterized with the help of a processing map generated on the principles of the dynamic materials model. On the basis of the flow-stress data, processing maps showing the variation of the efficiency of power dissipation (given by [2m/(m+1)], wherem is the strain-rate sensitivity) with temperature and strain rate were obtained, agr-beta nickel silver exhibits a single domain at temperatures greater than 700 °C and at strain rates lower than 1 s�1 with a maximum efficiency of power dissipation of about 42% occurring at about 850 °C and at 0.1 s�1. In the domain, the agr phase undergoes dynamic recrystallization and controls the deformation of the alloy, while the beta phase deforms superplastically. Optimum conditions for the processing of agr-beta nickel silver are 850 °C and 0.1 s�1. The material undergoes unstable flow at strain rates of 10 and 100 s�1 and in the temperature range 600�750 °C, manifestated in the form of adiabatic shear bands.
The Intelligent Measuring Sub-System in the Computer Integrated and Flexible Laser Processing System
Resumo:
Based on the computer integrated and flexible laser processing system, develop the intelligent measuring sub-system. A novel model has been built to compensate the deviations of the main frame, a new-developed 3-D laser tracker system is applied to adjust the accuracy of the system. Analyzing the characteristic of all kinds of automobile dies, which is the main processing object of the laser processing system, classify the types of the surface and border needed to be measured and be processed. According to different types of surface and border, develop 2-D adaptive measuring method based on B?zier curve and 3-D adaptive measuring method based on spline curve. During the data processing, a new 3-D probe compensation method has been described in details. Some measuring experiments and laser processing experiments are carried out to testify the methods. All the methods have been applied in the computer integrated and flexible laser processing system invented by the Institute of Mechanics, CAS.
Resumo:
Statistical analysis of diffusion tensor imaging (DTI) data requires a computational framework that is both numerically tractable (to account for the high dimensional nature of the data) and geometric (to account for the nonlinear nature of diffusion tensors). Building upon earlier studies exploiting a Riemannian framework to address these challenges, the present paper proposes a novel metric and an accompanying computational framework for DTI data processing. The proposed approach grounds the signal processing operations in interpolating curves. Well-chosen interpolating curves are shown to provide a computational framework that is at the same time tractable and information relevant for DTI processing. In addition, and in contrast to earlier methods, it provides an interpolation method which preserves anisotropy, a central information carried by diffusion tensor data. © 2013 Springer Science+Business Media New York.
Resumo:
Huelse, M, Barr, D R W, Dudek, P: Cellular Automata and non-static image processing for embodied robot systems on a massively parallel processor array. In: Adamatzky, A et al. (eds) AUTOMATA 2008, Theory and Applications of Cellular Automata. Luniver Press, 2008, pp. 504-510. Sponsorship: EPSRC
Resumo:
Plants exhibit different developmental strategies than animals; these are characterized by a tight linkage between environmental conditions and development. As plants have neither specialized sensory organs nor a nervous system, intercellular regulators are essential for their development. Recently, major advances have been made in understanding how intercellular regulation is achieved in plants on a molecular level. Plants use a variety of molecules for intercellular regulation: hormones are used as systemic signals that are interpreted at the individual-cell level; receptor peptide-ligand systems regulate local homeostasis; moving transcriptional regulators act in a switch-like manner over small and large distances. Together, these mechanisms coherently coordinate developmental decisions with resource allocation and growth.
Resumo:
BACKGROUND: Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. METHODS AND PRINCIPAL FINDINGS: The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. CONCLUSIONS: Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks.
Resumo:
A regularized algorithm for the recovery of band-limited signals from noisy data is described. The regularization is characterized by a single parameter. Iterative and non-iterative implementations of the algorithm are shown to have useful properties, the former offering the advantage of flexibility and the latter a potential for rapid data processing. Comparative results, using experimental data obtained in laser anemometry studies with a photon correlator, are presented both with and without regularization. © 1983 Taylor & Francis Ltd.
Resumo:
A Time of flight (ToF) mass spectrometer suitable in terms of sensitivity, detector response and time resolution, for application in fast transient Temporal Analysis of Products (TAP) kinetic catalyst characterization is reported. Technical difficulties associated with such application as well as the solutions implemented in terms of adaptations of the ToF apparatus are discussed. The performance of the ToF was validated and the full linearity of the specific detector over the full dynamic range was explored in order to ensure its applicability for the TAP application. The reported TAP-ToF setup is the first system that achieves the high level of sensitivity allowing monitoring of the full 0-200 AMU range simultaneously with sub-millisecond time resolution. In this new setup, the high sensitivity allows the use of low intensity pulses ensuring that transport through the reactor occurs in the Knudsen diffusion regime and that the data can, therefore, be fully analysed using the reported theoretical TAP models and data processing.
Resumo:
Data processing is an essential part of Acoustic Doppler Profiler (ADP) surveys, which have become the standard tool in assessing flow characteristics at tidal power development sites. In most cases, further processing beyond the capabilities of the manufacturer provided software tools is required. These additional tasks are often implemented by every user in mathematical toolboxes like MATLAB, Octave or Python. This requires the transfer of the data from one system to another and thus increases the possibility of errors. The application of dedicated tools for visualisation of flow or geographic data is also often beneficial and a wide range of tools are freely available, though again problems arise from the necessity of transferring the data. Furthermore, almost exclusively PCs are supported directly by the ADP manufacturers, whereas small computing solutions like tablet computers, often running Android or Linux operating systems, seem better suited for online monitoring or data acquisition in field conditions. While many manufacturers offer support for developers, any solution is limited to a single device of a single manufacturer. A common data format for all ADP data would allow development of applications and quicker distribution of new post processing methodologies across the industry.
Resumo:
Quantile normalization (QN) is a technique for microarray data processing and is the default normalization method in the Robust Multi-array Average (RMA) procedure, which was primarily designed for analysing gene expression data from Affymetrix arrays. Given the abundance of Affymetrix microarrays and the popularity of the RMA method, it is crucially important that the normalization procedure is applied appropriately. In this study we carried out simulation experiments and also analysed real microarray data to investigate the suitability of RMA when it is applied to dataset with different groups of biological samples. From our experiments, we showed that RMA with QN does not preserve the biological signal included in each group, but rather it would mix the signals between the groups. We also showed that the Median Polish method in the summarization step of RMA has similar mixing effect. RMA is one of the most widely used methods in microarray data processing and has been applied to a vast volume of data in biomedical research. The problematic behaviour of this method suggests that previous studies employing RMA could have been misadvised or adversely affected. Therefore we think it is crucially important that the research community recognizes the issue and starts to address it. The two core elements of the RMA method, quantile normalization and Median Polish, both have the undesirable effects of mixing biological signals between different sample groups, which can be detrimental to drawing valid biological conclusions and to any subsequent analyses. Based on the evidence presented here and that in the literature, we recommend exercising caution when using RMA as a method of processing microarray gene expression data, particularly in situations where there are likely to be unknown subgroups of samples.
Resumo:
Field programmable gate array devices boast abundant resources with which custom accelerator components for signal, image and data processing may be realised; however, realising high performance, low cost accelerators currently demands manual register transfer level design. Software-programmable ’soft’ processors have been proposed as a way to reduce this design burden but they are unable to support performance and cost comparable to custom circuits. This paper proposes a new soft processing approach for FPGA which promises to overcome this barrier. A high performance, fine-grained streaming processor, known as a Streaming Accelerator Element, is proposed which realises accelerators as large scale custom multicore networks. By adopting a streaming execution approach with advanced program control and memory addressing capabilities, typical program inefficiencies can be almost completely eliminated to enable performance and cost which are unprecedented amongst software-programmable solutions. When used to realise accelerators for fast fourier transform, motion estimation, matrix multiplication and sobel edge detection it is shown how the proposed architecture enables real-time performance and with performance and cost comparable with hand-crafted custom circuit accelerators and up to two orders of magnitude beyond existing soft processors.
Resumo:
One of the fundamental problems with image processing of petrographic thin sections is that the appearance (colour I intensity) of a mineral grain will vary with the orientation of the crystal lattice to the preferred direction of the polarizing filters on a petrographic microscope. This makes it very difficult to determine grain boundaries, grain orientation and mineral species from a single captured image. To overcome this problem, the Rotating Polarizer Stage was used to replace the fixed polarizer and analyzer on a standard petrographic microscope. The Rotating Polarizer Stage rotates the polarizers while the thin section remains stationary, allowing for better data gathering possibilities. Instead of capturing a single image of a thin section, six composite data sets are created by rotating the polarizers through 900 (or 1800 if quartz c-axes measurements need to be taken) in both plane and cross polarized light. The composite data sets can be viewed as separate images and consist of the average intensity image, the maximum intensity image, the minimum intensity image, the maximum position image, the minimum position image and the gradient image. The overall strategy used by the image processing system is to gather the composite data sets, determine the grain boundaries using the gradient image, classify the different mineral species present using the minimum and maximum intensity images and then perform measurements of grain shape and, where possible, partial crystallographic orientation using the maximum intensity and maximum position images.
Resumo:
The telemetry data processing operation intended for a given mission are pre-defined by an onboard telemetry configuration, mission trajectory and overall telemetry methodology have stabilized lately for ISRO vehicles. The given problem on telemetry data processing is reduced through hierarchical problem reduction whereby the sequencing of operations evolves as the control task and operations on data as the function task. The function task Input, Output and execution criteria are captured into tables which are examined by the control task and then schedules when the function task when the criteria is being met.