978 resultados para Processing technique
Resumo:
This paper focuses on the development of methods and cascade of models for flood monitoring and forecasting and its implementation in Grid environment. The processing of satellite data for flood extent mapping is done using neural networks. For flood forecasting we use cascade of models: regional numerical weather prediction (NWP) model, hydrological model and hydraulic model. Implementation of developed methods and models in the Grid infrastructure and related projects are discussed.
Resumo:
This paper describes a method of signal preprocessing under active monitoring. Suppose we want to solve the inverse problem of getting the response of a medium to one powerful signal, which is equivalent to obtaining the transmission function of the medium, but do not have an opportunity to conduct such an experiment (it might be too expensive or harmful for the environment). Practically the problem can be reduced to obtaining the transmission function of the medium. In this case we can conduct a series of experiments of relatively low power and superpose the response signals. However, this method is conjugated with considerable loss of information (especially in the high frequency domain) due to fluctuations of the phase, the frequency and the starting time of each individual experiment. The preprocessing technique presented in this paper allows us to substantially restore the response of the medium and consequently to find a better estimate for the transmission function. This technique is based on expanding the initial signal into the system of orthogonal functions.
Resumo:
After many years of scholar study, manuscript collections continue to be an important source of novel information for scholars, concerning both the history of earlier times as well as the development of cultural documentation over the centuries. D-SCRIBE project aims to support and facilitate current and future efforts in manuscript digitization and processing. It strives toward the creation of a comprehensive software product, which can assist the content holders in turning an archive of manuscripts into a digital collection using automated methods. In this paper, we focus on the problem of recognizing early Christian Greek manuscripts. We propose a novel digital image binarization scheme for low quality historical documents allowing further content exploitation in an efficient way. Based on the existence of closed cavity regions in the majority of characters and character ligatures in these scripts, we propose a novel, segmentation-free, fast and efficient technique that assists the recognition procedure by tracing and recognizing the most frequently appearing characters or character ligatures.
Resumo:
A solar power satellite is paid attention to as a clean, inexhaustible large- scale base-load power supply. The following technology related to beam control is used: A pilot signal is sent from the power receiving site and after direction of arrival estimation the beam is directed back to the earth by same direction. A novel direction-finding algorithm based on linear prediction technique for exploiting cyclostationary statistical information (spatial and temporal) is explored. Many modulated communication signals exhibit a cyclostationarity (or periodic correlation) property, corresponding to the underlying periodicity arising from carrier frequencies or baud rates. The problem was solved by using both cyclic second-order statistics and cyclic higher-order statistics. By evaluating the corresponding cyclic statistics of the received data at certain cycle frequencies, we can extract the cyclic correlations of only signals with the same cycle frequency and null out the cyclic correlations of stationary additive noise and all other co-channel interferences with different cycle frequencies. Thus, the signal detection capability can be significantly improved. The proposed algorithms employ cyclic higher-order statistics of the array output and suppress additive Gaussian noise of unknown spectral content, even when the noise shares common cycle frequencies with the non-Gaussian signals of interest. The proposed method completely exploits temporal information (multiple lag ), and also can correctly estimate direction of arrival of desired signals by suppressing undesired signals. Our approach was generalized over direction of arrival estimation of cyclostationary coherent signals. In this paper, we propose a new approach for exploiting cyclostationarity that seems to be more advanced in comparison with the other existing direction finding algorithms.
Resumo:
The purpose of this study was to investigate the effects of direct instruction in story grammar on the reading and writing achievement of second graders. Three aspects of story grammar (character, setting, and plot) were taught with direct instruction using the concept development technique of deep processing. Deep processing which included (a) visualization (the drawing of pictures), (b) verbalization (the writing of sentences), (c) the attachment of physical sensations, and (d) the attachment of emotions to concepts was used to help students make mental connections necessary for recall and application of character, setting, and plot when constructing meaning in reading and writing.^ Four existing classrooms consisting of seventy-seven second-grade students were randomly assigned to two treatments, experimental and comparison. Both groups were pretested and posttested for reading achievement using the Gates-MacGinitie Reading Tests. Pretest and posttest writing samples were collected and evaluated. Writing achievement was measured using (a) a primary trait scoring scale (an adapted version of the Glazer Narrative Composition Scale) and (b) an holistic scoring scale by R. J. Pritchard. ANCOVAs were performed on the posttests adjusted for the pretests to determine whether or not the methods differed. There was no significant improvement in reading after the eleven-day experimental period for either group; nor did the two groups differ. There was significant improvement in writing for the experimental group over the comparison group. Pretreatment and posttreatment interviews were selectively collected to evaluate qualitatively if the students were able to identify and manipulate elements of story grammar and to determine patterns in metacognitive processing. Interviews provided evidence that most students in the experimental group gained while most students in the comparison group did not gain in their ability to manipulate, with understanding, the concepts of character, setting, and plot. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
A heuristic for batching orders in a manual order-picking warehouse has been developed. It prioritizes orders based on due time to prevent mixing of orders of different priority levels. The order density of aisles criterion is used to form batches. It also determines the number of pickers required and assigns batches to pickers such that there is a uniform workload per unit of time. The effectiveness of the heuristic was studied by observing computational time and aisle congestion for various numbers of total orders and number of orders that form a batch. An initial heuristic performed well for small number of orders, but for larger number of orders, a partitioning technique is computationally more efficient, needing only minutes to solve for thousands of orders, while preserving 90% of the batch quality obtained with the original heuristic. Comparative studies between the heuristic and other published heuristics are needed. ^
Resumo:
As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.
Resumo:
We describe a low-energy glow-discharge process using reactive ion etching system that enables non-circular device patterns, such as squares or hexagons, to be formed from a precursor array of uniform circular openings in polymethyl methacrylate, PMMA, defined by electron beam lithography. This technique is of a particular interest for bit-patterned magnetic recording medium fabrication, where close packed square magnetic bits may improve its recording performance. The process and results of generating close packed square patterns by self-limiting low-energy glow-discharge are investigated. Dense magnetic arrays formed by electrochemical deposition of nickel over self-limiting formed molds are demonstrated.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.
Resumo:
Ellipsometry is a well known optical technique used for the characterization of reflective surfaces in study and films between two media. It is based on measuring the change in the state of polarization that occurs as a beam of polarized light is reflected from or transmitted through the film. Measuring this change can be used to calculate parameters of a single layer film such as the thickness and the refractive index. However, extracting these parameters of interest requires significant numerical processing due to the noninvertible equations. Typically, this is done using least squares solving methods which are slow and adversely affected by local minima in the solvable surface. This thesis describes the development and implementation of a new technique using only Artificial Neural Networks (ANN) to calculate thin film parameters. The new method offers a speed in the orders of magnitude faster than preceding methods and convergence to local minima is completely eliminated.
Resumo:
One of the global phenomena with threats to environmental health and safety is artisanal mining. There are ambiguities in the manner in which an ore-processing facility operates which hinders the mining capacity of these miners in Ghana. These problems are reviewed on the basis of current socio-economic, health and safety, environmental, and use of rudimentary technologies which limits fair-trade deals to miners. This research sought to use an established data-driven, geographic information (GIS)-based system employing the spatial analysis approach for locating a centralized processing facility within the Wassa Amenfi-Prestea Mining Area (WAPMA) in the Western region of Ghana. A spatial analysis technique that utilizes ModelBuilder within the ArcGIS geoprocessing environment through suitability modeling will systematically and simultaneously analyze a geographical dataset of selected criteria. The spatial overlay analysis methodology and the multi-criteria decision analysis approach were selected to identify the most preferred locations to site a processing facility. For an optimal site selection, seven major criteria including proximity to settlements, water resources, artisanal mining sites, roads, railways, tectonic zones, and slopes were considered to establish a suitable location for a processing facility. Site characterizations and environmental considerations, incorporating identified constraints such as proximity to large scale mines, forest reserves and state lands to site an appropriate position were selected. The analysis was limited to criteria that were selected and relevant to the area under investigation. Saaty’s analytical hierarchy process was utilized to derive relative importance weights of the criteria and then a weighted linear combination technique was applied to combine the factors for determination of the degree of potential site suitability. The final map output indicates estimated potential sites identified for the establishment of a facility centre. The results obtained provide intuitive areas suitable for consideration
Resumo:
In industrial plants, oil and oil compounds are usually transported by closed pipelines with circular cross-section. The use of radiotracers in oil transport and processing industrial facilities allows calibrating flowmeters, measuring mean residence time in cracking columns, locate points of obstruction or leak in underground ducts, as well as investigating flow behavior or industrial processes such as in distillation towers. Inspection techniques using radiotracers are non-destructive, simple, economic and highly accurate. Among them, Total Count, which uses a small amount of radiotracer with known activity, is acknowledged as an absolute technique for flow rate measurement. A viscous fluid transport system, composed by four PVC pipelines with 13m length (12m horizontal and 1m vertical) and ½, ¾, 1 and 2-inch gauges, respectively, interconnected by maneuvering valves was designed and assembled in order to conduct the research. This system was used to simulate different flow conditions of petroleum compounds and for experimental studies of flow profile in the horizontal and upward directions. As 198Au presents a single photopeak (411,8 keV), it was the radioisotope chosen for oil labeling, in small amounts (6 ml) or around 200 kBq activity, and it was injected in the oil transport lines. A NaI scintillation detector 2”x 2”, with well-defined geometry, was used to measure total activity, determine the calibration factor F and, positioned after a homogenization distance and interconnected to a standardized electronic set of nuclear instrumentation modules (NIM), to detect the radioactive cloud.
Resumo:
The dual problems of sustaining the fast growth of human society and preserving the environment for future generations urge us to shift our focus from exploiting fossil oils to researching and developing more affordable, reliable and clean energy sources. Human beings had a long history that depended on meeting our energy demands with plant biomass, and the modern biorefinery technologies realize the effective conversion of biomass to production of transportation fuels, bulk and fine chemicals so to alleviate our reliance on fossil fuel resources of declining supply. With the aim of replacing as much non-renewable carbon from fossil oils with renewable carbon from biomass as possible, innovative R&D activities must strive to enhance the current biorefinery process and secure our energy future. Much of my Ph.D. research effort is centered on the study of electrocatalytic conversion of biomass-derived compounds to produce value-added chemicals, biofuels and electrical energy on model electrocatalysts in AEM/PEM-based continuous flow electrolysis cell and fuel cell reactors. High electricity generation performance was obtained when glycerol or crude glycerol was employed as fuels in AEMFCs. The study on selective electrocatalytic oxidation of glycerol shows an electrode potential-regulated product distribution where tartronate and mesoxalate can be selectively produced with electrode potential switch. This finding then led to the development of AEMFCs with selective production of valuable tartronate or mesoxalate with high selectivity and yield and cogeneration of electricity. Reaction mechanisms of electrocatalytic oxidation of ethylene glycol and 1,2-propanediol were further elucidated by means of an on-line sample collection technique and DFT modeling. Besides electro-oxidation of biorenewable alcohols to chemicals and electricity, electrocatalytic reduction of keto acids (e.g. levulinic acid) was also studied for upgrading biomass-based feedstock to biofuels while achieving renewable electricity storage. Meanwhile, ORR that is often coupled in AEMFCs on the cathode was investigated on non-PGM electrocatalyst with comparable activity to commercial Pt/C. The electro-biorefinery process could be coupled with traditional biorefinery operation and will play a significant role in our energy and chemical landscape.
Resumo:
The formation of reactive oxygen species (ROS) within cells causes damage to biomolecules, including membrane lipids, DNA, proteins and sugars. An important type of oxidative damage is DNA base hydroxylation which leads to the formation of 8-oxo-7,8-dihydro-29-deoxyguanosine (8-oxodG) and 5-hydroxymethyluracil (5-HMUra). Measurement of these biomarkers in urine is challenging, due to the low levels of the analytes and the matrix complexity. In order to simultaneously quantify 8-oxodG and 5-HMUra in human urine, a new, reliable and powerful strategy was optimised and validated. It is based on a semi-automatic microextraction by packed sorbent (MEPS) technique, using a new digitally controlled syringe (eVolH), to enhance the extraction efficiency of the target metabolites, followed by a fast and sensitive ultrahigh pressure liquid chromatography (UHPLC). The optimal methodological conditions involve loading of 250 mL urine sample (1:10 dilution) through a C8 sorbent in a MEPS syringe placed in the semi-automatic eVolH syringe followed by elution using 90 mL of 20% methanol in 0.01% formic acid solution. The obtained extract is directly analysed in the UHPLC system using a binary mobile phase composed of aqueous 0.1% formic acid and methanol in the isocratic elution mode (3.5 min total analysis time). The method was validated in terms of selectivity, linearity, limit of detection (LOD), limit of quantification (LOQ), extraction yield, accuracy, precision and matrix effect. Satisfactory results were obtained in terms of linearity (r2 . 0.991) within the established concentration range. The LOD varied from 0.00005 to 0.04 mg mL21 and the LOQ from 0.00023 to 0.13 mg mL21. The extraction yields were between 80.1 and 82.2 %, while inter-day precision (n=3 days) varied between 4.9 and 7.7 % and intra-day precision between 1.0 and 8.3 %. This approach presents as main advantages the ability to easily collect and store urine samples for further processing and the high sensitivity, reproducibility, and robustness of eVolHMEPS combined with UHPLC analysis, thus retrieving a fast and reliable assessment of oxidatively damaged DNA.