917 resultados para post-processing method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bangla OCR (Optical Character Recognition) is a long deserving software for Bengali community all over the world. Numerous e efforts suggest that due to the inherent complex nature of Bangla alphabet and its word formation process development of high fidelity OCR producing a reasonably acceptable output still remains a challenge. One possible way of improvement is by using post processing of OCR’s output; algorithms such as Edit Distance and the use of n-grams statistical information have been used to rectify misspelled words in language processing. This work presents the first known approach to use these algorithms to replace misrecognized words produced by Bangla OCR. The assessment is made on a set of fifty documents written in Bangla script and uses a dictionary of 541,167 words. The proposed correction model can correct several words lowering the recognition error rate by 2.87% and 3.18% for the character based n- gram and edit distance algorithms respectively. The developed system suggests a list of 5 (five) alternatives for a misspelled word. It is found that in 33.82% cases, the correct word is the topmost suggestion of 5 words list for n-gram algorithm while using Edit distance algorithm the first word in the suggestion properly matches 36.31% of the cases. This work will ignite rooms of thoughts for possible improvements in character recognition endeavour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La seguente tesi nasce dall’esigenza di ottimizzare, da un punto di vista acustico e prestazionale, un ventilatore centrifugo preesistente in azienda. Nei primi tre capitoli si è analizzato il problema da un punto di vista teorico, mentre nel terzo e quarto capitolo da un punto di vista computazionale sfruttando tecniche CFD. Nel primo capitolo è stata fatta una trattazione generale dei ventilatori centrifughi, concentrandosi sul tipo di problematiche a cui questi vanno incontro. Nel secondo capitolo è stata presentata la teoria che sta alla base di una rilevazione sperimentale e di un’analisi acustica. Unitamente a ciò sono stati riportati alcuni articoli che mostrano tecniche di ottimizzazione acustica in ventilatori centrifughi. Nel terzo capitolo è stata riassunta la teoria alla base della fluidodinamica e di uno studio fluidodinamico. Nel quarto capitolo viene spiegato come è stato creato il modello fluidodinamico. Si è optato per un’analisi del problema in stato stazionario, sfruttando il Moving Reference Frame, e considerando l’aria come incomprimibile, visto il ridotto numero di Mach. L’analisi acustica è stata effettuata nel post-processing sfruttando il modello di Proudman. Infine è stata dimostrata la correlazione che intercorre tra i tre punti della curva resistente del ventilatore di funzionamento reale, permettendo di estendere i risultati ricavati dalla analisi di uno di questi agli altri due. Nel quinto capitolo è stata effettuata un’analisi dei risultati ottenuti dalle simulazioni fluidodinamiche e sono state proposte diverse modifiche della geometria. La modifica scelta ha visto un miglioramento delle prestazioni e una minore rumorosità. Infine sono state proposte nelle conclusioni ulteriori possibili strade da percorre per un’indagine e ottimizzazione del ventilatore più accurata.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Era of precision medicine and big medical data sharing, it is necessary to solve the work-flow of digital radiological big data in a productive and effective way. In particular, nowadays, it is possible to extract information “hidden” in digital images, in order to create diagnostic algorithms helping clinicians to set up more personalized therapies, which are in particular targets of modern oncological medicine. Digital images generated by the patient have a “texture” structure that is not visible but encrypted; it is “hidden” because it cannot be recognized by sight alone. Thanks to artificial intelligence, pre- and post-processing software and generation of mathematical calculation algorithms, we could perform a classification based on non-visible data contained in radiological images. Being able to calculate the volume of tissue body composition could lead to creating clasterized classes of patients inserted in standard morphological reference tables, based on human anatomy distinguished by gender and age, and maybe in future also by race. Furthermore, the branch of “morpho-radiology" is a useful modality to solve problems regarding personalized therapies, which is particularly needed in the oncological field. Actually oncological therapies are no longer based on generic drugs but on target personalized therapy. The lack of gender and age therapies table could be filled thanks to morpho-radiology data analysis application.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Activation functions within neural networks play a crucial role in Deep Learning since they allow to learn complex and non-trivial patterns in the data. However, the ability to approximate non-linear functions is a significant limitation when implementing neural networks in a quantum computer to solve typical machine learning tasks. The main burden lies in the unitarity constraint of quantum operators, which forbids non-linearity and poses a considerable obstacle to developing such non-linear functions in a quantum setting. Nevertheless, several attempts have been made to tackle the realization of the quantum activation function in the literature. Recently, the idea of the QSplines has been proposed to approximate a non-linear activation function by implementing the quantum version of the spline functions. Yet, QSplines suffers from various drawbacks. Firstly, the final function estimation requires a post-processing step; thus, the value of the activation function is not available directly as a quantum state. Secondly, QSplines need many error-corrected qubits and a very long quantum circuits to be executed. These constraints do not allow the adoption of the QSplines on near-term quantum devices and limit their generalization capabilities. This thesis aims to overcome these limitations by leveraging hybrid quantum-classical computation. In particular, a few different methods for Variational Quantum Splines are proposed and implemented, to pave the way for the development of complete quantum activation functions and unlock the full potential of quantum neural networks in the field of quantum machine learning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Questo studio si pone come obiettivo lo sviluppo e la sperimentazione di un metodo per eseguire un benchmarking di due diversi sistemi di Additive Manufacturing mediante macchina di misura a coordinate Renishaw Cyclone. In particolare sono valutate le prestazioni in termini di precisione di forma di un sistema di tipo FDM e di uno di tipo PolyJet al fine di ottenere dati indicanti le potenzialità di queste due tecnologie per parti di piccole dimensioni. Dopo un’introduzione generale sull’Additive Manufacturing, si scende nei dettagli delle due tecniche oggetto dello studio e si discute di come strutturare il piano sperimentale in funzione degli obiettivi dell’attività e dei metodi scelti per l’acquisizione e la valutazione dei dati. Si parte, infatti, con la fabbricazione di un modello di benchmark, le cui geometrie vengono poi rilevate tramite una macchina di misura a coordinate per ottenere i valori di precisione di forma, che sono presentati come tolleranze geometriche del sistema GD&T. Successivamente, si descrivono tutte le fasi dell’attività sperimentale, iniziando con l’ideazione del modello di benchmark e proseguendo con i processi di fabbricazione e misurazione, per poi arrivare alla deduzione dei valori di precisione di forma tramite un post-processing dei dati. Infine, si presentano i valori di tolleranza ottenuti e si traggono le conclusioni riguardo la riuscita dell’attività sperimentale e il confronto tra le due tecnologie di Additive Manufacturing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays, Recommender systems play a key role in managing information overload, particularly in areas such as e-commerce, music and cinema. However, despite their good-natured goal, in recent years there has been a growing awareness of their involvement in creating unwanted effects on society, such as creating biases of popularity or filter bubble. This thesis is an attempt to investigate the role of RS and its stakeholders in creating such effects. A simulation study will be performed using EcoAgent, an RL-based multi-stakeholder recommendation system, in a simulation environment that captures key user interactions, suppliers and the recommender system in order to identify possible unhealthy scenarios for stakeholders. In particular, we focus on analyzing the document catalog to see how the diversity of topics that users have access to varies during interactions. Finally, some post-processing methods will be defined on EcoAgent, one reactive and one proactive, which allows us to manipulate the agent’s behavior in order to study whether and how the topic distribution of documents is affected by content providers and by the fairness of the system.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Proton nuclear magnetic resonance (H-1 NMR) spectroscopy for detection of biochemical changes in biological samples is a successful technique. However, the achieved NMR resolution is not sufficiently high when the analysis is performed with intact cells. To improve spectral resolution, high resolution magic angle spinning (HR-MAS) is used and the broad signals are separated by a T-2 filter based on the CPMG pulse sequence. Additionally, HR-MAS experiments with a T-2 filter are preceded by a water suppression procedure. The goal of this work is to demonstrate that the experimental procedures of water suppression and T-2 or diffusing filters are unnecessary steps when the filter diagonalization method (FDM) is used to process the time domain HR-MAS signals. Manipulation of the FDM results, represented as a tabular list of peak positions, widths, amplitudes and phases, allows the removal of water signals without the disturbing overlapping or nearby signals. Additionally, the FDM can also be used for phase correction and noise suppression, and to discriminate between sharp and broad lines. Results demonstrate the applicability of the FDM post-acquisition processing to obtain high quality HR-MAS spectra of heterogeneous biological materials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A simple and low cost method to determine volatile contaminants in post-consumer recycled PET flakes was developed and validated by Headspace Dynamic Concentration and Gas Chromatography-Flame Ionization Detection (HDC-GC-FID). The analytical parameters evaluated by using surrogates include: correlation coefficient, detection limit, quantification limit, accuracy, intra-assay precision, and inter-assay precision. In order to compare the efficiency of the proposed method to recognized automated techniques, post-consumer PET packaging samples collected in Brazil were used. GC-MS was used to confirm the identity of the substances identified in the PET packaging. Some of the identified contaminants were estimated in the post-consumer material at concentrations higher than 220 ng.g-1. The findings in this work corroborate data available in the scientific literature pointing out the suitability of the proposed analytical method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Frequency deviation is a common problem for power system signal processing. Many power system measurements are carried out in a fixed sampling rate assuming the system operates in its nominal frequency (50 or 60 Hz). However, the actual frequency may deviate from the normal value from time to time due to various reasons such as disturbances and subsequent system transients. Measurement of signals based on a fixed sampling rate may introduce errors under such situations. In order to achieve high precision signal measurement appropriate algorithms need to be employed to reduce the impact from frequency deviation in the power system data acquisition process. This paper proposes an advanced algorithm to enhance Fourier transform for power system signal processing. The algorithm is able to effectively correct frequency deviation under fixed sampling rate. Accurate measurement of power system signals is essential for the secure and reliable operation of power systems. The algorithm is readily applicable to such occasions where signal processing is affected by frequency deviation. Both mathematical proof and numerical simulation are given in this paper to illustrate robustness and effectiveness of the proposed algorithm. Crown Copyright (C) 2003 Published by Elsevier Science B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The presence of Mycobacterium bovis in bovine carcasses with lesions suggestive of tuberculosis was evaluated. Seventy-two carcass samples were selected during slaughter inspection procedures in abattoirs in the state of Mato Grosso do Sul, Brazil. Seventeen (23.6%) of samples showed colonies suggestive of mycobacteria that were confirmed to be acid-fast bacilli by Ziehl-Neelsen staining. Polymerase chain reaction (PCR) using primers specific for M. bovis identified M. bovis in 13 (76.5%) isolates. The PCR-restriction enzyme pattern analysis using gene encoding for the 65-kDa protein and two restriction enzymes identified the remaining four isolates that were represented by two M. tuberculosis complex and two nontuberculous mycobacteria. The results are indicative of infection of slaughter cattle by M. bovis and other mycobacteria in the state of Mato Grosso do Sul.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: The annotation of protein post-translational modifications (PTMs) is an important task of UniProtKB curators and, with continuing improvements in experimental methodology, an ever greater number of articles are being published on this topic. To help curators cope with this growing body of information we have developed a system which extracts information from the scientific literature for the most frequently annotated PTMs in UniProtKB. RESULTS: The procedure uses a pattern-matching and rule-based approach to extract sentences with information on the type and site of modification. A ranked list of protein candidates for the modification is also provided. For PTM extraction, precision varies from 57% to 94%, and recall from 75% to 95%, according to the type of modification. The procedure was used to track new publications on PTMs and to recover potential supporting evidence for phosphorylation sites annotated based on the results of large scale proteomics experiments. CONCLUSIONS: The information retrieval and extraction method we have developed in this study forms the basis of a simple tool for the manual curation of protein post-translational modifications in UniProtKB/Swiss-Prot. Our work demonstrates that even simple text-mining tools can be effectively adapted for database curation tasks, providing that a thorough understanding of the working process and requirements are first obtained. This system can be accessed at http://eagl.unige.ch/PTM/.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When dealing with nonlinear blind processing algorithms (deconvolution or post-nonlinear source separation), complex mathematical estimations must be done giving as a result very slow algorithms. This is the case, for example, in speech processing, spike signals deconvolution or microarray data analysis. In this paper, we propose a simple method to reduce computational time for the inversion of Wiener systems or the separation of post-nonlinear mixtures, by using a linear approximation in a minimum mutual information algorithm. Simulation results demonstrate that linear spline interpolation is fast and accurate, obtaining very good results (similar to those obtained without approximation) while computational time is dramatically decreased. On the other hand, cubic spline interpolation also obtains similar good results, but due to its intrinsic complexity, the global algorithm is much more slow and hence not useful for our purpose.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Being physically assaulted is known to increase the risk of the occurrence of post-traumatic stress disorder (PTSD) symptoms but it may also skew judgements about the intentions of other people. The objectives of the study were to assess paranoia and PTSD after an assault and to test whether theory-derived cognitive factors predicted the persistence of these problems. Method: At 4 weeks after hospital attendance due to an assault, 106 people were assessed on multiple symptom measures (including virtual reality) and cognitive factors from models of paranoia and PTSD. The symptom measures were repeated 3 and 6 months later. Results: Factor analysis indicated that paranoia and PTSD were distinct experiences, though positively correlated. At 4 weeks, 33% of participants met diagnostic criteria for PTSD, falling to 16% at follow-up. Of the group at the first assessment, 80% reported that since the assault they were excessively fearful of other people, which over time fell to 66%. Almost all the cognitive factors (including information-processing style during the trauma, mental defeat, qualities of unwanted memories, self-blame, negative thoughts about self, worry, safety behaviours, anomalous internal experiences and cognitive inflexibility) predicted later paranoia and PTSD, but there was little evidence of differential prediction. Conclusions: Paranoia after an assault may be common and distinguishable from PTSD but predicted by a strikingly similar range of factors.