930 resultados para post-processing


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Non-invasive molecular-imaging technologies are playing a key role in drug discovery, development and delivery. Positron Emission Tomography (PET) is such a molecular imaging technology and a powerful tool for the observation of various deceases in vivo. However, it is limited by the availability of vectors with high selectivity to the target and radionuclides with a physical half-life which matches the biological half-life of the observed process. The 68Ge/68Ga radionuclide generator makes the PET-nuclide anywhere available without an on-site cyclotron. Besides the perfect availability 68Ga shows well suited nuclide properties for PET, but it has to be co-ordinated by a chelator to introduce it in a radiopharmaceuticals.rnHowever, the physical half-life of 68Ga (67.7 min) might limit the spectrum of clinical applications of 68Ga-labelled radiodiagnostics. Furthermore, 68Ga-labelled analogues of endoradiotherapeuticals of longer biological half-live such as 90Y- or 177Lu-labeled peptides and proteins cannot be used to determine individual radiation dosimetry directly. rnThus, radionuclide generator systems providing positron emitting daughters of extended physical half-life are of renewed interest. In this context, generator-derived positron emitters with longer physical half-life are needed, such as 72As (T½ = 26 h) from the 72Se/72As generator, or 44Sc (T½ = 3.97 h) from the 44Ti/44Sc generator.rnIn this thesis the implementation of radioactive gallium-68 and scandium-44 for molecular imaging and nuclear medical diagnosis, beginning with chemical separation and purification of 44Ti as a radionuclide mother, investigation of pilot generators with different elution mode, building a prototype generator, development and investigation of post-processing of the generator eluate, its concentration and further purification, the labeling chemistry under different conditions, in vitro and in vivo studies of labeled compounds and, finally, in vivo imaging experiments are described.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In questa tesi si è studiato un metodo per modellare e virtualizzare tramite algoritmi in Matlab le distorsioni armoniche di un dispositivo audio non lineare, ovvero uno “strumento” che, sollecitato da un segnale audio, lo modifichi, introducendovi delle componenti non presenti in precedenza. Il dispositivo che si è scelto per questo studio il pedale BOSS SD-1 Super OverDrive per chitarra elettrica e lo “strumento matematico” che ne fornisce il modello è lo sviluppo in serie di Volterra. Lo sviluppo in serie di Volterra viene diffusamente usato nello studio di sistemi fisici non lineari, nel caso in cui si abbia interesse a modellare un sistema che si presenti come una “black box”. Il metodo della Nonlinear Convolution progettato dall'Ing. Angelo Farina ha applicato con successo tale sviluppo anche all'ambito dell'acustica musicale: servendosi di una tecnica di misurazione facilmente realizzabile e del modello fornito dalla serie di Volterra Diagonale, il metodo permette di caratterizzare un dispositivo audio non lineare mediante le risposte all'impulso non lineari che il dispositivo fornisce a fronte di un opportuno segnale di test (denominato Exponential Sine Sweep). Le risposte all'impulso del dispositivo vengono utilizzate per ricavare i kernel di Volterra della serie. L'utilizzo di tale metodo ha permesso all'Università di Bologna di ottenere un brevetto per un software che virtualizzasse in post-processing le non linearità di un sistema audio. In questa tesi si è ripreso il lavoro che ha portato al conseguimento del brevetto, apportandovi due innovazioni: si è modificata la scelta del segnale utilizzato per testare il dispositivo (si è fatto uso del Synchronized Sine Sweep, in luogo dell'Exponential Sine Sweep); si è messo in atto un primo tentativo di orientare la virtualizzazione verso l'elaborazione in real-time, implementando un procedimento (in post-processing) di creazione dei kernel in dipendenza dal volume dato in input al dispositivo non lineare.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analisi di campi di moto e temperatura in micro condotti al variare delle condizioni operative e della geometria. Dopo una breve introduzione al settore della microfluidica, si presentano, dal punto di vista teorico, le condizioni operative che caratterizzano il moto in presenza di rarefazione, sottolineando come il modello matematico di base per la fluidodinamica debba essere modificato per tenerne conto. Segue una breve parentesi dedicata all’illustrazione delle tecnologie e dei procedimenti di fabbricazione dei moderni microcanali utilizzati in questa recente branca della termofluidodinamica. Si vedrà, successivamente, il comportamento termoidraulico di questi ultimi al variare di parametri geometrici come il fattore di forma ed il raggio di curvatura degli spigoli della sezione attraverso simulazioni effettuate tramite FlexPDE: il software in questione lascia poi all’utente la gestione del post-processing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die oberflächennahe Geothermie leistet im Bereich der Nutzung regenerativer Wärme einen wichtigen Beitrag zum Klima- und Umweltschutz. Um die technische Nutzung oberflächennaher Geothermie zu optimieren, ist die Kenntnis der Beschaffenheit des geologischen Untergrundes ausschlaggebend. Die vorliegende Dissertation befasst sich mit der Bestimmung verschiedener Untergrundparameter an einem Erdwärmesondenfeld. Es wurden Untersuchungen zur Bestimmung der Wärmeleitfähigkeit wie der enhanced Thermal Response Test (eTRT), sowie eine Untergrund-Temperaturüberwachung im ersten Betriebsjahr durchgeführt. Die Überwachung zeigte keine gegenseitige Beeinflussung einzelner Sonden. Ein Vergleich zwischen dem geplanten und dem tatsächlichem Wärmebedarf des ersten Betriebsjahres ergab eine Abweichung von ca. 35%. Dies zeigt, dass die Nutzungsparameter der Anlage deren Effizienz maßgeblich beeinflussen können. Der am Beispielobjekt praktisch durchgeführte eTRT wurde mittels numerischer Modellierung auf seine Reproduzierbarkeit hin überprüft. Bei einem rein konduktiven Wärmetransport im Untergrund betrug die maximale Abweichung der Messung selbst unter ungünstigen Bedingungen lediglich ca. 6% vom zu erwartenden Wert. Die Detektion von grundwasserdurchflossenen Schichten ist in den Modellen ebenfalls gut abbildbar. Problematisch bleibt die hohe Abhängigkeit des Tests von einer konstanten Wärmezufuhr. Lediglich die Bestimmung der Wärmeleitfähigkeit über das Relaxationsverhalten des Untergrundes liefert bei Wärmeeintragsschwankungen hinreichend genaue Ergebnisse. Die mathematische Nachbearbeitung von fehlerhaften Temperaturkurven bietet einen Einstiegspunkt für weiterführende Forschung.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In un sistema radar è fondamentale rilevare, riconoscere e cercare di seguire il percorso di un eventuale intruso presente in un’area di osservazione al fine ultimo della sicurezza, sia che si consideri l’ambito militare, che anche quello civile. A questo proposito sono stati fatti passi avanti notevoli nella creazione e sviluppo di sistemi di localizzazione passiva che possano rilevare un target (il quale ha come unica proprietà quella di riflettere un segnale inviato dal trasmettitore), in modo che esso sia nettamente distinto rispetto al caso di assenza dell’intruso stesso dall’area di sorveglianza. In particolare l’ultilizzo di Radar Multistatico (ossia un trasmettitore e più ricevitori) permette una maggior precisione nel controllo dell’area d’osservazione. Tra le migliori tecnologie a supporto di questa analisi vi è l’UWB (Ultra Wide-Band), che permette di sfruttare una banda molto grande con il riscontro di una precisione che può arrivare anche al centimetro per scenari in-door. L’UWB utilizza segnali ad impulso molto brevi, a banda larga e che quindi permettono una risoluzione elevata, tanto da consentire, in alcune applicazioni, di superare i muri, rimuovendo facilmente gli elementi presenti nell’ambiente, ossia il clutter. Quindi è fondamentale conoscere algoritmi che permettano la detection ed il tracking del percorso compiuto dal target nell’area. In particolare in questa tesi vengono elaborati nuovi algoritmi di Clustering del segnale ricevuto dalla riflessione sull’intruso, utilizzati al fine di migliorare la visualizzazione dello stesso in post-processing. Infine questi algoritmi sono stati anche implementati su misure sperimentali attuate tramite nodi PulsOn 410 Time Domain, al fine ultimo della rilevazione della presenza di un target nell’area di osservazione dei nodi.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In questa tesi, sono esposti i sistemi di navigazione che si sono evoluti, parimenti con il progresso scientifico e tecnologico, dalle prime misurazioni della Terra, per opera della civiltà ellenica, circa 2.500 anni fa, sino ai moderni sistemi satellitari e ai mai tramontati sistemi di radionavigazione. I sistemi di navigazione devono rispondere alla sempre maggiore richiesta di precisione, affidabilità, continuità e globalità del servizio, della società moderna. È sufficiente pensare che, attualmente, il solo traffico aereo civile fa volare 5 miliardi di passeggeri ogni anno, in oltre 60 milioni di voli e con un trasporto cargo di 85 milioni di tonnellate (ACI - World Airports Council International, 2012). La quota di traffico marittimo mondiale delle merci, è stata pari a circa 650 milioni di TEU (twenty-foot equivalent unit - misura standard di volume nel trasporto dei container ISO, corrisponde a circa 40 metri cubi totali), nel solo anno 2013 (IAPH - International Association of Ports and Harbors, 2013). Questi pochi, quanto significativi numeri, indicano una evidente necessità di “guidare” questo enorme flusso di aerei e navi in giro per il mondo, sempre in crescita, nella maniera più opportuna, tracciando le rotte adeguate e garantendo la sicurezza necessaria anche nelle fasi più delicate (decollo e atterraggio per gli aeroplani e manovre in porto per le grandi navi). Nello sviluppo della tesi si proverà a capire quali e quanto i sistemi di navigazione possono assolvere al ruolo di “guida” del trasporto aereo e marittimo.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Il presente lavoro di tesi si pone come obbiettivo l’elaborazione di dati GNSS in modalità cinematica post-processing per il monitoraggio strutturale e, in una seconda fase, lo studio delle precisioni raggiungibili delle soluzioni ottenute utilizzando algoritmi di post-elaborazione del dato. L’oggetto di studio è la torre Garisenda, situata in piazza Ravegnana, accanto alla torre Asinelli, nel centro storico di Bologna, da tempo oggetto di studi e monitoraggi per via della sua inclinazione particolarmente critica. Per lo studio è stato utilizzato un data set di quindici giorni, dal 15/12/2013 al 29/12/2013 compresi. Per l’elaborazione dei dati è stato utilizzato un software open source realizzato da ricercatori del Politecnico di Milano, goGPS. Quest'ultimo, essendo un codice nuovo, è stato necessario testarlo al fine di poter ottenere dei risultati validi. Nella prima fase della tesi si è quindi affrontato l’aspetto della calibrazione dei parametri che forniscono le soluzioni più precise per le finalità di monitoraggio considerando le possibili scelte offerte dal codice goGPS. In particolare sono stati imposti dei movimenti calibrati e si è osservata la soluzione al variare dei parametri selezionati scegliendo poi quella migliore, ossia il miglior compromesso tra la capacità di individuare i movimenti e il rumore della serie. Nella seconda fase, allo scopo di poter migliorare le precisioni delle soluzioni si sono valutati metodi di correzione delle soluzioni basati sull'uso di filtri sequenziali e sono state condotte analisi sull'incremento di precisione derivante dall'applicazione di tali correzioni.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Statistical shape models (SSMs) have been used widely as a basis for segmenting and interpreting complex anatomical structures. The robustness of these models are sensitive to the registration procedures, i.e., establishment of a dense correspondence across a training data set. In this work, two SSMs based on the same training data set of scoliotic vertebrae, and registration procedures were compared. The first model was constructed based on the original binary masks without applying any image pre- and post-processing, and the second was obtained by means of a feature preserving smoothing method applied to the original training data set, followed by a standard rasterization algorithm. The accuracies of the correspondences were assessed quantitatively by means of the maximum of the mean minimum distance (MMMD) and Hausdorf distance (H(D)). Anatomical validity of the models were quantified by means of three different criteria, i.e., compactness, specificity, and model generalization ability. The objective of this study was to compare quasi-identical models based on standard metrics. Preliminary results suggest that the MMMD distance and eigenvalues are not sensitive metrics for evaluating the performance and robustness of SSMs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a new approach for corpus-based speech enhancement that significantly improves over a method published by Xiao and Nickel in 2010. Corpus-based enhancement systems do not merely filter an incoming noisy signal, but resynthesize its speech content via an inventory of pre-recorded clean signals. The goal of the procedure is to perceptually improve the sound of speech signals in background noise. The proposed new method modifies Xiao's method in four significant ways. Firstly, it employs a Gaussian mixture model (GMM) instead of a vector quantizer in the phoneme recognition front-end. Secondly, the state decoding of the recognition stage is supported with an uncertainty modeling technique. With the GMM and the uncertainty modeling it is possible to eliminate the need for noise dependent system training. Thirdly, the post-processing of the original method via sinusoidal modeling is replaced with a powerful cepstral smoothing operation. And lastly, due to the improvements of these modifications, it is possible to extend the operational bandwidth of the procedure from 4 kHz to 8 kHz. The performance of the proposed method was evaluated across different noise types and different signal-to-noise ratios. The new method was able to significantly outperform traditional methods, including the one by Xiao and Nickel, in terms of PESQ scores and other objective quality measures. Results of subjective CMOS tests over a smaller set of test samples support our claims.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Forward-looking ground penetrating radar shows promise for detection of improvised explosive devices in active war zones. Because of certain insurmountable physical limitations, post-processing algorithm development is the most popular research topic in this field. One such investigative avenue explores the worthiness of frequency analysis during data post-processing. Using the finite difference time domain numerical method, simulations are run to test both mine and clutter frequency response. Mines are found to respond strongest at low frequencies and cause periodic changes in ground penetrating radar frequency results. These results are called into question, however, when clutter, a phenomenon generally known to be random, is also found to cause periodic frequency effects. Possible causes, including simulation inaccuracy, are considered. Although the clutter models used are found to be inadequately random, specular reflections of differing periodicity are found to return from both the mine and the ground. The presence of these specular reflections offers a potential alternative method of determining a mine’s presence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ultrasmall superparamagnetic iron oxide (USPIO) particles are promising contrast media, especially for molecular and cellular imaging besides lymph node staging owing to their superior NMR efficacy, macrophage uptake and lymphotropic properties. The goal of the present prospective clinical work was to validate quantification of signal decrease on high-resolution T(2)-weighted MR sequences before and 24-36 h after USPIO administration for accurate differentiation between benign and malignant normal-sized pelvic lymph nodes. Fifty-eight patients with bladder or prostate cancer were examined on a 3 T MR unit and their respective lymph node signal intensities (SI), signal-to-noise (SNR) and contrast-to-noise (CNR) were determined on pre- and post-contrast 3D T(2)-weighted turbo spin echo (TSE) images. Based on histology and/or localization, USPIO-uptake-related SI/SNR decrease of benign vs malignant and pelvic vs inguinal lymph nodes was compared. Out of 2182 resected lymph nodes 366 were selected for MRI post-processing. Benign pelvic lymph nodes showed a significantly higher SI/SNR decrease compared with malignant nodes (p < 0.0001). Inguinal lymph nodes in comparison to pelvic lymph nodes presented a reduced SI/SNR decrease (p < 0.0001). CNR did not differ significantly between benign and malignant lymph nodes. The receiver operating curve analysis yielded an area under the curve of 0.96, and the point with optimal accuracy was found at a threshold value of 13.5% SNR decrease. Overlap of SI and SNR changes between benign and malignant lymph nodes were attributed to partial voluming, lipomatosis, histiocytosis or focal lymphoreticular hyperplasia. USPIO-enhanced MRI improves the diagnostic ability of lymph node staging in normal-sized lymph nodes, although some overlap of SI/SNR-changes remained. Quantification of USPIO-dependent SNR decrease will enable the validation of this promising technique with the final goal of improving and individualizing patient care.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: To prospectively evaluate whether intravenous morphine co-medication improves bile duct visualization of dual-energy CT-cholangiography. MATERIALS AND METHODS: Forty potential donors for living-related liver transplantation underwent CT-cholangiography with infusion of a hepatobiliary contrast agent over 40min. Twenty minutes after the beginning of the contrast agent infusion, either normal saline (n=20 patients; control group [CG]) or morphine sulfate (n=20 patients; morphine group [MG]) was injected. Forty-five minutes after initiation of the contrast agent, a dual-energy CT acquisition of the liver was performed. Applying dual-energy post-processing, pure iodine images were generated. Primary study goals were determination of bile duct diameters and visualization scores (on a scale of 0 to 3: 0-not visualized; 3-excellent visualization). RESULTS: Bile duct visualization scores for second-order and third-order branch ducts were significantly higher in the MG compared to the CG (2.9±0.1 versus 2.6±0.2 [P<0.001] and 2.7±0.3 versus 2.1±0.6 [P<0.01], respectively). Bile duct diameters for the common duct and main ducts were significantly higher in the MG compared to the CG (5.9±1.3mm versus 4.9±1.3mm [P<0.05] and 3.7±1.3mm versus 2.6±0.5mm [P<0.01], respectively). CONCLUSION: Intravenous morphine co-medication significantly improved biliary visualization on dual-energy CT-cholangiography in potential donors for living-related liver transplantation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this prospective trial was to evaluate sensitivity and specificity of bright lumen magnetic resonance colonography (MRC) in comparison with conventional colonoscopy (CC). A total of 120 consecutive patients with clinical indications for CC were prospectively examined using MRC (1.5 Tesla) which was then followed by CC. Prior to MRC, the cleansed colon was filled with a gadolinium-water solution. A 3D GRE sequence was performed with the patient in the prone and supine position, each acquired during one breathhold period. After division of the colon into five segments, interactive data analysis was carried out using three-dimensional post-processing, including a virtual intraluminal view. The results of CC served as a reference standard. In all patients MRC was performed successfully and no complications occurred. Image quality was diagnostic in 92% (574/620 colonic segments). On a per-patient basis, the results of MRC were as follows: sensitivity 84% (95% CI 71.7-92.3%), specificity 97% (95% CI 89.0-99.6%). Five flat adenomas and 6/16 small polyps (< or =5 mm) were not identified by MRC. MRC offers high sensitivity and excellent specificity rates in patients with clinical indications for CC. Improved MRC techniques are needed to detect small polyps and flat adenomas.