816 resultados para Pollards rho algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

SAPK/JNK regulieren nach genotoxischem Stress eine Vielzahl von Zielsubstraten, die bedeutsam für Reparatur und Überleben der Zelle sind, somit nehmen sie Einfluss auf das zelluläre Schicksal der Zelle. Ob DNA-Schäden eine Phosphorylierung von Stress-Kinasen nach sich ziehen ist bisher noch wenig untersucht. Mit reparaturdefizienten Zellen wurde der Einfluss von DNA-Schäden, durch Cisplatin/Transplatin/UV-C, auf die SAPK/JNK Aktivierung untersucht. Die Aktivierung der Stress-Kinasen erfolgte agenzspezifisch und abhängig von verschieden Reparaturfaktoren. Die Aktivierung korrelierte in reparaturdefizienten Zellen teilweise mit dem späten Auftreten von DNA-Strangbrüchen, war jedoch unabhängig von erhöhten initialen DNA-Schäden. Diese Befunde zeigten, dass die späte Aktivierung der SAPK/JNK DNA-schadensabhängig verläuft und das Cisplatin und Transplatin bei Verwendung von äquitoxischen Dosen zu einer vergleichbaren Aktivierung von SAPK/JNK führten. Die Hemmung der Rho-GTPasen sowohl durch Statine als auch mittels Clostridium difficile Toxin B zeigte weiterhin, dass Rho-GTPasen möglicherweise die späte DNA-schadensabhängige Aktivierung der Stress-Kinasen vermitteln. Die Hemmung von Rho-GTPasen durch physiologisch relevante Konzentrationen von Statinen führte in primären humanen Endothelzellen (HUVECs) zu einer Protektion vor IR-Strahlung und Doxorubicin. In beiden Fällen konnte eine Hemmung des pro-apoptotischen Transkriptionsfaktors p53 sowie der Chk1, welche einen Zellzyklusarrest reguliert, mit der Statin-Behandlung erreicht werden. Effektor-Caspasen wurden dabei durch den HMG-CoA-Reduktase Hemmer nicht beeinflusst. Ausschließlich bei dem Statin-vermittelten Schutz vor Doxorubicin kam es zu einer Reduktion von initialen DNA-Schäden, in Form von DNA-Strangbrüchen. Die IR-induzierten Strangbrüche in der DNA blieben von der Statin-Inkubation hingegen unbeeinflusst. Aufgrund ihrer protektiven Eigenschaften gegenüber IR- und Doxorubicin-induzierter Zytotoxizität in Endothelzellen und ihrer pro-apoptotischen Wirkung auf Tumorzellen könnten Statine möglicherweise die unerwünschten Nebenwirkungen von Zytostatika und einer Strahlentherapie günstig beeinflussen

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex networks analysis is a very popular topic in computer science. Unfortunately this networks, extracted from different contexts, are usually very large and the analysis may be very complicated: computation of metrics on these structures could be very complex. Among all metrics we analyse the extraction of subnetworks called communities: they are groups of nodes that probably play the same role within the whole structure. Communities extraction is an interesting operation in many different fields (biology, economics,...). In this work we present a parallel community detection algorithm that can operate on networks with huge number of nodes and edges. After an introduction to graph theory and high performance computing, we will explain our design strategies and our implementation. Then, we will show some performance evaluation made on a distributed memory architectures i.e. the supercomputer IBM-BlueGene/Q "Fermi" at the CINECA supercomputing center, Italy, and we will comment our results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents some different techniques designed to drive a swarm of robots in an a-priori unknown environment in order to move the group from a starting area to a final one avoiding obstacles. The presented techniques are based on two different theories used alone or in combination: Swarm Intelligence (SI) and Graph Theory. Both theories are based on the study of interactions between different entities (also called agents or units) in Multi- Agent Systems (MAS). The first one belongs to the Artificial Intelligence context and the second one to the Distributed Systems context. These theories, each one from its own point of view, exploit the emergent behaviour that comes from the interactive work of the entities, in order to achieve a common goal. The features of flexibility and adaptability of the swarm have been exploited with the aim to overcome and to minimize difficulties and problems that can affect one or more units of the group, having minimal impact to the whole group and to the common main target. Another aim of this work is to show the importance of the information shared between the units of the group, such as the communication topology, because it helps to maintain the environmental information, detected by each single agent, updated among the swarm. Swarm Intelligence has been applied to the presented technique, through the Particle Swarm Optimization algorithm (PSO), taking advantage of its features as a navigation system. The Graph Theory has been applied by exploiting Consensus and the application of the agreement protocol with the aim to maintain the units in a desired and controlled formation. This approach has been followed in order to conserve the power of PSO and to control part of its random behaviour with a distributed control algorithm like Consensus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to its practical importance and inherent complexity, the optimisation of distribution networks for supplying drinking water has been the subject of extensive study for the past 30 years. The optimization is governed by sizing the pipes in the water distribution network (WDN) and / or optimises specific parts of the network such as pumps, tanks etc. or try to analyse and optimise the reliability of a WDN. In this thesis, the author has analysed two different WDNs (Anytown City and Cabrera city networks), trying to solve and optimise a multi-objective optimisation problem (MOOP). The main two objectives in both cases were the minimisation of Energy Cost (€) or Energy consumption (kWh), along with the total Number of pump switches (TNps) during a day. For this purpose, a decision support system generator for Multi-objective optimisation used. Its name is GANetXL and has been developed by the Center of Water System in the University of Exeter. GANetXL, works by calling the EPANET hydraulic solver, each time a hydraulic analysis has been fulfilled. The main algorithm used, was a second-generation algorithm for multi-objective optimisation called NSGA_II that gave us the Pareto fronts of each configuration. The first experiment that has been carried out was the network of Anytown city. It is a big network with a pump station of four fixed speed parallel pumps that are boosting the water dynamics. The main intervention was to change these pumps to new Variable speed driven pumps (VSDPs), by installing inverters capable to diverse their velocity during the day. Hence, it’s been achieved great Energy and cost savings along with minimisation in the number of pump switches. The results of the research are thoroughly illustrated in chapter 7, with comments and a variety of graphs and different configurations. The second experiment was about the network of Cabrera city. The smaller WDN had a unique FS pump in the system. The problem was the same as far as the optimisation process was concerned, thus, the minimisation of the energy consumption and in parallel the minimisation of TNps. The same optimisation tool has been used (GANetXL).The main scope was to carry out several and different experiments regarding a vast variety of configurations, using different pump (but this time keeping the FS mode), different tank levels, different pipe diameters and different emitters coefficient. All these different modes came up with a large number of results that were compared in the chapter 8. Concluding, it should be said that the optimisation of WDNs is a very interested field that has a vast space of options to deal with. This includes a large number of algorithms to choose from, different techniques and configurations to be made and different support system generators. The researcher has to be ready to “roam” between these choices, till a satisfactory result will convince him/her that has reached a good optimisation point.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lo scopo del presente lavoro di tesi riguarda la caratterizzazione di un sensore ottico per la lettura di ematocrito e lo sviluppo dell’algoritmo di calibrazione del dispositivo. In altre parole, utilizzando dati ottenuti da una sessione di calibrazione opportunamente pianificata, l’algoritmo sviluppato ha lo scopo di restituire la curva di interpolazione dei dati che caratterizza il trasduttore. I passi principali del lavoro di tesi svolto sono sintetizzati nei punti seguenti: 1) Pianificazione della sessione di calibrazione necessaria per la raccolta dati e conseguente costruzione di un modello black box.  Output: dato proveniente dal sensore ottico (lettura espressa in mV)  Input: valore di ematocrito espresso in punti percentuali ( questa grandezza rappresenta il valore vero di volume ematico ed è stata ottenuta con un dispositivo di centrifugazione sanguigna) 2) Sviluppo dell’algoritmo L’algoritmo sviluppato e utilizzato offline ha lo scopo di restituire la curva di regressione dei dati. Macroscopicamente, il codice possiamo distinguerlo in due parti principali: 1- Acquisizione dei dati provenienti da sensore e stato di funzionamento della pompa bifasica 2- Normalizzazione dei dati ottenuti rispetto al valore di riferimento del sensore e implementazione dell’algoritmo di regressione. Lo step di normalizzazione dei dati è uno strumento statistico fondamentale per poter mettere a confronto grandezze non uniformi tra loro. Studi presenti, dimostrano inoltre un mutazione morfologica del globulo rosso in risposta a sollecitazioni meccaniche. Un ulteriore aspetto trattato nel presente lavoro, riguarda la velocità del flusso sanguigno determinato dalla pompa e come tale grandezza sia in grado di influenzare la lettura di ematocrito.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il presente studio ha analizzato e seguito il monitoraggio ambientale relativo al progetto di adeguamento con caratteristiche autostradali della SP46 Rho-Monza. Più precisamente il progetto dii riqualificazione prevede la realizzazione di nuove infrastrutture, ma anche l’ammodernamento di quelle già esistenti. Più precisamente è stata quindi prevista l’analisi degli interventi eseguiti e delle tecnologie utilizzate per il monitoraggio ambientale presso i comuni di Milano e Paderno Dugnano: realizzazione di perforazioni a distruzione di nucleo, la messa in opera di piezometri a tubo aperto nei fori di sondaggio realizzati, l’esecuzione di rilevazioni chimico-fisiche sulla falda intercettata dai piezometri realizzati. Tramite questo studio è stato così possibile ricostruire il livello freatimetrico della falda direttamente interessata dal progetto di riqualificazione della SP46 ed intercettata dai piezometri. In secondo luogo oltre alla ricostruzione della soggiacenza della falda, è stato valutato l’impatto antropico sulla qualità delle acque sotterranee della falda stessa, grazie alla determinazione di parametri chimico-fisici e ad analisi di laboratorio su campioni di acqua prelevati.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il presente lavoro di tesi è stato svolto a seguito delle indagini geognostiche previste per un progetto ingegneristico volto alla riqualificazione con caratteristiche autostradali della SP46 Rho-Monza e del preesistente sistema autostradale A8/A52, la cui area interessata dai lavori è ubicata nella parte Nord del comune di Milano. Lo studio è stato finalizzato alla valutazione, attraverso metodologie e tecnologie specifiche, delle caratteristiche idrodinamiche delle acque sotterranee presenti nella zona oggetto dei lavori. A seguito di misure sul livello piezometrico della falda, compiute dopo la realizzazione di 8 piezometri, è stata realizzata (con l’ausilio del software Surfer 8.0® – Golden Software Inc.) una mappa relativa all’andamento delle isopieze e dei gradienti di flusso, attraverso interpolazione spaziale con metodo Kriging delle misure. La ricostruzione dell’assetto della falda così ottenuto ha permesso di fornire utili indicazioni riguardo le successive scelte progettuali.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il presente lavoro di tesi è stato svolto presso il servizio di Fisica Sanitaria del Policlinico Sant'Orsola-Malpighi di Bologna. Lo studio si è concentrato sul confronto tra le tecniche di ricostruzione standard (Filtered Back Projection, FBP) e quelle iterative in Tomografia Computerizzata. Il lavoro è stato diviso in due parti: nella prima è stata analizzata la qualità delle immagini acquisite con una CT multislice (iCT 128, sistema Philips) utilizzando sia l'algoritmo FBP sia quello iterativo (nel nostro caso iDose4). Per valutare la qualità delle immagini sono stati analizzati i seguenti parametri: il Noise Power Spectrum (NPS), la Modulation Transfer Function (MTF) e il rapporto contrasto-rumore (CNR). Le prime due grandezze sono state studiate effettuando misure su un fantoccio fornito dalla ditta costruttrice, che simulava la parte body e la parte head, con due cilindri di 32 e 20 cm rispettivamente. Le misure confermano la riduzione del rumore ma in maniera differente per i diversi filtri di convoluzione utilizzati. Lo studio dell'MTF invece ha rivelato che l'utilizzo delle tecniche standard e iterative non cambia la risoluzione spaziale; infatti gli andamenti ottenuti sono perfettamente identici (a parte le differenze intrinseche nei filtri di convoluzione), a differenza di quanto dichiarato dalla ditta. Per l'analisi del CNR sono stati utilizzati due fantocci; il primo, chiamato Catphan 600 è il fantoccio utilizzato per caratterizzare i sistemi CT. Il secondo, chiamato Cirs 061 ha al suo interno degli inserti che simulano la presenza di lesioni con densità tipiche del distretto addominale. Lo studio effettuato ha evidenziato che, per entrambi i fantocci, il rapporto contrasto-rumore aumenta se si utilizza la tecnica di ricostruzione iterativa. La seconda parte del lavoro di tesi è stata quella di effettuare una valutazione della riduzione della dose prendendo in considerazione diversi protocolli utilizzati nella pratica clinica, si sono analizzati un alto numero di esami e si sono calcolati i valori medi di CTDI e DLP su un campione di esame con FBP e con iDose4. I risultati mostrano che i valori ricavati con l'utilizzo dell'algoritmo iterativo sono al di sotto dei valori DLR nazionali di riferimento e di quelli che non usano i sistemi iterativi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel computerized algorithm for hip joint motion simulation and collision detection, called the Equidistant Method, has been developed. This was compared to three pre-existing methods having different properties regarding definition of the hip joint center and behavior after collision detection. It was proposed that the Equidistant Method would be most accurate in detecting the location and extent of femoroacetabular impingement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Residual acetabular dysplasia of the hip in most patients can be corrected by periacetabular osteotomy. However, some patients have intraarticular abnormalities causing insufficient coverage, containment or congruency after periacetabular osteotomy, or extraarticular abnormalities that limit either acetabular correction or hip motion. For these patients, we believe an additional proximal femoral osteotomy can improve coverage, containment, congruency and/or motion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to assess a pharmacokinetic algorithm to predict ketamine plasma concentration and drive a target-controlled infusion (TCI) in ponies. Firstly, the algorithm was used to simulate the course of ketamine enantiomers plasma concentrations after the administration of an intravenous bolus in six ponies based on individual pharmacokinetic parameters obtained from a previous experiment. Using the same pharmacokinetic parameters, a TCI of S-ketamine was then performed over 120 min to maintain a concentration of 1 microg/mL in plasma. The actual plasma concentrations of S-ketamine were measured from arterial samples using capillary electrophoresis. The performance of the simulation for the administration of a single bolus was very good. During the TCI, the S-ketamine plasma concentrations were maintained within the limit of acceptance (wobble and divergence <20%) at a median of 79% (IQR, 71-90) of the peak concentration reached after the initial bolus. However, in three ponies the steady concentrations were significantly higher than targeted. It is hypothesized that an inaccurate estimation of the volume of the central compartment is partly responsible for that difference. The algorithm allowed good predictions for the single bolus administration and an appropriate maintenance of constant plasma concentrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To assess the diagnostic accuracy, image quality, and radiation dose of an iterative reconstruction algorithm compared with a filtered back projection (FBP) algorithm for abdominal computed tomography (CT) at different tube voltages.