857 resultados para Complex network analysis. Time varying graph mine (TVG). Slow-wave sleep (SWS). Fault tolerance
Resumo:
3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.
Resumo:
La tesi di Dottorato studia il flusso sanguigno tramite un codice agli elementi finiti (COMSOL Multiphysics). Nell’arteria è presente un catetere Doppler (in posizione concentrica o decentrata rispetto all’asse di simmetria) o di stenosi di varia forma ed estensione. Le arterie sono solidi cilindrici rigidi, elastici o iperelastici. Le arterie hanno diametri di 6 mm, 5 mm, 4 mm e 2 mm. Il flusso ematico è in regime laminare stazionario e transitorio, ed il sangue è un fluido non-Newtoniano di Casson, modificato secondo la formulazione di Gonzales & Moraga. Le analisi numeriche sono realizzate in domini tridimensionali e bidimensionali, in quest’ultimo caso analizzando l’interazione fluido-strutturale. Nei casi tridimensionali, le arterie (simulazioni fluidodinamiche) sono infinitamente rigide: ricavato il campo di pressione si procede quindi all’analisi strutturale, per determinare le variazioni di sezione e la permanenza del disturbo sul flusso. La portata sanguigna è determinata nei casi tridimensionali con catetere individuando tre valori (massimo, minimo e medio); mentre per i casi 2D e tridimensionali con arterie stenotiche la legge di pressione riproduce l’impulso ematico. La mesh è triangolare (2D) o tetraedrica (3D), infittita alla parete ed a valle dell’ostacolo, per catturare le ricircolazioni. Alla tesi sono allegate due appendici, che studiano con codici CFD la trasmissione del calore in microcanali e l’ evaporazione di gocce d’acqua in sistemi non confinati. La fluidodinamica nei microcanali è analoga all’emodinamica nei capillari. Il metodo Euleriano-Lagrangiano (simulazioni dell’evaporazione) schematizza la natura mista del sangue. La parte inerente ai microcanali analizza il transitorio a seguito dell’applicazione di un flusso termico variabile nel tempo, variando velocità in ingresso e dimensioni del microcanale. L’indagine sull’evaporazione di gocce è un’analisi parametrica in 3D, che esamina il peso del singolo parametro (temperatura esterna, diametro iniziale, umidità relativa, velocità iniziale, coefficiente di diffusione) per individuare quello che influenza maggiormente il fenomeno.
Resumo:
Development aid involves a complex network of numerous and extremely heterogeneous actors. Nevertheless, all actors seem to speak the same ‘development jargon’ and to display a congruence that extends from the donor over the professional consultant to the village chief. And although the ideas about what counts as ‘good’ and ‘bad’ aid have constantly changed over time —with new paradigms and policies sprouting every few years— the apparent congruence between actors more or less remains unchanged. How can this be explained? Is it a strategy of all actors to get into the pocket of the donor, or are the social dynamics in development aid more complex? When a new development paradigm appears, where does it come from and how does it gain support? Is this support really homogeneous? To answer the questions, a multi-sited ethnography was conducted in the sector of water-related development aid, with a focus on 3 paradigms that are currently hegemonic in this sector: Integrated Water Resources Management, Capacity Building, and Adaptation to Climate Change. The sites of inquiry were: the headquarters of a multilateral organization, the headquarters of a development NGO, and the Inner Niger Delta in Mali. The research shows that paradigm shifts do not happen overnight but that new paradigms have long lines of descent. Moreover, they require a lot of work from actors in order to become hegemonic; the actors need to create a tight network of support. Each actor, however, interprets the paradigms in a slightly different way, depending on the position in the network. They implant their own interests in their interpretation of the paradigm (the actors ‘translate’ their interests), regardless of whether they constitute the donor, a mediator, or the aid recipient. These translations are necessary to cement and reproduce the network.
Resumo:
In dieser Dissertation wurden die Methoden Homologiemodellierung und Molekulardynamik genutzt, um die Struktur und das Verhalten von Proteinen in Lösung zu beschreiben. Mit Hilfe der Röntgenkleinwinkelstreuung wurden die mit den Computermethoden erzeugten Vorhersagen verifiziert. Für das alpha-Hämolysin, ein Toxin von Staphylococcus aureus, das eine heptamere Pore formen kann, wurde erstmalig die monomere Struktur des Protein in Lösung beschrieben. Homologiemodellierung auf Basis verwandter Proteine, deren monomere Struktur bekannt war, wurde verwendet, um die monomere Struktur des Toxins vorherzusagen. Flexibilität von Strukturelementen in einer Molekulardynamiksimulation konnte mit der Funktionalität des Proteines korreliert werden: Intrinsische Flexibilität versetzt das Protein in die Lage den Konformationswechsel zur Pore nach Assemblierung zu vollziehen. Röntgenkleinwinkelstreuung bewies die Unterschiede der monomeren Struktur zu den Strukturen der verwandten Proteine und belegt den eigenen Vorschlag zur Struktur. Überdies konnten Arbeiten an einer Mutante, die in einer sogenannten Präporenkonformation arretiert und nicht in der Lage ist eine Pore zu formen, zeigen, dass dieser Übergangszustand mit der Rotationsachse senkrecht zur Membran gelagert ist. Eine geometrische Analyse beweist, dass es sterisch möglich ist ausgehend von dieser Konformation die Konformation der Pore zu erreichen. Eine energetische und kinetische Analyse dieses Konformationswechsels steht noch aus. Ein weiterer Teil der Arbeit befasst sich mit den Konformationswechseln von Hämocyaninen. Diese wurden experimentell mittels Röntgenkleinwinkelstreuung verfolgt. Konformationswechsel im Zusammenhang mit der Oxygenierung konnten für die 24meren Hämocyanine von Eurypelma californicum und Pandinus imperator beschrieben werden. Für eine Reihe von Hämocyaninen ist nachgewiesen, dass sie unter Einfluss des Agenz SDS Tyrosinaseaktivität entfalten können. Der Konformationswechsel der Hämocyanine von E. californicum und P. imperator bei der Aktivierung zur Tyrosinase mittels SDS wurde experimentell bestätigt und die Stellung der Dodekamere der Hämocyanine als wesentlich bei der Aktivierung festgestellt. Im Zusammenhang mit anderen Arbeiten gilt damit die Relaxierung der Struktur unter SDS-Einfluss und der sterische Einfluss auf die verbindenden Untereinheiten b & c als wahrscheinliche Ursache für die Aktivierung zur Tyrosinase. Eigene Software zum sogenannten rigid body-Modellierung auf der Basis von Röntgenkleinwinkelstreudaten wurde erstellt, um die Streudaten des hexameren Hämocyanins von Palinurus elephas und Palinurus argus unter Einfluss der Effektoren Urat und Koffein strukturell zu interpretieren. Die Software ist die erste Implementierung eines Monte Carlo-Algorithmus zum rigid body-Modelling. Sie beherrscht zwei Varianten des Algorithmus: In Verbindung mit simulated annealing können wahrscheinliche Konformationen ausgefiltert werden und in einer anschließenden systematischen Analyse kann eine Konformation geometrisch beschrieben werden. Andererseits ist ein weiterer, reiner Monte Carlo-Algorithmus in der Lage die Konformation als Dichteverteilung zu beschreiben.
Resumo:
The research activity characterizing the present thesis was mainly centered on the design, development and validation of methodologies for the estimation of stationary and time-varying connectivity between different regions of the human brain during specific complex cognitive tasks. Such activity involved two main aspects: i) the development of a stable, consistent and reproducible procedure for functional connectivity estimation with a high impact on neuroscience field and ii) its application to real data from healthy volunteers eliciting specific cognitive processes (attention and memory). In particular the methodological issues addressed in the present thesis consisted in finding out an approach to be applied in neuroscience field able to: i) include all the cerebral sources in connectivity estimation process; ii) to accurately describe the temporal evolution of connectivity networks; iii) to assess the significance of connectivity patterns; iv) to consistently describe relevant properties of brain networks. The advancement provided in this thesis allowed finding out quantifiable descriptors of cognitive processes during a high resolution EEG experiment involving subjects performing complex cognitive tasks.
Resumo:
La ricerca pone al suo centro lo studio dell'opera architettonica di Emil Steffann (1899-1968) la cui produzione realizzata consta, nel breve arco temporale che va dal 1950 al 1968, del ragguardevole numero di trentanove chiese, rappresentando un caso emblematico di progettazione e costruzione di edifici per il culto cristiano in grado di raffigurarne concretamente i principi fondativi liturgici, estetici e morfologici. L'architettura di Steffann, profondamente ispirata dallo spirito religioso, legata a figure primigenie che plasmano lo stare-insieme della comunità nella qualità corporea della materia, dove la presenza liturgica e monumentale si esprime nel silenzio e nella disponibilità di uno spazio circoscritto dai muri e direzionato dalla luce, concorre a definire nell'oggettivo amore per il vero la percezione estetico-teologica e la poetica formativa che connaturano, a nostro parere, progetto e segno della chiesa. Il testo concretizza il primo studio monografico completo di questo corpus architettonico e si basa sulla ricognizione diretta delle opere di Steffann; ne è derivata una narrazione non conseguente a un ordine cronologico o di presupposta importanza degli edifici, bensì che ricerca ed evidenzia corrispondenze tra nodi di una rete ideativa la quale, con diversi gradi di finitezza, in punti non sempre omogenei del tempo e dello spazio, denota un'esperienza autentica del comporre e del costruire. Il racconto individua gli oggetti architettonici, ne discute la consistenza aprendosi a riferimenti altri (in particolare il pensiero ecclesiologico-liturgico di Romano Guardini e quello estetico-teologico di Hans Urs von Balthasar) in grado di illuminarne la genesi e la manifestazione, li lega infine in sequenze analogiche. Una serie di tavole fotografiche originali, parte ineludibile e integrante della ricerca, testimonia dello stato attuale dei luoghi, connotando ulteriormente l'aspetto info-rappresentativo della loro composizione architettonica. In chiusura, la sintesi architetturale vuole essere uno strumento di verifica e progetto, quindi di trasposizione futura, correlato all'elaborazione documentaria.
Resumo:
Complex networks analysis is a very popular topic in computer science. Unfortunately this networks, extracted from different contexts, are usually very large and the analysis may be very complicated: computation of metrics on these structures could be very complex. Among all metrics we analyse the extraction of subnetworks called communities: they are groups of nodes that probably play the same role within the whole structure. Communities extraction is an interesting operation in many different fields (biology, economics,...). In this work we present a parallel community detection algorithm that can operate on networks with huge number of nodes and edges. After an introduction to graph theory and high performance computing, we will explain our design strategies and our implementation. Then, we will show some performance evaluation made on a distributed memory architectures i.e. the supercomputer IBM-BlueGene/Q "Fermi" at the CINECA supercomputing center, Italy, and we will comment our results.
Resumo:
Social networks are one of the “hot” themes in people’s life and contemporary social research. Considering our “embeddedness” in a thick web of social relations is a study perspective that could unveil a number of explanations of how people may manage their personal and social resources. Looking at people’s behaviors of building and managing their social networks, seems to be an effective way to find some possible rationalization about how to help people getting the best from their resources . The main aim of this dissertation is to give a closer look at the role of networking behaviors. Antecedents, motivations, different steps and measures about networking behaviors and outcomes are analyzed and discussed. Results seem to confirm, in a different setting and time perspective, that networking behaviors include different types and goals that change over time. Effects of networking behaviors seem to find empirical confirmation through social network analysis methods. Both personality and situational self-efficacy seem to predict networking behaviors. Different types of motivational drivers seem to be related to diverse networking behaviors.
Resumo:
L’obbiettivo del lavoro è quello di delimitare uno spazio critico che consenta di ripensare il concetto di modernità nelle culture ispanoamericane degli inizi del XX secolo. In questa direzione, si è deciso di focalizzare l’attenzione su un’opera letteraria, quella dell’uruguaiano Julio Herrera y Reissig, del tutto particolare se comparata al resto delle produzioni estetiche a essa più immediatamente contigue. Tornare a leggere Herrera y Reissig equivale, infatti, nella sostanza, a rimettere mano criticamente a tutta l’epoca modernista, interpretandola non in senso unitario, bensì plurale e frammentario. Spunto di partenza dell’analisi sono state le coordinate culturali comuni in cui quelle estetiche si sono determinate e sviluppate, per poi procedere verso una moltiplicazione di percorsi in grado di rendere conto della sostanziale discrepanza di mezzi e finalità che intercorre fra Julio Herrera y Reissig e gran parte del Modernismo a lui contemporaneo. Mantenendo come base metodologica i presupposti dell’archeologia culturale foucauldiana, è stato possibile rintracciare, nell’opera dell’uruguaiano, un eterogeneo ma costante movimento di riemersione e riutilizzo delle più svariate esperienze del pensiero – estetico e non – occidentale. Nelle particolarità d’uso a cui la tradizione è sottomessa nella scrittura di Herrera y Reissig si è reso così possibile tornare a ragionare sui punti focali dell’esperienza della modernità: il legame fra patrimonio culturale e attualità, la relazione fra sedimentazione tradizionale e novità, nonché, in definitiva, le modalità attraverso le quali alla letteratura è consentito di pensare e dire la propria storia – passata, presente e futura – e, in conseguenza, metabolizzarla, per tornare ad agire attivamente su di essa.
Resumo:
L'indagine condotta, avvalendosi del paradigma della social network analysis, offre una descrizione delle reti di supporto personale e del capitale sociale di un campione di 80 italiani ex post un trattamento terapeutico residenziale di lungo termine per problemi di tossicodipendenza. Dopo aver identificato i profili delle reti di supporto sociale degli intervistati, si è proceduto, in primis, alla misurazione e comparazione delle ego-centered support networks tra soggetti drug free e ricaduti e, successivamente, all'investigazione delle caratteristiche delle reti e delle forme di capitale sociale – closure e brokerage – che contribuiscono al mantenimento dell'astinenza o al rischio di ricaduta nel post-trattamento. Fattori soggettivi, come la discriminazione pubblica percepita e l'attitudine al lavoro, sono stati inoltre esplorati al fine di investigare la loro correlazione con la condotta di reiterazione nell'uso di sostanze. Dai risultati dello studio emerge che un più basso rischio di ricaduta è positivamente associato ad una maggiore attitudine al lavoro, ad una minore percezione di discriminazione da parte della società, all'avere membri di supporto con un più alto status socio-economico e che mobilitano risorse reputazionali e, infine, all'avere reti più eterogenee nell'occupazione e caratterizzate da più elevati livelli di reciprocità. Inoltre, il capitale sociale di tipo brokerage contribuisce al mantenimento dell'astinenza in quanto garantisce l'accesso del soggetto ad informazioni meno omogenee e la sua esposizione a opportunità più numerose e differenziate. I risultati dello studio, pertanto, dimostrano l'importante ruolo delle personal support networks nel prevenire o ridurre il rischio di ricaduta nel post-trattamento, in linea con precedenti ricerche che suggeriscono la loro incorporazione nei programmi terapeutici per tossicodipendenti.
Resumo:
Nowadays, more and more data is collected in large amounts, such that the need of studying it both efficiently and profitably is arising; we want to acheive new and significant informations that weren't known before the analysis. At this time many graph mining algorithms have been developed, but an algebra that could systematically define how to generalize such operations is missing. In order to propel the development of a such automatic analysis of an algebra, We propose for the first time (to the best of my knowledge) some primitive operators that may be the prelude to the systematical definition of a hypergraph algebra in this regard.
Resumo:
In the first chapter, I develop a panel no-cointegration test which extends Pesaran, Shin and Smith (2001)'s bounds test to the panel framework by considering the individual regressions in a Seemingly Unrelated Regression (SUR) system. This allows to take into account unobserved common factors that contemporaneously affect all the units of the panel and provides, at the same time, unit-specific test statistics. Moreover, the approach is particularly suited when the number of individuals of the panel is small relatively to the number of time series observations. I develop the algorithm to implement the test and I use Monte Carlo simulation to analyze the properties of the test. The small sample properties of the test are remarkable, compared to its single equation counterpart. I illustrate the use of the test through a test of Purchasing Power Parity in a panel of EU15 countries. In the second chapter of my PhD thesis, I verify the Expectation Hypothesis of the Term Structure in the repurchasing agreements (repo) market with a new testing approach. I consider an "inexact" formulation of the EHTS, which models a time-varying component in the risk premia and I treat the interest rates as a non-stationary cointegrated system. The effect of the heteroskedasticity is controlled by means of testing procedures (bootstrap and heteroskedasticity correction) which are robust to variance and covariance shifts over time. I fi#nd that the long-run implications of EHTS are verified. A rolling window analysis clarifies that the EHTS is only rejected in periods of turbulence of #financial markets. The third chapter introduces the Stata command "bootrank" which implements the bootstrap likelihood ratio rank test algorithm developed by Cavaliere et al. (2012). The command is illustrated through an empirical application on the term structure of interest rates in the US.
Resumo:
Dopo un’introduzione sull’economia nel mondo antico e nella Galilea, la tesi affronta una rappresentazione storica de “Il Mare di Galilea tra l’antichità e oggi” (cap. 3). Seguono i capitoli sulle “Tecniche e le attrezzature di pesca” (cap.4) e su “Città, villaggi e aree di pesca” (Cap. 5). Due capitoli riguardano più particolarmente l’attività economica in senso stretto: “L’organizzazione dell’attività” (cap. 6) e “Commercio ed esportazione” (cap. 7). Chiudono la tesi due capitoli di carattere più metodologico: una rappresentazione degli agenti sociali della pesca (“i pescatori”) condotta ispirandosi alla network Analysis e un’analisi antropologica del loro sistema di vita (capitolo finale).La tesi è basata essenzialmente su tre corpi di documentazione: papiri documentari, dati archeologici, fonti storiche e letterarie. Molti dei documenti reperiti, in lingua greca, non erano mai stati tradotti in lingue moderne.La tesi consta – oltre ai diversi capitoli – anche di un’appendice documentaria molto estesa
Resumo:
The development of next generation microwave technology for backhauling systems is driven by an increasing capacity demand. In order to provide higher data rates and throughputs over a point-to-point link, a cost-effective performance improvement is enabled by an enhanced energy-efficiency of the transmit power amplification stage, whereas a combination of spectrally efficient modulation formats and wider bandwidths is supported by amplifiers that fulfil strict constraints in terms of linearity. An optimal trade-off between these conflicting requirements can be achieved by resorting to flexible digital signal processing techniques at baseband. In such a scenario, the adaptive digital pre-distortion is a well-known linearization method, that comes up to be a potentially widely-used solution since it can be easily integrated into base stations. Its operation can effectively compensate for the inter-modulation distortion introduced by the power amplifier, keeping up with the frequency-dependent time-varying behaviour of the relative nonlinear characteristic. In particular, the impact of the memory effects become more relevant and their equalisation become more challenging as the input discrete signal feature a wider bandwidth and a faster envelope to pre-distort. This thesis project involves the research, design and simulation a pre-distorter implementation at RTL based on a novel polyphase architecture, which makes it capable of operating over very wideband signals at a sampling rate that complies with the actual available clock speed of current digital devices. The motivation behind this structure is to carry out a feasible pre-distortion for the multi-band spectrally efficient complex signals carrying multiple channels that are going to be transmitted in near future high capacity and reliability microwave backhaul links.
Resumo:
BACKGROUND: Despite recent algorithmic and conceptual progress, the stoichiometric network analysis of large metabolic models remains a computationally challenging problem. RESULTS: SNA is a interactive, high performance toolbox for analysing the possible steady state behaviour of metabolic networks by computing the generating and elementary vectors of their flux and conversions cones. It also supports analysing the steady states by linear programming. The toolbox is implemented mainly in Mathematica and returns numerically exact results. It is available under an open source license from: http://bioinformatics.org/project/?group_id=546. CONCLUSION: Thanks to its performance and modular design, SNA is demonstrably useful in analysing genome scale metabolic networks. Further, the integration into Mathematica provides a very flexible environment for the subsequent analysis and interpretation of the results.