904 resultados para Network Analysis Methods
Resumo:
The main aim of this thesis is strongly interdisciplinary: it involves and presumes a knowledge on Neurophysiology, to understand the mechanisms that undergo the studied phenomena, a knowledge and experience on Electronics, necessary during the hardware experimental set-up to acquire neuronal data, on Informatics and programming to write the code necessary to control the behaviours of the subjects during experiments and the visual presentation of stimuli. At last, neuronal and statistical models should be well known to help in interpreting data. The project started with an accurate bibliographic research: until now the mechanism of perception of heading (or direction of motion) are still poorly known. The main interest is to understand how the integration of visual information relative to our motion with eye position information happens. To investigate the cortical response to visual stimuli in motion and the integration with eye position, we decided to study an animal model, using Optic Flow expansion and contraction as visual stimuli. In the first chapter of the thesis, the basic aims of the research project are presented, together with the reasons why it’s interesting and important to study perception of motion. Moreover, this chapter describes the methods my research group thought to be more adequate to contribute to scientific community and underlines my personal contribute to the project. The second chapter presents an overview on useful knowledge to follow the main part of the thesis: it starts with a brief introduction on central nervous system, on cortical functions, then it presents more deeply associations areas, which are the main target of our study. Furthermore, it tries to explain why studies on animal models are necessary to understand mechanism at a cellular level, that could not be addressed on any other way. In the second part of the chapter, basics on electrophysiology and cellular communication are presented, together with traditional neuronal data analysis methods. The third chapter is intended to be a helpful resource for future works in the laboratory: it presents the hardware used for experimental sessions, how to control animal behaviour during the experiments by means of C routines and a software, and how to present visual stimuli on a screen. The forth chapter is the main core of the research project and the thesis. In the methods, experimental paradigms, visual stimuli and data analysis are presented. In the results, cellular response of area PEc to visual stimuli in motion combined with different eye positions are shown. In brief, this study led to the identification of different cellular behaviour in relation to focus of expansion (the direction of motion given by the optic flow pattern) and eye position. The originality and importance of the results are pointed out in the conclusions: this is the first study aimed to investigate perception of motion in this particular cortical area. In the last paragraph, a neuronal network model is presented: the aim is simulating cellular pre-saccadic and post-saccadic response of neuron in area PEc, during eye movement tasks. The same data presented in chapter four, are further analysed in chapter fifth. The analysis started from the observation of the neuronal responses during 1s time period in which the visual stimulation was the same. It was clear that cells activities showed oscillations in time, that had been neglected by the previous analysis based on mean firing frequency. Results distinguished two cellular behaviour by their response characteristics: some neurons showed oscillations that changed depending on eye and optic flow position, while others kept the same oscillations characteristics independent of the stimulus. The last chapter discusses the results of the research project, comments the originality and interdisciplinary of the study and proposes some future developments.
Resumo:
Die resonante Laserionisation hat sich als ein universales Verfahren für eine Vielzahl von Anwendungen etabliert, die eine selektive Ionisation bei hoher Effizienz erfordern. Hierzu wurden zwei Lasersysteme mit unterschiedlichen Zielsetzungen und Schwerpunkten entwickelt und in dieser Arbeit angewendet. Im ersten Teil der Arbeit wird die Entwicklung der hochauflösenden Resonanzionisations-Massenspektrometrie zum Ultraspurennachweis von 41Ca vorgestellt. Hierzu wurden drei kontinuierliche Diodenlaser mit einem Quadrupolmassenspektrometer kombiniert. Bei einer Nachweiseffizienz von 1 × 10^−5 konnte eine Nachweisgrenze von 2 × 10^-13 41Ca/totCa erreicht werden. Das in den Routinebetrieb überführte Meßverfahren ermöglichte die Teilnahme an einem interdisziplinären Netzwerk zur Osteoporose-Forschung. In Vergleichsmessungen der Resonanzionisations-Massenspektrometrie mit allen derzeit existierenden Meßverfahren zum 41Ca-Ultraspurennachweis konnte eine sehr gute Übereinstimmung erzielt werden. Der zweite Teil der Arbeit beinhaltet die Adaption eines durchstimmbaren, hochrepetierenden Titan:Saphir-Lasersystem für den Einsatz an Laserionenquellen zur selektiven Erzeugung radioaktiver Ionenstrahlen. Das entwickelte Lasersystem ermöglicht eine effiziente, resonante Anregung des Großteils der Elemente im Periodensystem. Hierzu wurde eine kombinierte Frequenzverdopplungs- und Frequenzverdreifachungseinheit zur Erzeugung höherer Harmonischer aufgebaut. Die Anwendbarkeit eines solchen reinen Festkörper-Lasersystems wurde in zahlreichen off-line Testmessungen sowohl in Mainz als auch an den ISOL Einrichtungen am TRIUMF und ORNL gezeigt und führte zum ersten on-line Einsatz am TRIUMF.
Resumo:
Neoplastic overgrowth depends on the cooperation of several mutations ultimately leading to major rearrangements in cellular behaviour. The molecular crosstalk occurring between precancerous and normal cells strongly influences the early steps of the tumourigenic process as well as later stages of the disease. Precancerous cells are often removed by cell death from normal tissues but the mechanisms responsible for such fundamental safeguard processes remain in part elusive. To gain insight into these phenomena I took advantage of the clonal analysis methods available in Drosophila for studying the phenotypes due to loss of function of the neoplastic tumour suppressor lethal giant larvae (lgl). I found that lgl mutant cells growing in wild-type imaginal wing discs are subject to the phenomenon of cell competition and are eliminated by JNK-dependent cell death because they express very low levels of dMyc oncoprotein compared to those in the surrounding tissue. Indeed, in non-competitive backgrounds lgl mutant clones are able to overgrow and upregulate dMyc, overwhelming the neighbouring tissue and forming tumourous masses that display several cancer hallmarks. These phenotypes are completely abolished by reducing dMyc abundance within mutant cells while increasing it in lgl clones growing in a competitive context re-establishes their tumourigenic potential. Similarly, the neoplastic growth observed upon the oncogenic cooperation between lgl mutation and activated Ras/Raf/MAPK signalling was found to be characterised by and dependent on the ability of cancerous cells to upregulate dMyc with respect to the adjacent normal tissue, through both transcriptional and post-transcriptional mechanisms, thereby confirming its key role in lgl-induced tumourigenesis. These results provide first evidence that the dMyc oncoprotein is required in lgl mutant tissue to promote invasive overgrowth in developing and adult epithelial tissues and that dMyc abundance inside versus outside lgl mutant clones plays a key role in driving neoplastic overgrowth.
Resumo:
It is currently widely accepted that the understanding of complex cell functions depends on an integrated network theoretical approach and not on an isolated view of the different molecular agents. Aim of this thesis was the examination of topological properties that mirror known biological aspects by depicting the human protein network with methods from graph- and network theory. The presented network is a partial human interactome of 9222 proteins and 36324 interactions, consisting of single interactions reliably extracted from peer-reviewed scientific publications. In general, one can focus on intra- or intermodular characteristics, where a functional module is defined as "a discrete entity whose function is separable from those of other modules". It is found that the presented human network is also scale-free and hierarchically organised, as shown for yeast networks before. The interactome also exhibits proteins with high betweenness and low connectivity which are biologically analyzed and interpreted here as shuttling proteins between organelles (e.g. ER to Golgi, internal ER protein translocation, peroxisomal import, nuclear pores import/export) for the first time. As an optimisation for finding proteins that connect modules, a new method is developed here based on proteins located between highly clustered regions, rather than regarding highly connected regions. As a proof of principle, the Mediator complex is found in first place, the prime example for a connector complex. Focusing on intramodular aspects, the measurement of k-clique communities discriminates overlapping modules very well. Twenty of the largest identified modules are analysed in detail and annotated to known biological structures (e.g. proteasome, the NFκB-, TGF-β complex). Additionally, two large and highly interconnected modules for signal transducer and transcription factor proteins are revealed, separated by known shuttling proteins. These proteins yield also the highest number of redundant shortcuts (by calculating the skeleton), exhibit the highest numbers of interactions and might constitute highly interconnected but spatially separated rich-clubs either for signal transduction or for transcription factors. This design principle allows manifold regulatory events for signal transduction and enables a high diversity of transcription events in the nucleus by a limited set of proteins. Altogether, biological aspects are mirrored by pure topological features, leading to a new view and to new methods that assist the annotation of proteins to biological functions, structures and subcellular localisations. As the human protein network is one of the most complex networks at all, these results will be fruitful for other fields of network theory and will help understanding complex network functions in general.
Resumo:
Primary biogenic aerosol (PBA) particles account for large proportions of air particulate matter, and they can influence the hydrological cycle and climate as nuclei for water droplets and ice crystals in clouds, fog, and precipitation. Moreover, they can cause or enhance human, animal, and plant diseases. The actual abundance and properties of PBA particles and components in the atmosphere are, however, still poorly understood and quantified. rnIn this study, the identity, diversity, and frequency of occurrence of PBA particles were investigated by DNA analysis. Methods for the extraction, amplification, and analysis of DNA from aerosol filter samples were developed and optimized for different types of organisms, including fungi, bacteria, and plants. The investigations were focused on fungal DNA, and over 2500 sequences were obtained from air samples collected at different locations and climatic zones around the world (tropical, mid-latitude, sub-polar; continental, marine). rnNearly all fungal DNA sequences could be attributed to the phyla of Ascomycota and Basidiomycota. With regard to species richness, the ratio of Basidiomycota to Ascomycota was much higher in continental air samples (~60:40) than in marine air samples (~30:70). Pronounced differences in the relative abundance and seasonal cycles of various groups of fungi were detected in coarse and fine particulate matter from continental air, with more plant pathogens in the coarse and more human pathogens and allergens in the respirable fine particle fraction (<3 µm). The results of this study provide new information and insights into the sources of PBA particles and the interactions of the biosphere with the atmosphere, climate, and public health. rn
Resumo:
L'indagine condotta, avvalendosi del paradigma della social network analysis, offre una descrizione delle reti di supporto personale e del capitale sociale di un campione di 80 italiani ex post un trattamento terapeutico residenziale di lungo termine per problemi di tossicodipendenza. Dopo aver identificato i profili delle reti di supporto sociale degli intervistati, si è proceduto, in primis, alla misurazione e comparazione delle ego-centered support networks tra soggetti drug free e ricaduti e, successivamente, all'investigazione delle caratteristiche delle reti e delle forme di capitale sociale – closure e brokerage – che contribuiscono al mantenimento dell'astinenza o al rischio di ricaduta nel post-trattamento. Fattori soggettivi, come la discriminazione pubblica percepita e l'attitudine al lavoro, sono stati inoltre esplorati al fine di investigare la loro correlazione con la condotta di reiterazione nell'uso di sostanze. Dai risultati dello studio emerge che un più basso rischio di ricaduta è positivamente associato ad una maggiore attitudine al lavoro, ad una minore percezione di discriminazione da parte della società, all'avere membri di supporto con un più alto status socio-economico e che mobilitano risorse reputazionali e, infine, all'avere reti più eterogenee nell'occupazione e caratterizzate da più elevati livelli di reciprocità. Inoltre, il capitale sociale di tipo brokerage contribuisce al mantenimento dell'astinenza in quanto garantisce l'accesso del soggetto ad informazioni meno omogenee e la sua esposizione a opportunità più numerose e differenziate. I risultati dello studio, pertanto, dimostrano l'importante ruolo delle personal support networks nel prevenire o ridurre il rischio di ricaduta nel post-trattamento, in linea con precedenti ricerche che suggeriscono la loro incorporazione nei programmi terapeutici per tossicodipendenti.
Resumo:
In this thesis the evolution of the techno-social systems analysis methods will be reported, through the explanation of the various research experience directly faced. The first case presented is a research based on data mining of a dataset of words association named Human Brain Cloud: validation will be faced and, also through a non-trivial modeling, a better understanding of language properties will be presented. Then, a real complex system experiment will be introduced: the WideNoise experiment in the context of the EveryAware european project. The project and the experiment course will be illustrated and data analysis will be displayed. Then the Experimental Tribe platform for social computation will be introduced . It has been conceived to help researchers in the implementation of web experiments, and aims also to catalyze the cumulative growth of experimental methodologies and the standardization of tools cited above. In the last part, three other research experience which already took place on the Experimental Tribe platform will be discussed in detail, from the design of the experiment to the analysis of the results and, eventually, to the modeling of the systems involved. The experiments are: CityRace, about the measurement of human traffic-facing strategies; laPENSOcosì, aiming to unveil the political opinion structure; AirProbe, implemented again in the EveryAware project framework, which consisted in monitoring air quality opinion shift of a community informed about local air pollution. At the end, the evolution of the technosocial systems investigation methods shall emerge together with the opportunities and the threats offered by this new scientific path.
Resumo:
Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?
Resumo:
Advances in the area of mobile and wireless communication for healthcare (m-Health) along with the improvements in information science allow the design and development of new patient-centric models for the provision of personalised healthcare services, increase of patient independence and improvement of patient's self-control and self-management capabilities. This paper comprises a brief overview of the m-Health applications towards the self-management of individuals with diabetes mellitus and the enhancement of their quality of life. Furthermore, the design and development of a mobile phone application for Type 1 Diabetes Mellitus (T1DM) self-management is presented. The technical evaluation of the application, which permits the management of blood glucose measurements, blood pressure measurements, insulin dosage, food/drink intake and physical activity, has shown that the use of the mobile phone technologies along with data analysis methods might improve the self-management of T1DM.
Resumo:
Background Pneumococcal conjugate vaccines (PCV) were first licensed for use with 3 primary doses in infancy and a booster dose. The evidence for the effects of different schedules was examined in this systematic review and meta-analysis. Methods We searched 12 databases and trial registers up to March 2010. We selected randomised controlled trials (RCTs), cohort and case–control studies making direct comparisons between PCV schedules with (2p) or (3p) primary doses, with (+1) or without (+0) a booster dose. We extracted data on clinical, nasopharyngeal carriage and immunological outcomes and used meta-analysis to combine results where appropriate. Results Seropositivity levels (antibody concentration ≥0.35 μg/ml) following 3p and 2p PCV schedules were high for most serotypes (5 RCTs). Differences between schedules were generally small and tended to favour 3p schedules, particularly for serotypes 6B and 23F; between-study heterogeneity was high. Seropositivity levels following 3p+1 and 2p+1 schedules were similar but small differences favouring 3p+1 schedules were seen for serotypes 6B and 23F. We did not identify any RCTs reporting clinical outcomes for these comparisons. In 2 RCTs there was weak evidence of a reduction in carriage of S. pneumoniae serotypes included in the vaccine when 3p+0 schedules were compared to 2p+0 at 6 months of age. Conclusions Most data about the relative effects of different PCV schedules relate to immunological outcomes. Both 3p and 2p schedules result in high levels of seropositivity. The clinical relevance of differences in immunological outcomes between schedules is not known. There is an absence of clinical outcome data from RCTs with direct comparisons of any 2p with any 3p PCV schedule.
Resumo:
Objectives To compare the use of pair-wise meta-analysis methods to multiple treatment comparison (MTC) methods for evidence-based health-care evaluation to estimate the effectiveness and cost-effectiveness of alternative health-care interventions based on the available evidence. Methods Pair-wise meta-analysis and more complex evidence syntheses, incorporating an MTC component, are applied to three examples: 1) clinical effectiveness of interventions for preventing strokes in people with atrial fibrillation; 2) clinical and cost-effectiveness of using drug-eluting stents in percutaneous coronary intervention in patients with coronary artery disease; and 3) clinical and cost-effectiveness of using neuraminidase inhibitors in the treatment of influenza. We compare the two synthesis approaches with respect to the assumptions made, empirical estimates produced, and conclusions drawn. Results The difference between point estimates of effectiveness produced by the pair-wise and MTC approaches was generally unpredictable—sometimes agreeing closely whereas in other instances differing considerably. In all three examples, the MTC approach allowed the inclusion of randomized controlled trial evidence ignored in the pair-wise meta-analysis approach. This generally increased the precision of the effectiveness estimates from the MTC model. Conclusions The MTC approach to synthesis allows the evidence base on clinical effectiveness to be treated as a coherent whole, include more data, and sometimes relax the assumptions made in the pair-wise approaches. However, MTC models are necessarily more complex than those developed for pair-wise meta-analysis and thus could be seen as less transparent. Therefore, it is important that model details and the assumptions made are carefully reported alongside the results.
Resumo:
Transportation corridors in megaregions present a unique challenge for planners because of the high concentration of development, complex interjurisdictional issues, and history of independent development of core urban centers. The concept of resilience, as applied to megaregions, can be used to understand better the performance of these corridors. Resiliency is the ability to recover from or adjust easily to change. Resiliency performance measures can be expanded on for application to megaregions throughout the United States. When applied to transportation corridors in megaregions and represented by performance measures such as redundancy, continuity, connectivity, and travel time reliability, the concept of resiliency captures the spatial and temporal relationships between the attributes of a corridor, a network, and neighboring facilities over time at the regional and local levels. This paper focuses on the development of performance measurements for evaluating corridor resiliency as well as a plan for implementing analysis methods at the jurisdictional level. The transportation corridor between Boston, Massachusetts, and Washington, D.C., is used as a case study to represent the applicability of these measures to megaregions throughout the country.
Resumo:
BACKGROUND: Despite recent algorithmic and conceptual progress, the stoichiometric network analysis of large metabolic models remains a computationally challenging problem. RESULTS: SNA is a interactive, high performance toolbox for analysing the possible steady state behaviour of metabolic networks by computing the generating and elementary vectors of their flux and conversions cones. It also supports analysing the steady states by linear programming. The toolbox is implemented mainly in Mathematica and returns numerically exact results. It is available under an open source license from: http://bioinformatics.org/project/?group_id=546. CONCLUSION: Thanks to its performance and modular design, SNA is demonstrably useful in analysing genome scale metabolic networks. Further, the integration into Mathematica provides a very flexible environment for the subsequent analysis and interpretation of the results.
Resumo:
BACKGROUND: Assessment of lung volume (FRC) and ventilation inhomogeneities with ultrasonic flowmeter and multiple breath washout (MBW) has been used to provide important information about lung disease in infants. Sub-optimal adjustment of the mainstream molar mass (MM) signal for temperature and external deadspace may lead to analysis errors in infants with critically small tidal volume changes during breathing. METHODS: We measured expiratory temperature in human infants at 5 weeks of age and examined the influence of temperature and deadspace changes on FRC results with computer simulation modeling. A new analysis method with optimized temperature and deadspace settings was then derived, tested for robustness to analysis errors and compared with the previously used analysis methods. RESULTS: Temperature in the facemask was higher and variations of deadspace volumes larger than previously assumed. Both showed considerable impact upon FRC and LCI results with high variability when obtained with the previously used analysis model. Using the measured temperature we optimized model parameters and tested a newly derived analysis method, which was found to be more robust to variations in deadspace. Comparison between both analysis methods showed systematic differences and a wide scatter. CONCLUSION: Corrected deadspace and more realistic temperature assumptions improved the stability of the analysis of MM measurements obtained by ultrasonic flowmeter in infants. This new analysis method using the only currently available commercial ultrasonic flowmeter in infants may help to improve stability of the analysis and further facilitate assessment of lung volume and ventilation inhomogeneities in infants.