834 resultados para Network Analysis Methods
Resumo:
This programmatic paper investigates the possibilities, chances, and risks of analyzing personal and professional online communication from the point of view of interactional sociolinguistics combined with modern social network analysis (SNA). Thus, it has two complementing goals: One is the exploration of adequate, innovative concepts and methods for analyzing online communication, the other is to use online communication and its ontological and functional specificities to enrich the conceptual and methodological background of SNA. The paper is organized in two parts. It begins with an introduction to recent developments in sociolinguistic social network analysis. Here, three interesting new concepts and tools are discussed: latent versus emergent networks (Watts 1991), coalitions (Fitzmaurice 2000a, Fitzmaurice 2000b), and communities of practice (Wenger 1998
Resumo:
Computational network analysis provides new methods to analyze the human connectome. Brain structural networks can be characterized by global and local metrics that recently gave promising insights for diagnosis and further understanding of neurological, psychiatric and neurodegenerative disorders. In order to ensure the validity of results in clinical settings the precision and repeatability of the networks and the associated metrics must be evaluated. In the present study, nineteen healthy subjects underwent two consecutive measurements enabling us to test reproducibility of the brain network and its global and local metrics. As it is known that the network topology depends on the network density, the effects of setting a common density threshold for all networks were also assessed. Results showed good to excellent repeatability for global metrics, while for local metrics it was more variable and some metrics were found to have locally poor repeatability. Moreover, between subjects differences were slightly inflated when the density was not fixed. At the global level, these findings confirm previous results on the validity of global network metrics as clinical biomarkers. However, the new results in our work indicate that the remaining variability at the local level as well as the effect of methodological characteristics on the network topology should be considered in the analysis of brain structural networks and especially in networks comparisons.
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
QUESTIONS UNDER STUDY: Patient characteristics and risk factors for death of Swiss trauma patients in the Trauma Audit and Research Network (TARN). METHODS: Descriptive analysis of trauma patients (≥16 years) admitted to a level I trauma centre in Switzerland (September 1, 2009 to August 31, 2010) and entered into TARN. Multivariable logistic regression analysis was used to identify predictors of 30-day mortality. RESULTS: Of 458 patients 71% were male. The median age was 50.5 years (inter-quartile range [IQR] 32.2-67.7), median Injury Severity Score (ISS) was 14 (IQR 9-20) and median Glasgow Coma Score (GCS) was 15 (IQR 14-15). The ISS was >15 for 47%, and 14% had an ISS >25. A total of 17 patients (3.7%) died within 30 days of trauma. All deaths were in patients with ISS >15. Most injuries were due to falls <2 m (35%) or road traffic accidents (29%). Injuries to the head (39%) were followed by injuries to the lower limbs (33%), spine (28%) and chest (27%). The time of admission peaked between 12:00 and 22:00, with a second peak between 00:00 and 02:00. A total of 64% of patients were admitted directly to our trauma centre. The median time to CT was 30 min (IQR 18-54 min). Using multivariable regression analysis, the predictors of mortality were older age, higher ISS and lower GCS. CONCLUSIONS: Characteristics of Swiss trauma patients derived from TARN were described for the first time, providing a detailed overview of the institutional trauma population. Based on these results, patient management and hospital resources (e.g. triage of patients, time to CT, staffing during night shifts) could be evaluated as a further step.
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^
Resumo:
Vita.
Resumo:
Mode of access: Internet.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
There have been recent calls for the field of International Business to retool its routines by becoming genuinely interdisciplinary. This paper takes such an approach by using recent advances in the fields of evolutionary economics and applying them to IB. Evolutionary economists are now viewing the economy as an actual network. Consequently, one the key analytical tools in this approach is network analysis. Some of the basic methods in network analysis are reviewed. The paper then looks at how using these tools might be of use in IB studies. In particular, it outlines fruitful research paths in the areas of globalisation and regionalisation, and the measurement of performance in multi-national firms and alliances. In each case, propositions are put forward which can be analytically tested with the use of network analysis. The paper concludes with a brief outline of a research agenda which utilises this approach in International Business studies.
Resumo:
Supply Chain Risk Management (SCRM) has become a popular area of research and study in recent years. This can be highlighted by the number of peer reviewed articles that have appeared in academic literature. This coupled with the realisation by companies that SCRM strategies are required to mitigate the risks that they face, makes for challenging research questions in the field of risk management. The challenge that companies face today is not only to identify the types of risks that they face, but also to assess the indicators of risk that face them. This will allow them to mitigate that risk before any disruption to the supply chain occurs. The use of social network theory can aid in the identification of disruption risk. This thesis proposes the combination of social networks, behavioural risk indicators and information management, to uniquely identify disruption risk. The propositions that were developed from the literature review and exploratory case study in the aerospace OEM, in this thesis are:- By improving information flows, through the use of social networks, we can identify supply chain disruption risk. - The management of information to identify supply chain disruption risk can be explored using push and pull concepts. The propositions were further explored through four focus group sessions, two within the OEM and two within an academic setting. The literature review conducted by the researcher did not find any studies that have evaluated supply chain disruption risk management in terms of social network analysis or information management studies. The evaluation of SCRM using these methods is thought to be a unique way of understanding the issues in SCRM that practitioners face today in the aerospace industry.
Resumo:
Small Arms and Light Weapons (SALW) proliferation was undertaken by the Non-Governmental Organizations (NGOs) as the next important issue in international relations after the success of the International Campaign to Ban Landmines (ICBL). This dissertation focuses on the reasons why the issue of SALW resulted in an Action Program rather than an international convention. Thus, this result was considered as unsuccessful by the advocates of regulating the illicit trade in SALW. The study provides a social movement theoretical approach, using framing, political opportunity and network analysis to explain why the advocates of regulating the illicit trade in SALW did no succeed in their goals. The UN is taken as the arena in which NGOs, States and International Governmental Organizations (IGOs) discussed the illicit trade in SALW. ^ The findings of the study indicate that the political opportunity for the issue of SALW was not ideal. The network of NGOs, States and IGOs was not strong. The NGOs advocating regulation of SALW were divided over the approach of the issue and were part of different coalitions with differing objectives. Despite initial widespread interest among States, only a couple of States were fully committed to the issue till the end. The regional IGOs approached the issue based on their regional priorities and were less interested in an international covenant. The advocates of regulating illicit trade in SALW attempted to frame SALW as a humanitarian issue rather than as a security issue. Thus they were not able to use frame alignment to convince states to treat SALW as a humanitarian issue. In conclusion it can be said that all three items, framing, political opportunity and the network, play a role in the lack of success of advocates for regulating the illicit trade in SALW. ^
Resumo:
In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as 1) closure or connectedness within the group, 2) bridging ties which extend outside of the group, and 3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software.
Resumo:
The production of artistic prints in the sixteenth- and seventeenth-century Netherlands was an inherently social process. Turning out prints at any reasonable scale depended on the fluid coordination between designers, platecutters, and publishers; roles that, by the sixteenth century, were considered distinguished enough to merit distinct credits engraved on the plates themselves: invenit, fecit/sculpsit, and excudit. While any one designer, plate cutter, and publisher could potentially exercise a great deal of influence over the production of a single print, their individual decisions (Whom to select as an engraver? What subjects to create for a print design? What market to sell to?) would have been variously constrained or encouraged by their position in this larger network (Who do they already know? And who, in turn, do their contacts know?) This dissertation addresses the impact of these constraints and affordances through the novel application of computational social network analysis to major databases of surviving prints from this period. This approach is used to evaluate several questions about trends in early modern print production practices that have not been satisfactorily addressed by traditional literature based on case studies alone: Did the social capital demanded by print production result in centralized, or distributed production of prints? When, and to what extent, did printmakers and publishers in the Low countries favor international versus domestic collaborators? And were printmakers under the same pressure as painters to specialize in particular artistic genres? This dissertation ultimately suggests how simple professional incentives endemic to the practice of printmaking may, at large scales, have resulted in quite complex patterns of collaboration and production. The framework of network analysis surfaces the role of certain printmakers who tend to be neglected in aesthetically-focused histories of art. This approach also highlights important issues concerning art historians’ balancing of individual influence versus the impact of longue durée trends. Finally, this dissertation also raises questions about the current limitations and future possibilities of combining computational methods with cultural heritage datasets in the pursuit of historical research.
Resumo:
Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.