904 resultados para Network Analysis Methods


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supply Chain Risk Management (SCRM) has become a popular area of research and study in recent years. This can be highlighted by the number of peer reviewed articles that have appeared in academic literature. This coupled with the realisation by companies that SCRM strategies are required to mitigate the risks that they face, makes for challenging research questions in the field of risk management. The challenge that companies face today is not only to identify the types of risks that they face, but also to assess the indicators of risk that face them. This will allow them to mitigate that risk before any disruption to the supply chain occurs. The use of social network theory can aid in the identification of disruption risk. This thesis proposes the combination of social networks, behavioural risk indicators and information management, to uniquely identify disruption risk. The propositions that were developed from the literature review and exploratory case study in the aerospace OEM, in this thesis are:- By improving information flows, through the use of social networks, we can identify supply chain disruption risk. - The management of information to identify supply chain disruption risk can be explored using push and pull concepts. The propositions were further explored through four focus group sessions, two within the OEM and two within an academic setting. The literature review conducted by the researcher did not find any studies that have evaluated supply chain disruption risk management in terms of social network analysis or information management studies. The evaluation of SCRM using these methods is thought to be a unique way of understanding the issues in SCRM that practitioners face today in the aerospace industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small Arms and Light Weapons (SALW) proliferation was undertaken by the Non-Governmental Organizations (NGOs) as the next important issue in international relations after the success of the International Campaign to Ban Landmines (ICBL). This dissertation focuses on the reasons why the issue of SALW resulted in an Action Program rather than an international convention. Thus, this result was considered as unsuccessful by the advocates of regulating the illicit trade in SALW. The study provides a social movement theoretical approach, using framing, political opportunity and network analysis to explain why the advocates of regulating the illicit trade in SALW did no succeed in their goals. The UN is taken as the arena in which NGOs, States and International Governmental Organizations (IGOs) discussed the illicit trade in SALW. ^ The findings of the study indicate that the political opportunity for the issue of SALW was not ideal. The network of NGOs, States and IGOs was not strong. The NGOs advocating regulation of SALW were divided over the approach of the issue and were part of different coalitions with differing objectives. Despite initial widespread interest among States, only a couple of States were fully committed to the issue till the end. The regional IGOs approached the issue based on their regional priorities and were less interested in an international covenant. The advocates of regulating illicit trade in SALW attempted to frame SALW as a humanitarian issue rather than as a security issue. Thus they were not able to use frame alignment to convince states to treat SALW as a humanitarian issue. In conclusion it can be said that all three items, framing, political opportunity and the network, play a role in the lack of success of advocates for regulating the illicit trade in SALW. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as 1) closure or connectedness within the group, 2) bridging ties which extend outside of the group, and 3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production of artistic prints in the sixteenth- and seventeenth-century Netherlands was an inherently social process. Turning out prints at any reasonable scale depended on the fluid coordination between designers, platecutters, and publishers; roles that, by the sixteenth century, were considered distinguished enough to merit distinct credits engraved on the plates themselves: invenit, fecit/sculpsit, and excudit. While any one designer, plate cutter, and publisher could potentially exercise a great deal of influence over the production of a single print, their individual decisions (Whom to select as an engraver? What subjects to create for a print design? What market to sell to?) would have been variously constrained or encouraged by their position in this larger network (Who do they already know? And who, in turn, do their contacts know?) This dissertation addresses the impact of these constraints and affordances through the novel application of computational social network analysis to major databases of surviving prints from this period. This approach is used to evaluate several questions about trends in early modern print production practices that have not been satisfactorily addressed by traditional literature based on case studies alone: Did the social capital demanded by print production result in centralized, or distributed production of prints? When, and to what extent, did printmakers and publishers in the Low countries favor international versus domestic collaborators? And were printmakers under the same pressure as painters to specialize in particular artistic genres? This dissertation ultimately suggests how simple professional incentives endemic to the practice of printmaking may, at large scales, have resulted in quite complex patterns of collaboration and production. The framework of network analysis surfaces the role of certain printmakers who tend to be neglected in aesthetically-focused histories of art. This approach also highlights important issues concerning art historians’ balancing of individual influence versus the impact of longue durée trends. Finally, this dissertation also raises questions about the current limitations and future possibilities of combining computational methods with cultural heritage datasets in the pursuit of historical research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to analyze the existing literature on hospitality management from all the research papers published in The International Journal of Hospitality Management (IJHM) between 2008 and 2014. The authors apply bibliometric methods – in particular, author citation and co-citation analyses (ACA) – to identify the main research lines within this scientific field; in other words, its ‘intellectual structure’. Social network analysis (SNA) is also used to perform a visualization of this structure. The results of the analysis allow us to define the different research lines or fronts which shape the intellectual structure of research on hospitality management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Different types of water bodies, including lakes, streams, and coastal marine waters, are often susceptible to fecal contamination from a range of point and nonpoint sources, and have been evaluated using fecal indicator microorganisms. The most commonly used fecal indicator is Escherichia coli, but traditional cultivation methods do not allow discrimination of the source of pollution. The use of triplex PCR offers an approach that is fast and inexpensive, and here enabled the identification of phylogroups. The phylogenetic distribution of E. coli subgroups isolated from water samples revealed higher frequencies of subgroups A1 and B23 in rivers impacted by human pollution sources, while subgroups D1 and D2 were associated with pristine sites, and subgroup B1 with domesticated animal sources, suggesting their use as a first screening for pollution source identification. A simple classification is also proposed based on phylogenetic subgroup distribution using the w-clique metric, enabling differentiation of polluted and unpolluted sites.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research reported here draws on a study of five teenagers from a Dinka-speaking community of Sudanese settling in Australia. A range of factors including language proficiency, social network structure and language attitudes are examined as possible causes for the variability of language use. The results and discussion illustrate how the use of a triangular research approach captured the complexity of the participants' language situation and was critical to developing a full understanding of the interplay of factors influencing the teens' language maintenance and shift in a way that no single method could. Further, it shows that employment of different methodologies allowed for flexibility in data collection to ensure the fullest response from participants. Overall, this research suggests that for studies of non-standard communities, variability in research methods may prove more of a strength that the use of standardised instruments and approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dual-energy X-ray absorptiometry (DXA) is a widely used method for measuring bone mineral in the growing skeleton. Because scan analysis in children offers a number of challenges, we compared DXA results using six analysis methods at the total proximal femur (PF) and five methods at the femoral neck (FN), In total we assessed 50 scans (25 boys, 25 girls) from two separate studies for cross-sectional differences in bone area, bone mineral content (BMC), and areal bone mineral density (aBMD) and for percentage change over the short term (8 months) and long term (7 years). At the proximal femur for the short-term longitudinal analysis, there was an approximate 3.5% greater change in bone area and BMC when the global region of interest (ROI) was allowed to increase in size between years as compared with when the global ROI was held constant. Trend analysis showed a significant (p < 0.05) difference between scan analysis methods for bone area and BMC across 7 years. At the femoral neck, cross-sectional analysis using a narrower (from default) ROI, without change in location, resulted in a 12.9 and 12.6% smaller bone area and BMC, respectively (both p < 0.001), Changes in FN area and BMC over 8 months were significantly greater (2.3 %, p < 0.05) using a narrower FN rather than the default ROI, Similarly, the 7-year longitudinal data revealed that differences between scan analysis methods were greatest when the narrower FN ROI was maintained across all years (p < 0.001), For aBMD there were no significant differences in group means between analysis methods at either the PF or FN, Our findings show the need to standardize the analysis of proximal femur DXA scans in growing children.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Functional magnetic resonance imaging (FMRI) analysis methods can be quite generally divided into hypothesis-driven and data-driven approaches. The former are utilised in the majority of FMRI studies, where a specific haemodynamic response is modelled utilising knowledge of event timing during the scan, and is tested against the data using a t test or a correlation analysis. These approaches often lack the flexibility to account for variability in haemodynamic response across subjects and brain regions which is of specific interest in high-temporal resolution event-related studies. Current data-driven approaches attempt to identify components of interest in the data, but currently do not utilise any physiological information for the discrimination of these components. Here we present a hypothesis-driven approach that is an extension of Friman's maximum correlation modelling method (Neurolmage 16, 454-464, 2002) specifically focused on discriminating the temporal characteristics of event-related haemodynamic activity. Test analyses, on both simulated and real event-related FMRI data, will be presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE. Coronary MDCT angiography has been shown to be an accurate noninvasive tool for the diagnosis of obstructive coronary artery disease (CAD). Its sensitivity and negative predictive value for diagnosing percentage of stenosis are unsurpassed compared with those of other noninvasive testing methods. However, in its current form, it provides no information regarding the physiologic impact of CAD and is a poor predictor of myocardial ischemia. CORE320 is a multicenter multinational diagnostic study with the primary objective to evaluate the diagnostic accuracy of 320-MDCT for detecting coronary artery luminal stenosis and corresponding myocardial perfusion deficits in patients with suspected CAD compared with the reference standard of conventional coronary angiography and SPECT myocardial perfusion imaging. CONCLUSION. We aim to describe the CT acquisition, reconstruction, and analysis methods of the CORE320 study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Performance indicators in the public sector have often been criticised for being inadequate and not conducive to analysing efficiency. The main objective of this study is to use data envelopment analysis (DEA) to examine the relative efficiency of Australian universities. Three performance models are developed, namely, overall performance, performance on delivery of educational services, and performance on fee-paying enrolments. The findings based on 1995 data show that the university sector was performing well on technical and scale efficiency but there was room for improving performance on fee-paying enrolments. There were also small slacks in input utilisation. More universities were operating at decreasing returns to scale, indicating a potential to downsize. DEA helps in identifying the reference sets for inefficient institutions and objectively determines productivity improvements. As such, it can be a valuable benchmarking tool for educational administrators and assist in more efficient allocation of scarce resources. In the absence of market mechanisms to price educational outputs, which renders traditional production or cost functions inappropriate, universities are particularly obliged to seek alternative efficiency analysis methods such as DEA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of the new TOGA (titration and off-gas analysis) sensor for the detailed study of biological processes in wastewater treatment systems is outlined. The main innovation of the sensor is the amalgamation of titrimetric and off-gas measurement techniques. The resulting measured signals are: hydrogen ion production rate (HPR), oxygen transfer rate (OTR), nitrogen transfer rate (NTR), and carbon dioxide transfer rate (CTR). While OTR and NTR are applicable to aerobic and anoxic conditions, respectively, HPR and CTR are useful signals under all of the conditions found in biological wastewater treatment systems, namely, aerobic, anoxic and anaerobic. The sensor is therefore a powerful tool for studying the key biological processes under all these conditions. A major benefit from the integration of the titrimetric and off-gas analysis methods is that the acid/base buffering systems, in particular the bicarbonate system, are properly accounted for. Experimental data resulting from the TOGA sensor in aerobic, anoxic, and anaerobic conditions demonstrates the strength of the new sensor. In the aerobic environment, carbon oxidation (using acetate as an example carbon source) and nitrification are studied. Both the carbon and ammonia removal rates measured by the sensor compare very well with those obtained from off-line chemical analysis. Further, the aerobic acetate removal process is examined at a fundamental level using the metabolic pathway and stoichiometry established in the literature, whereby the rate of formation of storage products is identified. Under anoxic conditions, the denitrification process is monitored and, again, the measured rate of nitrogen gas transfer (NTR) matches well with the removal of the oxidised nitrogen compounds (measured chemically). In the anaerobic environment, the enhanced biological phosphorus process was investigated. In this case, the measured sensor signals (HPR and CTR) resulting from acetate uptake were used to determine the ratio of the rates of carbon dioxide production by competing groups of microorganisms, which consequently is a measure of the activity of these organisms. The sensor involves the use of expensive equipment such as a mass spectrometer and requires special gases to operate, thus incurring significant capital and operational costs. This makes the sensor more an advanced laboratory tool than an on-line sensor. (C) 2003 Wiley Periodicals, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Density functional theory for adsorption in carbons is adapted here to incorporate a random distribution of pore wall thickness in the solid, and it is shown that the mean pore wall thickness is intimately related to the pore size distribution characteristics. For typical carbons the pore walls are estimated to comprise only about two graphene layers, and application of the modified density functional theory approach shows that the commonly used assumption of infinitely thick walls can severely affect the results for adsorption in small pores under both supercritical and subcritical conditions. Under supercritical conditions the Henry's law coefficient is overpredicted by as much as a factor of 2, while under subcritical conditions pore wall heterogeneity appears to modify transitions in small pores into a sequence of smaller ones corresponding to pores with different wall thicknesses. The results suggest the need to improve current pore size distrubution analysis methods to allow for pore wall heterogeneity. The density functional theory is further extended here to allow for interpore adsorbate interactions, and it appears that these interaction are negligible for small molecules such as nitrogen but significant for more strongly interacting heavier molecules such as butane, for which the traditional independent pore model may not be adequate.