869 resultados para NETWORK DESIGN PROBLEMS


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Motivated by the need to understand which are the underlying forces that trigger network evolution, we develop a multilevel theoretical and empirically testable model to examine the relationship between changes in the external environment and network change. We refer to network change as the dissolution or replacement of an interorganizational tie, adding also the case of the formation of new ties with new or preexisting partners. Previous research has paid scant attention to the organizational consequences of quantum change enveloping entire industries in favor of an emphasis on continuous change. To highlight radical change we introduce the concept of environmental jolt. The September 11 terrorist attacks provide us with a natural experiment to test our hypotheses on the antecedents and the consequences of network change. Since network change can be explained at multiple levels, we incorporate firm-level variables as moderators. The empirical setting is the global airline industry, which can be regarded as a constantly changing network of alliances. The study reveals that firms react to environmental jolts by forming homophilous ties and transitive triads as opposed to the non jolt periods. Moreover, we find that, all else being equal, firms that adopt a brokerage posture will have positive returns. However, we find that in the face of an environmental jolt brokerage relates negatively to firm performance. Furthermore, we find that the negative relationship between brokerage and performance during an environmental jolt is more significant for larger firms. Our findings suggest that jolts are an important predictor of network change, that they significantly affect operational returns and should be thus incorporated in studies of network dynamics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When designing metaheuristic optimization methods, there is a trade-off between application range and effectiveness. For large real-world instances of combinatorial optimization problems out-of-the-box metaheuristics often fail, and optimization methods need to be adapted to the problem at hand. Knowledge about the structure of high-quality solutions can be exploited by introducing a so called bias into one of the components of the metaheuristic used. These problem-specific adaptations allow to increase search performance. This thesis analyzes the characteristics of high-quality solutions for three constrained spanning tree problems: the optimal communication spanning tree problem, the quadratic minimum spanning tree problem and the bounded diameter minimum spanning tree problem. Several relevant tree properties, that should be explored when analyzing a constrained spanning tree problem, are identified. Based on the gained insights on the structure of high-quality solutions, efficient and robust solution approaches are designed for each of the three problems. Experimental studies analyze the performance of the developed approaches compared to the current state-of-the-art.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Simulation Automation Framework for Experiments (SAFE) is a project created to raise the level of abstraction in network simulation tools and thereby address issues that undermine credibility. SAFE incorporates best practices in network simulationto automate the experimental process and to guide users in the development of sound scientific studies using the popular ns-3 network simulator. My contributions to the SAFE project: the design of two XML-based languages called NEDL (ns-3 Experiment Description Language) and NSTL (ns-3 Script Templating Language), which facilitate the description of experiments and network simulationmodels, respectively. The languages provide a foundation for the construction of better interfaces between the user and the ns-3 simulator. They also provide input to a mechanism which automates the execution of network simulation experiments. Additionally,this thesis demonstrates that one can develop tools to generate ns-3 scripts in Python or C++ automatically from NSTL model descriptions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research initiative was triggered by the problems of water management of Polymer Electrolyte Membrane Fuel Cell (PEMFC). In low temperature fuel cells such as PEMFC, some of the water produced after the chemical reaction remains in its liquid state. Excess water produced by the fuel cell must be removed from the system to avoid flooding of the gas diffusion layers (GDL). The GDL is responsible for the transport of reactant gas to the active sites and remove the water produced from the sites. If the GDL is flooded, the supply gas will not be able to reach the reactive sites and the fuel cell fails. The choice of water removal method in this research is to exert a variable asymmetrical force on a liquid droplet. As the drop of liquid is subjected to an external vibrational force in the form of periodic wave, it will begin to oscillate. A fluidic oscillator is capable to produce a pulsating flow using simple balance of momentum fluxes between three impinging jets. By connecting the outputs of the oscillator to the gas channels of a fuel cell, a flow pulsation can be imposed on a water droplet formed within the gas channel during fuel cell operation. The lowest frequency produced by this design is approximately 202 Hz when a 20 inches feed-back port length was used and a supply pressure of 5 psig was introduced. This information was found by setting up a fluidic network with appropriate data acquisition. The components include a fluidic amplifier, valves and fittings, flow meters, a pressure gage, NI-DAQ system, Siglab®, Matlab software and four PCB microphones. The operating environment of the water droplet was reviewed, speed of the sound pressure which travels down the square channel was precisely estimated, and measurement devices were carefully selected. Applicable alternative measurement devices and its application to pressure wave measurement was considered. Methods for experimental setup and possible approaches were recommended, with some discussion of potential problems with implementation of this technique. Some computational fluid dynamic was also performed as an approach to oscillator design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this report is to study distributed (decentralized) three phase optimal power flow (OPF) problem in unbalanced power distribution networks. A full three phase representation of the distribution networks is considered to account for the highly unbalance state of the distribution networks. All distribution network’s series/shunt components, and load types/combinations had been modeled on commercial version of General Algebraic Modeling System (GAMS), the high-level modeling system for mathematical programming and optimization. The OPF problem has been successfully implemented and solved in a centralized approach and distributed approach, where the objective is to minimize the active power losses in the entire system. The study was implemented on the IEEE-37 Node Test Feeder. A detailed discussion of all problem sides and aspects starting from the basics has been provided in this study. Full simulation results have been provided at the end of the report.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This programmatic paper investigates the possibilities, chances, and risks of analyzing personal and professional online communication from the point of view of interactional sociolinguistics combined with modern social network analysis (SNA). Thus, it has two complementing goals: One is the exploration of adequate, innovative concepts and methods for analyzing online communication, the other is to use online communication and its ontological and functional specificities to enrich the conceptual and methodological background of SNA. The paper is organized in two parts. It begins with an introduction to recent developments in sociolinguistic social network analysis. Here, three interesting new concepts and tools are discussed: latent versus emergent networks (Watts 1991), coalitions (Fitzmaurice 2000a, Fitzmaurice 2000b), and communities of practice (Wenger 1998

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Health care providers face the problem of trying to make decisions with inadequate information and also with an overload of (often contradictory) information. Physicians often choose treatment long before they know which disease is present. Indeed, uncertainty is intrinsic to the practice of medicine. Decision analysis can help physicians structure and work through a medical decision problem, and can provide reassurance that decisions are rational and consistent with the beliefs and preferences of other physicians and patients. ^ The primary purpose of this research project is to develop the theory, methods, techniques and tools necessary for designing and implementing a system to support solving medical decision problems. A case study involving “abdominal pain” serves as a prototype for implementing the system. The research, however, focuses on a generic class of problems and aims at covering theoretical as well as practical aspects of the system developed. ^ The main contributions of this research are: (1) bridging the gap between the statistical approach and the knowledge-based (expert) approach to medical decision making; (2) linking a collection of methods, techniques and tools together to allow for the design of a medical decision support system, based on a framework that involves the Analytic Network Process (ANP), the generalization of the Analytic Hierarchy Process (AHP) to dependence and feedback, for problems involving diagnosis and treatment; (3) enhancing the representation and manipulation of uncertainty in the ANP framework by incorporating group consensus weights; and (4) developing a computer program to assist in the implementation of the system. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective To determine the comparative effectiveness and safety of current maintenance strategies in preventing exacerbations of asthma. Design Systematic review and network meta-analysis using Bayesian statistics. Data sources Cochrane systematic reviews on chronic asthma, complemented by an updated search when appropriate. Eligibility criteria Trials of adults with asthma randomised to maintenance treatments of at least 24 weeks duration and that reported on asthma exacerbations in full text. Low dose inhaled corticosteroid treatment was the comparator strategy. The primary effectiveness outcome was the rate of severe exacerbations. The secondary outcome was the composite of moderate or severe exacerbations. The rate of withdrawal was analysed as a safety outcome. Results 64 trials with 59 622 patient years of follow-up comparing 15 strategies and placebo were included. For prevention of severe exacerbations, combined inhaled corticosteroids and long acting β agonists as maintenance and reliever treatment and combined inhaled corticosteroids and long acting β agonists in a fixed daily dose performed equally well and were ranked first for effectiveness. The rate ratios compared with low dose inhaled corticosteroids were 0.44 (95% credible interval 0.29 to 0.66) and 0.51 (0.35 to 0.77), respectively. Other combined strategies were not superior to inhaled corticosteroids and all single drug treatments were inferior to single low dose inhaled corticosteroids. Safety was best for conventional best (guideline based) practice and combined maintenance and reliever therapy. Conclusions Strategies with combined inhaled corticosteroids and long acting β agonists are most effective and safe in preventing severe exacerbations of asthma, although some heterogeneity was observed in this network meta-analysis of full text reports.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE To investigate whether revascularisation improves prognosis compared with medical treatment among patients with stable coronary artery disease. DESIGN Bayesian network meta-analyses to combine direct within trial comparisons between treatments with indirect evidence from other trials while maintaining randomisation. ELIGIBILITY CRITERIA FOR SELECTING STUDIES A strategy of initial medical treatment compared with revascularisation by coronary artery bypass grafting or Food and Drug Administration approved techniques for percutaneous revascularization: balloon angioplasty, bare metal stent, early generation paclitaxel eluting stent, sirolimus eluting stent, and zotarolimus eluting (Endeavor) stent, and new generation everolimus eluting stent, and zotarolimus eluting (Resolute) stent among patients with stable coronary artery disease. DATA SOURCES Medline and Embase from 1980 to 2013 for randomised trials comparing medical treatment with revascularisation. MAIN OUTCOME MEASURE All cause mortality. RESULTS 100 trials in 93 553 patients with 262 090 patient years of follow-up were included. Coronary artery bypass grafting was associated with a survival benefit (rate ratio 0.80, 95% credibility interval 0.70 to 0.91) compared with medical treatment. New generation drug eluting stents (everolimus: 0.75, 0.59 to 0.96; zotarolimus (Resolute): 0.65, 0.42 to 1.00) but not balloon angioplasty (0.85, 0.68 to 1.04), bare metal stents (0.92, 0.79 to 1.05), or early generation drug eluting stents (paclitaxel: 0.92, 0.75 to 1.12; sirolimus: 0.91, 0.75 to 1.10; zotarolimus (Endeavor): 0.88, 0.69 to 1.10) were associated with improved survival compared with medical treatment. Coronary artery bypass grafting reduced the risk of myocardial infarction compared with medical treatment (0.79, 0.63 to 0.99), and everolimus eluting stents showed a trend towards a reduced risk of myocardial infarction (0.75, 0.55 to 1.01). The risk of subsequent revascularisation was noticeably reduced by coronary artery bypass grafting (0.16, 0.13 to 0.20) followed by new generation drug eluting stents (zotarolimus (Resolute): 0.26, 0.17 to 0.40; everolimus: 0.27, 0.21 to 0.35), early generation drug eluting stents (zotarolimus (Endeavor): 0.37, 0.28 to 0.50; sirolimus: 0.29, 0.24 to 0.36; paclitaxel: 0.44, 0.35 to 0.54), and bare metal stents (0.69, 0.59 to 0.81) compared with medical treatment. CONCLUSION Among patients with stable coronary artery disease, coronary artery bypass grafting reduces the risk of death, myocardial infarction, and subsequent revascularisation compared with medical treatment. All stent based coronary revascularisation technologies reduce the need for revascularisation to a variable degree. Our results provide evidence for improved survival with new generation drug eluting stents but no other percutaneous revascularisation technology compared with medical treatment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Limitations in the primary studies constitute one important factor to be considered in the grading of recommendations assessment, development, and evaluation (GRADE) system of rating quality of evidence. However, in the network meta-analysis (NMA), such evaluation poses a special challenge because each network estimate receives different amounts of contributions from various studies via direct as well as indirect routes and because some biases have directions whose repercussion in the network can be complicated. FINDINGS In this report we use the NMA of maintenance pharmacotherapy of bipolar disorder (17 interventions, 33 studies) and demonstrate how to quantitatively evaluate the impact of study limitations using netweight, a STATA command for NMA. For each network estimate, the percentage of contributions from direct comparisons at high, moderate or low risk of bias were quantified, respectively. This method has proven flexible enough to accommodate complex biases with direction, such as the one due to the enrichment design seen in some trials of bipolar maintenance pharmacotherapy. CONCLUSIONS Using netweight, therefore, we can evaluate in a transparent and quantitative manner how study limitations of individual studies in the NMA impact on the quality of evidence of each network estimate, even when such limitations have clear directions.