127 resultados para Knowledge Networks


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reconstruction of power industries has brought fundamental changes to both power system operation and planning. This paper presents a new planning method using multi-objective optimization (MOOP) technique, as well as human knowledge, to expand the transmission network in open access schemes. The method starts with a candidate pool of feasible expansion plans. Consequent selection of the best candidates is carried out through a MOOP approach, of which multiple objectives are tackled simultaneously, aiming at integrating the market operation and planning as one unified process in context of deregulated system. Human knowledge has been applied in both stages to ensure the selection with practical engineering and management concerns. The expansion plan from MOOP is assessed by reliability criteria before it is finalized. The proposed method has been tested with the IEEE 14-bus system and relevant analyses and discussions have been presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An economy is a coordinated system of distributed knowledge. Economic evolution occurs as knowledge grows and the structure of the system changes. This paper is about the role of markets in this process. Traditionally, the theory of markets has not been a central feature of evolutionary economics. This seems to be due to the orthodox view of markets as information-processing mechanisms for finding equilibria. But in economic evolution markets are actually knowledge-structuring mechanisms. What then is the relation between knowledge, information, markets and mechanisms? I argue that an evolutionary theory of markets, in the manner of Loasby (1999), requires a clear formulation of these relations. I suggest that a conception of knowledge and markets in terms of a graphical theory of complex systems furnishes precisely this.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional methods of R&D management are no longer sufficient for embracing innovations and leveraging complex new technologies to fully integrated positions in established systems. This paper presents the view that the technology integration process is a result of fundamental interactions embedded in inter-organisational activities. Emerging industries, high technology companies and knowledge intensive organisations owe a large part of their viability to complex networks of inter-organisational interactions and relationships. R&D organisations are the gatekeepers in the technology integration process with their initial sanction and motivation to develop technologies providing the first point of entry. Networks rely on the activities of stakeholders to provide the foundations of collaborative R&D activities, business-to-business marketing and strategic alliances. Such complex inter-organisational interactions and relationships influence value creation and organisational goals as stakeholders seek to gain investment opportunities. A theoretical model is developed here that contributes to our understanding of technology integration (adoption) as a dynamic process, which is simultaneously structured and enacted through the activities of stakeholders and organisations in complex inter-organisational networks of sanction and integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network building and exchange of information by people within networks is crucial to the innovation process. Contrary to older models, in social networks the flow of information is noncontinuous and nonlinear. There are critical barriers to information flow that operate in a problematic manner. New models and new analytic tools are needed for these systems. This paper introduces the concept of virtual circuits and draws on recent concepts of network modelling and design to introduce a probabilistic switch theory that can be described using matrices. It can be used to model multistep information flow between people within organisational networks, to provide formal definitions of efficient and balanced networks and to describe distortion of information as it passes along human communication channels. The concept of multi-dimensional information space arises naturally from the use of matrices. The theory and the use of serial diagonal matrices have applications to organisational design and to the modelling of other systems. It is hypothesised that opinion leaders or creative individuals are more likely to emerge at information-rich nodes in networks. A mathematical definition of such nodes is developed and it does not invariably correspond with centrality as defined by early work on networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: Targeting peptides direct nascent proteins to their specific subcellular compartment. Knowledge of targeting signals enables informed drug design and reliable annotation of gene products. However, due to the low similarity of such sequences and the dynamical nature of the sorting process, the computational prediction of subcellular localization of proteins is challenging. Results: We contrast the use of feed forward models as employed by the popular TargetP/SignalP predictors with a sequence-biased recurrent network model. The models are evaluated in terms of performance at the residue level and at the sequence level, and demonstrate that recurrent networks improve the overall prediction performance. Compared to the original results reported for TargetP, an ensemble of the tested models increases the accuracy by 6 and 5% on non-plant and plant data, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The regulation of osteoclast differentiation in the bone microenvironment is critical for normal bone remodeling, as well as for various human bone diseases. Over the last decade, our knowledge of how osteoclast differentiation occurs has progressed rapidly. We highlight some of the major advances in understanding how cell signaling and transcription are integrated to direct the differentiation of this cell type. These studies used genetic, molecular, and biochemical approaches. Additionally, we summarize data obtained from studies of osteoclast differentiation that used the functional genomic approach of global gene profiling applied to osteoclast differentiation. This genomic data confirms results from studies using the classical experimental approaches and also may suggest new modes by which osteoclast differentiation and function can be modulated. Two conclusions that emerge are that osteoclast differentiation depends on a combination of fairly ubiquitously expressed transcription factors rather than unique osteoclast factors, and that the overlay of cell signaling pathways on this set of transcription factors provides a powerful mechanism to fine tune the differentiation program in response to the local bone microenvironment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bistability arises within a wide range of biological systems from the A phage switch in bacteria to cellular signal transduction pathways in mammalian cells. Changes in regulatory mechanisms may result in genetic switching in a bistable system. Recently, more and more experimental evidence in the form of bimodal population distributions indicates that noise plays a very important role in the switching of bistable systems. Although deterministic models have been used for studying the existence of bistability properties under various system conditions, these models cannot realize cell-to-cell fluctuations in genetic switching. However, there is a lag in the development of stochastic models for studying the impact of noise in bistable systems because of the lack of detailed knowledge of biochemical reactions, kinetic rates, and molecular numbers. in this work, we develop a previously undescribed general technique for developing quantitative stochastic models for large-scale genetic regulatory networks by introducing Poisson random variables into deterministic models described by ordinary differential equations. Two stochastic models have been proposed for the genetic toggle switch interfaced with either the SOS signaling pathway or a quorum-sensing signaling pathway, and we have successfully realized experimental results showing bimodal population distributions. Because the introduced stochastic models are based on widely used ordinary differential equation models, the success of this work suggests that this approach is a very promising one for studying noise in large-scale genetic regulatory networks.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Taiwan is embarking on a new phase in its approach to building its national innovative capacity, through building the infrastructure for a biotechnology industry. Rather than acting as a “fast follower” of trends developed elsewhere, Taiwan is seeking to evolve the elements of a national innovation system, including upgrading the role of universities in providing fundamental R&D, in providing incubators for new, knowledge-based firms, in developing new funding models, and in establishing new biotech-focused science parks. This paper reviews the progress achieved to date, and the prospects for this new phase in Taiwan’s transition from imitation to innovation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discussion about relations between research and design has a number of strands, and presumably motivations. Putting aside the question whether or not design or “creative endeavour” should be counted as research, for reasons to do with institutional recognition or reward, the question remains how, if at all, is design research? This question is unlikely to have attracted much interest but for matters external to Architecture within the modern university. But Architecture as a discipline now needs to understand research much better than in the past when ‘research’ was whatever went on in building science, history or people/environment studies. In this paper, I begin with some common assumptions about design, considered in relation to research, and suggest how the former can constitute or be a mode of the latter. Central to this consideration is an understanding of research as the production of publicly available knowledge. The method is that of conceptual analysis which is much more fruitful than is usually appreciated. This work is part of a larger project in philosophy of design, in roughly the analytical tradition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).