842 resultados para Technology Network
Resumo:
Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a two-layer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.
Resumo:
This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a "large margin." The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics
Resumo:
In fault detection and diagnostics, limitations coming from the sensor network architecture are one of the main challenges in evaluating a system’s health status. Usually the design of the sensor network architecture is not solely based on diagnostic purposes, other factors like controls, financial constraints, and practical limitations are also involved. As a result, it quite common to have one sensor (or one set of sensors) monitoring the behaviour of two or more components. This can significantly extend the complexity of diagnostic problems. In this paper a systematic approach is presented to deal with such complexities. It is shown how the problem can be formulated as a Bayesian network based diagnostic mechanism with latent variables. The developed approach is also applied to the problem of fault diagnosis in HVAC systems, an application area with considerable modeling and measurement constraints.
Resumo:
Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.
Resumo:
Almost all metapopulation modelling assumes that connectivity between patches is only a function of distance, and is therefore symmetric. However, connectivity will not depend only on the distance between the patches, as some paths are easy to traverse, while others are difficult. When colonising organisms interact with the heterogeneous landscape between patches, connectivity patterns will invariably be asymmetric. There have been few attempts to theoretically assess the effects of asymmetric connectivity patterns on the dynamics of metapopulations. In this paper, we use the framework of complex networks to investigate whether metapopulation dynamics can be determined by directly analysing the asymmetric connectivity patterns that link the patches. Our analyses focus on “patch occupancy” metapopulation models, which only consider whether a patch is occupied or not. We propose three easily calculated network metrics: the “asymmetry” and “average path strength” of the connectivity pattern, and the “centrality” of each patch. Together, these metrics can be used to predict the length of time a metapopulation is expected to persist, and the relative contribution of each patch to a metapopulation’s viability. Our results clearly demonstrate the negative effect that asymmetry has on metapopulation persistence. Complex network analyses represent a useful new tool for understanding the dynamics of species existing in fragmented landscapes, particularly those existing in large metapopulations.
Resumo:
Innovation is vital for the future of Australia.s internet economy. Innovations rely on businesses. ability to innovate. Businesses. ability to innovate relies on their employees. The more these individual end users engage in the internet economy, the better businesses. engagement will be. The less these individual end users engage, the less likely a business is to engage and innovate. This means, for the internet economy to function at its fullest potential, it is essential that individual Australians have the capacity to engage with it and participate in it. The Australian federal government is working to facilitate the internet economy through policies, legislation and practices that implement high-speed broadband. The National Broadband Network will be a vital tool for Australia.s internet economy. Its .chief importance¡® is that it will provide faster internet access speeds that will facilitate access to internet services and content. However, an appropriate infrastructure and internet speed is only part of the picture. As the Organisation for Economic Co-operation and Development identified, appropriate government policies are also needed to ensure that vital services are more accessible by consumers. The thesis identifies essential theories and principles underpinning the internet economy and from which the concept of connectedness is developed. Connectedness is defined as the ability of end users to connect with internet content and services, other individuals and organisations, and government. That is, their ability to operate in the internet economy. The NBN will be vital in ensuring connectedness into the future. What is not currently addressed by existing access regimes is how to facilitate end user access capacity and participation. The thesis concludes by making recommendations to the federal government as to what the governing principles of the Australian internet economy should include in order to enable individual end user access capacity.
Resumo:
This paper presents a “research frame” which we have found useful in analyzing complex socio- technical situations. The research frame is based on aspects of actor-network theory: “interressment”, “enrollment”, “points of passage” and the “trial of strength”. Each of these aspects are described in turn, making clear their purpose in the overall research frame. Having established the research frame it is used to analyse two examples. First, the use of speech recognition technology is examined in two different contexts, showing how to apply the frame to compare and contrast current situations. Next, a current medical consultation context is described and the research frame is used to consider how it could change with innovative technology. In both examples, the research frame shows that the use of an artefact or technology must be considered together with the context in which it is used.
Resumo:
In the current economy, knowledge has been recognized to be a valuable organisational asset, a crucial factor that aids organisations to succeed in highly competitive environments. Many organisations have begun projects and special initiatives aimed at fostering better knowledge sharing amongst their employees. Not surprisingly, information technology (IT) has been a central element of many of these projects and initiatives, as the potential of emerging information technologies such as Web 2.0 for enabling the process of managing organisational knowledge is recognised. This technology could be used as a collaborative system for knowledge management (KM) within enterprises. Enterprise 2.0 is the application of Web 2.0 in an organisational context. Enterprise 2.0 technologies are web-based social software that facilitate collaboration, communication and information flow in a bidirectional manner: an essential aspect of organisational knowledge management. This chapter explains how Enterprise 2.0 technologies (Web 2.0 technologies within organisations) can support knowledge management. The chapter also explores how such technologies support the codifying (technology-centred) and social network (people-centred) approaches of KM, towards bridging the current gap between these two approaches.
Resumo:
In this article I would like to examine the promise and possibilities of music, digital media and National Broadband Network. I will do this based on concepts that have emerged from a study undertaken by Professor Andrew Brown and I that categorise technologies into what we term representational technologies and technologies with agency
Resumo:
Networked control systems (NCSs) offer many advantages over conventional control; however, they also demonstrate challenging problems such as network-induced delay and packet losses. This paper proposes an approach of predictive compensation for simultaneous network-induced delays and packet losses. Different from the majority of existing NCS control methods, the proposed approach addresses co-design of both network and controller. It also alleviates the requirements of precise process models and full understanding of NCS network dynamics. For a series of possible sensor-to-actuator delays, the controller computes a series of corresponding redundant control values. Then, it sends out those control values in a single packet to the actuator. Once receiving the control packet, the actuator measures the actual sensor-to-actuator delay and computes the control signals from the control packet. When packet dropout occurs, the actuator utilizes past control packets to generate an appropriate control signal. The effectiveness of the approach is demonstrated through examples.
Resumo:
Online social networks can be found everywhere from chatting websites like MSN, blogs such as MySpace to social media such as YouTube and second life. Among them, there is one interesting type of online social networks, online dating network that is growing fast. This paper analyzes an online dating network from social network analysis point of view. Observations are made and results are obtained in order to suggest a better recommendation system for people-to-people networks.
Resumo:
Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.