924 resultados para 280401 Analysis of Algorithms and Complexity
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Distributed network utility maximization (NUM) is receiving increasing interests for cross-layer optimization problems in multihop wireless networks. Traditional distributed NUM algorithms rely heavily on feedback information between different network elements, such as traffic sources and routers. Because of the distinct features of multihop wireless networks such as time-varying channels and dynamic network topology, the feedback information is usually inaccurate, which represents as a major obstacle for distributed NUM application to wireless networks. The questions to be answered include if distributed NUM algorithm can converge with inaccurate feedback and how to design effective distributed NUM algorithm for wireless networks. In this paper, we first use the infinitesimal perturbation analysis technique to provide an unbiased gradient estimation on the aggregate rate of traffic sources at the routers based on locally available information. On the basis of that, we propose a stochastic approximation algorithm to solve the distributed NUM problem with inaccurate feedback. We then prove that the proposed algorithm can converge to the optimum solution of distributed NUM with perfect feedback under certain conditions. The proposed algorithm is applied to the joint rate and media access control problem for wireless networks. Numerical results demonstrate the convergence of the proposed algorithm. © 2013 John Wiley & Sons, Ltd.
Resumo:
The polyparametric intelligence information system for diagnostics human functional state in medicine and public health is developed. The essence of the system consists in polyparametric describing of human functional state with the unified set of physiological parameters and using the polyparametric cognitive model developed as the tool for a system analysis of multitude data and diagnostics of a human functional state. The model is developed on the basis of general principles geometry and symmetry by algorithms of artificial intelligence systems. The architecture of the system is represented. The model allows analyzing traditional signs - absolute values of electrophysiological parameters and new signs generated by the model – relationships of ones. The classification of physiological multidimensional data is made with a transformer of the model. The results are presented to a physician in a form of visual graph – a pattern individual functional state. This graph allows performing clinical syndrome analysis. A level of human functional state is defined in the case of the developed standard (“ideal”) functional state. The complete formalization of results makes it possible to accumulate physiological data and to analyze them by mathematics methods.
Resumo:
The paper treats the task for cluster analysis of a given assembly of objects on the basis of the information contained in the description table of these objects. Various methods of cluster analysis are briefly considered. Heuristic method and rules for classification of the given assembly of objects are presented for the cases when their division into classes and the number of classes is not known. The algorithm is checked by a test example and two program products (PP) – learning systems and software for company management. Analysis of the results is presented.
Resumo:
Summarizing the accumulated experience for a long time in the polyparametric cognitive modeling of different physiological processes (electrocardiogram, electroencephalogram, electroreovasogram and others) and the development on this basis some diagnostics methods give ground for formulating a new methodology of the system analysis in biology. The gist of the methodology consists of parametrization of fractals of electrophysiological processes, matrix description of functional state of an object with a unified set of parameters, construction of the polyparametric cognitive geometric model with artificial intelligence algorithms. The geometry model enables to display the parameter relationships are adequate to requirements of the system approach. The objective character of the elements of the models and high degree of formalization which facilitate the use of the mathematical methods are advantages of these models. At the same time the geometric images are easily interpreted in physiological and clinical terms. The polyparametric modeling is an object oriented tool possessed advances functional facilities and some principal features.
Resumo:
Location estimation is important for wireless sensor network (WSN) applications. In this paper we propose a Cramer-Rao Bound (CRB) based analytical approach for two centralized multi-hop localization algorithms to get insights into the error performance and its sensitivity to the distance measurement error, anchor node density and placement. The location estimation performance is compared with four distributed multi-hop localization algorithms by simulation to evaluate the efficiency of the proposed analytical approach. The numerical results demonstrate the complex tradeoff between the centralized and distributed localization algorithms on accuracy, complexity and communication overhead. Based on this analysis, an efficient and scalable performance evaluation tool can be designed for localization algorithms in large scale WSNs, where simulation-based evaluation approaches are impractical. © 2013 IEEE.
Resumo:
The term oxylipin is applied to the generation of oxygenated products of polyunsaturated fatty acids that can arise either through non-enzymatic or enzymatic processes generating a complex array of products, including alcohols, aldehydes, ketones, acids and hydrocarbon gases. The biosynthetic origin of these products has revealed an array of enzymes involved in their formation and more recently a radical pathway. These include lipoxygenases and α-dioxygenase that insert both oxygen atoms in to the acyl chain to initiate the pathways, to specialised P450 monooxygenases that are responsible for their downstream processing. This latter group include enzymes at the branch points such as allene oxide synthase, leading to jasmonate signalling, hydroperoxide lyase, responsible for generating pathogen/pest defensive volatiles and divinyl ether synthases and peroxygenases involved in the formation of antimicrobial compounds. The complexity of the products generated raises significant challenges for their rapid identification and quantification using metabolic screening methods. Here the current developments in oxylipin metabolism are reviewed together with the emerging technologies required to expand this important field of research that underpins advances in plant-pest/pathogen interactions.
Resumo:
This research evaluates pattern recognition techniques on a subclass of big data where the dimensionality of the input space (p) is much larger than the number of observations (n). Specifically, we evaluate massive gene expression microarray cancer data where the ratio κ is less than one. We explore the statistical and computational challenges inherent in these high dimensional low sample size (HDLSS) problems and present statistical machine learning methods used to tackle and circumvent these difficulties. Regularization and kernel algorithms were explored in this research using seven datasets where κ < 1. These techniques require special attention to tuning necessitating several extensions of cross-validation to be investigated to support better predictive performance. While no single algorithm was universally the best predictor, the regularization technique produced lower test errors in five of the seven datasets studied.
Resumo:
Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.
Resumo:
The central motif of this work is prediction and optimization in presence of multiple interacting intelligent agents. We use the phrase `intelligent agents' to imply in some sense, a `bounded rationality', the exact meaning of which varies depending on the setting. Our agents may not be `rational' in the classical game theoretic sense, in that they don't always optimize a global objective. Rather, they rely on heuristics, as is natural for human agents or even software agents operating in the real-world. Within this broad framework we study the problem of influence maximization in social networks where behavior of agents is myopic, but complication stems from the structure of interaction networks. In this setting, we generalize two well-known models and give new algorithms and hardness results for our models. Then we move on to models where the agents reason strategically but are faced with considerable uncertainty. For such games, we give a new solution concept and analyze a real-world game using out techniques. Finally, the richest model we consider is that of Network Cournot Competition which deals with strategic resource allocation in hypergraphs, where agents reason strategically and their interaction is specified indirectly via player's utility functions. For this model, we give the first equilibrium computability results. In all of the above problems, we assume that payoffs for the agents are known. However, for real-world games, getting the payoffs can be quite challenging. To this end, we also study the inverse problem of inferring payoffs, given game history. We propose and evaluate a data analytic framework and we show that it is fast and performant.
Resumo:
66 p.
Resumo:
This paper compares the performance of the complex nonlinear least squares algorithm implemented in the LEVM/LEVMW software with the performance of a genetic algorithm in the characterization of an electrical impedance of known topology. The effect of the number of measured frequency points and of measurement uncertainty on the estimation of circuit parameters is presented. The analysis is performed on the equivalent circuit impedance of a humidity sensor.
Resumo:
This work proposes the analysis of tracking algorithms for point objects and extended targets particle filter on a radar application problem. Through simulations, the number of particles, the process and measurement noise of particle filter have been optimized. Four different scenarios have been considered in this work: point object with linear trajectory, point object with non-linear trajectory, extended object with linear trajectory, extended object with non-linear trajectory. The extended target has been modelled as an ellipse parametrized by the minor and major axes, the orientation angle, and the center coordinates (5 parameters overall).
Resumo:
This report describes the realization of a system, in which an object detection model will be implemented, whose aim is to detect the presence of people in images. This system could be used for several applications: for example, it could be carried on board an aircraft or a drone. In this case, the system is designed in such a way that it can be mounted on light/medium weight helicopters, helping the operator to find people in emergency situations. In the first chapter the use of helicopters for civil protection is analysed and applications similar to this case study are listed. The second chapter describes the choice of the hardware devices that have been used to implement a prototype of a system to collect, analyse and display images. At first, the PC necessary to process the images was chosen, based on the characteristics of the algorithms that are necessary to run the analysis. In the further, a camera that could be compatible with the PC was selected. Finally, the battery pack was chosen taking into account the electrical consumption of the devices. The third chapter illustrates the algorithms used for image analysis. In the fourth, some of the requirements listed in the regulations that must be taken into account for carrying on board all the devices have been briefly analysed. In the fifth chapter the activity of design and modelling, with the CAD Solidworks, the devices and a prototype of a case that will house them is described. The sixth chapter discusses the additive manufacturing, since the case was printed exploiting this technology. In the seventh chapter, part of the tests that must be carried out on the equipment to certificate it have been analysed, and some simulations have been carried out. In the eighth chapter the results obtained once loaded the object detection model on a hardware for image analyses were showed. In the ninth chapter, conclusions and future applications were discussed.