994 resultados para Network Centric Warfare


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neural Network has emerged as the topic of the day. The spectrum of its application is as wide as from ECG noise filtering to seismic data analysis and from elementary particle detection to electronic music composition. The focal point of the proposed work is an application of a massively parallel connectionist model network for detection of a sonar target. This task is segmented into: (i) generation of training patterns from sea noise that contains radiated noise of a target, for teaching the network;(ii) selection of suitable network topology and learning algorithm and (iii) training of the network and its subsequent testing where the network detects, in unknown patterns applied to it, the presence of the features it has already learned in. A three-layer perceptron using backpropagation learning is initially subjected to a recursive training with example patterns (derived from sea ambient noise with and without the radiated noise of a target). On every presentation, the error in the output of the network is propagated back and the weights and the bias associated with each neuron in the network are modified in proportion to this error measure. During this iterative process, the network converges and extracts the target features which get encoded into its generalized weights and biases.In every unknown pattern that the converged network subsequently confronts with, it searches for the features already learned and outputs an indication for their presence or absence. This capability for target detection is exhibited by the response of the network to various test patterns presented to it.Three network topologies are tried with two variants of backpropagation learning and a grading of the performance of each combination is subsequently made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International School of Photonics, Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microarray data analysis is one of data mining tool which is used to extract meaningful information hidden in biological data. One of the major focuses on microarray data analysis is the reconstruction of gene regulatory network that may be used to provide a broader understanding on the functioning of complex cellular systems. Since cancer is a genetic disease arising from the abnormal gene function, the identification of cancerous genes and the regulatory pathways they control will provide a better platform for understanding the tumor formation and development. The major focus of this thesis is to understand the regulation of genes responsible for the development of cancer, particularly colorectal cancer by analyzing the microarray expression data. In this thesis, four computational algorithms namely fuzzy logic algorithm, modified genetic algorithm, dynamic neural fuzzy network and Takagi Sugeno Kang-type recurrent neural fuzzy network are used to extract cancer specific gene regulatory network from plasma RNA dataset of colorectal cancer patients. Plasma RNA is highly attractive for cancer analysis since it requires a collection of small amount of blood and it can be obtained at any time in repetitive fashion allowing the analysis of disease progression and treatment response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evolution of wireless sensor network technology has enabled us to develop advanced systems for real time monitoring. In the present scenario wireless sensor networks are increasingly being used for precision agriculture. The advantages of using wireless sensor networks in agriculture are distributed data collection and monitoring, monitor and control of climate, irrigation and nutrient supply. Hence decreasing the cost of production and increasing the efficiency of production. This paper describes the development and deployment of wireless sensor network for crop monitoring in the paddy fields of Kuttanad, a region of Kerala, the southern state of India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MicroRNAs are short non-coding RNAs that can regulate gene expression during various crucial cell processes such as differentiation, proliferation and apoptosis. Changes in expression profiles of miRNA play an important role in the development of many cancers, including CRC. Therefore, the identification of cancer related miRNAs and their target genes are important for cancer biology research. In this paper, we applied TSK-type recurrent neural fuzzy network (TRNFN) to infer miRNA–mRNA association network from paired miRNA, mRNA expression profiles of CRC patients. We demonstrated that the method we proposed achieved good performance in recovering known experimentally verified miRNA–mRNA associations. Moreover, our approach proved successful in identifying 17 validated cancer miRNAs which are directly involved in the CRC related pathways. Targeting such miRNAs may help not only to prevent the recurrence of disease but also to control the growth of advanced metastatic tumors. Our regulatory modules provide valuable insights into the pathogenesis of cancer

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major applications of underwater acoustic sensor networks (UWASN) is ocean environment monitoring. Employing data mules is an energy efficient way of data collection from the underwater sensor nodes in such a network. A data mule node such as an autonomous underwater vehicle (AUV) periodically visits the stationary nodes to download data. By conserving the power required for data transmission over long distances to a remote data sink, this approach extends the network life time. In this paper we propose a new MAC protocol to support a single mobile data mule node to collect the data sensed by the sensor nodes in periodic runs through the network. In this approach, the nodes need to perform only short distance, single hop transmission to the data mule. The protocol design discussed in this paper is motivated to support such an application. The proposed protocol is a hybrid protocol, which employs a combination of schedule based access among the stationary nodes along with handshake based access to support mobile data mules. The new protocol, RMAC-M is developed as an extension to the energy efficient MAC protocol R-MAC by extending the slot time of R-MAC to include a contention part for a hand shake based data transfer. The mobile node makes use of a beacon to signal its presence to all the nearby nodes, which can then hand-shake with the mobile node for data transfer. Simulation results show that the new protocol provides efficient support for a mobile data mule node while preserving the advantages of R-MAC such as energy efficiency and fairness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In our study we use a kernel based classification technique, Support Vector Machine Regression for predicting the Melting Point of Drug – like compounds in terms of Topological Descriptors, Topological Charge Indices, Connectivity Indices and 2D Auto Correlations. The Machine Learning model was designed, trained and tested using a dataset of 100 compounds and it was found that an SVMReg model with RBF Kernel could predict the Melting Point with a mean absolute error 15.5854 and Root Mean Squared Error 19.7576

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metal matrix composites (MMC) having aluminium (Al) in the matrix phase and silicon carbide particles (SiCp) in reinforcement phase, ie Al‐SiCp type MMC, have gained popularity in the re‐cent past. In this competitive age, manufacturing industries strive to produce superior quality products at reasonable price. This is possible by achieving higher productivity while performing machining at optimum combinations of process variables. The low weight and high strength MMC are found suitable for variety of components

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Composite Fe3O4–SiO2 materials were prepared by the sol–gel method with tetraethoxysilane and aqueous-based Fe3O4 ferrofluids as precursors. The monoliths obtained were crack free and showed both optical and magnetic properties. The structural properties were determined by infrared spectroscopy, x-ray diffractometry and transmission electron microscopy. Fe3O4 particles of 20 nm size lie within the pores of the matrix without any strong Si–O–Fe bonding. The well established silica network provides effective confinement to these nanoparticles. The composites were transparent in the 600–800 nm regime and the field dependent magnetization curves suggest that the composite exhibits superparamagnetic characteristics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Towed Array electronics is a multi-channel simultaneous real time high speed data acquisition system. Since its assembly is highly manpower intensive, the costs of arrays are prohibitive and therefore any attempt to reduce the manufacturing, assembly, testing and maintenance costs is a welcome proposition. The Network Based Towed Array is an innovative concept and its implementation has remarkably simplified the fabrication, assembly and testing and revolutionised the Towed Array scenario. The focus of this paper is to give a good insight into the Reliability aspects of Network Based Towed Array. A case study of the comparison between the conventional array and the network based towed array is also dealt with

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper investigates the feasibility of implementing an intelligent classifier for noise sources in the ocean, with the help of artificial neural networks, using higher order spectral features. Non-linear interactions between the component frequencies of the noise data can give rise to certain phase relations called Quadratic Phase Coupling (QPC), which cannot be characterized by power spectral analysis. However, bispectral analysis, which is a higher order estimation technique, can reveal the presence of such phase couplings and provide a measure to quantify such couplings. A feed forward neural network has been trained and validated with higher order spectral features

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social resource sharing systems like YouTube and del.icio.us have acquired a large number of users within the last few years. They provide rich resources for data analysis, information retrieval, and knowledge discovery applications. A first step towards this end is to gain better insights into content and structure of these systems. In this paper, we will analyse the main network characteristics of two of the systems. We consider their underlying data structures – socalled folksonomies – as tri-partite hypergraphs, and adapt classical network measures like characteristic path length and clustering coefficient to them. Subsequently, we introduce a network of tag co-occurrence and investigate some of its statistical properties, focusing on correlations in node connectivity and pointing out features that reflect emergent semantics within the folksonomy. We show that simple statistical indicators unambiguously spot non-social behavior such as spam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social resource sharing systems like YouTube and del.icio.us have acquired a large number of users within the last few years. They provide rich resources for data analysis, information retrieval, and knowledge discovery applications. A first step towards this end is to gain better insights into content and structure of these systems. In this paper, we will analyse the main network characteristics of two of these systems. We consider their underlying data structures – so-called folksonomies – as tri-partite hypergraphs, and adapt classical network measures like characteristic path length and clustering coefficient to them. Subsequently, we introduce a network of tag cooccurrence and investigate some of its statistical properties, focusing on correlations in node connectivity and pointing out features that reflect emergent semantics within the folksonomy. We show that simple statistical indicators unambiguously spot non-social behavior such as spam.