937 resultados para clonal selection algorithm


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Staphylococcus aureus produces a virulence factor, protein A (SpA), that contains five homologous Ig-binding domains. The interactions of SpA with the Fab region of membrane-anchored Igs can stimulate a large fraction of B cells, contributing to lymphocyte clonal selection. To understand the molecular basis for this activity, we have solved the crystal structure of the complex between domain D of SpA and the Fab fragment of a human IgM antibody to 2.7-Å resolution. In the complex, helices II and III of domain D interact with the variable region of the Fab heavy chain (VH) through framework residues, without the involvement of the hypervariable regions implicated in antigen recognition. The contact residues are highly conserved in human VH3 antibodies but not in other families. The contact residues from domain D also are conserved among all SpA Ig-binding domains, suggesting that each could bind in a similar manner. Features of this interaction parallel those reported for staphylococcal enterotoxins that are superantigens for many T cells. The structural homology between Ig VH regions and the T-cell receptor Vβ regions facilitates their comparison, and both types of interactions involve lymphocyte receptor surface remote from the antigen binding site. However, T-cell superantigens reportedly interact through hydrogen bonds with T-cell receptor Vβ backbone atoms in a primary sequence-independent manner, whereas SpA relies on a sequence-restricted conformational binding with residue side chains, suggesting that this common bacterial pathogen has adopted distinct molecular recognition strategies for affecting large sets of B and T lymphocytes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The hierarchical properties of potential energy landscapes have been used to gain insight into thermodynamic and kinetic properties of protein ensembles. It also may be possible to use them to direct computational searches for thermodynamically stable macroscopic states, i.e., computational protein folding. To this end, we have developed a top-down search procedure in which conformation space is recursively dissected according to the intrinsic hierarchical structure of a landscape's effective-energy barriers. This procedure generates an inverted tree similar to the disconnectivity graphs generated by local minima-clustering methods, but it fundamentally differs in the manner in which the portion of the tree that is to be computationally explored is selected. A key ingredient is a branch-selection algorithm that takes advantage of statistically predictive properties of the landscape to guide searches down the tree branches that are most likely to lead to the physically relevant macroscopic states. Using the computational folding of a β-hairpin-forming peptide as an example, we show that such predictive properties indeed exist and can be used for structure prediction by free-energy global minimization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rapid progress in the cloning of proteoglycan genes has enabled investigators to examine in depth the functional roles these polyhedric molecules play in the control of cell proliferation. Decorin, a leucine-rich proteoglycan expressed by most connective tissues, is a prototype molecule that regulates cellular growth via two mechanisms: modulation of growth factor activity and matrix assembly. We now provide direct evidence that human colon cancer cells stably transfected with decorin cDNA exhibit a marked suppression of the transformed phenotype: the cells have a reduced growth rate in vitro, form small colonies in soft agar, and do not generate tumors in scid/scid mice. Several independent clones are arrested in the G1 phase of the cell cycle, and their growth suppression can be restored by treatment with decorin antisense oligodeoxynucleotides. These effects are independent of growth factors and are not due to either clonal selection or integration site of the decorin gene. These findings correlate well with the observation that decorin gene expression is markedly up-regulated during quiescence. Decorin thus appears to be one component of a negative loop that controls cell growth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We have used the Two-Degree Field (2dF) instrument on the Anglo-Australian Telescope (AAT) to obtain redshifts of a sample of z < 3 and 18.0 < g < 21.85 quasars selected from Sloan Digital Sky Survey (SDSS) imaging. These data are part of a larger joint programme between the SDSS and 2dF communities to obtain spectra of faint quasars and luminous red galaxies, namely the 2dF-SDSS LRG and QSO (2SLAQ) Survey. We describe the quasar selection algorithm and present the resulting number counts and luminosity function of 5645 quasars in 105.7 deg(2). The bright-end number counts and luminosity functions agree well with determinations from the 2dF QSO Redshift Survey (2QZ) data to g similar to 20.2. However, at the faint end, the 2SLAQ number counts and luminosity functions are steeper (i.e. require more faint quasars) than the final 2QZ results from Croom et al., but are consistent with the preliminary 2QZ results from Boyle et al. Using the functional form adopted for the 2QZ analysis ( a double power law with pure luminosity evolution characterized by a second-order polynomial in redshift), we find a faint-end slope of beta =-1.78 +/- 0.03 if we allow all of the parameters to vary, and beta =-1.45 +/- 0.03 if we allow only the faint-end slope and normalization to vary (holding all other parameters equal to the final 2QZ values). Over the magnitude range covered by the 2SLAQ survey, our maximum-likelihood fit to the data yields 32 per cent more quasars than the final 2QZ parametrization, but is not inconsistent with other g > 21 deep surveys for quasars. The 2SLAQ data exhibit no well-defined 'break' in the number counts or luminosity function, but do clearly flatten with increasing magnitude. Finally, we find that the shape of the quasar luminosity function derived from 2SLAQ is in good agreement with that derived from Type I quasars found in hard X-ray surveys.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a fast part-based subspace selection algorithm, termed the binary sparse nonnegative matrix factorization (B-SNMF). Both the training process and the testing process of B-SNMF are much faster than those of binary principal component analysis (B-PCA). Besides, B-SNMF is more robust to occlusions in images. Experimental results on face images demonstrate the effectiveness and the efficiency of the proposed B-SNMF.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ability to use Software Defined Radio (SDR) in the civilian mobile applications will make it possible for the next generation of mobile devices to handle multi-standard personal wireless devices and ubiquitous wireless devices. The original military standard created many beneficial characteristics for SDR, but resulted in a number of disadvantages as well. Many challenges in commercializing SDR are still the subject of interest in the software radio research community. Four main issues that have been already addressed are performance, size, weight, and power. ^ This investigation presents an in-depth study of SDR inter-components communications in terms of total link delay related to the number of components and packet sizes in systems based on Software Communication Architecture (SCA). The study is based on the investigation of the controlled environment platform. Results suggest that the total link delay does not linearly increase with the number of components and the packet sizes. The closed form expression of the delay was modeled using a logistic function in terms of the number of components and packet sizes. The model performed well when the number of components was large. ^ Based upon the mobility applications, energy consumption has become one of the most crucial limitations. SDR will not only provide flexibility of multi-protocol support, but this desirable feature will also bring a choice of mobile protocols. Having such a variety of choices available creates a problem in the selection of the most appropriate protocol to transmit. An investigation in a real-time algorithm to optimize energy efficiency was also performed. Communication energy models were used including switching estimation to develop a waveform selection algorithm. Simulations were performed to validate the concept.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A number of studies in the areas of Biomedical Engineering and Health Sciences have employed machine learning tools to develop methods capable of identifying patterns in different sets of data. Despite its extinction in many countries of the developed world, Hansen’s disease is still a disease that affects a huge part of the population in countries such as India and Brazil. In this context, this research proposes to develop a method that makes it possible to understand in the future how Hansen’s disease affects facial muscles. By using surface electromyography, a system was adapted so as to capture the signals from the largest possible number of facial muscles. We have first looked upon the literature to learn about the way researchers around the globe have been working with diseases that affect the peripheral neural system and how electromyography has acted to contribute to the understanding of these diseases. From these data, a protocol was proposed to collect facial surface electromyographic (sEMG) signals so that these signals presented a high signal to noise ratio. After collecting the signals, we looked for a method that would enable the visualization of this information in a way to make it possible to guarantee that the method used presented satisfactory results. After identifying the method's efficiency, we tried to understand which information could be extracted from the electromyographic signal representing the collected data. Once studies demonstrating which information could contribute to a better understanding of this pathology were not to be found in literature, parameters of amplitude, frequency and entropy were extracted from the signal and a feature selection was made in order to look for the features that better distinguish a healthy individual from a pathological one. After, we tried to identify the classifier that best discriminates distinct individuals from different groups, and also the set of parameters of this classifier that would bring the best outcome. It was identified that the protocol proposed in this study and the adaptation with disposable electrodes available in market proved their effectiveness and capability of being used in different studies whose intention is to collect data from facial electromyography. The feature selection algorithm also showed that not all of the features extracted from the signal are significant for data classification, with some more relevant than others. The classifier Support Vector Machine (SVM) proved itself efficient when the adequate Kernel function was used with the muscle from which information was to be extracted. Each investigated muscle presented different results when the classifier used linear, radial and polynomial kernel functions. Even though we have focused on Hansen’s disease, the method applied here can be used to study facial electromyography in other pathologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A number of studies in the areas of Biomedical Engineering and Health Sciences have employed machine learning tools to develop methods capable of identifying patterns in different sets of data. Despite its extinction in many countries of the developed world, Hansen’s disease is still a disease that affects a huge part of the population in countries such as India and Brazil. In this context, this research proposes to develop a method that makes it possible to understand in the future how Hansen’s disease affects facial muscles. By using surface electromyography, a system was adapted so as to capture the signals from the largest possible number of facial muscles. We have first looked upon the literature to learn about the way researchers around the globe have been working with diseases that affect the peripheral neural system and how electromyography has acted to contribute to the understanding of these diseases. From these data, a protocol was proposed to collect facial surface electromyographic (sEMG) signals so that these signals presented a high signal to noise ratio. After collecting the signals, we looked for a method that would enable the visualization of this information in a way to make it possible to guarantee that the method used presented satisfactory results. After identifying the method's efficiency, we tried to understand which information could be extracted from the electromyographic signal representing the collected data. Once studies demonstrating which information could contribute to a better understanding of this pathology were not to be found in literature, parameters of amplitude, frequency and entropy were extracted from the signal and a feature selection was made in order to look for the features that better distinguish a healthy individual from a pathological one. After, we tried to identify the classifier that best discriminates distinct individuals from different groups, and also the set of parameters of this classifier that would bring the best outcome. It was identified that the protocol proposed in this study and the adaptation with disposable electrodes available in market proved their effectiveness and capability of being used in different studies whose intention is to collect data from facial electromyography. The feature selection algorithm also showed that not all of the features extracted from the signal are significant for data classification, with some more relevant than others. The classifier Support Vector Machine (SVM) proved itself efficient when the adequate Kernel function was used with the muscle from which information was to be extracted. Each investigated muscle presented different results when the classifier used linear, radial and polynomial kernel functions. Even though we have focused on Hansen’s disease, the method applied here can be used to study facial electromyography in other pathologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a multiple robots formation manoeuvring and its collision avoidance strategy. The direction priority sequential selection algorithm is employed to achieve the raw path, and a new algorithm is then proposed to calculate the turning compliant waypoints supporting the multi-robot formation manoeuvre. The collision avoidance strategy based on the formation control is presented to translate the collision avoidance problem into the stability problem of the formation. The extension-decomposition-aggregation scheme is next applied to solve the formation control problem and subsequently achieve the collision avoidance during the formation manoeuvre. Simulation study finally shows that the collision avoidance problem can be conveniently solved if the stability of the constructed formation including unidentified objects can be satisfied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Artificial immune systems, more specifically the negative selection algorithm, have previously been applied to intrusion detection. The aim of this research is to develop an intrusion detection system based on a novel concept in immunology, the Danger Theory. Dendritic Cells (DCs) are antigen presenting cells and key to the activation of the human immune system. DCs perform the vital role of combining signals from the host tissue and correlate these signals with proteins known as antigens. In algorithmic terms, individual DCs perform multi-sensor data fusion based on time-windows. The whole population of DCs asynchronously correlates the fused signals with a secondary data stream. The behaviour of human DCs is abstracted to form the DC Algorithm (DCA), which is implemented using an immune inspired framework, libtissue. This system is used to detect context switching for a basic machine learning dataset and to detect outgoing portscans in real-time. Experimental results show a significant difference between an outgoing portscan and normal traffic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ability to use Software Defined Radio (SDR) in the civilian mobile applications will make it possible for the next generation of mobile devices to handle multi-standard personal wireless devices and ubiquitous wireless devices. The original military standard created many beneficial characteristics for SDR, but resulted in a number of disadvantages as well. Many challenges in commercializing SDR are still the subject of interest in the software radio research community. Four main issues that have been already addressed are performance, size, weight, and power. This investigation presents an in-depth study of SDR inter-components communications in terms of total link delay related to the number of components and packet sizes in systems based on Software Communication Architecture (SCA). The study is based on the investigation of the controlled environment platform. Results suggest that the total link delay does not linearly increase with the number of components and the packet sizes. The closed form expression of the delay was modeled using a logistic function in terms of the number of components and packet sizes. The model performed well when the number of components was large. Based upon the mobility applications, energy consumption has become one of the most crucial limitations. SDR will not only provide flexibility of multi-protocol support, but this desirable feature will also bring a choice of mobile protocols. Having such a variety of choices available creates a problem in the selection of the most appropriate protocol to transmit. An investigation in a real-time algorithm to optimize energy efficiency was also performed. Communication energy models were used including switching estimation to develop a waveform selection algorithm. Simulations were performed to validate the concept.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a methodology for short-term load forecasting based on genetic algorithm feature selection and artificial neural network modeling. A feed forward artificial neural network is used to model the 24-h ahead load based on past consumption, weather and stock index data. A genetic algorithm is used in order to find the best subset of variables for modeling. Three data sets of different geographical locations, encompassing areas of different dimensions with distinct load profiles are used in order to evaluate the methodology. The developed approach was found to generate models achieving a minimum mean average percentage error under 2 %. The feature selection algorithm was able to significantly reduce the number of used features and increase the accuracy of the models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.