927 resultados para Negative Selection Algorithm
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
The ability to use Software Defined Radio (SDR) in the civilian mobile applications will make it possible for the next generation of mobile devices to handle multi-standard personal wireless devices and ubiquitous wireless devices. The original military standard created many beneficial characteristics for SDR, but resulted in a number of disadvantages as well. Many challenges in commercializing SDR are still the subject of interest in the software radio research community. Four main issues that have been already addressed are performance, size, weight, and power. ^ This investigation presents an in-depth study of SDR inter-components communications in terms of total link delay related to the number of components and packet sizes in systems based on Software Communication Architecture (SCA). The study is based on the investigation of the controlled environment platform. Results suggest that the total link delay does not linearly increase with the number of components and the packet sizes. The closed form expression of the delay was modeled using a logistic function in terms of the number of components and packet sizes. The model performed well when the number of components was large. ^ Based upon the mobility applications, energy consumption has become one of the most crucial limitations. SDR will not only provide flexibility of multi-protocol support, but this desirable feature will also bring a choice of mobile protocols. Having such a variety of choices available creates a problem in the selection of the most appropriate protocol to transmit. An investigation in a real-time algorithm to optimize energy efficiency was also performed. Communication energy models were used including switching estimation to develop a waveform selection algorithm. Simulations were performed to validate the concept.^
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
The umbu tree (Spondias tuberosa Arruda) is a fruit native to the northeast of Brazil with great economic, social and ecological importance for the northeastern semi-arid region. Despite its role, the umbu tree has suffered negative pressure thanks to cluttered extractivism and to negative selection of its fruits, which as the deforestation and the dormancy of seeds contribute to the decrease of its production year after year, making necessary studies that contribute to the improvement of this specie and its conservation. Given the risks to the conservation of the specie and its usefulness to the population, the association between plant biotechnology, for being a tool that can be used to increase its production. and the perception of gathering communities, by valuing the point of view and the knowledge of the population, can facilitate its conservation. This work aimed to develop methods of propagation for umbu tree as well as contribute to its conservation by using biotechnology, with specific objectives to contribute to the conservation of this species; determine concentrations of BAP and ANA in the formation of buds; testing the efficiency of different substrates and concentrations of gibberellic acid on germination in vitro and ex vitro, as well as capture the perception of families in communities that engage in the gathering of umbu. To study the germination, the seeds were inoculated in different substrates (vermiculite, vermiculite + clay, clay, clay + manure and manure + vermiculite) and in different concentrations of gibberellic acid (0 mg, 250 g and 500 mg). For the formation of buds BAP to 0.1 mg-1 was associated with different concentrations of ANA (0.2; 0.4; 0.8mg.L-1). The study of perception was conducted by applying semi-structured questionnaire with Malhada Vermelha community. The experiments resulted in vermiculite and concentration of 500 mg gibberellic acid as the best for germination. The association of 0.1 mg.L-1 of BAP to 0.2 mg.L-1 of ANA provided better formation of buds. As to the application of questionnaires, they revealed that the population understands the decreased amount of umbu plants and umbu fruit in the region, as well as shows concern for its conservation.
Resumo:
A number of studies in the areas of Biomedical Engineering and Health Sciences have employed machine learning tools to develop methods capable of identifying patterns in different sets of data. Despite its extinction in many countries of the developed world, Hansen’s disease is still a disease that affects a huge part of the population in countries such as India and Brazil. In this context, this research proposes to develop a method that makes it possible to understand in the future how Hansen’s disease affects facial muscles. By using surface electromyography, a system was adapted so as to capture the signals from the largest possible number of facial muscles. We have first looked upon the literature to learn about the way researchers around the globe have been working with diseases that affect the peripheral neural system and how electromyography has acted to contribute to the understanding of these diseases. From these data, a protocol was proposed to collect facial surface electromyographic (sEMG) signals so that these signals presented a high signal to noise ratio. After collecting the signals, we looked for a method that would enable the visualization of this information in a way to make it possible to guarantee that the method used presented satisfactory results. After identifying the method's efficiency, we tried to understand which information could be extracted from the electromyographic signal representing the collected data. Once studies demonstrating which information could contribute to a better understanding of this pathology were not to be found in literature, parameters of amplitude, frequency and entropy were extracted from the signal and a feature selection was made in order to look for the features that better distinguish a healthy individual from a pathological one. After, we tried to identify the classifier that best discriminates distinct individuals from different groups, and also the set of parameters of this classifier that would bring the best outcome. It was identified that the protocol proposed in this study and the adaptation with disposable electrodes available in market proved their effectiveness and capability of being used in different studies whose intention is to collect data from facial electromyography. The feature selection algorithm also showed that not all of the features extracted from the signal are significant for data classification, with some more relevant than others. The classifier Support Vector Machine (SVM) proved itself efficient when the adequate Kernel function was used with the muscle from which information was to be extracted. Each investigated muscle presented different results when the classifier used linear, radial and polynomial kernel functions. Even though we have focused on Hansen’s disease, the method applied here can be used to study facial electromyography in other pathologies.
Resumo:
A number of studies in the areas of Biomedical Engineering and Health Sciences have employed machine learning tools to develop methods capable of identifying patterns in different sets of data. Despite its extinction in many countries of the developed world, Hansen’s disease is still a disease that affects a huge part of the population in countries such as India and Brazil. In this context, this research proposes to develop a method that makes it possible to understand in the future how Hansen’s disease affects facial muscles. By using surface electromyography, a system was adapted so as to capture the signals from the largest possible number of facial muscles. We have first looked upon the literature to learn about the way researchers around the globe have been working with diseases that affect the peripheral neural system and how electromyography has acted to contribute to the understanding of these diseases. From these data, a protocol was proposed to collect facial surface electromyographic (sEMG) signals so that these signals presented a high signal to noise ratio. After collecting the signals, we looked for a method that would enable the visualization of this information in a way to make it possible to guarantee that the method used presented satisfactory results. After identifying the method's efficiency, we tried to understand which information could be extracted from the electromyographic signal representing the collected data. Once studies demonstrating which information could contribute to a better understanding of this pathology were not to be found in literature, parameters of amplitude, frequency and entropy were extracted from the signal and a feature selection was made in order to look for the features that better distinguish a healthy individual from a pathological one. After, we tried to identify the classifier that best discriminates distinct individuals from different groups, and also the set of parameters of this classifier that would bring the best outcome. It was identified that the protocol proposed in this study and the adaptation with disposable electrodes available in market proved their effectiveness and capability of being used in different studies whose intention is to collect data from facial electromyography. The feature selection algorithm also showed that not all of the features extracted from the signal are significant for data classification, with some more relevant than others. The classifier Support Vector Machine (SVM) proved itself efficient when the adequate Kernel function was used with the muscle from which information was to be extracted. Each investigated muscle presented different results when the classifier used linear, radial and polynomial kernel functions. Even though we have focused on Hansen’s disease, the method applied here can be used to study facial electromyography in other pathologies.
Resumo:
Cancer comprises a collection of diseases, all of which begin with abnormal tissue growth from various stimuli, including (but not limited to): heredity, genetic mutation, exposure to harmful substances, radiation as well as poor dieting and lack of exercise. The early detection of cancer is vital to providing life-saving, therapeutic intervention. However, current methods for detection (e.g., tissue biopsy, endoscopy and medical imaging) often suffer from low patient compliance and an elevated risk of complications in elderly patients. As such, many are looking to “liquid biopsies” for clues into presence and status of cancer due to its minimal invasiveness and ability to provide rich information about the native tumor. In such liquid biopsies, peripheral blood is drawn from patients and is screened for key biomarkers, chiefly circulating tumor cells (CTCs). Capturing, enumerating and analyzing the genetic and metabolomic characteristics of these CTCs may hold the key for guiding doctors to better understand the source of cancer at an earlier stage for more efficacious disease management.
The isolation of CTCs from whole blood, however, remains a significant challenge due to their (i) low abundance, (ii) lack of a universal surface marker and (iii) epithelial-mesenchymal transition that down-regulates common surface markers (e.g., EpCAM), reducing their likelihood of detection via positive selection assays. These factors potentiate the need for an improved cell isolation strategy that can collect CTCs via both positive and negative selection modalities as to avoid the reliance on a single marker, or set of markers, for more accurate enumeration and diagnosis.
The technologies proposed herein offer a unique set of strategies to focus, sort and template cells in three independent microfluidic modules. The first module exploits ultrasonic standing waves and a class of elastomeric particles for the rapid and discriminate sequestration of cells. This type of cell handling holds promise not only in sorting, but also in the isolation of soluble markers from biofluids. The second module contains components to focus (i.e., arrange) cells via forces from acoustic standing waves and separate cells in a high throughput fashion via free-flow magnetophoresis. The third module uses a printed array of micromagnets to capture magnetically labeled cells into well-defined compartments, enabling on-chip staining and single cell analysis. These technologies can operate in standalone formats, or can be adapted to operate with established analytical technologies, such as flow cytometry. A key advantage of these innovations is their ability to process erythrocyte-lysed blood in a rapid (and thus high throughput) fashion. They can process fluids at a variety of concentrations and flow rates, target cells with various immunophenotypes and sort cells via positive (and potentially negative) selection. These technologies are chip-based, fabricated using standard clean room equipment, towards a disposable clinical tool. With further optimization in design and performance, these technologies might aid in the early detection, and potentially treatment, of cancer and various other physical ailments.
Resumo:
Breast cancer is the most frequently diagnosed cancer in women, accounting for over 25% of cancer diagnoses and 13% of cancer-related deaths in Canadian women. There are many types of therapies for treatment or management of breast cancer, with chemotherapy being one of the most widely used. Taxol (paclitaxel) is one of the most extensively used chemotherapeutic agents for treating cancers of the breast and numerous other sites. Taxol stabilizes microtubules during mitosis, causing the cell cycle to arrest until eventually the cell undergoes apoptosis. Although Taxol has had significant benefits in many patients, response rates range from only 25-69%, and over half of Taxol-treated patients eventually acquire resistance to the drug. Drug resistance remains one of the greatest barriers to effective cancer treatment, yet little has been discerned regarding resistance to Taxol, despite its widespread clinical use. Kinases are known to be heavily involved in cancer development and progression, and several kinases have been linked to resistance of Taxol and other chemotherapeutic agents. However, a systematic screen for kinases regulating Taxol resistance is lacking. Thus, in this study, a set of kinome-wide screens was conducted to interrogate the involvement of kinases in the Taxol response. Positive-selection and negative-selection CRISPR-Cas9 screens were conducted, whereby a pooled library of 5070 sgRNAs targeted 507 kinase-encoding genes in MCF-7 breast cancer cells that were Taxol-sensitive (WT) or Taxol-resistant (TxR) which were then treated with Taxol. Next generation sequencing (NGS) was performed on cells that survived Taxol treatment, allowing identification and quantitation of sgRNAs. STK38, Blk, FASTK and Nek3 stand out as potentially critical kinases for Taxol-induced apoptosis to occur. Furthermore, kinases CDKL1 and FRK may have a role in Taxol resistance. Further validation of these candidate kinases will provide novel pre-clinical data about potential predictive biomarkers or therapeutic targets for breast cancer patients in the future.
Resumo:
This paper presents a multiple robots formation manoeuvring and its collision avoidance strategy. The direction priority sequential selection algorithm is employed to achieve the raw path, and a new algorithm is then proposed to calculate the turning compliant waypoints supporting the multi-robot formation manoeuvre. The collision avoidance strategy based on the formation control is presented to translate the collision avoidance problem into the stability problem of the formation. The extension-decomposition-aggregation scheme is next applied to solve the formation control problem and subsequently achieve the collision avoidance during the formation manoeuvre. Simulation study finally shows that the collision avoidance problem can be conveniently solved if the stability of the constructed formation including unidentified objects can be satisfied.
Resumo:
Susceptibility to autoimmune diseases results from the encounter of a complex and long evolved genetic context with a no less complex and changing environment. Major actors in maintaining health are regulatory T cells (Treg) that primarily dampen a large subset of autoreactive lymphocytes escaping thymic negative selection. Here, we directly asked whether Treg participate in defining susceptibility and resistance to Experimental Autoimmune Prostatitis (EAP). We analyzed three common laboratory strains of mice presenting with different susceptibility to autoimmune prostatitis upon immunization with prostate proteins. The NOD, the C57BL/6 and the BALB/c mice that can be classified along a disease score ranging from severe, mild and to undetectable, respectively. Upon mild and transient depletion of Treg at the induction phase of EAP, each model showed an increment along this score, most remarkably with the BALB/c mice switching from a resistant to a susceptible phenotype. We further show that disease associates with the upregulation of CXCR3 expression on effector T cells, a process requiring IFNγ. Together with recent advances on environmental factors affecting Treg, these findings provide a likely cellular and molecular explanation to the recent rise in autoimmune diseases incidence.
Resumo:
The ability to use Software Defined Radio (SDR) in the civilian mobile applications will make it possible for the next generation of mobile devices to handle multi-standard personal wireless devices and ubiquitous wireless devices. The original military standard created many beneficial characteristics for SDR, but resulted in a number of disadvantages as well. Many challenges in commercializing SDR are still the subject of interest in the software radio research community. Four main issues that have been already addressed are performance, size, weight, and power. This investigation presents an in-depth study of SDR inter-components communications in terms of total link delay related to the number of components and packet sizes in systems based on Software Communication Architecture (SCA). The study is based on the investigation of the controlled environment platform. Results suggest that the total link delay does not linearly increase with the number of components and the packet sizes. The closed form expression of the delay was modeled using a logistic function in terms of the number of components and packet sizes. The model performed well when the number of components was large. Based upon the mobility applications, energy consumption has become one of the most crucial limitations. SDR will not only provide flexibility of multi-protocol support, but this desirable feature will also bring a choice of mobile protocols. Having such a variety of choices available creates a problem in the selection of the most appropriate protocol to transmit. An investigation in a real-time algorithm to optimize energy efficiency was also performed. Communication energy models were used including switching estimation to develop a waveform selection algorithm. Simulations were performed to validate the concept.
Resumo:
Les translocations chromosomiques du gène MLL sont connues pour mener au développement de leucémies aiguës. La translocation avec un de ses partenaires de fusion les plus communs, ENL, peut engendrer des leucémies aiguës de plusieurs types différents pour cette même translocation. Une fois la leucémogenèse initiée par la fusion MLL-ENL, son rôle quant au maintien du phénotype leucémique n’est pas encore bien connu à ce jour. Pour mieux comprendre l’importance de MLL-ENL après la leucémogenèse, des cellules souches/progénitrices de sang de cordon ombilical humain purifiées ont ainsi été transduites par un virus exprimant le gène de fusion MLL-ENL bordé par des sites LoxP ainsi que le marqueur eGFP. Ces cellules infectées ont ensuite été injectées dans notre modèle de souris immunodéficientes irradiées et placées sous observation pendant 24 semaines pour voir le développement de leucémies aiguës. Elles ont alors été sacrifiées et les cellules la moelle osseuse et de la rate ont ensuite été analysées par cytométrie en flux pour déterminer si la xénogreffe a engendré une leucémie dans notre modèle ainsi que le phénotype de celle-ci. Les souris injectées par les cellules infectées par le MLL-ENL ont généré uniquement des leucémies lymphoïdes aiguës de type B. Les cellules de ces leucémies primaires isolées ont été par la suite infectées par un lentivirus exprimant la cre-recombinase et le marqueur BFP afin d’exciser le gène MLL-ENL des cellules leucémiques grâce aux sites LoxP. Les cellules ont ensuite été triées pour le marqueur BFP et injectées dans des souris secondaires pour de voir si les cellules leucémiques souches pouvaient toujours régénérer la leucémie. Les conséquences de l’absence de MLL-ENL dans le maintien du phénotype leucémique n’ont cependant pas pu être vérifiées à cause d’une erreur dans la séquence de la cre-recombinase, mais nous avons observé la régénération des leucémies secondaires.
Resumo:
This paper presents a methodology for short-term load forecasting based on genetic algorithm feature selection and artificial neural network modeling. A feed forward artificial neural network is used to model the 24-h ahead load based on past consumption, weather and stock index data. A genetic algorithm is used in order to find the best subset of variables for modeling. Three data sets of different geographical locations, encompassing areas of different dimensions with distinct load profiles are used in order to evaluate the methodology. The developed approach was found to generate models achieving a minimum mean average percentage error under 2 %. The feature selection algorithm was able to significantly reduce the number of used features and increase the accuracy of the models.
Resumo:
The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.
Resumo:
Lynch syndrome is one of the most common hereditary colorectal cancer (CRC) syndrome and is caused by germline mutations of MLH1, MSH2 and more rarely MSH6, PMS2, MLH3 genes. Whereas the absence of MSH2 protein is predictive of Lynch syndrome, it is not the case for the absence of MLH1 protein. The purpose of this study was to develop a sensitive and cost effective algorithm to select Lynch syndrome cases among patients with MLH1 immunohistochemical silencing. Eleven sporadic CRC and 16 Lynch syndrome cases with MLH1 protein abnormalities were selected. The BRAF c.1799T> A mutation (p.Val600Glu) was analyzed by direct sequencing after PCR amplification of exon 15. Methylation of MLH1 promoter was determined by Methylation-Sensitive Single-Strand Conformation Analysis. In patients with Lynch syndrome, there was no BRAF mutation and only one case showed MLH1 methylation (6%). In sporadic CRC, all cases were MLH1 methylated (100%) and 8 out of 11 cases carried the above BRAF mutation (73%) whereas only 3 cases were BRAF wild type (27%). We propose the following algorithm: (1) no further molecular analysis should be performed for CRC exhibiting MLH1 methylation and BRAF mutation, and these cases should be considered as sporadic CRC; (2) CRC with unmethylated MLH1 and negative for BRAF mutation should be considered as Lynch syndrome; and (3) only a small fraction of CRC with MLH1 promoter methylation but negative for BRAF mutation should be true Lynch syndrome patients. These potentially Lynch syndrome patients should be offered genetic counselling before searching for MLH1 gene mutations.