929 resultados para Lattice theory - Computer programs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A poster of this paper will be presented at the 25th International Conference on Parallel Architecture and Compilation Technology (PACT ’16), September 11-15, 2016, Haifa, Israel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A primary goal of this dissertation is to understand the links between mathematical models that describe crystal surfaces at three fundamental length scales: The scale of individual atoms, the scale of collections of atoms forming crystal defects, and macroscopic scale. Characterizing connections between different classes of models is a critical task for gaining insight into the physics they describe, a long-standing objective in applied analysis, and also highly relevant in engineering applications. The key concept I use in each problem addressed in this thesis is coarse graining, which is a strategy for connecting fine representations or models with coarser representations. Often this idea is invoked to reduce a large discrete system to an appropriate continuum description, e.g. individual particles are represented by a continuous density. While there is no general theory of coarse graining, one closely related mathematical approach is asymptotic analysis, i.e. the description of limiting behavior as some parameter becomes very large or very small. In the case of crystalline solids, it is natural to consider cases where the number of particles is large or where the lattice spacing is small. Limits such as these often make explicit the nature of links between models capturing different scales, and, once established, provide a means of improving our understanding, or the models themselves. Finding appropriate variables whose limits illustrate the important connections between models is no easy task, however. This is one area where computer simulation is extremely helpful, as it allows us to see the results of complex dynamics and gather clues regarding the roles of different physical quantities. On the other hand, connections between models enable the development of novel multiscale computational schemes, so understanding can assist computation and vice versa. Some of these ideas are demonstrated in this thesis. The important outcomes of this thesis include: (1) a systematic derivation of the step-flow model of Burton, Cabrera, and Frank, with corrections, from an atomistic solid-on-solid-type models in 1+1 dimensions; (2) the inclusion of an atomistically motivated transport mechanism in an island dynamics model allowing for a more detailed account of mound evolution; and (3) the development of a hybrid discrete-continuum scheme for simulating the relaxation of a faceted crystal mound. Central to all of these modeling and simulation efforts is the presence of steps composed of individual layers of atoms on vicinal crystal surfaces. Consequently, a recurring theme in this research is the observation that mesoscale defects play a crucial role in crystal morphological evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metamamterials are 1D, 2D or 3D arrays of articial atoms. The articial atoms, called "meta-atoms", can be any component with tailorable electromagnetic properties, such as resonators, LC circuits, nano particles, and so on. By designing the properties of individual meta-atoms and the interaction created by putting them in a lattice, one can create a metamaterial with intriguing properties not found in nature. My Ph. D. work examines the meta-atoms based on radio frequency superconducting quantum interference devices (rf-SQUIDs); their tunability with dc magnetic field, rf magnetic field, and temperature are studied. The rf-SQUIDs are superconducting split ring resonators in which the usual capacitance is supplemented with a Josephson junction, which introduces strong nonlinearity in the rf properties. At relatively low rf magnetic field, a magnetic field tunability of the resonant frequency of up to 80 THz/Gauss by dc magnetic field is observed, and a total frequency tunability of 100% is achieved. The macroscopic quantum superconducting metamaterial also shows manipulative self-induced broadband transparency due to a qualitatively novel nonlinear mechanism that is different from conventional electromagnetically induced transparency (EIT) or its classical analogs. A near complete disappearance of resonant absorption under a range of applied rf flux is observed experimentally and explained theoretically. The transparency comes from the intrinsic bi-stability and can be tuned on/ off easily by altering rf and dc magnetic fields, temperature and history. Hysteretic in situ 100% tunability of transparency paves the way for auto-cloaking metamaterials, intensity dependent filters, and fast-tunable power limiters. An rf-SQUID metamaterial is shown to have qualitatively the same behavior as a single rf-SQUID with regards to dc flux, rf flux and temperature tuning. The two-tone response of self-resonant rf-SQUID meta-atoms and metamaterials is then studied here via intermodulation (IM) measurement over a broad range of tone frequencies and tone powers. A sharp onset followed by a surprising strongly suppressed IM region near the resonance is observed. This behavior can be understood employing methods in nonlinear dynamics; the sharp onset, and the gap of IM, are due to sudden state jumps during a beat of the two-tone sum input signal. The theory predicts that the IM can be manipulated with tone power, center frequency, frequency difference between the two tones, and temperature. This quantitative understanding potentially allows for the design of rf-SQUID metamaterials with either very low or very high IM response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Apresenta·se um breve resumo histórico da evolução da amostragem por transectos lineares e desenvolve·se a sua teoria. Descrevemos a teoria de amostragem por transectos lineares, proposta por Buckland (1992), sendo apresentados os pontos mais relevantes, no que diz respeito à modelação da função de detecção. Apresentamos uma descrição do princípio CDM (Rissanen, 1978) e a sua aplicação à estimação de uma função densidade por um histograma (Kontkanen e Myllymãki, 2006), procedendo à aplicação de um exemplo prático, recorrendo a uma mistura de densidades. Procedemos à sua aplicação ao cálculo do estimador da probabilidade de detecção, no caso dos transectos lineares e desta forma estimar a densidade populacional de animais. Analisamos dois casos práticos, clássicos na amostragem por distâncias, comparando os resultados obtidos. De forma a avaliar a metodologia, simulámos vários conjuntos de observações, tendo como base o exemplo das estacas, recorrendo às funções de detecção semi-normal, taxa de risco, exponencial e uniforme com um cosseno. Os resultados foram obtidos com o programa DISTANCE (Thomas et al., in press) e um algoritmo escrito em linguagem C, cedido pelo Professor Doutor Petri Kontkanen (Departamento de Ciências da Computação, Universidade de Helsínquia). Foram desenvolvidos programas de forma a calcular intervalos de confiança recorrendo à técnica bootstrap (Efron, 1978). São discutidos os resultados finais e apresentadas sugestões de desenvolvimentos futuros. ABSTRACT; We present a brief historical note on the evolution of line transect sampling and its theoretical developments. We describe line transect sampling theory as proposed by Buckland (1992), and present the most relevant issues about modeling the detection function. We present a description of the CDM principle (Rissanen, 1978) and its application to histogram density estimation (Kontkanen and Myllymãki, 2006), with a practical example, using a mixture of densities. We proceed with the application and estimate probability of detection and animal population density in the context of line transect sampling. Two classical examples from the literature are analyzed and compared. ln order to evaluate the proposed methodology, we carry out a simulation study based on a wooden stakes example, and using as detection functions half normal, hazard rate, exponential and uniform with a cosine term. The results were obtained using program DISTANCE (Thomas et al., in press), and an algorithm written in C language, kindly offered by Professor Petri Kontkanen (Department of Computer Science, University of Helsinki). We develop some programs in order to estimate confidence intervals using the bootstrap technique (Efron, 1978). Finally, the results are presented and discussed with suggestions for future developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Career Academy instructors’ technical literacy is vital to the academic success of students. This nonexperimental ex post facto study examined the relationships between the level of technical literacy of instructors in career academies and student academic performance. It was also undertaken to explore the relationship between the pedagogical training of instructors and the academic performance of students. Out of a heterogeneous population of 564 teachers in six targeted schools, 136 teachers (26.0 %) responded to an online survey. The survey was designed to gather demographic and teaching experience data. Each demographic item was linked by researchers to teachers’ technology use in the classroom. Student achievement was measured by student learning gains as assessed by the reading section of the FCAT from the previous to the present school year. Linear and hierarchical regressions were conducted to examine the research questions. To clarify the possibility of teacher gender and teacher race/ethnic group differences by research variable, a series of one-way ANOVAs were conducted. As revealed by the ANOVA results, there were not statistically significant group differences in any of the research variables by teacher gender or teacher race/ethnicity. Greater student learning gains were associated with greater teacher technical expertise integrating computers and technology into the classroom, even after controlling for teacher attitude towards computers. Neither teacher attitude toward technology integration nor years of experience in integrating computers into the curriculum significantly predicted student learning gains in the regression models. Implications for HRD theory, research, and practice suggest that identifying teacher levels of technical literacy may help improve student academic performance by facilitating professional development strategies and new parameters for defining highly qualified instructors with 21st century skills. District professional development programs can benefit by increasing their offerings to include more computer and information communication technology courses. Teacher preparation programs can benefit by including technical literacy as part of their curriculum. State certification requirements could be expanded to include formal surveys to assess teacher use of technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of InGaAs metamorphic buffer layers (MBLs) to facilitate the growth of lattice-mismatched heterostructures constitutes an attractive approach to developing long-wavelength semiconductor lasers on GaAs substrates, since they offer the improved carrier and optical confinement associated with GaAs-based materials. We present a theoretical study of GaAs-based 1.3 and 1.55 μm (Al)InGaAs quantum well (QW) lasers grown on InGaAs MBLs. We demonstrate that optimised 1.3 μm metamorphic devices offer low threshold current densities and high differential gain, which compare favourably with InP-based devices. Overall, our analysis highlights and quantifies the potential of metamorphic QWs for the development of GaAs-based long-wavelength semiconductor lasers, and also provides guidelines for the design of optimised devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The very nature of computer science with its constant changes forces those who wish to follow to adapt and react quickly. Large companies invest in being up to date in order to generate revenue and stay active on the market. Universities, on the other hand, need to imply same practices of staying up to date with industry needs in order to produce industry ready engineers. By interviewing former students, now engineers in the industry, and current university staff this thesis aims to learn if there is space for enhancing the education through different lecturing approaches and/or curriculum adaptation and development. In order to address these concerns a qualitative research has been conducted, focusing on data collection obtained through semi-structured live world interviews. The method used follows the seven stages of research interviewing introduced by Kvale and focuses on collecting and preparing relevant data for analysis. The collected data is transcribed, refined, and further on analyzed in the “Findings and analysis” chapter. The focus of analyzing was answering the three research questions; learning how higher education impacts a Computer Science and Informatics Engineers’ job, how to better undergo the transition from studies to working in the industry and how to develop a curriculum that helps support the previous two. Unaltered quoted extracts are presented and individually analyzed. To paint a better picture a theme-wise analysis is presented summing valuable themes that were repeated throughout the interviewing phase. The findings obtained imply that there are several factors directly influencing the quality of education. From the student side, it mostly concerns expectation and dedication involving studies, and from the university side it is commitment to the curriculum development process. Due to the time and resource limitations this research provides findings conducted on a narrowed scope, although it can serve as a great foundation for further development; possibly as a PhD research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a covariant approach in Minkowski space for the description of quarks and mesons that exhibits both chiral-symmetry breaking and confinement. In a simple model for the interquark interaction, the quark mass function is obtained and used in the calculation of the pion form factor. We study the effects of the mass function and the different quark pole contributions on the pion form factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is focused on the viscoelastic behavior of macro-synthetic fiber-reinforced concrete (MSFRC) with polypropylene studied numerically when subjected to temperature variations (-30 oC to +60 oC). LDPM (lattice discrete particle model), a meso-scale model for heterogeneous composites, is used. To reproduce the MSFRC structural behavior, an extended version of LDPM that includes fiber effects through fiber-concrete interface micromechanics, called LDPM-F, is applied. Model calibration is performed based on three-point bending, cube, and cylinder test for plain concrete and MSFRC. This is followed by a comprehensive literature study on the variation of mechanical properties with temperature for individual fibers and plain concrete. This literature study and past experimental test results constitute inputs for final numerical simulations. The numerical response of MSFRC three-point bending test is replicated and compared with the previously conducted experimental test results; finally, the conclusions were drawn. LDPM numerical model is successfully calibrated using experimental responses on plain concrete. Fiber-concrete interface micro-mechanical parameters are subsequently fixed and LDPM-F models are calibrated based on MSFRC three-point bending test at room temperature. Number of fibers contributing crack bridging mechanism is computed and found to be in good agreement with experimental counts. Temperature variations model for individual constituents of MSFRC, fibers and plain concrete, are implemented in LDPM-F. The model is validated for MSFRC three-point bending stress-CMOD (crack mouth opening) response reproduced at -30 oC, -15 oC, 0 oC, +20 oC, +40 oC and +60 oC. It is found that the model can well describe the temperature variation behavior of MSFRC. At positive temperatures, simulated responses are in good agreement. Slight disagreement in negative regimes suggests an in-depth study on fiber-matrix interface bond behavior with varying temperatures.