904 resultados para Dunkl Kernel


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research is to develop an optimal kernel which would be used in a real-time engineering and communications system. Since the application is a real-time system, relevant real-time issues are studied in conjunction with kernel related issues. The emphasis of the research is the development of a kernel which would not only adhere to the criteria of a real-time environment, namely determinism and performance, but also provide the flexibility and portability associated with non-real-time environments. The essence of the research is to study how the features found in non-real-time systems could be applied to the real-time system in order to generate an optimal kernel which would provide flexibility and architecture independence while maintaining the performance needed by most of the engineering applications. Traditionally, development of real-time kernels has been done using assembly language. By utilizing the powerful constructs of the C language, a real-time kernel was developed which addressed the goals of flexibility and portability while still meeting the real-time criteria. The implementation of the kernel is carried out using the powerful 68010/20/30/40 microprocessor based systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Questo lavoro di tesi riguarda lo studio e l’implementazione di un algoritmo di multiple kernel learning (MKL) per la classificazione e la regressione di dati di neuroimaging ed, in particolare, di grafi di connettività funzionale. Gli algoritmi di MKL impiegano una somma pesata di vari kernel (ovvero misure di similarità) e permettono di selezionare le features utili alla discriminazione delle istanze durante l’addestramento del classificatore/regressore stesso. L’aspetto innovativo introdotto in questa tesi è stato lo studio di un nuovo kernel tra grafi di connettività funzionale, con la particolare caratteristica di conservare l’informazione relativa all’importanza di ogni singola region of interest (ROI) ed impiegando la norma lp come metodo per l’aggiornamento dei pesi, al fine di ottenere soluzioni sparsificate. L’algoritmo è stato validato utilizzando mappe di connettività sintetiche ed è stato applicato ad un dataset formato da 32 pazienti affetti da deterioramento cognitivo lieve e malattia dei piccoli vasi, di cui 16 sottoposti a riabilitazione cognitiva tra un’esame di risonanza ma- gnetica funzionale di baseline e uno di follow-up. Le mappe di con- nettività sono state ottenute con il toolbox CONN. Il classificatore è riuscito a discriminare i due gruppi di pazienti in una configurazione leave-one-out annidata con un’accuratezza dell’87.5%. Questo lavoro di tesi è stato svolto durante un periodo di ricerca presso la School of Computer Science and Electronic Engineering dell’University of Essex (Colchester, UK).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bahadur representation and its applications have attracted a large number of publications and presentations on a wide variety of problems. Mixing dependency is weak enough to describe the dependent structure of random variables, including observations in time series and longitudinal studies. This note proves the Bahadur representation of sample quantiles for strongly mixing random variables (including ½-mixing and Á-mixing) under very weak mixing coe±cients. As application, the asymptotic normality is derived. These results greatly improves those recently reported in literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Para entender nuestro proyecto, debemos comprender DEVS. Dentro de los formalismos más populares de representación de sistemas de eventos discretos se encuentra DES. En la década de los 70, el matemático Bernard Zeigler propuso un formalismo general para la representación de dichos sistemas. Este formalismo denominado DEVS (Discrete EVent System Specification) es el formalismo más general para el tratamiento de DES. DEVS permite representar todos aquellos sistemas cuyo comportamiento pueda describirse mediante una secuencia de eventos discretos. Estos eventos se caracterizan por un tiempo base en el que solo un número de eventos finitos puede ocurrir. DEVS Modelado y Simulación tiene múltiples implementaciones en varios lenguajes de programación como por ejemplo en Java, C# o C++. Pero surge la necesidad de implementar una plataforma distribuida estable para proporcionar la mecánica de interoperabilidad e integrar modelos DEVS diversificados. En este proyecto, se nos dará como código base el core de xDEVS en java, aplicado de forma secuencial y paralelizada. Nuestro trabajo será implementar el core de manera distribuida de tal forma que se pueda dividir un sistema DEVS en diversas máquinas. Para esto hemos utilizado sockets de java para hacer la transmisión de datos lo más eficiente posible. En un principio deberemos especificar el número de máquinas que se conectarán al servidor. Una vez estas se hayan conectado se les enviará el trabajo específico que deberán simular. Cabe destacar que hay dos formas de dividir un sistema DEVS las cuales están implementadas en nuestro proyecto. La primera es dividirlo en módulos atómicos los cuales son subsistemas indivisibles en un sistema DEVS. Y la segunda es dividir las funciones de todos los subsistemas en grupos y repartirlos entre las máquinas. En resumen el funcionamiento de nuestro sistema distribuido será comenzar ejecutando el trabajo asignado al primer cliente, una vez finalizado actualizará la información del servidor y este mandara la orden al siguiente y así sucesivamente.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we study the heat kernel, a useful tool to analyze various properties of different quantum field theories. In particular, we focus on the study of the one-loop effective action and the application of worldline path integrals to derive perturbatively the heat kernel coefficients for the Proca theory of massive vector fields. It turns out that the worldline path integral method encounters some difficulties if the differential operator of the heat kernel is of non-minimal kind. More precisely, a direct recasting of the differential operator in terms of worldline path integrals, produces in the classical action a non-perturbative vertex and the path integral cannot be solved. In this work we wish to find ways to circumvent this issue and to give a suggestion to solve similar problems in other contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shelled, roasted and salted cashew nut kernels were packaged in three different flexible materials (PP/PE= polypropylene / polyethylene; PETmet/PE= metallized polyethylene terephthalate / polyethylene; PET/Al/LDPE= polyethylene terephthalate / aluminum foil / low density polyethylene ), with different barrier properties. Kernels were stored for one year at 30° C and 80% relative humidity. Quantitative descriptive sensory analysis (QDA) were performed at the end of storage time. Descriptive terms obtained for kernels characterization were brown color, color uniformity and rugosity for appearance; toasted kernel, sweet, old and rancidity for odor; toasted kernel, sweet, old rancidity, salt and bitter for taste, crispness for texture. QDA showed that factors responsible for sensory quality decrease, after one year storage, were increase in old aroma and taste, increase in rancidity aroma and taste, decrease in roasted kernel aroma and taste, and decrease of crispness. Sensory quality decrease was higher in kernels packaged in PP/PE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivou-se identificar fatores associados ao edentulismo e o seu risco espacial em idosos. Foi realizado um estudo transversal em uma amostra de 372 indivíduos de 60 anos e mais, no Município de Botucatu, São Paulo, Brasil, em 2005. Razões de prevalência brutas e ajustadas foram estimadas por meio de regressão de Poisson, com estimativa robusta da variância e procedimentos de modelagem hierárquica. A análise espacial foi realizada por estimativas de densidade de Kernel. A prevalência de edentulismo foi de 63,17%. Os fatores sociodemográficos associados ao edentulismo foram a baixa escolaridade, o aumento do número de pessoas por cômodo, não possuir automóvel e idade mais avançada, presença de comorbidades, ausência de um cirurgião-dentista regular e ter realizado a última consulta há três anos ou mais. A análise espacial mostrou maior risco nas áreas periféricas. Obteve-se uma melhor compreensão da perda dentária entre os idosos, subsidiando o planejamento de ações em saúde coletiva.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this study was to evaluate the agronomic characteristics, bromatological-chemical composition and digestibility of 11 corn cultivars (Zea mays) harvested at two cutting heights. Cultivars D 766, D 657, D 1000, P 3021, P 3041, C 805, C 333, AG 5011, FO 01, CO 9621 and BR 205 were evaluated when they were harvested 5 cm above ground (low) and 5 cm below the insertion of the first ear (high). The experiment was designed as random blocks, with three replicates, arranged in an 11 x 2 factorial scheme. Cultivars presented similar productions of forage dry matter and grains. Percentages of stalk, leaf, straw, cob and kernel fractions were different among cultivars, as well as dry matter content of the whole plant at harvest. Considering the whole plant, only the contents of gross energy, nitrogen in neutral detergent fiber, and in vitro neutral and acid detergent fiber digestibility did not differ among cultivars. Increase on the cutting height improved forage quality due to the reduction of stalk and leaf fractions and contents of cell wall constituents.