904 resultados para Run length


Relevância:

100.00% 100.00%

Publicador:

Resumo:

N-gram analysis is an approach that investigates the structure of a program using bytes, characters or text strings. This research uses dynamic analysis to investigate malware detection using a classification approach based on N-gram analysis. A key issue with dynamic analysis is the length of time a program has to be run to ensure a correct classification. The motivation for this research is to find the optimum subset of operational codes (opcodes) that make the best indicators of malware and to determine how long a program has to be monitored to ensure an accurate support vector machine (SVM) classification of benign and malicious software. The experiments within this study represent programs as opcode density histograms gained through dynamic analysis for different program run periods. A SVM is used as the program classifier to determine the ability of different program run lengths to correctly determine the presence of malicious software. The findings show that malware can be detected with different program run lengths using a small number of opcodes

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a new multi-class steganalysis for binary image. The proposed method can identify the type of steganographic technique used by examining on the given binary image. In addition, our proposed method is also capable of differentiating an image with hidden message from the one without hidden message. In order to do that, we will extract some features from the binary image. The feature extraction method used is a combination of the method extended from our previous work and some new methods proposed in this paper. Based on the extracted feature sets, we construct our multi-class steganalysis from the SVM classifier. We also present the empirical works to demonstrate that the proposed method can effectively identify five different types of steganography.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a new blind steganalytic method to detect the presence of secret messages embedded in black and white images using the steganographic techniques. We start by extracting several sets of matrix, such as run length matrix, gap length matrix and pixel difference. We also apply characteristic function on these matrices to enhance their discriminative capabilities. Then we calculate the statistics which include mean, variance, kurtosis and skewness to form our feature sets. The presented empirical works demonstrate our proposed method can effectively detect three different types of steganography. This proves the universality of our proposed method as a blind steganalysis. In addition, the experimental results show our proposed method is capable of detecting small amount of the embedded message.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a new state transition based embedding (STBE) technique for audio watermarking with high fidelity. Furthermore, we propose a new correlation based encoding (CBE) scheme for binary logo image in order to enhance the payload capacity. The result of CBE is also compared with standard run-length encoding (RLE) compression and Huffman schemes. Most of the watermarking algorithms are based on modulating selected transform domain feature of an audio segment in order to embed given watermark bit. In the proposed STBE method instead of modulating feature of each and every segment to embed data, our aim is to retain the default value of this feature for most of the segments. Thus, a high quality of watermarked audio is maintained. Here, the difference between the mean values (Mdiff) of insignificant complex cepstrum transform (CCT) coefficients of down-sampled subsets is selected as a robust feature for embedding. Mdiff values of the frames are changed only when certain conditions are met. Hence, almost 50% of the times, segments are not changed and still STBE can convey watermark information at receiver side. STBE also exhibits a partial restoration feature by which the watermarked audio can be restored partially after extraction of the watermark at detector side. The psychoacoustic model analysis showed that the noise-masking ratio (NMR) of our system is less than -10dB. As amplitude scaling in time domain does not affect selected insignificant CCT coefficients, strong invariance towards amplitude scaling attacks is also proved theoretically. Experimental results reveal that the proposed watermarking scheme maintains high audio quality and are simultaneously robust to general attacks like MP3 compression, amplitude scaling, additive noise, re-quantization, etc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The authors analyzed several cytomorphonuclear parameters related to chromatin distribution and DNA ploidy in typical and atypical carcinoids and in small cell lung cancers. Nuclear measurements and analysis were performed with a SAMBA 200 (TITN, Grenoble, France) cell image processor with software allowing the discrimination of parameters computed on cytospin preparations of Feulgen-stained nuclei extracted from deparaffinized tumor tissues. The authors' results indicate a significant increase in DNA content--assessed by integrated optical density (IOD)--from typical carcinoids to small cell lung carcinomas, with atypical carcinoids showing an intermediate value. Parameters related to hyperchromatism (short and long run length and variance of optical density) also characterize the atypical carcinoids as being intermediate between typical carcinoids and small cell lung cancers. The systematic measurement of these cytomorphonuclear parameters seems to define an objective, reproducible "scale" of differentiation that helps to define the atypical carcinoid and may be of value in establishing cytologic criteria for differential diagnosis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The design of a System-on-a-Chip (SoC) demonstrator for a baseline JPEG encoder core is presented. This combines a highly optimized Discrete Cosine Transform (DCT) and quantization unit with an entropy coder which has been realized using off-the-shelf synthesizable IP cores (Run-length coder, Huffman coder and data packer). When synthesized in a 0.35 µm CMOS process, the core can operate at speeds up to 100 MHz and contains 50 k gates plus 11.5 kbits of RAM. This is approximately 20% less than similar JPEG encoder designs reported in literature. When targeted at FPGA the core can operate up to 30 MHz and is capable of compressing 9-bit full-frame color input data at NTSC or PAL rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ce mémoire a pour objectif de déterminer si les précipitations convectives estivales simulées par le modèle régional canadien du climat (MRCC) sont stationnaires ou non à travers le temps. Pour répondre à cette question, nous proposons une méthodologie statistique de type fréquentiste et une de type bayésien. Pour l'approche fréquentiste, nous avons utilisé le contrôle de qualité standard ainsi que le CUSUM afin de déterminer si la moyenne a augmenté à travers les années. Pour l'approche bayésienne, nous avons comparé la distribution a posteriori des précipitations dans le temps. Pour ce faire, nous avons modélisé la densité \emph{a posteriori} d'une période donnée et nous l'avons comparée à la densité a posteriori d'une autre période plus éloignée dans le temps. Pour faire la comparaison, nous avons utilisé une statistique basée sur la distance d'Hellinger, la J-divergence ainsi que la norme L2. Au cours de ce mémoire, nous avons utilisé l'ARL (longueur moyenne de la séquence) pour calibrer et pour comparer chacun de nos outils. Une grande partie de ce mémoire sera donc dédiée à l'étude de l'ARL. Une fois nos outils bien calibrés, nous avons utilisé les simulations pour les comparer. Finalement, nous avons analysé les données du MRCC pour déterminer si elles sont stationnaires ou non.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As the number of processors in distributed-memory multiprocessors grows, efficiently supporting a shared-memory programming model becomes difficult. We have designed the Protocol for Hierarchical Directories (PHD) to allow shared-memory support for systems containing massive numbers of processors. PHD eliminates bandwidth problems by using a scalable network, decreases hot-spots by not relying on a single point to distribute blocks, and uses a scalable amount of space for its directories. PHD provides a shared-memory model by synthesizing a global shared memory from the local memories of processors. PHD supports sequentially consistent read, write, and test- and-set operations. This thesis also introduces a method of describing locality for hierarchical protocols and employs this method in the derivation of an abstract model of the protocol behavior. An embedded model, based on the work of Johnson[ISCA19], describes the protocol behavior when mapped to a k-ary n-cube. The thesis uses these two models to study the average height in the hierarchy that operations reach, the longest path messages travel, the number of messages that operations generate, the inter-transaction issue time, and the protocol overhead for different locality parameters, degrees of multithreading, and machine sizes. We determine that multithreading is only useful for approximately two to four threads; any additional interleaving does not decrease the overall latency. For small machines and high locality applications, this limitation is due mainly to the length of the running threads. For large machines with medium to low locality, this limitation is due mainly to the protocol overhead being too large. Our study using the embedded model shows that in situations where the run length between references to shared memory is at least an order of magnitude longer than the time to process a single state transition in the protocol, applications exhibit good performance. If separate controllers for processing protocol requests are included, the protocol scales to 32k processor machines as long as the application exhibits hierarchical locality: at least 22% of the global references must be able to be satisfied locally; at most 35% of the global references are allowed to reach the top level of the hierarchy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Observations of turbulent fluxes of momentum, heat and moisture from low-level aircraft data are presented. Fluxes are calculated using the eddy covariance technique from flight legs typically ∼40 m above the sea surface. Over 400 runs of 2 min (∼12 km) from 26 flights are evaluated. Flight legs are mainly from around the British Isles although a small number are from around Iceland and Norway. Sea-surface temperature (SST) observations from two on-board sensors (the ARIES interferometer and a Heimann radiometer) and a satellite-based analysis (OSTIA) are used to determine an improved SST estimate. Most of the observations are from moderate to strong wind speed conditions, the latter being a regime short of validation data for the bulk flux algorithms that are necessary for numerical weather prediction and climate models. Observations from both statically stable and unstable atmospheric boundary-layer conditions are presented. There is a particular focus on several flights made as part of the DIAMET (Diabatic influence on mesoscale structures in extratropical storms) project. Observed neutral exchange coefficients are in the same range as previous studies, although higher for the momentum coefficient, and are broadly consistent with the COARE 3.0 bulk flux algorithm, as well as the surface exchange schemes used in the ECMWF and Met Office models. Examining the results as a function of aircraft heading shows higher fluxes and exchange coefficients in the across-wind direction, compared to along-wind (although this comparison is limited by the relatively small number of along-wind legs). A multi-resolution spectral decomposition technique demonstrates a lengthening of spatial scales in along-wind variances in along-wind legs, implying the boundary-layer eddies are elongated in the along-wind direction. The along-wind runs may not be able to adequately capture the full range of turbulent exchange that is occurring because elongation places the largest eddies outside of the run length.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Foram estudadas, pelo método da assinatura digital, 35 biópsias esofágicas provenientes de pacientes da província de Linxian, China, classificadas por dois observadores com ampla experiência em patologia gastrointestinal como normais, displasias ou carcinomas (8 casos normais, 6 displasias leves, 8 displasias moderadas, 4 displasias acentuadas, 4 carcinomas suspeitos de invasão e 5 carcinomas invasores). O objetivo do trabalho foi caracterizar os núcleos das populações celulares desses casos de forma que permitisse a derivação de informações diagnósticas e de possível implicação prognóstica a partir do estudo quantitativo das características nucleares de cada caso ou categoria diagnóstica. As biópsias foram coradas pelo método de Feulgen, sendo então selecionados e digitalizados 48 a 50 núcleos de cada uma delas. De cada núcleo foram extraídas 93 características cariométricas, arranjadas arbitrariamente em histograma designado como assinatura nuclear. Da média aritmética de cada característica dos núcleos de uma mesma biópsia resultou a assinatura digital do caso. A análise de funções discriminantes, baseada nas 15 características cariométricas que ofereceram melhor discriminação entre as categorias diagnósticas, mostrou que o grupo classificado como normal foi claramente distinto das demais categorias. A densidade óptica total aumentou progressivamente segundo a classificação das biópsias, do normal à displasia acentuada, sendo o valor do carcinoma semelhante ao da displasia moderada. A matriz de comprimento de seqüência apresentou o mesmo perfil, ou seja, ambas as características ofereceram discriminação clara entre as categorias diagnósticas, com exceção do carcinoma invasor, cujos valores foram superponíveis aos da displasia moderada. O estudo demonstrou a viabilidade da quantificação de características nucleares através das assinaturas nucleares digitais, que demonstraram diferenças estatisticamente significativas entre diferentes categorias diagnósticas e a elevação progressiva dos valores mensurados relacionados com o espectro das lesões, apresentando-as como um histograma (assinatura digital nuclear).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this article we consider a control chart based on the sample variances of two quality characteristics. The points plotted on the chart correspond to the maximum value of these two statistics. The main reason to consider the proposed chart instead of the generalized variance |S| chart is its better diagnostic feature, that is, with the new chart it is easier to relate an out-of-control signal to the variables whose parameters have moved away from their in-control values. We study the control chart efficiency considering different shifts in the covariance matrix. In this way, we obtain the average run length (ARL) that measures the effectiveness of a control chart in detecting process shifts. The proposed chart always detects process disturbances faster than the generalized variance |S| chart. The same is observed when the size of the samples is variable, except in a few cases in which the size of the samples switches between small size and very large size.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)