997 resultados para 662


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Knapsack Cryptosystem of Merkle and Hellman, 1978, is one of the earliest public-key cryptography schemes. The security of the method relies on the difficulty in solving Subset Sum Problems (also known as Knapsack Problems). In this paper, we first provide a brief history of knapsack-based cryptosystems and their cryptanalysis attacks. Following that, we review the advances in integer programming approaches to 0 − 1 Knapsack Problems, with a focus on the polyhedral studies of the convex hull of the integer set. Last of all, we discuss potential future research directions in applying integer programming in the cryptanalysis of knapsack ciphers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Images published on online social sites such as Facebook are increasingly prone to be misused for malicious purposes. However, existing image forensic research assumes that the investigator can confiscate every piece of evidence and hence overlooks the fact that the original image is difficult to obtain. Because Facebook applies a Discrete Cosine Transform (DCT)-based compression on uploaded images, we are able to detect the modified images which are re-uploaded to Facebook. Specifically, we propose a novel method to effectively detect the presence of double compression via the spatial domain of the image: We select small image patches from a given image, define a distance metric to measure the differences between compressed images, and propose an algorithm to infer whether the given image is double compressed without referring to the original image. To demonstrate the correctness of our algorithm, we correctly predict the number of compressions being applied to a Facebook image.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reduction of size of ensemble classifiers is important for various security applications. The majority of known pruning algorithms belong to the following three categories: ranking based, clustering based, and optimization based methods. The present paper introduces and investigates a new pruning technique. It is called a Three-Level Pruning Technique, TLPT, because it simultaneously combines all three approaches in three levels of the process. This paper investigates the TLPT method combining the state-of-the-art ranking of the Ensemble Pruning via Individual Contribution ordering, EPIC, the clustering of the K-Means Pruning, KMP, and the optimisation method of Directed Hill Climbing Ensemble Pruning, DHCEP, for a phishing dataset. Our new experiments presented in this paper show that the TLPT is competitive in comparison to EPIC, KMP and DHCEP, and can achieve better outcomes. These experimental results demonstrate the effectiveness of the TLPT technique in this example of information security application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports the increasing popularity of outsourcing academic works by university students motivated by the lure of lucrative dividends and visa opportunities. Due to a lack of formal methods in detecting such transactions, freelance websites are thriving in facilitating the trade of outsourced assignments. This is compounded by the fact that many university staff have neither the time nor training to perform complex media analysis and forensic investigations. This paper proposes a method to aid in the identification of those who outsource assignment works on the most popular site freelancer.com. We include a recent real-world case study to demonstrate the relevancy and applicability of our methodology. In this case study, a suspect attempts to evade detection via use of anti-forensics which demonstrates the capability and awareness of evasion techniques used by students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phylogenetic generalised least squares (PGLS) is one of the most commonly employed phylogenetic comparative methods. The technique, a modification of generalised least squares, uses knowledge of phylogenetic relationships to produce an estimate of expected covariance in cross-species data. Closely related species are assumed to have more similar traits because of their shared ancestry and hence produce more similar residuals from the least squares regression line. By taking into account the expected covariance structure of these residuals, modified slope and intercept estimates are generated that can account for interspecific autocorrelation due to phylogeny. Here, we provide a basic conceptual background to PGLS, for those unfamiliar with the approach. We describe the requirements for a PGLS analysis and highlight the packages that can be used to implement the method. We show how phylogeny is used to calculate the expected covariance structure in the data and how this is applied to the generalised least squares regression equation. We demonstrate how PGLS can incorporate information about phylogenetic signal, the extent to which closely related species truly are similar, and how it controls for this signal appropriately, thereby negating concerns about unnecessarily ‘correcting’ for phylogeny. In addition to discussing the appropriate way to present the results of PGLS analyses, we highlight some common misconceptions about the approach and commonly encountered problems with the method. These include misunderstandings about what phylogenetic signal refers to in the context of PGLS (residuals errors, not the traits themselves), and issues associated with unknown or uncertain phylogeny.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the increase use of location-based services, location privacy has recently raised serious concerns. To protect a user from being identified, a cloaked spatial region that contains other k-1 nearest neighbors of the user is used to replace the accurate position. In this paper, we consider location-aware applications that services are different among regions. To search nearest neighbors, we define a novel distance measurement that combines the semantic distance and the Euclidean distance to address the privacy preserving issue in the above-mentioned applications. We also propose an algorithm kNNH to implement our proposed method. The experimental results further suggest that the proposed distance metric and the algorithm can successfully retain the utility of the location services while preserving users’ privacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The book provides an analysis of the grocery retail market in a very large number of countries with an international report written by an economist. The chapter on Australia describes how the law in Australia addresses competition concerns arising from the grocery retail market, analyses its success in addressing these concerns and considers possible future reform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Problem gamblers are not a homogeneous group and recent data suggest that subtyping can improve treatment outcomes. This study administered three readiness rulers and aimed to identify subtypes of gamblers accessing a national web-based counselling service based on these rulers. METHODS: Participants were 1204 gamblers (99.4% problem gamblers) who accessed a single session of web-based counselling in Australia. Measures included three readiness rulers (importance, readiness and confidence to resist an urge to gamble), demographics and the Problem Gambling Severity Index (PGSI). RESULTS: Gamblers reported high importance of change [mean = 9.2, standard deviation (SD) = 1.51] and readiness to change (mean = 8.86, SD = 1.84), but lower confidence to resist an urge to gamble (mean = 3.93, SD = 2.44) compared with importance and readiness. The statistical fit indices of a latent class analysis identified a four-class model. Subtype 1 was characterized by a very high readiness to change and very low confidence to resist an urge to gamble (n = 662, 55.0%) and subtype 2 reported high readiness and low confidence (n = 358, 29.7%). Subtype 3 reported moderate ratings on all three rulers (n = 139, 11.6%) and subtype 4 reported high importance of change but low readiness and confidence (n = 45, 3.7%). A multinomial logistic regression indicated that subtypes differed by gender (P < 0.001), age (P = 0.01), gambling activity (P < 0.05), preferred mode of gambling (P < 0.001) and PGSI score (P < 0.001). CONCLUSIONS: Problem gamblers in Australia who seek web-based counselling comprise four distinct subgroups based on self-reported levels of readiness to change, confidence to resist the urge to gamble and importance of change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Requirements written in multiple languages can lead to error-proneness, inconsistency and incorrectness. In a Malaysian setting, software engineers are exposed to both Malay and English requirements. This can be a challenging task for them especially when capturing and analyzing requirements. Further, they face difficulties to model requirements using semi-formal or formal models. This paper introduces a new approach, Pair-Oriented Requirements Engineering (PORE) that uses an Essential Use Case (EUC) model to capture and analyze multi-lingual requirements. This approach is intended to assist practitioners in developing correct and consistent requirements as well as developing teamwork skills. Two quasi-experiment studies involving 80 participants in the first study and 38 participants in a subsequent study were conducted to evaluate the effectiveness of this approach with respect to correctness and time spent in capturing multi-lingual requirements. It was found that PORE improves accuracy and hence helps users perform better in developing high quality requirements models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Capturing security requirements is a complex process, but it is crucial to the success of a secure software product. Hence, requirements engineers need to have security knowledge when eliciting and analyzing the security requirements from business requirements. However, the majority of requirements engineers lack such knowledge and skills, and they face difficulties to capture and understand many security terms and issues. This results in capturing inaccurate, inconsistent and incomplete security requirements that in turn may lead to insecure software systems. In this paper, we describe a new approach of capturing security requirements using an extended Essential Use Cases (EUCs) model. This approach enhances the process of capturing and analyzing security requirements to produce accurate and complete requirements. We have evaluated our prototype tool using usability testing and assessment of the quality of our generated EUC security patterns by security engineering experts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada ao Programa de Pós- Graduação em Comunicação da Universidade Municipal de São Caetano do Sul para a obtenção do título de Mestre em Comunicação

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A cardiomioplastia (CDM) tem sido proposta com uma alternativa de tratamento cirúrgico para pacientes em estado avançado de cardiomiopatia dilatada e isquêmica. Os resultados clínicos e experimentais demonstram que este procedimento atenua o processo de remodelamento ventricular, através da compressão dinâmica ou passiva do miocárdio pelo grande dorsal (GD). Além disso, estudos observaram formação de vasos colaterais do GD para o coração após a CDM. O infarto do miocárdio (IM) induz disfunção e remodelamento ventricular e tem sido muito utilizado na literatura como modelo experimental de isquemia miocárdica. A aplicação de fatores angiogênicos diretamente no miocárdio isquêmico tem mostrado resultados positivos na estimulação da formação de colaterais. O objetivo do presente estudo foi avaliar os efeitos da CDM associada ao tratamento com VEGF165 na função ventricular e no desenvolvimento de fluxo colateral extramiocárdico em ratos infartados. Foram utilizados ratos machos Wistar (n=57, 220-250g) divididos em grupos infartados e controles. As alterações temporais induzidas pelo IM (ligadura da artéria coronária esquerda) foram avaliadas aos 14 (IM-14) e aos 56 (IM-56) dias pós IM sendo comparadas com seus respectivos controles (C-14 e C-56). Animais controles (C-CDM) e infartados (IM-CDM) foram submetidos à CDM passiva (sem estimulação do GD) após 14 dias de IM e avaliados aos 56 dias. Ratos controles foram submetidos à cirurgia fictícia de IM e de CDM (S-IMCDM) e ratos infartados à cirurgia fictícia de CDM (IM-SCDM) a fim de verificar eventuais alterações induzidas pelos procedimentos cirúrgicos. Um grupo de ratos infartados recebeu a administração de uma dose de 25µg de VEGF165 na artéria principal do GD imediatamente antes da CDM (14 dias de IM) e foi avaliados aos 56 dias (IMCDM-VEGF). Ao final do protocolo os animais foram anestesiados (pentobarbital sódico, 40mg/Kg) e a artéria carótida direita foi canulada para registro da PA. Logo após, esta cânula foi inserida no ventrículo esquerdo (VE) para registro da pressão ventricular. O registro e processamento dos sinais de pressão foram realizados utilizando-se um sistema de aquisição de sinais (CODAS, 1 Khz). O débito cardíaco (DC) e os fluxos regionais (coração e rins) foram avaliados através da infusão de 300 000 microesferas azuis no ventrículo esquerdo. Após a infusão de 50 000 microesferas amarelas na artéria principal do GD o fluxo colateral extramiocárdico do GD para o coração (FCO GD→coração) foi quantificado através da divisão do número de microesferas amarelas no coração pelo número de microesferas amarelas no GD. Após a oclusão da artéria do GD foram infundidas 300 000 microesferas azuis no VE e o fluxo colateral extramiocárdico do coração para o GD (FCO coração→GD) foi avaliado pela divisão do número de microesferas azuis no GD pelo número de microesferas azuis no coração. O IM induziu hipotensão e aumento da pressão diastólica final (PDF) nos grupos IM-14 (84±6 e 6,88±2,6 mmHg) e IM-56 (98±3 e 15,4±2 mmHg) em relação aos seus respectivos controles (C-14: 102±4 e –3,2±0,5; C-56: 114±3 e 0,5±1,7 mmHg). O débito cárdiaco (DC) foi menor no grupo IM-56 (49,5±9 ml/min) em relação ao grupo IM-14 (72±9 ml/min). A máxima velocidade de relaxamento do VE (-dP/dt) estava reduzida nos grupos IM-14 (-2416±415 vs -4141±309 mmHg/seg nos C-14) e IM-56 (-3062±254 vs -4191±354 mmHg/seg nos C-56) e a de contração do VE (+dP/dt) somente no grupo IM-56 (4191±354 vs 5420±355 mmHg/seg nos C-56). O IM não alterou o fluxo e a resistência vascular coronariana, no entanto, o fluxo renal estava reduzido e a resistência renal aumentada no grupo IM-56 quando comparados ao grupo C-56. Os animais com 56 dias de IM apresentaram aumento de massa ventricular (pv) e da razão peso ventricular/peso corporal (pv/pc) em relação aos controles (1,3±0,04 vs 0,98±0,04 23 g e 3,37±0,08 vs 2,54±0,09 mg/g nos C-56). O tamanho do infarto foi menor no grupo IM-14 (35±3 % do VE) em relação ao grupo IM-56 (44±2 % do VE). Os grupos sham não apresentaram alterações nos parâmetros avaliados em relação aos seus controles. Os ratos infartados submetidos à CDM não apresentaram hipotensão (105±2 mmHg), nem aumento da PDF (4,8±1,7 mmHg) conforme observado no grupo IM-56. O FC, o DC, a RVP e os fluxos e a resistência vascular coronariana foram semelhantes entre os grupos C-56, IM-56, C-CDM e IM-CDM. A +dP/dt e a –dP/dt mostraram-se reduzidas nos grupos C-CDM e IM-CDM em relação ao grupo C-56. O fluxo e a resistência vascular renal estavam normalizadas nos ratos IM-CDM. O pv (1,11±0,04g) e a razão pv/pc (2,94±0,09 mg/g) apresentaram-se similares aos valores do grupo C-56 e o tamanho do IM foi semelhante entre os grupo IM-56 e IM-CDM (44±2 vs 45±3 % do VE). O grupo IMCDM-VEGF apresentou normalização dos parâmetros hemodinâmicos e morfométricos de forma semelhante aos do grupo IM-CDM quando comparados ao grupo IM-56. A resistência coronariana mostrou-se reduzida nos animais IMCDM-VEGF (22,07±2,01 mmHg/ml/min/g) quando comparada ao grupo C-CDM (37,81±4 mmHg/ml/min/g), apesar do fluxo coronariano ter sido similar entre os grupos submetidos à CDM. O FCOcoração→GD ocorreu predominantemente nos animais dos grupo C-CDM e IMCDM-VEGF (70% e 83,3% vs 28,6% no IM-CDM) enquanto que o FCO GD-coração foi observado em todos os animais dos grupos IM-CDM e IMCDM-VEGF (20% no C-CDM). A administração de VEGF165 aumentou o FCO GD→coração em valores absolutos e normalizados por grama (24,85±10,3% e 62,29±23,27%/g) em relação aos grupos C-CDM (0,88±0,89% e 1,42±1,42%/g) e IM-CDM (4,43±1,45 % e 7,66±2,34 %/g). O FCO GD→coração normalizado foi maior nos animais IM-CDM em relação aos C-CDM. O grupo IMCDM-VEGF (4,47±1,46 %/g) apresentou maior FCO coração→GD normalizado em comparação ao grupo MI-CDM (2,43±1,44 %/g). O tamanho do infarto foi menor nos animais do grupo IMCDM-VEGF (36±3 % do VE) em relação aos grupos IM-56 e IM-CDM. Correlações positivas foram obtidas entre o FCO GD→coração e o volume sistólico e (r=0,7) e o fluxo coronariano (r=0,7), e entre a PDF e a razão pv/pc (r=0,8) e o tamanho do IM (r=0,6). Além disso, correlações inversas foram observadas entre o tamanho do infarto e o volume sistólico (r=0,8) e o FCO GD→coração (r=0,7). Estes resultados permitem concluir que a CDM passiva preveniu a disfunção e o remodelamento do VE em ratos infartados. A aplicação de VEGF165 induziu diminuição do tamanho do IM que pode estar associado ao aumento do fluxo colateral extramiocárdico do GD→coração observado neste grupo tratado com VEGF. Estes achados sugerem que o uso de fatores angiogênicos, como o VEGF165, pode induzir melhora da perfusão das regiões isquêmicas do coração infartado, limitando a perda tecidual. Este efeito associado ao da compressão passiva do VE infartado pelo GD pós CDM, pode prevenir as disfunções decorrentes de isquemias miocárdicas.