38 resultados para Probability Metrics

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both Flash crowds and DDoS (Distributed Denial-of-Service) attacks have very similar properties in terms of internet traffic, however Flash crowds are legitimate flows and DDoS attacks are illegitimate flows, and DDoS attacks have been a serious threat to internet security and stability. In this paper we propose a set of novel methods using probability metrics to distinguish DDoS attacks from Flash crowds effectively, and our simulations show that the proposed methods work well. In particular, these mathods can not only distinguish DDoS attacks from Flash crowds clearly, but also can distinguish the anomaly flow being DDoS attacks flow or being Flash crowd flow from Normal network flow effectively. Furthermore, we show our proposed hybrid probability metrics can greatly reduce both false positive and false negative rates in detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sixteen young (25±2.6 years) and 16 older individuals (69±4.4 years) walked normally then terminated walking rapidly. A visual stopping stimulus was presented 10 ms following ground contact (short delay) and in another condition, at 450 ms prior to toe-off (long delay). Stimulus probability was either high (80% of trials) or low (10%). The younger group stopped faster (463 vs. 574 ms) despite also walking faster (1.29 vs. 1.17 m s−1). Longer delay decreased one-step responses but older participants used significantly more (slower) two-step stopping, which increased stopping time and distance. The additional step may have been pre-planned to maintain medial–lateral stability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of marketing channels is to implement marketing strategy. The difficulty of channel strategy is compounded by the emergence of e-channels and the need to integrate e-channels into traditional or “bricks and mortar” channels (Rowley 2002). As a result, managing performance across a greater number of channels with diverse characteristics is more difficult.

Organization and marketing performance is to some degree a function of the quality of channel implementation and particularly channel performance measurement. The channels literature suggests a “channel performance metric paradox”. Approaches to channel performance metrics have been mutually orthogonal or even negatively correlated. (Jeuland & Shugan 1983; Lewis & Lambert 1991; Larson & Lusch 1992). This paradox implies that it is impossible for all channel performance metrics to be maximized simultaneously and tradeoffs exist.

This paper proposes a research model and propositions which extend previous research and attempts to reconcile this “channel performance metric paradox”. The model assumes that testing the relationship between the Miles and Snow strategy types and a comprehensive range of channel performance metrics may explain this paradox. Previous implementation performance research has focused more on the Porter strategies rather than the Miles and Snow strategy types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents a profiling tools for identifying students knowledge in chance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image fusion quality metrics have evolved from image processing quality metrics. They measure the quality of fused images by estimating how much localized information has been transferred from the source images into the fused image. However, this technique assumes that it is actually possible to fuse two images into one without any loss. In practice, some features must be sacrificed and relaxed in both source images. Relaxed features might be very important, like edges, gradients and texture elements. The importance of a certain feature is application dependant. This paper presents a new method for image fusion quality assessment. It depends on estimating how much valuable information has not been transferred.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Specific scales were developed for discriminating child sexual offenders with different classes of victim. The project demonstrates a method of individualising scores on actuarial risk assessment measured in a way that makes them more meaningful for those involved in decision-making about individual child sexual offenders. At present, the only quantifiable approach to specific decision-making relies on a general prediction of future behaviour, based on group data. The Bayesian approach is one method that can be used to assist decision-makers to use this information in ways that lead to the more appropriate management of risk. Ultimately, the better management of known child sexual offenders will lead to fewer offences and a reduction in the number of children who lives are profoundly affected by sexual victimisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis reports on a quantitative exposure assessment and on an analysis of the attributes of the data used in the estimations, in particular distinguishing between its uncertainty and variability. A retrospective assessment of exposure to benzene was carried out for a case control study of leukaemia in the Australian petroleum industry. The study used the mean of personal task-based measurements (Base Estimates) in a deterministic algorithm and applied factors to model back to places, times etc for which no exposure measurements were available. Mean daily exposures were estimated, on an individual subject basis, by summing the task-based exposures. These mean exposures were multiplied by the years spent on each job to provide exposure estimates in ppm-years. These were summed to provide a Cumulative Estimate for each subject. Validation was completed for the model and key inputs. Exposures were low, most jobs were below TWA of 5 ppm benzene. Exposures in terminals were generally higher than at refineries. Cumulative Estimates ranged from 0.005 to 50.9 ppm-years, with 84 percent less than 10 ppm-years. Exposure probability distributions were developed for tanker drivers using Monte Carlo simulation of the exposure estimation algorithm. The outcome was a lognormal distribution of exposure for each driver. These provide the basis for alternative risk assessment metrics e.g. the frequency of short but intense exposures which provided only a minimal contribution to the long-term average exposure but may increase risk of leukaemia. The effect of different inputs to the model were examined and their significance assessed using Monte Carlo simulation. The Base Estimates were the most important determinant of exposure in the model. The sources of variability in the measured data were examined, including the effect of having censored data and the between and within-worker variability. The sources of uncertainty in the exposure estimates were analysed and consequential improvements in exposure assessment identified. Monte Carlo sampling was also used to examine the uncertainties and variability associated with the tanker drivers' exposure assessment, to derive an estimate of the range and to put confidence intervals on the daily mean exposures. The identified uncertainty was less than the variability associated with the estimates. The traditional approach to exposure estimation typically derives only point estimates of mean exposure. The approach developed here allows a range of exposure estimates to be made and provides a more flexible and improved basis for risk assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review of the literature shows such null models are virtually always rejected, with often large effect sizes. We formulate an alternative null model, which assumes that 1) the number of EPC has a random (Poisson) distribution across females (broods) and that 2) the probability for an offspring to be of extrapair origin is zero without any EPC and increases with the number of EPC. Our brood-level model can accommodate the bimodality of both zero and medium rates of EPY typically found in empirical data, and fitting our model to EPY production of 7 passerine bird species shows evidence of a nonrandom distribution of EPY in only 2 species. We therefore argue that 1) dichotomy in extrapair mate choice cannot be inferred only from a significant deviation in the observed distribution of EPY from a random process on the level of offspring and that 2) additional empirical work on testing the contrasting critical predictions from the classic and our alternative null models is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usage of Ecstasy and related drug (ERD) has increasingly been the focus of epidemiological and other public health-related research. One of the more promising methods is the use of the Internet as a recruitment and survey tool.However, there remain methodological concerns and questions about representativeness. Three samples of ERD users in Melbourne, Australia surveyed in 2004 are compared in terms of a number of key demographic and drug use variables. The Internet, face-to-face, and probability
sampling methods appear to access similar but not identical groups of ERD users. Implications and limitations of the study are noted and future research is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are the two common means for propagating worms: scanning vulnerable computers in the network and sending out malicious email attachments. Modeling the propagation of worms can help us understand how worms spread and devise effective defence strategies. Most traditional models simulate the overall scale of infected network in each time tick, making them invalid for examining deep inside the propagation procedure among individual nodes. For this reason, this paper proposes a novel probability matrix to model the propagation mechanism of the two main classes of worms (scanning and email worms) by concentrating on the propagation probability. The objective of this paper is to access the spreading and work out an effective scheme against the worms. In order to evaluate the effects of each major component in our probability model, we implement a series of experiments for both worms. From the results, the network administrators can make decision on how to reduce the number of vulnerable nodes to a certain threshold for scanning worms, and how to immunize the highly-connected node for preventing worm's propagation for email worms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Active Peer-to-Peer worms are great threat to the network security since they can propagate in automated ways and flood the Internet within a very short duration. Modeling a propagation process can help us to devise effective strategies against a worm's spread. This paper presents a study on modeling a worm's propagation probability in a P2P overlay network and proposes an optimized patch strategy for defenders. Firstly, we present a probability matrix model to construct the propagation of P2P worms. Our model involves three indispensible aspects for propagation: infected state, vulnerability distribution and patch strategy. Based on a fully connected graph, our comprehensive model is highly suited for real world cases like Code Red II. Finally, by inspecting the propagation procedure, we propose four basic tactics for defense of P2P botnets. The rationale is exposed by our simulated experiments and the results show these tactics are of effective and have considerable worth in being applied in real-world networks.