953 resultados para Probability Metrics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both Flash crowds and DDoS (Distributed Denial-of-Service) attacks have very similar properties in terms of internet traffic, however Flash crowds are legitimate flows and DDoS attacks are illegitimate flows, and DDoS attacks have been a serious threat to internet security and stability. In this paper we propose a set of novel methods using probability metrics to distinguish DDoS attacks from Flash crowds effectively, and our simulations show that the proposed methods work well. In particular, these mathods can not only distinguish DDoS attacks from Flash crowds clearly, but also can distinguish the anomaly flow being DDoS attacks flow or being Flash crowd flow from Normal network flow effectively. Furthermore, we show our proposed hybrid probability metrics can greatly reduce both false positive and false negative rates in detection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

* Research supported by NATO GRANT CRG 900 798 and by Humboldt Award for U.S. Scientists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is to assess daylight performance of buildings with climatic responsive envelopes with complex geometry that integrates shading devices in the façade. To this end two case studies are chosen due to their complex geometries and integrated daylight devices. The effect of different parameters of the daylight devices is analysed through Climate base daylight metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The occupant impact velocity (OIV) and acceleration severity index (ASI) are competing measures of crash severity used to assess occupant injury risk in full-scale crash tests involving roadside safety hardware, e.g. guardrail. Delta-V, or the maximum change in vehicle velocity, is the traditional metric of crash severity for real world crashes. This study compares the ability of the OIV, ASI, and delta-V to discriminate between serious and non-serious occupant injury in real world frontal collisions. Vehicle kinematics data from event data recorders (EDRs) were matched with detailed occupant injury information for 180 real world crashes. Cumulative probability of injury risk curves were generated using binary logistic regression for belted and unbelted data subsets. By comparing the available fit statistics and performing a separate ROC curve analysis, the more computationally intensive OIV and ASI were found to offer no significant predictive advantage over the simpler delta-V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geometrical dependencies are being researched for analytical representation of the probability density function (pdf) for the travel time between a random, and a known or another random point in Tchebyshev’s metric. In the most popular case - a rectangular area of service - the pdf of this random variable depends directly on the position of the server. Two approaches have been introduced for the exact analytical calculation of the pdf: Ad-hoc approach – useful for a ‘manual’ solving of a specific case; by superposition – an algorithmic approach for the general case. The main concept of each approach is explained, and a short comparison is done to prove the faithfulness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australia’s civil infrastructure assets of roads, bridges, railways, buildings and other structures are worth billions of dollars. Road assets alone are valued at around A$ 140 billion. As the condition of assets deteriorate over time, close to A$10 billion is spent annually in asset maintenance on Australia's roads, or the equivalent of A$27 million per day. To effectively manage road infrastructures, firstly, road agencies need to optimise the expenditure for asset data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. A procedure for assessing investment decision for road asset management has been developed. The procedure includes: • A methodology for optimising asset data collection; • A methodology for calibrating deterioration prediction models; • A methodology for assessing risk-adjusted estimates for life-cycle cost estimates. • A decision framework in the form of risk map

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring social and environmental metrics of property is necessary for meaningful triple bottom line (TBL) assessments. This paper demonstrates how relevant indicators derived from environmental rating systems provide for reasonably straightforward collations of performance scores that support adjustments based on a sliding scale. It also highlights the absence of a corresponding consensus of important social metrics representing the third leg of the TBL tripod. Assessing TBL may be unavoidably imprecise, but if valuers and managers continue to ignore TBL concerns, their assessments may soon be less relevant given the emerging institutional milieu informing and reflecting business practices and society expectations.