790 resultados para IPv6, Denial of Service, Coloured Petri Nets, Risk Analysis, IPv6threats
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
Areas of the landscape that are priorities for conservation should be those that are both vulnerable to threatening processes and that if lost or degraded, will result in conservation targets being compromised. While much attention is directed towards understanding the patterns of biodiversity, much less is given to determining the areas of the landscape most vulnerable to threats. We assessed the relative vulnerability of remaining areas of native forest to conversion to plantations in the ecologically significant temperate rainforest region of south central Chile. The area of the study region is 4.2 million ha and the extent of plantations is approximately 200000 ha. First, the spatial distribution of native forest conversion to plantations was determined. The variables related to the spatial distribution of this threatening process were identified through the development of a classification tree and the generation of a multivariate. spatially explicit, statistical model. The model of native forest conversion explained 43% of the deviance and the discrimination ability of the model was high. Predictions were made of where native forest conversion is likely to occur in the future. Due to patterns of climate, topography, soils and proximity to infrastructure and towns, remaining forest areas differ in their relative risk of being converted to plantations. Another factor that may increase the vulnerability of remaining native forest in a subset of the study region is the proposed construction of a highway. We found that 90% of the area of existing plantations within this region is within 2.5 km of roads. When the predictions of native forest conversion were recalculated accounting for the construction of this highway, it was found that: approximately 27000 ha of native forest had an increased probability of conversion. The areas of native forest identified to be vulnerable to conversion are outside of the existing reserve network. (C) 2004 Elsevier Ltd. All tights reserved.
Resumo:
Objective: To evaluate the cost of atrial fibrillation (AF) to health and social services in the UK in 1995 and, based on epidemiological trends, to project this estimate to 2000. Design, setting, and main outcome measures: Contemporary estimates of health care activity related to AF were applied to the whole population of the UK on an age and sex specific basis for the year 1995. The activities considered ( and costs calculated) were hospital admissions, outpatient consultations, general practice consultations, and drug treatment ( including the cost of monitoring anticoagulant treatment). By adjusting for the progressive aging of the British population and related increases in hospital admissions, the cost of AF was also projected to the year 2000. Results: There were 534 000 people with AF in the UK during 1995. The direct'' cost of health care for these patients was pound 244 million (similar toE350 million) or 0.62% of total National Health Service ( NHS) expenditure. Hospitalisations and drug prescriptions accounted for 50% and 20% of this expenditure, respectively. Long term nursing home care after hospital admission cost an additional pound46.4 million (similar toE66 million). The direct cost of AF rose to pound459 million (similar toE655 million) in 2000, equivalent to 0.97% of total NHS expenditure based on 1995 figures. Nursing home costs rose to pound111 million (similar toE160 million). Conclusions: AF is an extremely costly public health problem.
Resumo:
The aim of this study was to apply multifailure survival methods to analyze time to multiple occurrences of basal cell carcinoma (BCC). Data from 4.5 years of follow-up in a randomized controlled trial, the Nambour Skin Cancer Prevention Trial (1992-1996), to evaluate skin cancer prevention were used to assess the influence of sunscreen application on the time to first BCC and the time to subsequent BCCs. Three different approaches of time to ordered multiple events were applied and compared: the Andersen-Gill, Wei-Lin-Weissfeld, and Prentice-Williams-Peterson models. Robust variance estimation approaches were used for all multifailure survival models. Sunscreen treatment was not associated with time to first occurrence of a BCC (hazard ratio = 1.04, 95% confidence interval: 0.79, 1.45). Time to subsequent BCC tumors using the Andersen-Gill model resulted in a lower estimated hazard among the daily sunscreen application group, although statistical significance was not reached (hazard ratio = 0.82, 95% confidence interval: 0.59, 1.15). Similarly, both the Wei-Lin-Weissfeld marginal-hazards and the Prentice-Williams-Peterson gap-time models revealed trends toward a lower risk of subsequent BCC tumors among the sunscreen intervention group. These results demonstrate the importance of conducting multiple-event analysis for recurring events, as risk factors for a single event may differ from those where repeated events are considered.
Resumo:
This paper describes an experiment in designing, implementing and testing a Transport layer cluster scheduling and dispatching architecture. The motivation for the experiment was the hypothesis that a Transport layer clustering solution may offer advantantages over the existing industry-standard Network layer and Data Link Layer approaches. The critical success factors initially established to guide and evaluate the experiment were reduced dispatcher work load, reduced dispatcher internal state memory requirements, distributed denial of service resilience, and cluster software design simplicity. The functional design stage of the experiment produced a Transport layer strategy for scheduling and load balancing based on the specification of two new TCP options. Implementation required the introduction of the newly specified TCP options into the Linux (2.4) kernel. The implementation produced an extended Linux Socket API to facilitate user-process access to the additional TCP capability. The testing stage of the experiment confirmed the operational efficiency of the solution.
Resumo:
An inherent weakness in the management of large scale projects is the failure to achieve the scheduled completion date. When projects are planned with the objective of time achievement, the initial planning plays a vital role in the successful achievement of project deadlines. Cost and quality are additional priorities when such projects are being executed. This article proposes a methodology for achieving time duration of a project through risk analysis with the application of a Monte Carlo simulation technique. The methodology is demonstrated using a case application of a cross-country petroleum pipeline construction project.
Resumo:
This paper examines the source country determinants of FDI into Japan. The paper highlights certain methodological and theoretical weaknesses in the previous literature and offers some explanations for hitherto ambiguous results. Specifically, the paper highlights the importance of panel data analysis, and the identification of fixed effects in the analysis rather than simply pooling the data. Indeed, we argue that many of the results reported elsewhere are a feature of this mis-specification. To this end, pooled, fixed effects and random effects estimates are compared. The results suggest that FDI into Japan is inversely related to trade flows, such that trade and FDI are substitutes. Moreover, the results also suggest that FDI increases with home country political and economic stability. The paper also shows that previously reported results, regarding the importance of exchange rates, relative borrowing costs and labour costs in explaining FDI flows, are sensitive to the econometric specification and estimation approach. The paper also discusses the importance of these results within a policy context. In recent years Japan has sought to attract FDI, though many firms still complain of barriers to inward investment penetration in Japan. The results show that cultural and geographic distance are only of marginal importance in explaining FDI, and that the results are consistent with the market-seeking explanation of FDI. As such, the attitude to risk in the source country is strongly related to the size of FDI flows to Japan. © 2007 The Authors Journal compilation © 2007 Blackwell Publishing Ltd.
Resumo:
This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.
Resumo:
Assessing factors that predict new product success (NPS) holds critical importance for companies, as research shows that despite considerable new product investment, success rates are generally below 25%. Over the decades, meta-analytical attempts have been made to summarize empirical findings on NPS factors. However, market environment changes such as increased global competition, as well as methodological advancements in meta-analytical research, present a timely opportunity to augment their results. Hence, a key objective of this research is to provide an updated and extended meta-analytic investigation of the factors affecting NPS. Using Henard and Szymanski's meta-analysis as the most comprehensive recent summary of empirical findings, this study updates their findings by analyzing articles published from 1999 through 2011, the period following the original meta-analysis. Based on 233 empirical studies (from 204 manuscripts) on NPS, with a total 2618 effect sizes, this study also takes advantage of more recent methodological developments by re-calculating effects of the meta-analysis employing a random effects model. The study's scope broadens by including overlooked but important additional variables, notably “country culture,” and discusses substantive differences between the updated meta-analysis and its predecessor. Results reveal generally weaker effect sizes than those reported by Henard and Szymanski in 2001, and provide evolutionary evidence of decreased effects of common success factors over time. Moreover, culture emerges as an important moderating factor, weakening effect sizes for individualistic countries and strengthening effects for risk-averse countries, highlighting the importance of further investigating culture's role in product innovation studies, and of tracking changes of success factors of product innovations. Finally, a sharp increase since 1999 in studies investigating product and process characteristics identifies a significant shift in research interest in new product development success factors. The finding that the importance of success factors generally declines over time calls for new theoretical approaches to better capture the nature of new product development (NPD) success factors. One might speculate that the potential to create competitive advantages through an understanding of NPD success factors is reduced as knowledge of these factors becomes more widespread among managers. Results also imply that managers attempting to improve success rates of NPDs need to consider national culture as this factor exhibits a strong moderating effect: Working in varied cultural contexts will result in differing antecedents of successful new product ventures.
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
In this study we have identified key genes that are critical in development of astrocytic tumors. Meta-analysis of microarray studies which compared normal tissue to astrocytoma revealed a set of 646 differentially expressed genes in the majority of astrocytoma. Reverse engineering of these 646 genes using Bayesian network analysis produced a gene network for each grade of astrocytoma (Grade I–IV), and ‘key genes’ within each grade were identified. Genes found to be most influential to development of the highest grade of astrocytoma, Glioblastoma multiforme were: COL4A1, EGFR, BTF3, MPP2, RAB31, CDK4, CD99, ANXA2, TOP2A, and SERBP1. All of these genes were up-regulated, except MPP2 (down regulated). These 10 genes were able to predict tumor status with 96–100% confidence when using logistic regression, cross validation, and the support vector machine analysis. Markov genes interact with NFkβ, ERK, MAPK, VEGF, growth hormone and collagen to produce a network whose top biological functions are cancer, neurological disease, and cellular movement. Three of the 10 genes - EGFR, COL4A1, and CDK4, in particular, seemed to be potential ‘hubs of activity’. Modified expression of these 10 Markov Blanket genes increases lifetime risk of developing glioblastoma compared to the normal population. The glioblastoma risk estimates were dramatically increased with joint effects of 4 or more than 4 Markov Blanket genes. Joint interaction effects of 4, 5, 6, 7, 8, 9 or 10 Markov Blanket genes produced 9, 13, 20.9, 26.7, 52.8, 53.2, 78.1 or 85.9%, respectively, increase in lifetime risk of developing glioblastoma compared to normal population. In summary, it appears that modified expression of several ‘key genes’ may be required for the development of glioblastoma. Further studies are needed to validate these ‘key genes’ as useful tools for early detection and novel therapeutic options for these tumors.
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
This paper proposes a technique to defeat Denial of Service (DoS) and Distributed Denial of Service (DDoS) attacks in Ad Hoc Networks. The technique is divided into two main parts and with game theory and cryptographic puzzles. Introduced first is a new client puzzle to prevent DoS attacks in such networks. The second part presents a multiplayer game that takes place between the nodes of an ad hoc network and based on fundamental principles of game theory. By combining computational problems with puzzles, improvement occurs in the efficiency and latency of the communicating nodes and resistance in DoS and DDoS attacks. Experimental results show the effectiveness of the approach for devices with limited resources and for environments like ad hoc networks where nodes must exchange information quickly.
Resumo:
This chapter sets out a comprehensive analysis of the regulation of money market funds in the EU and US. The theoretical framework has unique cases and examples and includes checklists to assist with the practice of fund management and legal risk analysis.