931 resultados para flash card
Resumo:
This thesis investigates and develops techniques for accurately detecting Internet-based Distributed Denial-of-Service (DDoS) Attacks where an adversary harnesses the power of thousands of compromised machines to disrupt the normal operations of a Web-service provider, resulting in significant down-time and financial losses. This thesis also develops methods to differentiate these attacks from similar-looking benign surges in web-traffic known as Flash Events (FEs). This thesis also addresses an intrinsic challenge in research associated with DDoS attacks, namely, the extreme scarcity of public domain datasets (due to legal and privacy issues) by developing techniques to realistically emulate DDoS attack and FE traffic.
Resumo:
Transit passenger market segmentation enables transit operators to target different classes of transit users to provide customized information and services. The Smart Card (SC) data, from Automated Fare Collection system, facilitates the understanding of multiday travel regularity of transit passengers, and can be used to segment them into identifiable classes of similar behaviors and needs. However, the use of SC data for market segmentation has attracted very limited attention in the literature. This paper proposes a novel methodology for mining spatial and temporal travel regularity from each individual passenger’s historical SC transactions and segments them into four segments of transit users. After reconstructing the travel itineraries from historical SC transactions, the paper adopts the Density-Based Spatial Clustering of Application with Noise (DBSCAN) algorithm to mine travel regularity of each SC user. The travel regularity is then used to segment SC users by an a priori market segmentation approach. The methodology proposed in this paper assists transit operators to understand their passengers and provide them oriented information and services.
Resumo:
Flash flood disasters happen suddenly. The Toowoomba Lockyer Valley flash flood in January 2011 was not forecast by the Bureau of Meteorology until after it had occurred. Domestic and wild animals gave the first warning of the disaster in the days leading up to the event and large animals gave warnings on the morning of the disaster. Twenty-three people, including 5 children in the disaster zone died. More than 500 people were listed as missing. Some of those who died, perished because they stayed in the disaster zone to look after their animals while other members of their family escaped to safety. Some people who were in danger refused to be rescued because they could not take their pets with them. During a year spent recording accounts of the survivors of the disaster, animals were often mentioned by survivors. Despite the obvious perils, people risked their lives to save their animals; people saw animals try to save each other; animals rescued people; people rescued animals; animals survived where people died; animals were used to find human victims in the weeks after the disaster; and animals died. The stories of the flood present challenges for pet owners, farmers, counter disaster planners, weather forecasters and emergency responders in preparing for disasters, responding to them and recovering after them.
Resumo:
To provide card holder authentication while they are conducting an electronic transaction using mobile devices, VISA and MasterCard independently proposed two electronic payment protocols: Visa 3D Secure and MasterCard Secure Code. The protocols use pre-registered passwords to provide card holder authentication and Secure Socket Layer/ Transport Layer Security (SSL/TLS) for data confidentiality over wired networks and Wireless Transport Layer Security (WTLS) between a wireless device and a Wireless Application Protocol (WAP) gateway. The paper presents our analysis of security properties in the proposed protocols using formal method tools: Casper and FDR2. We also highlight issues concerning payment security in the proposed protocols.
Resumo:
"Eight people are dead and there are grave fears the toll may rise with at least 70 missing after flash floods swept through southeastern Queensland."
Resumo:
This report presents the final deliverable from the project titled Conceptual and statistical framework for a water quality component of an integrated report card’ funded by the Marine and Tropical Sciences Research Facility (MTSRF; Project 3.7.7). The key management driver of this, and a number of other MTSRF projects concerned with indicator development, is the requirement for state and federal government authorities and other stakeholders to provide robust assessments of the present ‘state’ or ‘health’ of regional ecosystems in the Great Barrier Reef (GBR) catchments and adjacent marine waters. An integrated report card format, that encompasses both biophysical and socioeconomic factors, is an appropriate framework through which to deliver these assessments and meet a variety of reporting requirements. It is now well recognised that a ‘report card’ format for environmental reporting is very effective for community and stakeholder communication and engagement, and can be a key driver in galvanising community and political commitment and action. Although a report card it needs to be understandable by all levels of the community, it also needs to be underpinned by sound, quality-assured science. In this regard this project was to develop approaches to address the statistical issues that arise from amalgamation or integration of sets of discrete indicators into a final score or assessment of the state of the system. In brief, the two main issues are (1) selecting, measuring and interpreting specific indicators that vary both in space and time, and (2) integrating a range of indicators in such a way as to provide a succinct but robust overview of the state of the system. Although there is considerable research and knowledge of the use of indicators to inform the management of ecological, social and economic systems, methods on how to best to integrate multiple disparate indicators remain poorly developed. Therefore the objective of this project was to (i) focus on statistical approaches aimed at ensuring that estimates of individual indicators are as robust as possible, and (ii) present methods that can be used to report on the overall state of the system by integrating estimates of individual indicators. It was agreed at the outset, that this project was to focus on developing methods for a water quality report card. This was driven largely by the requirements of Reef Water Quality Protection Plan (RWQPP) and led to strong partner engagement with the Reef Water Quality Partnership.
Resumo:
Transit passenger market segmentation enables transit operators to target different classes of transit users for targeted surveys and various operational and strategic planning improvements. However, the existing market segmentation studies in the literature have been generally done using passenger surveys, which have various limitations. The smart card (SC) data from an automated fare collection system facilitate the understanding of the multiday travel pattern of transit passengers and can be used to segment them into identifiable types of similar behaviors and needs. This paper proposes a comprehensive methodology for passenger segmentation solely using SC data. After reconstructing the travel itineraries from SC transactions, this paper adopts the density-based spatial clustering of application with noise (DBSCAN) algorithm to mine the travel pattern of each SC user. An a priori market segmentation approach then segments transit passengers into four identifiable types. The methodology proposed in this paper assists transit operators to understand their passengers and provides them oriented information and services.
Resumo:
Defectivity has been historically identified as a leading technical roadblock to the implementation of nanoimprint lithography for semiconductor high volume manufacturing. The lack of confidence in nanoimprint's ability to meet defect requirements originates in part from the industry's past experiences with 1 × lithography and the shortage in enduser generated defect data. SEMATECH has therefore initiated a defect assessment aimed at addressing these concerns. The goal is to determine whether nanoimprint, specifically Jet and Flash Imprint Lithography from Molecular Imprints, is capable of meeting semiconductor industry defect requirements. At this time, several cycles of learning have been completed in SEMATECH's defect assessment, with promising results. J-FIL process random defectivity of < 0.1 def/cm2 has been demonstrated using a 120nm half-pitch template, providing proof of concept that a low defect nanoimprint process is possible. Template defectivity has also improved significantly as shown by a pre-production grade template at 80nm pitch. Cycles of learning continue on feature sizes down to 22nm. © 2011 SPIE.
Resumo:
Tunable charge-trapping behaviors including unipolar charge trapping of one type of charge carrier and ambipolar trapping of both electrons and holes in a complementary manner is highly desirable for low power consumption multibit flash memory design. Here, we adopt a strategy of tuning the Fermi level of reduced graphene oxide (rGO) through self-assembled monolayer (SAM) functionalization and form p-type and n-type doped rGO with a wide range of manipulation on work function. The functionalized rGO can act as charge-trapping layer in ambipolar flash memories, and a dramatic transition of charging behavior from unipolar trapping of electrons to ambipolar trapping and eventually to unipolar trapping of holes was achieved. Adjustable hole/electron injection barriers induce controllable Vth shift in the memory transistor after programming operation. Finally, we transfer the ambipolar memory on flexible substrates and study their charge-trapping properties at various bending cycles. The SAM-functionalized rGO can be a promising candidate for next-generation nonvolatile memories.
Resumo:
Smart Card Automated Fare Collection (AFC) data has been extensively exploited to understand passenger behavior, passenger segment, trip purpose and improve transit planning through spatial travel pattern analysis. The literature has been evolving from simple to more sophisticated methods such as from aggregated to individual travel pattern analysis, and from stop-to-stop to flexible stop aggregation. However, the issue of high computing complexity has limited these methods in practical applications. This paper proposes a new algorithm named Weighted Stop Density Based Scanning Algorithm with Noise (WS-DBSCAN) based on the classical Density Based Scanning Algorithm with Noise (DBSCAN) algorithm to detect and update the daily changes in travel pattern. WS-DBSCAN converts the classical quadratic computation complexity DBSCAN to a problem of sub-quadratic complexity. The numerical experiment using the real AFC data in South East Queensland, Australia shows that the algorithm costs only 0.45% in computation time compared to the classical DBSCAN, but provides the same clustering results.
Resumo:
We review studies of Nelson's (1976) Modified Card Sorting Test (MCST) that have examined the performance of subjects with frontal lobe dysfunction. Six studies investigated the performance of normal controls and patients with frontal lobe dysfunction, whereas four studies compared the performance of frontal and nonfrontal patients. One further study compared the performance of amnesic patients both on the MCST and on the original Wisconsin Card Sorting Test (WCST). Evidence regarding the MCST's differential sensitivity to frontal lobe dysfunction is weak, as is the evidence regarding the equivalence of the MCST and WCST. It is likely that the MCST is an altogether different test from the standard version. In the absence of proper normative data for the MCST, we provide a table of scores derived from the control groups of various studies. Given the paucity of evidence, further research is required before the MCST can be recommended for use as a marker of frontal lobe dysfunction.
Resumo:
Objectives. To investigate the test-retest stability of a standardized version of Nelson's (1976) Modified Card Sorting Test (MCST) and its relationships with demographic variables in a sample of healthy older adults. Design. A standard card order and administration were devised for the MCST and administered to participants at an initial assessment, and again at a second session conducted a minimum of six months later in order to examine its test-retest stability. Participants were also administered the WAIS-R at initial assessment in order to provide a measure of psychometric intelligence. Methods. Thirty-six (24 female, 12 male) healthy older adults aged 52 to 77 years with mean education 12.42 years (SD = 3.53) completed the MCST on two occasions approximately 7.5 months (SD = 1.61) apart. Stability coefficients and test-retest differences were calculated for the range of scores. The effect of gender on MCST performance was examined. Correlations between MCST scores and age, education and WAIS-R IQs were also determined. Results. Stability coefficients ranged from .26 for the percent perseverative errors measure to .49 for the failure to maintain set measure. Several measures were significantly correlated with age, education and WAIS-R IQs, although no effect of gender on MCST performance was found. Conclusions. None of the stability coefficients reached the level required for clinical decision making. The results indicate that participants' age, education, and intelligence need to be considered when interpreting MCST performance. Normative studies of MCST performance as well as further studies with patients with executive dysfunction are needed.
Resumo:
An intrinsic challenge associated with evaluating proposed techniques for detecting Distributed Denial-of-Service (DDoS) attacks and distinguishing them from Flash Events (FEs) is the extreme scarcity of publicly available real-word traffic traces. Those available are either heavily anonymised or too old to accurately reflect the current trends in DDoS attacks and FEs. This paper proposes a traffic generation and testbed framework for synthetically generating different types of realistic DDoS attacks, FEs and other benign traffic traces, and monitoring their effects on the target. Using only modest hardware resources, the proposed framework, consisting of a customised software traffic generator, ‘Botloader’, is capable of generating a configurable mix of two-way traffic, for emulating either large-scale DDoS attacks, FEs or benign traffic traces that are experimentally reproducible. Botloader uses IP-aliasing, a well-known technique available on most computing platforms, to create thousands of interactive UDP/TCP endpoints on a single computer, each bound to a unique IP-address, to emulate large numbers of simultaneous attackers or benign clients.