825 resultados para Event Procedure


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A general procedure to determine the principal domain (i.e., nonredundant region of computation) of any higher-order spectrum is presented, using the bispectrum as an example. The procedure is then applied to derive the principal domain of the trispectrum of a real-valued, stationary time series. These results are easily extended to compute the principal domains of other higher-order spectra

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed Denial-of-Service (DDoS) attacks continue to be one of the most pernicious threats to the delivery of services over the Internet. Not only are DDoS attacks present in many guises, they are also continuously evolving as new vulnerabilities are exploited. Hence accurate detection of these attacks still remains a challenging problem and a necessity for ensuring high-end network security. An intrinsic challenge in addressing this problem is to effectively distinguish these Denial-of-Service attacks from similar looking Flash Events (FEs) created by legitimate clients. A considerable overlap between the general characteristics of FEs and DDoS attacks makes it difficult to precisely separate these two classes of Internet activity. In this paper we propose parameters which can be used to explicitly distinguish FEs from DDoS attacks and analyse two real-world publicly available datasets to validate our proposal. Our analysis shows that even though FEs appear very similar to DDoS attacks, there are several subtle dissimilarities which can be exploited to separate these two classes of events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Few studies have investigated iatrogenic outcomes from the viewpoint of patient experience. To address this anomaly, the broad aim of this research is to explore the lived experience of patient harm. Patient harm is defined as major harm to the patient, either psychosocial or physical in nature, resulting from any aspect of health care. Utilising the method of Consensual Qualitative Research (CQR), in-depth interviews are conducted with twenty-four volunteer research participants who self-report having been severely harmed by an invasive medical procedure. A standardised measure of emotional distress, the Impact of Event Scale (IES), is additionally employed for purposes of triangulation. Thematic analysis of transcript data indicate numerous findings including: (i) difficulties regarding patients‘ prior understanding of risks involved with their medical procedure; (ii) the problematic response of the health system post-procedure; (iii) multiple adverse effects upon life functioning; (iv) limited recourse options for patients; and (v) the approach desired in terms of how patient harm should be systemically handled. In addition, IES results indicate a clinically significant level of distress in the sample as a whole. To discuss findings, a cross-disciplinary approach is adopted that draws upon sociology, medicine, medical anthropology, psychology, philosophy, history, ethics, law, and political theory. Furthermore, an overall explanatory framework is proposed in terms of the master themes of power and trauma. In terms of the theme of power, a postmodernist analysis explores the politics of patient harm, particularly the dynamics surrounding the politics of knowledge (e.g., notions of subjective versus objective knowledge, informed consent, and open disclosure). This analysis suggests that patient care is not the prime function of the health system, which appears more focussed upon serving the interests of those in the upper levels of its hierarchy. In terms of the master theme of trauma, current understandings of posttraumatic stress disorder (PTSD) are critiqued, and based on data from this research as well as the international literature, a new model of trauma is proposed. This model is based upon the principle of homeostasis observed in biology, whereby within every cell or organism a state of equilibrium is sought and maintained. The proposed model identifies several bio-psychosocial markers of trauma across its three main phases. These trauma markers include: (i) a profound sense of loss; (ii) a lack of perceived control; (iii) passive trauma processing responses; (iv) an identity crisis; (v) a quest to fully understand the trauma event; (vi) a need for social validation of the traumatic experience; and (vii) posttraumatic adaption with the possibility of positive change. To further explore the master themes of power and trauma, a natural group interview is carried out at a meeting of a patient support group for arachnoiditis. Observations at this meeting and members‘ stories in general support the homeostatic model of trauma, particularly the quest to find answers in the face of distressing experience, as well as the need for social recognition of that experience. In addition, the sociopolitical response to arachnoiditis highlights how public domains of knowledge are largely constructed and controlled by vested interests. Implications of the data overall are discussed in terms of a cultural revolution being needed in health care to position core values around a prime focus upon patients as human beings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modelling events in densely crowded environments remains challenging, due to the diversity of events and the noise in the scene. We propose a novel approach for anomalous event detection in crowded scenes using dynamic textures described by the Local Binary Patterns from Three Orthogonal Planes (LBP-TOP) descriptor. The scene is divided into spatio-temporal patches where LBP-TOP based dynamic textures are extracted. We apply hierarchical Bayesian models to detect the patches containing unusual events. Our method is an unsupervised approach, and it does not rely on object tracking or background subtraction. We show that our approach outperforms existing state of the art algorithms for anomalous event detection in UCSD dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Travel time is an important network performance measure and it quantifies congestion in a manner easily understood by all transport users. In urban networks, travel time estimation is challenging due to number of reasons such as, fluctuations in traffic flow due to traffic signals, significant flow to/from mid link sinks/sources, etc. The classical analytical procedure utilizes cumulative plots at upstream and downstream locations for estimating travel time between the two locations. In this paper, we discuss about the issues and challenges with classical analytical procedure such as its vulnerability to non conservation of flow between the two locations. The complexity with respect to exit movement specific travel time is discussed. Recently, we have developed a methodology utilising classical procedure to estimate average travel time and its statistic on urban links (Bhaskar, Chung et al. 2010). Where, detector, signal and probe vehicle data is fused. In this paper we extend the methodology for route travel time estimation and test its performance using simulation. The originality is defining cumulative plots for each exit turning movement utilising historical database which is self updated after each estimation. The performance is also compared with a method solely based on probe (Probe-only). The performance of the proposed methodology has been found insensitive to different route flow, with average accuracy of more than 94% given a probe per estimation interval which is more than 5% increment in accuracy with respect to Probe-only method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Workforce planning for first aid and medical coverage of mass gatherings is hampered by limited research. In particular, the characteristics and likely presentation patterns of low-volume mass gatherings of between several hundred to several thousand people are poorly described in the existing literature. OBJECTIVES: This study was conducted to: 1. Describe key patient and event characteristics of medical presentations at a series of mass gatherings, including events smaller than those previously described in the literature; 2. Determine whether event type and event size affect the mean number of patients presenting for treatment per event, and specifically, whether the 1:2,000 deployment rule used by St John Ambulance Australia is appropriate; and 3. Identify factors that are predictive of injury at mass gatherings. METHODS: A retrospective, observational, case-series design was used to examine all cases treated by two Divisions of St John Ambulance (Queensland) in the greater metropolitan Brisbane region over a three-year period (01 January 2002-31 December 2004). Data were obtained from routinely collected patient treatment forms completed by St John officers at the time of treatment. Event-related data (e.g., weather, event size) were obtained from event forms designed for this study. Outcome measures include: total and average number of patient presentations for each event; event type; and event size category. Descriptive analyses were conducted using chi-square tests, and mean presentations per event and event type were investigated using Kruskal-Wallis tests. Logistic regression analyses were used to identify variables independently associated with injury presentation (compared with non-injury presentations). RESULTS: Over the three-year study period, St John Ambulance officers treated 705 patients over 156 separate events. The mean number of patients who presented with any medical condition at small events (less than or equal to 2,000 attendees) did not differ significantly from that of large (>2,000 attendees) events (4.44 vs. 4.67, F = 0.72, df = 1, 154, p = 0.79). Logistic regression analyses indicated that presentation with an injury compared with non-injury was independently associated with male gender, winter season, and sporting events, even after adjusting for relevant variables. CONCLUSIONS: In this study of low-volume mass gatherings, a similar number of patients sought medical treatment at small (<2,000 patrons) and large (>2,000 patrons) events. This demonstrates that for low-volume mass gatherings, planning based solely on anticipated event size may be flawed, and could lead to inappropriate levels of first-aid coverage. This study also highlights the importance of considering other factors, such as event type and patient characteristics, when determining appropriate first-aid resourcing for low-volume events. Additionally, identification of factors predictive of injury presentations at mass gatherings has the potential to significantly enhance the ability of event coordinators to plan effective prevention strategies and response capability for these events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we set out to dissociate the developmental time course of automatic symbolic number processing and cognitive control functions in grade 1-3 British primary school children. Event-related potential (ERP) and behavioral data were collected in a physical size discrimination numerical Stroop task. Task-irrelevant numerical information was processed automatically already in grade 1. Weakening interference and strengthening facilitation indicated the parallel development of general cognitive control and automatic number processing. Relationships among ERP and behavioral effects suggest that control functions play a larger role in younger children and that automaticity of number processing increases from grade 1 to 3.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Mango Boulevard Pty Ltd v Spencer [2010] QCA 207, a self-executing order had been made in consequence of continuing default by parties to the proceedings in meeting their disclosure obligations. The case involved several questions about the construction and implications of the self-executing order. This note focuses on the aspects of the case relating to that order.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Legal Services Commissioner v Wright [2010] QCA 321 the Queensland Court of Appeal allowed an appeal from the first instance decision. The decision involved the construction of “third party payer” in Part 3.4 of the Legal Profession Act 2007 (Qld).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Bowenbrae Pty Ltd v Flying Fighters Maintenance and Restoration [2010] QDC 347 Reid DCJ made orders requiring the plaintiffs to make application under the Freedom of Information Act 1982 (Cth) (“the FOI Act”) for documents sought by the defendant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In practice, parallel-machine job-shop scheduling (PMJSS) is very useful in the development of standard modelling approaches and generic solution techniques for many real-world scheduling problems. In this paper, based on the analysis of structural properties in an extended disjunctive graph model, a hybrid shifting bottleneck procedure (HSBP) algorithm combined with Tabu Search metaheuristic algorithm is developed to deal with the PMJSS problem. The original-version SBP algorithm for the job-shop scheduling (JSS) has been significantly improved to solve the PMJSS problem with four novelties: i) a topological-sequence algorithm is proposed to decompose the PMJSS problem into a set of single-machine scheduling (SMS) and/or parallel-machine scheduling (PMS) subproblems; ii) a modified Carlier algorithm based on the proposed lemmas and the proofs is developed to solve the SMS subproblem; iii) the Jackson rule is extended to solve the PMS subproblem; iv) a Tabu Search metaheuristic algorithm is embedded under the framework of SBP to optimise the JSS and PMJSS cases. The computational experiments show that the proposed HSBP is very efficient in solving the JSS and PMJSS problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Access to cardiac services is essential for appropriate implementation of evidence-based therapies to improve outcomes. The Cardiac Accessibility and Remoteness Index for Australia (Cardiac ARIA) aimed to derive an objective, geographic measure reflecting access to cardiac services. Methods: An expert panel defined an evidence-based clinical pathway. Using Geographic Information Systems (GIS), a numeric/alpha index was developed at two points along the continuum of care. The acute category (numeric) measured the time from the emergency call to arrival at an appropriate medical facility via road ambulance. The aftercare category (alpha) measured access to four basic services (family doctor, pharmacy, cardiac rehabilitation, and pathology services) when a patient returned to their community. Results: The numeric index ranged from 1 (access to principle referral center with cardiac catheterization service ≤ 1 hour) to 8 (no ambulance service, > 3 hours to medical facility, air transport required). The alphabetic index ranged from A (all 4 services available within 1 hour drive-time) to E (no services available within 1 hour). 13.9 million (71%) Australians resided within Cardiac ARIA 1A locations (hospital with cardiac catheterization laboratory and all aftercare within 1 hour). Those outside Cardiac 1A were over-represented by people aged over 65 years (32%) and Indigenous people (60%). Conclusion: The Cardiac ARIA index demonstrated substantial inequity in access to cardiac services in Australia. This methodology can be used to inform cardiology health service planning and the methodology could be applied to other common disease states within other regions of the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Comprehensive geriatric assessment has been shown to improve patient outcomes, but the geriatricians who deliver it are in short-supply. A web-based method of comprehensive geriatric assessment has been developed with the potential to improve access to specialist geriatric expertise. The current study aims to test the reliability and safety of comprehensive geriatric assessment performed “online” in making geriatric triage decisions. It will also explore the accuracy of the procedure in identifying common geriatric syndromes, and its cost relative to conventional “live” consultations. Methods/Design The study population will consist of 270 acutely hospitalized patients referred for geriatric consultation at three sites. Paired assessments (live and online) will be conducted by independent, blinded geriatricians and the level of agreement examined. This will be compared with the level of agreement between two independent, blinded geriatricians each consulting with the patient in person (i.e. “live”). Agreement between the triage decision from live-live assessments and between the triage decision from live-online assessments will be calculated using kappa statistics. Agreement between the online and live detection of common geriatric syndromes will also be assessed using kappa statistics. Resource use data will be collected for online and live-live assessments to allow comparison between the two procedures. Discussion If the online approach is found to be less precise than live assessment, further analysis will seek to identify patient subgroups where disagreement is more likely. This may enable a protocol to be developed that avoids unsafe clinical decisions at a distance. Trial registration Trial registration number: ACTRN12611000936921