954 resultados para Critical incident technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the promises of New Labour was that government policy would be grounded in 'evidence based research'. In recent years some academics have come to question whether the government has delivered on this promise. Professors Reece Walters and Tim Hope offer two contributions to this debate, arguing that rather than the 'evidence base', it is political considerations that govern the commissioning, production and dissemination of Home Office research. As the first monograph in our 'Evidence based policy series' Critical thinking about the uses of research carries a thought provoking set of arguments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work offers a critical introduction to sociology for New Zealand students. Written in an accessible narrative style, it seeks to challenge and debunk students' assumptions about key elements of their social worlds, encouraging them to develop a "critical imagination" as a tool to identify broader social themes in personal issues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the benefits and issues related to travel time prediction on urban network. Travel time information quantifies congestion and is perhaps the most important network performance measure. Travel time prediction has been an active area of research for the last five decades. The activities related to ITS have increased the attention of researchers for better and accurate real-time prediction of travel time. Majority of the literature on travel time prediction is applicable to freeways where, under non-incident conditions, traffic flow is not affected by external factors such as traffic control signals and opposing traffic flows. On urban environment the problem is more complicated due to conflicting areas (intersections), mid-link sources and sinks etc. and needs to be addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It appears that few of the students holding ‘socially idealistic’ goals upon entering law school actually maintain these upon graduation. The critical legal narrative, which explains and seeks to act upon this shift in the graduate’s ‘legal identity’, posits that these ideals are repressed through power relations that create passive receptacles into which professional ideologies can be deposited, in the interests of those advantaged by the social and legal status quo. Using the work of Michel Foucault, this paper unpacks the assumptions underpinning this narrative, particularly its arguments about ideology, power, and the subject. In doing so, it will argue this narrative provides an untenable basis for political action within legal education. By interrogating this narrative, this paper provides a new way of understanding the construction of the legal identity through legal education, and a new basis for political action within law school.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explores power within legal education scholarship. It suggests that power relations are not effectively reflected on within this scholarship, and it provokes legal educators to consider power more explicitly and effectively. It then outlines in-depth a conceptual and methodological approach based on Michel Foucault’s concept of ‘governmentality’ to assist in such an analysis. By detailing the conceptual moves required in order to research power in legal education more effectively, this article seeks to stimulate new reflection and thought about the practice and scholarship of legal education, and allow for political interventions to become more ethically sensitive and potentially more effective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Those working in the critical criminology tradition have been centrally concerned with the social construction, variability and contingency of the criminal label. The concern is no less salient to a consideration of critical criminology itself and any history of critical criminology (in Australia or elsewhere) should aim itself to be critical in this sense. The point applies with equal force to both of the terms ‘critical’ and ‘criminology’. The want of a stable theoretical object has meant that criminology itself needs to be seen not as a distinct discipline but as a composite intellectual and governmental hybrid, a field of studies that overlaps and intersects many others (sociology, law, psychology, history, anthropology, social work, media studies and youth studies to name only a few). In consequence, much of the most powerful work on subjects of criminological inquiry is undertaken by scholars who do not necessarily define themselves as criminologists first and foremost, or at all. For reasons that should later become obvious this is even more pronounced in the Australian context. Although we may appear at times to be claiming such work for criminology, our purpose is to recognize its impact on and in critical criminology in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter presents a technique to assess the overall network performance of sampled value process buses based on IEC 61850-9-2 using measurements from a single location in the network. The method is based upon the use of Ethernet cards with externally synchronized time stamping, and characteristics of the process bus protocol. The application and utility of the method is demonstrated by measuring latency introduced by Ethernet switches. Network latency can be measured from a single set of captures, rather than comparing source and destination captures. Absolute latency measures will greatly assist the design testing, commissioning and maintenance of these critical data networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim The purpose of this study was to examine the relationship between registered nurses’ (RN) job satisfaction and their intention to leave critical care nursing in Saudi Arabia. Background Many studies have identified critical care areas as stressful work environments for nurses and have identified factors contributing to job satisfaction and staff retention. However, very little research has examined these relationships in the Saudi context. Design and Methods This study utilised an exploratory, cross-sectional survey design to examine the relationship between RN job satisfaction and intention to leave at King Abdul-Aziz University Hospital, Saudi Arabia. Respondents completed a self-administered survey including demographic items and validated measures of job satisfaction and intention to leave. A convenience sample of 182 RNs working in critical care areas during the data collection period were included. Results Regression analysis predicting RN intention to leave found that demographic variables including age, parental status and length of ICU experience, and three of the job satisfaction subscales including perceived workload, professional support and pay and prospects for promotion, were significantly associated with the outcome variable. Conclusion This study adds to the existing literature on the relationship between job satisfaction and intention to leave critical care areas among RNs working in Saudi Arabia. These findings point to the need for management and policy interventions targeting nurses’ workloads, professional support and pay and promotion in order to improve nurse retention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Six Sigma technique is one of the quality management strategies and is utilised for improving the quality and productivity in the manufacturing process. It is inspired by the two major project methodologies of Deming’s "Plan – Do – Check – Act (PDCA)" Cycle which consists of DMAIC and DMADV. Those two methodologies are comprised of five phases. The DMAIC project methodology will be comprehensively used in this research. In brief, DMAIC is utilised for improving the existing manufacturing process and it involves the phases Define, Measure, Analyse, Improve, and Control. Mask industry has become a significant industry in today’s society since the outbreak of some serious diseases such as the Severe Acute Respiratory Syndrome (SARS), bird flu, influenza, swine flu and hay fever. Protecting the respiratory system, then, has become the fundamental requirement for preventing respiratory deceases. Mask is the most appropriate and protective product inasmuch as it is effective in protecting the respiratory tract and resisting the virus infection through air. In order to satisfy various customers’ requirements, thousands of mask products are designed in the market. Moreover, masks are also widely used in industries including medical industries, semi-conductor industries, food industries, traditional manufacturing, and metal industries. Notwithstanding the quality of masks have become the prioritisations since they are used to prevent dangerous diseases and safeguard people, the quality improvement technique are of very high significance in mask industry. The purpose of this research project is firstly to investigate the current quality control practices in a mask industry, then, to explore the feasibility of using Six Sigma technique in that industry, and finally, to implement the Six Sigma technique in the case company to develop and evaluate the product quality process. This research mainly investigates the quality problems of musk industry and effectiveness of six sigma technique in musk industry with the United Excel Enterprise Corporation (UEE) Company as a case company. The DMAIC project methodology in the Six Sigma technique is adopted and developed in this research. This research makes significant contribution to knowledge. The main results contribute to the discovering the root causes of quality problems in a mask industry. Secondly, the company was able to increase not only acceptance rate but quality level by utilising the Six Sigma technique. Hence, utilising the Six Sigma technique could increase the production capacity of the company. Third, the Six Sigma technique is necessary to be extensively modified to improve the quality control in the mask industry. The impact of the Six Sigma technique on the overall performance in the business organisation should be further explored in future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To provide an overview and a critical appraisal of systematic reviews (SRs) of published interventions for the prevention/management of radiation dermatitis. Methods and Materials: We searched Medline, CINAHL, Embase, and the Cochrane Library. We also manually searched through individual reference lists of potentially eligible articles and a number of key journals in the topic area. Two authors screened all potential articles and included eligible SRs. Two authors critically appraised and extracted key findings from the included reviews using AMSTAR (the measurement tool for “assessment of multiple systematic reviews”). Results: Of 1837 potential titles, 6 SRs were included. A number of interventions have been reported to be potentially beneficial for managing radiation dermatitis. Interventions evaluated in these reviews included skin care advice, steroidal/nonsteroidal topical agents, systemic therapies, modes of radiation delivery, and dressings. However, all the included SRs reported that there is insufficient evidence supporting any single effective intervention. The methodological quality of the included studies varied, and methodological shortfalls in these reviews might create biases to the overall results or recommendations for clinical practice. Conclusions: An up-to-date high-quality SR in the prevention/management of radiation dermatitis is needed to guide practice and direction for future research. We recommend that clinicians or guideline developers critically evaluate the information of SRs in their decision making.