173 resultados para Trial and error
Resumo:
In light of the high stakes of the deepwater horizon civil trial and the important precedent-setting role that the case will have on the assessment of future marine disasters, the methodologies underpinning the calculations of damage on both sides will be subjected to considerable scrutiny. Despite the importance of the case, however, there seems to be a pronounced lack of convergence about it in the academic literature. Contributions from scientific journals frequently make comparisons to the Ixtoc I oil spill off the coast of Mexico in 1979; the legal literature, by stark contrast, seems to be much more focused on the Exxon Valdez spill that occurred off the shores of Alaska in 1989. This paper accordingly calls for a more thorough consideration of other analogs beyond the Exxon Valdez spill—most notably, the Ixtoc I incident—in arriving at an assessment of the damage caused by the Deepwater Horizon disaster.
Resumo:
Numbers, rates and proportions of those remanded in custody have increased significantly in recent decades across a range of jurisdictions. In Australia they have doubled since the early 1980s, such that close to one in four prisoners is currently unconvicted. Taking NSW as a case study and drawing on the recent New South Wales Law Reform Commission Report on Bail (2012), this article will identify the key drivers of this increase in NSW, predominantly a form of legislative hyperactivity involving constant changes to the Bail Act 1978 (NSW), changes which remove or restrict the presumption in favour of bail for a wide range of offences. The article will then examine some of the conceptual, cultural and practice shifts underlying the increase. These include: a shift away from a conception of bail as a procedural issue predominantly concerned with securing the attendance of the accused at trial and the integrity of the trial, to the use of bail for crime prevention purposes; the diminishing force of the presumption of innocence; the framing of a false opposition between an individual interest in liberty and a public interest in safety; a shift from determination of the individual case by reference to its own particular circumstances to determination by its classification within pre‐set legislative categories of offence types and previous convictions; a double jeopardy effect arising in relation to people with previous convictions for which they have already been punished; and an unacknowledged preventive detention effect arising from the increased emphasis on risk. Many of these conceptual shifts are apparent in the explosion in bail conditions and the KPI‐driven policing of bail conditions and consequent rise in revocations, especially in relation to juveniles. The paper will conclude with a note on the NSW Government’s response to the NSW LRC Report in the form of a Bail Bill (2013) and brief speculation as to its likely effects.
Resumo:
On 25 March 1997 the Witness program on Channel 7 screened a story about the conviction of Neil Chidiac in February 1989 for conspiracy to import a trafficable quantity of heroin in NSW. The program questioned the justice of Chidiac's conviction and filmed his recent release from prison on parole after serving over eight years in prison, still protesting his innocence. Witness featured an interview with the chief Crown witness against Chidiac, Alfred Oti, in which Oti completely repudiated the testimony he gave at the trial and admitted to lying at the behest of the police in order to secure advantages for himself...
Resumo:
The failure of medical practitioners to consistently discharge their obligation to report sudden or unnatural deaths to coroners has rightly prompted concern. Following recent public scandals, coroners and health authorities have increasingly developed procedures to ensure that concerning deaths are reported to coroners. However, the negative consequences of deaths being unnecessarily reported have received less attention: unnecessary intrusion into bereavement; a waste of public resources; and added delay and hindrance to the investigation of matters needing a coroner’s attention. Traditionally, coroners have largely, unquestioningly assumed jurisdiction over any deaths for which a medical practitioner has not issued a cause of death certificate. The Office of the State Coroner in Queensland has recently trialled a system to more rigorously assess whether deaths apparently resulting from natural causes, which have been reported to a coroner, should be investigated by the coroner, rather than being finalised by a doctor issuing a cause of death certificate. This article describes that trial and its results.
Improving the performance of nutrition screening through a series of quality improvement initiatives
Resumo:
Background Nutrition screening identifies patients at risk of malnutrition to facilitate early nutritional intervention. Studies have reported incompletion and error rates of 30-90% for a range of commonly used screening tools. This study aims to investigate the incompletion and error rates of 3-Minute Nutrition Screening (3-MinNS) and the effect of quality improvement initiatives in improving the overall performance of the screening tool and the referral process for at risk patients. Methods Annual audits were carried out from 2008-2013 on 4467 patients. Value Stream Mapping, Plan-Do-Check-Act cycle and Root Cause Analysis were used in this study to identify gaps and determine the best intervention. The intervention included 1) implementing a nutrition screening protocol, 2) nutrition screening training, 3) nurse empowerment for online dietetics referral of at-risk cases, 4) closed-loop feedback system and 5) removing a component of 3-MinNS that caused the most error without compromising its sensitivity and specificity. Results Nutrition screening error rates were 33% and 31%, with 5% and 8% blank or missing forms, in 2008 and 2009 respectively. For patients at risk of malnutrition, referral to dietetics took up to 7.5 days, with 10% not referred at all. After intervention, the latter decreased to 7% (2010), 4% (2011) and 3% (2012 and 2013), and the mean turnaround time from screening to referral was reduced significantly from 4.3 ± 1.8 days to 0.3 ± 0.4 days (p < 0.001). Error rates were reduced to 25% (2010), 15% (2011), 7% (2012) and 5% (2013) and percentage of blank or missing forms reduced to and remained at 1%. Conclusion Quality improvement initiatives are effective in reducing the incompletion and error rates of nutrition screening, and led to sustainable improvements in the referral process of patients at nutritional risk.
Resumo:
Neuropsychological tests requiring patients to find a path through a maze can be used to assess visuospatial memory performance in temporal lobe pathology, particularly in the hippocampus. Alternatively, they have been used as a task sensitive to executive function in patients with frontal lobe damage. We measured performance on the Austin Maze in patients with unilateral left and right temporal lobe epilepsy (TLE), with and without hippocampal sclerosis, compared to healthy controls. Performance was correlated with a number of other neuropsychological tests to identify the cognitive components that may be associated with poor Austin Maze performance. Patients with right TLE were significantly impaired on the Austin Maze task relative to patients with left TLE and controls, and error scores correlated with their performance on the Block Design task. The performance of patients with left TLE was also impaired relative to controls; however, errors correlated with performance on tests of executive function and delayed recall. The presence of hippocampal sclerosis did not have an impact on maze performance. A discriminant function analysis indicated that the Austin Maze alone correctly classified 73.5% of patients as having right TLE. In summary, impaired performance on the Austin Maze task is more suggestive of right than left TLE; however, impaired performance on this visuospatial task does not necessarily involve the hippocampus. The relationship of the Austin Maze task with other neuropsychological tests suggests that differential cognitive components may underlie performance decrements in right versus left TLE.
Resumo:
INTRODUCTION: In 2008, the US FDA required all new glucose-lowering therapies to show cardiovascular safety, and this applies to the dipeptidyl peptidase-4 inhibitors ('gliptins'). AREAS COVERED: The cardiovascular safety trials of saxagliptin and alogliptin have recently been published and are the subject of this evaluation. EXPERT OPINION: The Saxagliptin Assessment of Vascular Outcomes Recorded in Patients with Diabetes Mellitus - Thrombolysis in Myocardial Infarction 53 trial and Examination of Cardiovascular Outcomes with Alogliptin versus Standard of Care were both multicentre, randomised, double-blind, placebo-controlled, Phase IV clinical trials. These trials showed that saxagliptin and alogliptin did not increase the primary end point, which was a composite of cardiovascular outcomes that did not include hospitalisations for heart failure. However, saxagliptin significantly increased hospitalisation for heart failure, which was a component of the secondary end point. The effect of alogliptin on hospitalisations for heart failure has not been reported. Neither agent improved cardiovascular outcomes. As there is no published evidence of improved outcomes with gliptins, it is unclear to us why these agents are so widely available for use. We suggest that the use of gliptins be restricted to Phase IV clinical trials until such time as cardiovascular safety and benefits/superiority are clearly established
Resumo:
Acoustic sensors allow scientists to scale environmental monitoring over large spatiotemporal scales. The faunal vocalisations captured by these sensors can answer ecological questions, however, identifying these vocalisations within recorded audio is difficult: automatic recognition is currently intractable and manual recognition is slow and error prone. In this paper, a semi-automated approach to call recognition is presented. An automated decision support tool is tested that assists users in the manual annotation process. The respective strengths of human and computer analysis are used to complement one another. The tool recommends the species of an unknown vocalisation and thereby minimises the need for the memorization of a large corpus of vocalisations. In the case of a folksonomic tagging system, recommending species tags also minimises the proliferation of redundant tag categories. We describe two algorithms: (1) a “naïve” decision support tool (16%–64% sensitivity) with efficiency of O(n) but which becomes unscalable as more data is added and (2) a scalable alternative with 48% sensitivity and an efficiency ofO(log n). The improved algorithm was also tested in a HTML-based annotation prototype. The result of this work is a decision support tool for annotating faunal acoustic events that may be utilised by other bioacoustics projects.
Resumo:
The development and maintenance of large and complex ontologies are often time-consuming and error-prone. Thus, automated ontology learning and revision have attracted intensive research interest. In data-centric applications where ontologies are designed or automatically learnt from the data, when new data instances are added that contradict to the ontology, it is often desirable to incrementally revise the ontology according to the added data. This problem can be intuitively formulated as the problem of revising a TBox by an ABox. In this paper we introduce a model-theoretic approach to such an ontology revision problem by using a novel alternative semantic characterisation of DL-Lite ontologies. We show some desired properties for our ontology revision. We have also developed an algorithm for reasoning with the ontology revision without computing the revision result. The algorithm is efficient as its computational complexity is in coNP in the worst case and in PTIME when the size of the new data is bounded.
Resumo:
The control of environmental factors in open-office environments, such as lighting and temperature is becoming increasingly automated. This development means that office inhabitants are losing the ability to manually adjust environmental conditions according to their needs. In this paper we describe the design, use and evaluation of MiniOrb, a system that employs ambient and tangible interaction mechanisms to allow inhabitants of office environments to maintain awareness of environmental factors, report on their own subjectively perceived office comfort levels and see how these compare to group average preferences. The system is complemented by a mobile application, which enables users to see and set the same sensor values and preferences, but using a screen-based interface. We give an account of the system’s design and outline the results of an in-situ trial and user study. Our results show that devices that combine ambient and tangible interaction approaches are well suited to the task of recording indoor climate preferences and afford a rich set of possible interactions that can complement those enabled by more conventional screen-based interfaces.
Resumo:
What helps us determine whether a word is a noun or a verb, without conscious awareness? We report on cues in the way individual English words are spelled, and, for the first time, identify their neural correlates via functional magnetic resonance imaging (fMRI). We used a lexical decision task with trisyllabic nouns and verbs containing orthographic cues that are either consistent or inconsistent with the spelling patterns of words from that grammatical category. Significant linear increases in response times and error rates were observed as orthography became less consistent, paralleled by significant linear decreases in blood oxygen level dependent (BOLD) signal in the left supramarginal gyrus of the left inferior parietal lobule, a brain region implicated in visual word recognition. A similar pattern was observed in the left superior parietal lobule. These findings align with an emergentist view of grammatical category processing which results from sensitivity to multiple probabilistic cues.
Resumo:
Ankylosing spondylitis (AS) is a chronic inflammatory arthritis that affects the spine and sacroiliac joints. It causes significant disability and is associated with a number of other features including peripheral arthritis, anterior uveitis, psoriasis and inflammatory bowel disease (IBD). Significant progress has been made in the genetics of AS have in the last five years, leading to new treatments in trial, and major leaps in understanding of the aetiopathogenesis of the disease.
Resumo:
A decision-theoretic framework is proposed for designing sequential dose-finding trials with multiple outcomes. The optimal strategy is solvable theoretically via backward induction. However, for dose-finding studies involving k doses, the computational complexity is the same as the bandit problem with k-dependent arms, which is computationally prohibitive. We therefore provide two computationally compromised strategies, which is of practical interest as the computational complexity is greatly reduced: one is closely related to the continual reassessment method (CRM), and the other improves CRM and approximates to the optimal strategy better. In particular, we present the framework for phase I/II trials with multiple outcomes. Applications to a pediatric HIV trial and a cancer chemotherapy trial are given to illustrate the proposed approach. Simulation results for the two trials show that the computationally compromised strategy can perform well and appear to be ethical for allocating patients. The proposed framework can provide better approximation to the optimal strategy if more extensive computing is available.
Resumo:
Multi-objective optimization is an active field of research with broad applicability in aeronautics. This report details a variant of the original NSGA-II software aimed to improve the performances of such a widely used Genetic Algorithm in finding the optimal Pareto-front of a Multi-Objective optimization problem for the use of UAV and aircraft design and optimsaiton. Original NSGA-II works on a population of predetermined constant size and its computational cost to evaluate one generation is O(mn^2 ), being m the number of objective functions and n the population size. The basic idea encouraging this work is that of reduce the computational cost of the NSGA-II algorithm by making it work on a population of variable size, in order to obtain better convergence towards the Pareto-front in less time. In this work some test functions will be tested with both original NSGA-II and VPNSGA-II algorithms; each test will be timed in order to get a measure of the computational cost of each trial and the results will be compared.
Resumo:
DNA evidence has made a significant contribution to criminal investigations in Australia and around the world since it was widely adopted in the 1990s (Gans & Urbas 2002). The direct matching of DNA profiles, such as comparing one obtained from a crime scene with one obtained from a suspect or database, remains a widely used technique in criminal investigations. A range of new DNA profiling techniques continues to be developed and applied in criminal investigations around the world (Smith & Urbas 2012). This paper is the third in a series by the Australian Institute of Criminology (AIC) on DNA evidence. The first, published in 1990 when the technology was in its relative infancy, outlined the scientific background for DNA evidence, considered early issues such as scientific reliability and privacy and described its application in early criminal cases (Easteal & Easteal 1990). The second, published in 2002, expanded on the scientific background and discussed a significant number of Australian cases in a 12-year period, illustrating issues that had arisen in investigations, at trial and in the use of DNA in the review of convictions and acquittals (Gans & Urbas 2002). There have been some significant developments in the science and technology behind DNA evidence in the 13 years since 2002 that have important implications for law enforcement and the legal system. These are discussed through a review of relevant legal cases and the latest empirical evidence. This paper is structured in three sections. The first examines the scientific techniques and how they have been applied in police investigations, drawing on a number of recent cases to illustrate them. The second considers empirical research evaluating DNA evidence and databases and the impact DNA has on investigative and court outcomes. The final section discusses significant cases that establish legal precedent relating to DNA evidence in criminal trials where significant issues have arisen or new techniques have been applied that have not yet been widely discussed in the literature. The paper concludes by reflecting on implications for policy and practice.