69 resultados para Error judicial
Resumo:
Sexual harassment can be conceptualized as a series of interactions between harassers and targets that either inhibit or increase outrage by third parties. The outrage management model predicts the kinds of actions likely to be used by perpetrators to minimize outrage, predicts the consequences of failing to use these tactics—namely backfire, and recommends countertactics to increase outrage. Using this framework, our archival study examined outrage-management tactics reported as evidence in 23 judicial decisions of sexual harassment cases in Australia. The decisions contained precise, detailed information about the circumstances leading to the claim; the events which transpired in the courtroom, including direct quotations; and the judges' interpretations and findings. We found evidence that harassers minimize outrage by covering up the actions, devaluing the target, reinterpreting the events, using official channels to give an appearance of justice, and intimidating or bribing people involved. Targets can respond using countertactics of exposure, validation, reframing, mobilization of support, and resistance. Although there are limitations to using judicial decisions as a source of information, our study points to the value of studying tactics and the importance to harassers of minimizing outrage from their actions. The findings also highlight that, given the limitations of statutory and organizational protections in reducing the incidence and severity of sexual harassment in the community, individual responses may be effective as part of a multilevel response in reducing the incidence and impact of workplace sexual harassment as a gendered harm.
Resumo:
An algorithm based on the concept of combining Kalman filter and Least Error Square (LES) techniques is proposed in this paper. The algorithm is intended to estimate signal attributes like amplitude, frequency and phase angle in the online mode. This technique can be used in protection relays, digital AVRs, DGs, DSTATCOMs, FACTS and other power electronics applications. The Kalman filter is modified to operate on a fictitious input signal and provides precise estimation results insensitive to noise and other disturbances. At the same time, the LES system has been arranged to operate in critical transient cases to compensate the delay and inaccuracy identified because of the response of the standard Kalman filter. Practical considerations such as the effect of noise, higher order harmonics, and computational issues of the algorithm are considered and tested in the paper. Several computer simulations and a laboratory test are presented to highlight the usefulness of the proposed method. Simulation results show that the proposed technique can simultaneously estimate the signal attributes, even if it is highly distorted due to the presence of non-linear loads and noise.
Resumo:
Internationally, sentencing research has largely neglected the impact of Indigeneity on sentencing outcomes. Using data from Western Australia’s higher courts for the years 2003–05, we investigate the direct and interactive effects of Indigenous status on the judicial decision to imprison. Unlike prior research on race/ethnicity in which minority offenders are often found to be more harshly treated by sentencing courts, we find that Indigenous status has no direct effect on the decision to imprison,after adjusting for other sentencing factors (especially past and current criminality).However, there are sub-group differences: Indigenous males are more likely to receive a prison sentence compared to non-Indigenous females. We draw on the focal concerns perspective of judicial decision making in interpreting our findings.
Resumo:
As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.
Resumo:
Gay community media functions as a system with three nodes, in which the flows of information and capital theoretically benefit all parties: the gay community gains a sense of cohesion and citizenship through media; the gay media outlets profit from advertisers’ capital; and advertisers recoup their investments in lucrative ‘pink dollar’ revenue. But if a necessary corollary of all communication systems is error or noise, where—and what—are the errors in this system? In this paper we argue that the ‘error’ in the gay media system is Queerness, and that the gay media system ejects (in a process of Kristevan abjection) these Queer identities in order to function successfully. We examine the ways in which Queer identities are excluded from representation in such media through a discourse and content analysis of The Sydney Star Observer (Australia’s largest gay and lesbian paper). First, we analyse the way Queer bodies are excluded from the discourses that construct and reinforce both the ideal gay male body and the notions of homosexual essence required for that body to be meaningful. We then argue that abject Queerness returns in the SSO’s discourses of public health through the conspicuous absence of the AIDS-inflicted body (which we read as the epitome of the abject Queer), since this absence paradoxically conjures up a trace of that which the system tries to expel. We conclude by arguing that because the ‘Queer error’ is integral to the SSO, gay community media should practise a politics of Queer inclusion rather than exclusion.
Resumo:
Regardless of technology benefits, safety planners still face difficulties explaining errors related to the use of different technologies and evaluating how the errors impact the performance of safety decision making. This paper presents a preliminary error impact analysis testbed to model object identification and tracking errors caused by image-based devices and algorithms and to analyze the impact of the errors for spatial safety assessment of earthmoving and surface mining activities. More specifically, this research designed a testbed to model workspaces for earthmoving operations, to simulate safety-related violations, and to apply different object identification and tracking errors on the data collected and processed for spatial safety assessment. Three different cases were analyzed based on actual earthmoving operations conducted at a limestone quarry. Using the testbed, the impacts of the errors were investigated for the safety planning purpose.
Resumo:
A good faith reading of core international protection obligations requires that states employ appropriate legislative, administrative and judicial mechanisms to ensure the enjoyment of a fair and effective asylum process. Restrictive asylum policies instead seek to ‘denationalize’ the asylum process by eroding access to national statutory, judicial and executive safeguards that ensure a full and fair hearing of an asylum claim. From a broader perspective, the argument in this thesis recognizes hat international human rights depend on domestic institutions for their effective implementation, and that a rights-based international legal order requires that power is limited, whether that power is expressed as an instance of the sovereign right of states in international law or as the authority of governments under domestic constitutions.
Resumo:
Nationally, there is much legislation regulating land sale transactions, particularly in relation to seller disclosure of information. The statutes require strict compliance by a seller failing which, in general, a buyer can terminate the contract. In a number of instances, when buyers have sought to exercise these rights, sellers have alleged that buyers have either expressly or by conduct waived their rights to rely upon these statutes. This article examines the nature of these rights in this context, whether they are capable of waiver and, if so, what words or conduct might be sufficient to amount to waiver. The analysis finds that the law is in a very unsatisfactory state, that the operation of those rules that can be identified as having relevance are unevenly applied and concludes that sellers have, in the main, been unsuccessful in defeating buyers' statutory rights as a result of an alleged waiver by those buyers.
Resumo:
We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.
Resumo:
We consider complexity penalization methods for model selection. These methods aim to choose a model to optimally trade off estimation and approximation errors by minimizing the sum of an empirical risk term and a complexity penalty. It is well known that if we use a bound on the maximal deviation between empirical and true risks as a complexity penalty, then the risk of our choice is no more than the approximation error plus twice the complexity penalty. There are many cases, however, where complexity penalties like this give loose upper bounds on the estimation error. In particular, if we choose a function from a suitably simple convex function class with a strictly convex loss function, then the estimation error (the difference between the risk of the empirical risk minimizer and the minimal risk in the class) approaches zero at a faster rate than the maximal deviation between empirical and true risks. In this paper, we address the question of whether it is possible to design a complexity penalized model selection method for these situations. We show that, provided the sequence of models is ordered by inclusion, in these cases we can use tight upper bounds on estimation error as a complexity penalty. Surprisingly, this is the case even in situations when the difference between the empirical risk and true risk (and indeed the error of any estimate of the approximation error) decreases much more slowly than the complexity penalty. We give an oracle inequality showing that the resulting model selection method chooses a function with risk no more than the approximation error plus a constant times the complexity penalty.