187 resultados para principled


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Alternative dispute resolution, or ‘ADR’, is defined by the National Alternative Dispute Resolution Advisory Council as: … an umbrella term for processes, other than judicial determination, in which an impartial person assists those in a dispute to resolve the issues between them. ADR is commonly used as an abbreviation for alternative dispute resolution, but can also be used to mean assisted or appropriate dispute resolution. Some also use the term ADR to include approaches that enable parties to prevent or manage their own disputes without outside assistance. A broad range of ADR processes are used in legal practice contexts, including, for example, arbitration, conciliation, mediation, negotiation, conferencing, case appraisal and neutral evaluation. Hybrid processes are also used, such as med-arb in which the practitioner starts by using mediation, and then shifts to using arbitration. ADR processes generally fall into one of three general categories: facilitative, advisory or determinative. In a facilitative process, the ADR practitioner has the role of assisting the parties to reach a mutually agreeable outcome to the dispute by helping them to identify the issues in dispute, and to develop a range of options for resolving the dispute. Mediation and facilitated negotiation are examples of facilitative processes. ADR processes that are advisory involve the practitioner appraising the dispute, providing advice as to the facts of the dispute, the law and then, in some cases, articulating possible or appropriate outcomes and how they might be achieved. Case appraisal and neutral evaluation are examples of advisory processes. In a determinative ADR process, the practitioner evaluates the dispute (which may include the hearing of formal evidence from the parties) and makes a determination. Arbitration is an example of a determinative ADR process. The use of ADR processes has increased significantly in recent years. Indeed, in a range of contemporary legal contexts the use of an ADR process is now required before a party is able to file a matter in court. For example, Juliet Behrens discusses in Chapter 11 of this book how the Family Law Act 1975 (Cth) now effectively mandates attendance at pre-filing family dispute resolution in parenting disputes. At the state level, in Queensland, for example, attendance at a conciliation conference can be required in anti-discrimination matters, and is encouraged in residential tenancy matters, and in personal injuries matters the parties must attend a preliminary compulsory conference. Certain ADR processes are used more commonly in the resolution of particular disputes. For example, in family law contexts, mediation and conciliation are generally used because they provide the parties with flexibility in terms of process and outcome while still ensuring that the negotiations occur in a positive, structured and facilitated framework. In commercial contexts, arbitration and neutral evaluation are often used because they can provide the parties with a determination of the dispute that is factually and legally principled, but which is also private and more timely than if the parties went to court. Women, as legal personalities and citizens of society, can find themselves involved in any sort of legal dispute, and therefore all forms of ADR are relevant to women. Perhaps most commonly, however, women come into contact with facilitative ADR processes. For example, through involvement in family law disputes women will encounter family dispute resolution processes, such as mediation. In this chapter, therefore, the focus is on facilitative ADR processes and, particularly, issues for women in terms of their participation in such processes. The aim of this chapter is to provide legal practitioners with an understanding of issues for women in ADR to inform your approach to representing women clients in such processes, and to guide you in preparing women clients for their participation in ADR. The chapter begins with a consideration of the ways in which facilitative ADR processes are positive for women participants. Next, some of the disadvantages for women in ADR are explored. Finally, the chapter offers ways in which legal practitioners can effectively prepare women clients for participation in ADR. Before embarking on a discussion of issues for women in ADR, it is important to acknowledge that women’s experiences in these dispute resolution environments, whilst often sharing commonalities, are diverse and informed by a range of factors specific to each individual woman; for example, her race or socio-economic background. This discussion, therefore, addresses some common issues for women in ADR that are fundamentally gender based. It must be noted, however, that providing advice to women clients about participating in ADR processes requires legal practitioners to have a very good understanding of the client as an individual, and her particular needs and interests. Some sources of diversity are discussed in Chapters 13, 14 and 15.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wound healing and tumour growth involve collective cell spreading, which is driven by individual motility and proliferation events within a population of cells. Mathematical models are often used to interpret experimental data and to estimate the parameters so that predictions can be made. Existing methods for parameter estimation typically assume that these parameters are constants and often ignore any uncertainty in the estimated values. We use approximate Bayesian computation (ABC) to estimate the cell diffusivity, D, and the cell proliferation rate, λ, from a discrete model of collective cell spreading, and we quantify the uncertainty associated with these estimates using Bayesian inference. We use a detailed experimental data set describing the collective cell spreading of 3T3 fibroblast cells. The ABC analysis is conducted for different combinations of initial cell densities and experimental times in two separate scenarios: (i) where collective cell spreading is driven by cell motility alone, and (ii) where collective cell spreading is driven by combined cell motility and cell proliferation. We find that D can be estimated precisely, with a small coefficient of variation (CV) of 2–6%. Our results indicate that D appears to depend on the experimental time, which is a feature that has been previously overlooked. Assuming that the values of D are the same in both experimental scenarios, we use the information about D from the first experimental scenario to obtain reasonably precise estimates of λ, with a CV between 4 and 12%. Our estimates of D and λ are consistent with previously reported values; however, our method is based on a straightforward measurement of the position of the leading edge whereas previous approaches have involved expensive cell counting techniques. Additional insights gained using a fully Bayesian approach justify the computational cost, especially since it allows us to accommodate information from different experiments in a principled way.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the internet age, copyright owners are increasingly looking to online intermediaries to take steps to prevent copyright infringement. Sometimes these intermediaries are closely tied to the acts of infringement; sometimes – as in the case of ISPs – they are not. In 2012, the Australian High Court decided the Roadshow Films v iiNet case, in which it held that an Australian ISP was not liable under copyright’s authorization doctrine, which asks whether the intermediary has sanctioned, approved or countenanced the infringement. The Australian Copyright Act 1968 directs a court to consider, in these situations, whether the intermediary had the power to prevent the infringement and whether it took any reasonable steps to prevent or avoid the infringement. It is generally not difficult for a court to find the power to prevent infringement – power to prevent can include an unrefined technical ability to disconnect users from the copyright source, such as an ISP terminating users’ internet accounts. In the iiNet case, the High Court eschewed this broad approach in favor of focusing on a notion of control that was influenced by principles of tort law. In tort, when a plaintiff asserts that a defendant should be liable for failing to act to prevent harm caused to the plaintiff by a third party, there is a heavy burden on the plaintiff to show that the defendant had a duty to act. The duty must be clear and specific, and will often hinge on the degree of control that the defendant was able to exercise over the third party. Control in these circumstances relates directly to control over the third party’s actions in inflicting the harm. Thus, in iiNet’s case, the control would need to be directed to the third party’s infringing use of BitTorrent; control over a person’s ability to access the internet is too imprecise. Further, when considering omissions to act, tort law differentiates between the ability to control and the ability to hinder. The ability to control may establish a duty to act, and the court will then look to small measures taken to prevent the harm to determine whether these satisfy the duty. But the ability to hinder will not suffice to establish liability in the absence of control. This article argues that an inquiry grounded in control as defined in tort law would provide a more principled framework for assessing the liability of passive intermediaries in copyright. In particular, it would set a higher, more stable benchmark for determining the copyright liability of passive intermediaries, based on the degree of actual, direct control that the intermediary can exercise over the infringing actions of its users. This approach would provide greater clarity and consistency than has existed to date in this area of copyright law in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a study about women's participation in Bhutan's new democracy and exposes the patriarchy embedded in Bhutanese society which is reinforced through cultural practices and the legal framework. It reveals the public/private dichotomy, the low educational attainment of girls and the gendered division of labour which derails women's public life. It discloses a masculine driven party politics and the challenges of being a woman in the world of men. Nonetheless, the first trailblazing women parliamentarians demonstrated a principled, feminine, political leadership in a masculine environment. Semi-structured interviews, document review and participant observation methods were used to collect data.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the internet age, copyright owners are increasingly looking to online intermediaries to take steps to prevent copyright infringement. Sometimes these intermediaries are closely tied to the acts of infringement; sometimes – as in the case of ISPs – they are not. In 2012, the Australian High Court decided the Roadshow Films v iiNet case, in which it held that an Australian ISP was not liable under copyright’s authorization doctrine, which asks whether the intermediary has sanctioned, approved or countenanced the infringement. The Australian Copyright Act 1968 directs a court to consider, in these situations, whether the intermediary had the power to prevent the infringement and whether it took any reasonable steps to prevent or avoid the infringement. It is generally not difficult for a court to find the power to prevent infringement – power to prevent can include an unrefined technical ability to disconnect users from the copyright source, such as an ISP terminating users’ internet accounts. In the iiNet case, the High Court eschewed this broad approach in favor of focusing on a notion of control that was influenced by principles of tort law. In tort, when a plaintiff asserts that a defendant should be liable for failing to act to prevent harm caused to the plaintiff by a third party, there is a heavy burden on the plaintiff to show that the defendant had a duty to act. The duty must be clear and specific, and will often hinge on the degree of control that the defendant was able to exercise over the third party. Control in these circumstances relates directly to control over the third party’s actions in inflicting the harm. Thus, in iiNet’s case, the control would need to be directed to the third party’s infringing use of BitTorrent; control over a person’s ability to access the internet is too imprecise. Further, when considering omissions to act, tort law differentiates between the ability to control and the ability to hinder. The ability to control may establish a duty to act, and the court will then look to small measures taken to prevent the harm to determine whether these satisfy the duty. But the ability to hinder will not suffice to establish liability in the absence of control. This chapter argues that an inquiry grounded in control as defined in tort law would provide a more principled framework for assessing the liability of passive intermediaries in copyright. In particular, it would set a higher, more stable benchmark for determining the copyright liability of passive intermediaries, based on the degree of actual, direct control that the intermediary can exercise over the infringing actions of its users. This approach would provide greater clarity and consistency than has existed to date in this area of copyright law in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multielectrode neurophysiological recording and high-resolution neuroimaging generate multivariate data that are the basis for understanding the patterns of neural interactions. How to extract directions of information flow in brain networks from these data remains a key challenge. Research over the last few years has identified Granger causality as a statistically principled technique to furnish this capability. The estimation of Granger causality currently requires autoregressive modeling of neural data. Here, we propose a nonparametric approach based on widely used Fourier and wavelet transforms to estimate both pairwise and conditional measures of Granger causality, eliminating the need of explicit autoregressive data modeling. We demonstrate the effectiveness of this approach by applying it to synthetic data generated by network models with known connectivity and to local field potentials recorded from monkeys performing a sensorimotor task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the problem of uncertainty in the entries of the Kernel matrix, arising in SVM formulation. Using Chance Constraint Programming and a novel large deviation inequality we derive a formulation which is robust to such noise. The resulting formulation applies when the noise is Gaussian, or has finite support. The formulation in general is non-convex, but in several cases of interest it reduces to a convex program. The problem of uncertainty in kernel matrix is motivated from the real world problem of classifying proteins when the structures are provided with some uncertainty. The formulation derived here naturally incorporates such uncertainty in a principled manner leading to significant improvements over the state of the art. 1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When hosting XML information on relational backends, a mapping has to be established between the schemas of the information source and the target storage repositories. A rich body of recent literature exists for mapping isolated components of XML Schema to their relational counterparts, especially with regard to table configurations. In this paper, we present the Elixir system for designing industrial-strength mappings for real-world applications. Specifically, it produces an information-preserving holistic mapping that transforms the complete XML world-view (XML schema with constraints, XML documents XQuery queries including triggers and views) into a full-scale relational mapping (table definitions, integrity constraints, indices, triggers and views) that is tuned to the application workload. A key design feature of Elixir is that it performs all its mapping-related optimizations in the XML source space, rather than in the relational target space. Further, unlike the XML mapping tools of commercial database systems, which rely heavily on user inputs, Elixir takes a principled cost-based approach to automatically find an efficient relational mapping. A prototype of Elixir is operational and we quantitatively demonstrate its functionality and efficacy on a variety of real-life XML schemas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a new algorithm for learning oblique decision trees. Most of the current decision tree algorithms rely on impurity measures to assess the goodness of hyperplanes at each node while learning a decision tree in top-down fashion. These impurity measures do not properly capture the geometric structures in the data. Motivated by this, our algorithm uses a strategy for assessing the hyperplanes in such a way that the geometric structure in the data is taken into account. At each node of the decision tree, we find the clustering hyperplanes for both the classes and use their angle bisectors as the split rule at that node. We show through empirical studies that this idea leads to small decision trees and better performance. We also present some analysis to show that the angle bisectors of clustering hyperplanes that we use as the split rules at each node are solutions of an interesting optimization problem and hence argue that this is a principled method of learning a decision tree.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the task of mapping a given textual domain model (e.g., an industry-standard reference model) for a given domain (e.g., ERP), with the source code of an independently developed application in the same domain. This has applications in improving the understandability of an existing application, migrating it to a more flexible architecture, or integrating it with other related applications. We use the vector-space model to abstractly represent domain model elements as well as source-code artifacts. The key novelty in our approach is to leverage the relationships between source-code artifacts in a principled way to improve the mapping process. We describe experiments wherein we apply our approach to the task of matching two real, open-source applications to corresponding industry-standard domain models. We demonstrate the overall usefulness of our approach, as well as the role of our propagation techniques in improving the precision and recall of the mapping task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work addresses the problem of estimating the optimal value function in a Markov Decision Process from observed state-action pairs. We adopt a Bayesian approach to inference, which allows both the model to be estimated and predictions about actions to be made in a unified framework, providing a principled approach to mimicry of a controller on the basis of observed data. A new Markov chain Monte Carlo (MCMC) sampler is devised for simulation from theposterior distribution over the optimal value function. This step includes a parameter expansion step, which is shown to be essential for good convergence properties of the MCMC sampler. As an illustration, the method is applied to learning a human controller.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabajo intenta presentar una respuesta desde el Personalismo ontológico a la Bioética principista, fundándose en una ética de las virtudes que apela a la clásica noción de virtud moral como hábito operativo bueno que hace buena la obra y bueno al que obra. Se analizan así, antropológica y éticamente, las virtudes de la prudencia, de la justicia, de la fortaleza y de la templanza, mostrando la aplicación de cada una de ellas al campo bioético.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brain is perhaps the most complex system to have ever been subjected to rigorous scientific investigation. The scale is staggering: over 10^11 neurons, each making an average of 10^3 synapses, with computation occurring on scales ranging from a single dendritic spine, to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterize this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. This dissertation attempts to meet this need, extending and applying the modern tools of latent variable modeling to problems in neural data analysis.

It is divided into two parts. The first begins with an exposition of the general techniques of latent variable modeling. A new, extremely general, optimization algorithm is proposed - called Relaxation Expectation Maximization (REM) - that may be used to learn the optimal parameter values of arbitrary latent variable models. This algorithm appears to alleviate the common problem of convergence to local, sub-optimal, likelihood maxima. REM leads to a natural framework for model size selection; in combination with standard model selection techniques the quality of fits may be further improved, while the appropriate model size is automatically and efficiently determined. Next, a new latent variable model, the mixture of sparse hidden Markov models, is introduced, and approximate inference and learning algorithms are derived for it. This model is applied in the second part of the thesis.

The second part brings the technology of part I to bear on two important problems in experimental neuroscience. The first is known as spike sorting; this is the problem of separating the spikes from different neurons embedded within an extracellular recording. The dissertation offers the first thorough statistical analysis of this problem, which then yields the first powerful probabilistic solution. The second problem addressed is that of characterizing the distribution of spike trains recorded from the same neuron under identical experimental conditions. A latent variable model is proposed. Inference and learning in this model leads to new principled algorithms for smoothing and clustering of spike data.