959 resultados para LIKELIHOOD PRINCIPLE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This discussion paper is intended to provide background material for the workshop organised by Queensland University Technology (QUT) on 17 October 2014. The overall purpose of the workshop is to better understand the relationship between the precautionary principle and endangered species management in Australia. In particular, we are looking for real life examples (or hypotheticals) of where the principle is (or is not) being applied in relation to Australia’s endangered species. A wide variety of participants have been invited to the workshop including scientists, representatives of NGOs, lawyers and academics. Whilst some very general information is outlined below, we encourage all participants to bring their own thoughts on how the precautionary principle should operate and to reflect on examples of where you have seen it work (or not work) in Australia. The sharing of your own case studies is thus encouraged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A teaching laboratory experiment is described that uses Archimedes’ principle to precisely investigate the effect of global warming on the oceans. A large component of sea level rise is due to the increase in the volume of water due to the decrease in water density with increasing temperature. Water close to 0 °C is placed in a beaker and a glass marble hung from an electronic balance immersed in the water. As the water warms, the weight of the marble increases as the water is less buoyant due to the decrease in density. In the experiment performed in this paper a balance with a precision of 0.1 mg was used with a marble 40.0 cm3 and mass of 99.3 g, yielding water density measurements with an average error of -0.008 ± 0.011%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report provides an evaluation of the implementation of the Polluter Pays Principle (PPP) – a principle of international environmental law – in the context of pollution from sugarcane farming affecting Australia’s Great Barrier Reef (GBR). The research was part of an experiment to test methods for evaluating the effectiveness of environmental laws. Overall, we found that whilst the PPP is reflected to a limited extent in Australian law (more so in Queensland law, than at the national level), the behaviour one might expect in terms of implementing the principle was largely inadequate. Evidence of a longer term, explicit commitment to the PPP was particularly weak.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article addresses the need of an implementation mechanism for the protection of refugees’ rights. However, it is contended that the principle forms part of Customary International Law, under which it is binding on all states irrespective of whether or not they are parties to the Convention Relating to the Status of Refugees 1951 or its Protocol 1967. Since last decade, U.S and its allies have been fighting to curve terrorism which has raised many issues such as human rights violation, deportation, expulsion, extradition, rendition and many more. Pakistan has played a very critical role in War against Terrorism, particularly in reference of war in Afghanistan. Particular concern of this article is the violation of refugees’ rights in Pakistan in 2008 and 2010. This article would highlight the legislation regarding non-expulsion of Afghan refugees from Pakistan to a territory where they have well founded fear of persecution. Article is divided into three parts, the first one deals with “Principle of Non-Refoulement”, the second one deals with “exceptions to the principle” whereas the last one discusses the violation of the very principle in Pakistan with reference to Afghan refugees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Fabens method is commonly used to estimate growth parameters k and l infinity in the von Bertalanffy model from tag-recapture data. However, the Fabens method of estimation has an inherent bias when individual growth is variable. This paper presents an asymptotically unbiassed method using a maximum likelihood approach that takes account of individual variability in both maximum length and age-at-tagging. It is assumed that each individual's growth follows a von Bertalanffy curve with its own maximum length and age-at-tagging. The parameter k is assumed to be a constant to ensure that the mean growth follows a von Bertalanffy curve and to avoid overparameterization. Our method also makes more efficient use nf thp measurements at tno and recapture and includes diagnostic techniques for checking distributional assumptions. The method is reasonably robust and performs better than the Fabens method when individual growth differs from the von Bertalanffy relationship. When measurement error is negligible, the estimation involves maximizing the profile likelihood of one parameter only. The method is applied to tag-recapture data for the grooved tiger prawn (Penaeus semisulcatus) from the Gulf of Carpentaria, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a simple method of constructing quasi-likelihood functions for dependent data based on conditional-mean-variance relationships, and apply the method to estimating the fractal dimension from box-counting data. Simulation studies were carried out to compare this method with the traditional methods. We also applied this technique to real data from fishing grounds in the Gulf of Carpentaria, Australia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider estimation of mortality rates and growth parameters from length-frequency data of a fish stock and derive the underlying length distribution of the population and the catch when there is individual variability in the von Bertalanffy growth parameter L-infinity. The model is flexible enough to accommodate 1) any recruitment pattern as a function of both time and length, 2) length-specific selectivity, and 3) varying fishing effort over time. The maximum likelihood method gives consistent estimates, provided the underlying distribution for individual variation in growth is correctly specified. Simulation results indicate that our method is reasonably robust to violations in the assumptions. The method is applied to tiger prawn data (Penaeus semisulcatus) to obtain estimates of natural and fishing mortality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple stochastic model of a fish population subject to natural and fishing mortalities is described. The fishing effort is assumed to vary over different periods but to be constant within each period. A maximum-likelihood approach is developed for estimating natural mortality (M) and the catchability coefficient (q) simultaneously from catch-and-effort data. If there is not enough contrast in the data to provide reliable estimates of both M and q, as is often the case in practice, the method can be used to obtain the best possible values of q for a range of possible values of M. These techniques are illustrated with tiger prawn (Penaeus semisulcatus) data from the Northern Prawn Fishery of Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quasi-likelihood (QL) methods are often used to account for overdispersion in categorical data. This paper proposes a new way of constructing a QL function that stems from the conditional mean-variance relationship. Unlike traditional QL approaches to categorical data, this QL function is, in general, not a scaled version of the ordinary log-likelihood function. A simulation study is carried out to examine the performance of the proposed QL method. Fish mortality data from quantal response experiments are used for illustration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent axiomatic derivations of the maximum entropy principle from consistency conditions are critically examined. We show that proper application of consistency conditions alone allows a wider class of functionals, essentially of the form ∝ dx p(x)[p(x)/g(x)] s , for some real numbers, to be used for inductive inference and the commonly used form − ∝ dx p(x)ln[p(x)/g(x)] is only a particular case. The role of the prior densityg(x) is clarified. It is possible to regard it as a geometric factor, describing the coordinate system used and it does not represent information of the same kind as obtained by measurements on the system in the form of expectation values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report is the result of a small-scale experiment looking at improving methods for evaluating environmental laws. The objective in this research was to evaluate the effectiveness of the precautionary principle – an accepted principle of international environmental law – in the context of Australia’s endangered species. Two case studies were selected by our team: the (Great) White Shark and an endangered native Australian plant known as Tylophora Linearis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is common to model the dynamics of fisheries using natural and fishing mortality rates estimated independently using two separate analyses. Fishing mortality is routinely estimated from widely available logbook data, whereas natural mortality estimations have often required more specific, less frequently available, data. However, in the case of the fishery for brown tiger prawn (Penaeus esculentus) in Moreton Bay, both fishing and natural mortality rates have been estimated from logbook data. The present work extended the fishing mortality model to incorporate an eco-physiological response of tiger prawn to temperature, and allowed recruitment timing to vary from year to year. These ecological characteristics of the dynamics of this fishery were ignored in the separate model that estimated natural mortality. Therefore, we propose to estimate both natural and fishing mortality rates within a single model using a consistent set of hypotheses. This approach was applied to Moreton Bay brown tiger prawn data collected between 1990 and 2010. Natural mortality was estimated by maximum likelihood to be equal to 0.032 ± 0.002 week−1, approximately 30% lower than the fixed value used in previous models of this fishery (0.045 week−1).