959 resultados para LIKELIHOOD PRINCIPLE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The drive for comparability of financial information is to enable users to distinguish similarities and differences in economic activities for an entity over time and between entities so that their resource allocation decisions are facilitated. With the increased globalisation of economic activities, the enhanced international comparability of financial statements is often used as an argument to advance the convergence of local accounting standards to international financial reporting standards (IFRS). Differences in the underlying economic substance of transactions between jurisdictions plus accounting standards allowing alternative treatments may render this expectation of increased comparability unrealistic. Motivated by observations that, as a construct, comparability is under-researched and not well understood, we develop a comparability framework that distinguishes between four types of comparability. In applying this comparability framework to pension accounting in the Australian and USA contexts, we highlight a dilemma: while regulators seek to increase the likelihood that similar events are accounted for similarly, an unintended consequence may be that preparers are forced to apply similar accounting treatment to events that are, in substance, different.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scientific efforts to understand and reduce the occurrence of road crashes continue to expand, particularly in the areas of vulnerable road user groups. Three groups that are receiving increasing attention within the literature are younger drivers, motorcyclists and older drivers. These three groups are at an elevated risk of being in a crash or seriously injured, and research continues to focus on the origins of this risk as well as the development of appropriate countermeasures to improve driving outcomes for these cohorts. However, it currently remains unclear what factors produce the largest contribution to crash risk or what countermeasures are likely to produce the greatest long term positive effects on road safety. This paper reviews research that has focused on the personal and environmental factors that increase crash risk for these groups as well as considers direction for future research in the respective areas. A major theme to emerge from this review is that while there is a plethora of individual and situational factors that influence the likelihood of crashes, these factors often combine in an additive manner to exacerbate the risk of both injury and fatality. Additionally, there are a number of risk factors that are pertinent for all three road user groups, particularly age and the level of driving experience. As a result, targeted interventions that address these factors are likely to maximise the flow-on benefits to a wider range of road users. Finally, there is a need for further research that aims to bridge the research-to-practice gap, in order to develop appropriate pathways to ensure that evidenced-based research is directly transferred to effective policies that improve safety outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper focuses on codes of practice in domestic (in-country) and international (out of country) philanthropic giving/grantmaking, their similarities and differences. Codes of principle and practice are interesting not so much because they accurately reflect differences in practice on the ground, but rather because they indicate what is considered important or relevant, as well as aspirational. Codes tell us what people are most concerned about – what is seen to be in need of regulation or reminder.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a new system, dubbed Continuous Appearance-based Trajectory Simultaneous Localisation and Mapping (CAT-SLAM), which augments sequential appearance-based place recognition with local metric pose filtering to improve the frequency and reliability of appearance-based loop closure. As in other approaches to appearance-based mapping, loop closure is performed without calculating global feature geometry or performing 3D map construction. Loop-closure filtering uses a probabilistic distribution of possible loop closures along the robot’s previous trajectory, which is represented by a linked list of previously visited locations linked by odometric information. Sequential appearance-based place recognition and local metric pose filtering are evaluated simultaneously using a Rao–Blackwellised particle filter, which weights particles based on appearance matching over sequential frames and the similarity of robot motion along the trajectory. The particle filter explicitly models both the likelihood of revisiting previous locations and exploring new locations. A modified resampling scheme counters particle deprivation and allows loop-closure updates to be performed in constant time for a given environment. We compare the performance of CAT-SLAM with FAB-MAP (a state-of-the-art appearance-only SLAM algorithm) using multiple real-world datasets, demonstrating an increase in the number of correct loop closures detected by CAT-SLAM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Establishing a persistent presence in the ocean with an AUV to observe temporal variability of large-scale ocean processes requires a unique sensor platform. In this paper, we propose a strategy that utilizes ocean model predictions to increase the autonomy and control of Lagrangian or profiling floats for precisely this purpose. An A* planner is applied to a local controllability map generated from predictions of ocean currents to compute a path between prescribed waypoints that has the highest likelihood of successful execution. The control to follow the planned path is computed by use of a model predictive controller. This controller is designed to select the best depth for the vehicle to exploit ambient currents to reach the goal waypoint. Mission constraints are employed to simulate a practical data collection mission. Results are presented in simulation for a mission off the coast of Los Angeles, CA USA, and show surprising results in the ability of a Lagrangian float to reach a desired location.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The emerging principle of a “responsibility to protect” (R2P) presents a direct challenge to China’s traditional emphasis on the twin principles of non-intervention in the domestic affairs of other states and non-use of military force. This paper considers the evolution of China’s relationship with R2P over the past ten years. In particular, it examines how China engaged with R2P during the recent Libyan crisis, and considers what impact this conflict may have first, on Chinese attitudes to R2P, and second, on the future development and implementation of the doctrine itself. This paper argues that China’s decision to allow the passage of Security Council resolution 1973, authorising force in Libya, was shaped by an unusual set of political and factual circumstances, and should not be viewed as evidence of a dramatic shift in Chinese attitudes towards R2P. More broadly, controversy over the scope of NATO’s military action in Libya has raised questions about R2P’s legitimacy, which have contributed to a lack of timely international action in Syria. In the short term at least, this post-Libya backlash against R2P is likely to constrain the Security Council’s ability to respond decisively to other civilian protection situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent increases in cycling have led to many media articles highlighting concerns about interactions between cyclists and pedestrians on footpaths and off-road paths. Under the Australian Road Rules, adults are not allowed to ride on footpaths unless accompanying a child 12 years of age or younger. However, this rule does not apply in Queensland. This paper reviews international studies that examine the safety of footpath cycling for both cyclists and pedestrians, and relevant Australian crash and injury data. The results of a survey of more than 2,500 Queensland adult cyclists are presented in terms of the frequency of footpath cycling, the characteristics of those cyclists and the characteristics of self-reported footpath crashes. A third of the respondents reported riding on the footpath and, of those, about two-thirds did so reluctantly. Riding on the footpath was more common for utilitarian trips and for new riders, although the average distance ridden on footpaths was greater for experienced riders. About 5% of distance ridden and a similar percentage of self-reported crashes occurred on footpaths. These data are discussed in terms of the Safe Systems principle of separating road users with vastly different levels of kinetic energy. The paper concludes that footpaths are important facilities for both inexperienced and experienced riders and for utilitarian riding, especially in locations riders consider do not provide a safe system for cycling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on the development of a good practice guide that will offer the higher education sector a framework for safeguarding student learning engagement. The good practice guide and framework are underpinned by a set of principles initially identified as themes in the social justice literature which were refined following the consolidation of data collected from eight selected “good practice” Australasian universities and feedback gathered at various forums and presentations. The good practice guide will provide the sector with examples of institutional wide efforts which respond to national priorities for student retention and will also provide exemplars of institutional practices for each principle to facilitate the uptake of sector-wide good practice. Participants will be provided with the opportunity to discuss the social justice principles, the draft good practice guide and identify the practical applications of the guide within individual institutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonised health-care practitioners and uncolonised patients (and vice versa) solely from routinely collected incidence data. The method, using approximate Bayesian computation, is substantially less computer intensive and easier to implement than likelihood-based approaches we refer to here. We find through replacing the likelihood with a comparison of an efficient summary statistic between observed and simulated data that little is lost in the precision of estimated transmission rates. Furthermore, we investigate the impact of incorporating uncertainty in previously fixed parameters on the precision of the estimated transmission rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In response to developments in international trade and an increased focus on international transfer-pricing issues, Canada’s minister of finance announced in the 1997 budget that the Department of Finance would undertake a review of the transfer-pricing provisions in the Income Tax Act. On September 11, 1997, the Department of Finance released draft transfer-pricing legislation and Revenue Canada released revised draft Information Circular 87-2R. The legislation was subsequently amended and included in Bill C-28, which received first reading on December 10, 1997. The new rules are intended to update Canada’s international transfer-pricing practices. In particular, they attempt to harmonize the standards in the Income Tax Act with the arm’s-length principle established in the OECD’s transfer pricing guidelines. The new rules also set out contemporaneous documentation requirements in respect of cross-border related-party transactions, facilitate administration of the law by Revenue Canada, and provide for a penalty where transfer prices do not comply with the arm’s-length principle. The Australian tax authorities have similarly reviewed and updated their transfer-pricing practices. Since 1992, the Australian commissioner of taxation has issued three rulings and seven draft rulings directly relating to international transfer pricing. These rulings outline the selection and application of transfer pricing methodologies, documentation requirements, and penalties for non-compliance. The Australian Taxation Office supports the use of advance pricing agreements (APAs) and has expanded its audit strategy by conducting transfer-pricing risk assessment reviews. This article presents a detailed review of Australia’s transfer-pricing policy and practices, which address essentially the same concerns as those at which the new Canadian rules are directed. This review provides a framework for comparison of the approaches adopted in the two jurisdictions. The author concludes that although these approaches differ in some respects, ultimately they produce a similar result. Both regimes set a clear standard to be met by multinational enterprises in establishing transfer prices. Both provide for audits and penalties in the event of noncompliance. And both offer the alternative of an APA as a means of avoiding transfer-pricing disputes with Australian and Canadian tax authorities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite recent methodological advances in inferring the time-scale of biological evolution from molecular data, the fundamental question of whether our substitution models are sufficiently well specified to accurately estimate branch-lengths has received little attention. I examine this implicit assumption of all molecular dating methods, on a vertebrate mitochondrial protein-coding dataset. Comparison with analyses in which the data are RY-coded (AG → R; CT → Y) suggests that even rates-across-sites maximum likelihood greatly under-compensates for multiple substitutions among the standard (ACGT) NT-coded data, which has been subject to greater phylogenetic signal erosion. Accordingly, the fossil record indicates that branch-lengths inferred from the NT-coded data translate into divergence time overestimates when calibrated from deeper in the tree. Intriguingly, RY-coding led to the opposite result. The underlying NT and RY substitution model misspecifications likely relate respectively to “hidden” rate heterogeneity and changes in substitution processes across the tree, for which I provide simulated examples. Given the magnitude of the inferred molecular dating errors, branch-length estimation biases may partly explain current conflicts with some palaeontological dating estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An increase in the likelihood of navigational collisions in port waters has put focus on the collision avoidance process in port traffic safety. The most widely used on-board collision avoidance system is the automatic radar plotting aid which is a passive warning system that triggers an alert based on the pilot’s pre-defined indicators of distance and time proximities at the closest point of approaches in encounters with nearby vessels. To better help pilot in decision making in close quarter situations, collision risk should be considered as a continuous monotonic function of the proximities and risk perception should be considered probabilistically. This paper derives an ordered probit regression model to study perceived collision risks. To illustrate the procedure, the risks perceived by Singapore port pilots were obtained to calibrate the regression model. The results demonstrate that a framework based on the probabilistic risk assessment model can be used to give a better understanding of collision risk and to define a more appropriate level of evasive actions.