896 resultados para Law of Propagation of Uncertainty
Resumo:
There are many issues associated with good faith that will ultimately confront the Australian High Court and a number of these have been well canvassed. However, one significant issue has attracted relatively little comment. To date, a number of Australian courts (lower in the judicial hierarchy) have been prepared to hold directly, tacitly accept or assume (without making a final determination) that good faith is implied (as a matter of law) in the performance and enforcement of a very broad class of contract, namely commercial contracts per se. This broad approach is demonstrated in decisions from the Federal Court, the New South Wales Court of Appeal, the Supreme Courts of Victoria and Western Australia and has crept into pleadings in commercial matters in Queensland
Resumo:
Abstract As regional and continental carbon balances of terrestrial ecosystems become available, it becomes clear that the soils are the largest source of uncertainty. Repeated inventories of soil organic carbon (SOC) organized in soil monitoring networks (SMN) are being implemented in a number of countries. This paper reviews the concepts and design of SMNs in ten countries, and discusses the contribution of such networks to reducing the uncertainty of soil carbon balances. Some SMNs are designed to estimate country-specific land use or management effects on SOC stocks, while others collect soil carbon and ancillary data to provide a nationally consistent assessment of soil carbon condition across the major land-use/soil type combinations. The former use a single sampling campaign of paired sites, while for the latter both systematic (usually grid based) and stratified repeated sampling campaigns (5–10 years interval) are used with densities of one site per 10–1,040 km². For paired sites, multiple samples at each site are taken in order to allow statistical analysis, while for the single sites, composite samples are taken. In both cases, fixed depth increments together with samples for bulk density and stone content are recommended. Samples should be archived to allow for re-measurement purposes using updated techniques. Information on land management, and where possible, land use history should be systematically recorded for each site. A case study of the agricultural frontier in Brazil is presented in which land use effect factors are calculated in order to quantify the CO2 fluxes from national land use/management conversion matrices. Process-based SOC models can be run for the individual points of the SMN, provided detailed land management records are available. These studies are still rare, as most SMNs have been implemented recently or are in progress. Examples from the USA and Belgium show that uncertainties in SOC change range from 1.6–6.5 Mg C ha−1 for the prediction of SOC stock changes on individual sites to 11.72 Mg C ha−1 or 34% of the median SOC change for soil/land use/climate units. For national SOC monitoring, stratified sampling sites appears to be the most straightforward attribution of SOC values to units with similar soil/land use/climate conditions (i.e. a spatially implicit upscaling approach). Keywords Soil monitoring networks - Soil organic carbon - Modeling - Sampling design
Resumo:
Historically there has been a correlation between the economic cycles and litigation in the area of professional negligence relating to valuers. Negligence actions have principally been instigated by financiers for valuations prepared during more buoyant economic times but where there has been a subsequent loss due to a reduction in property value. More specifically during periods of economic downturn such as 1982 to 1983 and 1990 to 1998 there has been an increased focus by academic writers on professional negligence as it relates to property valuers. Based on historical trends it is anticipated that the end of an extended period of economic prosperity such as has been experienced in Australia, will once again be marked by an increase in litigation against valuers for professional negligence. However, the context of valuers liability has become increasingly complex as a result of statutory reforms introduced in response to the Review of the Law of Negligence Final Report 2002 (“the IPP Report”), in particular the introduction of Civil Liability Acts introducing proportionate liability provisions. This paper looks at valuers’ liability for professional negligence in the context of statutory reforms in Queensland and recent case law to determine the most significant impacts of recent statutory reform on property valuers.
Resumo:
Aim: In this paper we discuss the use of the Precede-Proceed model when investigating health promotion options for breast cancer survivors. Background: Adherence to recommended health behaviors can optimize well-being after cancer treatment. Guided by the Precede-Proceed approach, we studied the behaviors of breast cancer survivors in our health service area. Data sources: The interview data from the cohort of breast cancer survivors are used in this paper to illustrate the use of Precede-Proceed in this nursing research context. Interview data were collected from June to December 2009. We also searched Medline, CINAHL, PsychInfo and PsychExtra up to 2010 for relevant literature in English to interrogate the data from other theoretical perspectives. Discussion: The Precede-Proceed model is theoretically-complex. The deductive analytic process guided by the model usefully explained some of the health behaviors of cancer survivors, although it could not explicate many other findings. A complementary inductive approach to the analysis and subsequent interpretation by way of Uncertainty in Illness Theory and other psychosocial perspectives provided a comprehensive account of the qualitative data that resulted in contextually-relevant recommendations for nursing practice. Implications for nursing: Nursing researchers using Precede-Proceed should maintain theoretical flexibility when interpreting qualitative data. Perspectives not embedded in the model might need to be considered to ensure that the data are analyzed in a contextually-relevant way. Conclusion: Precede-Proceed provides a robust framework for nursing researchers investigating health promotion in cancer survivors; however additional theoretical lenses to those embedded in the model can enhance data interpretation.
Resumo:
There are several ways that the Commissioner of Taxation may indirectly obtain priority over unsecured creditors. This is contrary to the principle of pari passu, a principle endorsed by the 1988 Harmer Report as one that is a fundamental objective of the law of insolvency. As the law and practice of Australia's taxation regime evolves, the law is being drafted in a manner that is inconsistent with the principle of pari passu. The natural consequence of this development is that it places at risk the capacity of corporate and bankruptcy laws to coexist and cooperate with taxation laws. This article posits that undermining the consistency of Commonwealth legislative objectives is undesirable. The authors suggest that one means of addressing the inconsistency is to examine whether there is a clearly aligned theoretical basis for the development of these areas of law and the extent that alignment addresses these inconsistencies. This forms the basis for the recommendations made around such inconsistencies using statutory priorities as an exemplar.
Resumo:
Statement: Jams, Jelly Beans and the Fruits of Passion Let us search, instead, for an epistemology of practice implicit in the artistic, intuitive processes which some practitioners do bring to situations of uncertainty, instability, uniqueness, and value conflict. (Schön 1983, p40) Game On was born out of the idea of creative community; finding, networking, supporting and inspiring the people behind the face of an industry, those in the mist of the machine and those intending to join. We understood this moment to be a pivotal opportunity to nurture a new emerging form of game making, in an era of change, where the old industry models were proving to be unsustainable. As soon as we started putting people into a room under pressure, to make something in 48hrs, a whole pile of evolutionary creative responses emerged. People refashioned their craft in a moment of intense creativity that demanded different ways of working, an adaptive approach to the craft of making games – small – fast – indie. An event like the 48hrs forces participants’ attention onto the process as much as the outcome. As one game industry professional taking part in a challenge for the first time observed: there are three paths in the genesis from idea to finished work: the path that focuses on mechanics; the path that focuses on team structure and roles, and the path that focuses on the idea, the spirit – and the more successful teams put the spirit of the work first and foremost. The spirit drives the adaptation, it becomes improvisation. As Schön says: “Improvisation consists on varying, combining and recombining a set of figures within the schema which bounds and gives coherence to the performance.” (1983, p55). This improvisational approach is all about those making the games: the people and the principles of their creative process. This documentation evidences the intensity of their passion, determination and the shit that they are prepared to put themselves through to achieve their goal – to win a cup full of jellybeans and make a working game in 48hrs. 48hr is a project where, on all levels, analogue meets digital. This concept was further explored through the documentation process. All of these pictures were taken with a 1945 Leica III camera. The use of this classic, film-based camera, gives the images a granularity and depth, this older slower technology exposes the very human moments of digital creativity. ____________________________ Schön, D. A. 1983, The Reflective Practitioner: How Professionals Think in Action, Basic Books, New York
Resumo:
Airports worldwide represent key forms of critical infrastructure in addition to serving as nodes in the international aviation network. While the continued operation of airports is critical to the functioning of reliable air passenger and freight transportation, these infrastructure systems face a number of sources of disturbance that threaten their operational viability. Recent examples of high magnitude events include the eruption of Iceland’s Eyjafjallajokull volcano eruption (Folattau and Schofield 2010), the failure of multiple systems at the opening of Heathrow’s Terminal 5 (Brady and Davies 2010) and the Glasgow airport 2007 terrorist attack (Crichton 2008). While these newsworthy events do occur, a multitude of lower-level more common disturbances also have the potential to cause significant discontinuity to airport operations. Regional airports face a unique set of challenges, particularly in a nation like Australia where they serve to link otherwise remote and isolated communities to metropolitan hubs (Wheeler 2005), often without the resources and political attention received by larger capital city airports. This paper discusses conceptual relationships between Business Continuity Management (BCM) and High Reliability Theory, and proposes BCM as an appropriate risk-based management process to ensure continued airport operation in the face of uncertainty. In addition, it argues that that correctly implemented BCM can lead to highly reliable organisations. This is framed within the broader context of critical infrastructures and the need for adequate crisis management approaches suited to their unique requirements (Boin and McConnell 2007).
Resumo:
Many corporations and individuals realize that environmental sustainability is an urgent problem to address. In this chapter, we contribute to the emerging academic discussion by proposing two innovative approaches for engaging in the development of environmentally sustainable business processes. Specifically, we describe an extended process modeling approach for capturing and documenting the dioxide emissions produced during the execution of a business process. For illustration, we apply this approach to the case of a government Shared Service provider. Second, we then introduce an analysis method for measuring the carbon dioxide emissions produced during the execution of a business process. To illustrate this approach, we apply it in the real-life case of a European airport and show how this information can be leveraged in the re-design of "green" business processes.
Resumo:
Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.
Resumo:
Enabling web-based service networks and ecosystems requires a way of describing services by a "commercial envelope" as discussed in Chapter 1. A uniform conception of services across all walks of life (including technical services) is required capturing business, operational and technical aspects. Therefore, our proposed Unified Service Description Language (USDL) particularly draws from and generalizes the best-of-breed approaches presented in Part I. The following chapter presents the design rationale of USDL where the different aspects are put in a framework of descriptions requirements. This is followed by the subsequent chapters of this part that provide details on specific aspects such as pricing or legal issues.
Resumo:
In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.
Resumo:
Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...
Resumo:
Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.
Resumo:
This study contributes to the understanding of the contribution of financial reserves to sustaining nonprofit organisations. Recognising the limited recent Australian research in the area of nonprofit financial vulnerability, it specifically examines financial reserves held by signatories to the Code of Conduct of the Australian Council for International Development (ACFID) for the years 2006 to 2010. As this period includes the Global Financial Crisis, it presents a unique opportunity to observe the role of savings in a period of heightened financial threats to sustainability. The need for nonprofit entities to maintain reserves, while appearing intuitively evident, is neither unanimously accepted nor supported by established theoretic constructs. Some early frameworks attempt to explain the savings behaviour of nonprofit organisations and its role in organisational sustainability. Where researchers have considered the issue, its treatment has usually been either purely descriptive or alternatively, peripheral to a broader attempt to predict financial vulnerability. Given the importance of nonprofit entities to civil society, the sustainability of these organisations during times of economic contraction, such as the recent Global Financial Crisis, is a significant issue. Widespread failure of nonprofits, or even the perception of failure, will directly affect, not only those individuals who access their public goods and services, but would also have impacts on public confidence in both government and the sectors’ ability to manage and achieve their purpose. This study attempts to ‘shine a light’ on the paradox inherent in considering nonprofit savings. On the one hand, a public prevailing view is that nonprofit organisations should not hoard and indeed, should spend all of their funds on the direct achievement of their purposes. Against this, is the commonsense need for a financial buffer if only to allow for the day to day contingencies of pay rises and cost increases. At the entity level, the extent of reserves accumulated (or not) is an important consideration for Management Boards. The general public are also interested in knowing the level of funds held by nonprofits as a measure of both their commitment to purpose and as an indicator of their effectiveness. There is a need to communicate the level and prevalence of reserve holdings, balancing the prudent hedging of uncertainty against a sense of resource hoarding in the mind of donors. Finally, funders (especially governments) are interested in knowing the appropriate level of reserves to facilitate the ongoing sustainability of the sector. This is particularly so where organisations are involved in the provision of essential public goods and services. At a scholarly level, the study seeks to provide a rationale for this behaviour within the context of appropriate theory. At a practical level, the study seeks to give an indication of the drivers for savings, the actual levels of reserves held within the sector studied, as well as an indication as to whether the presence of reserves did mitigate the effects of financial turmoil during the Global Financial Crisis. The argument is not whether there is a need to ensure sustainability of nonprofits, but rather how it is to be done and whether the holding of reserves (net assets) is an essential element is achieving this. While the study offers no simple answers, it does appear that the organisations studied present as two groups, the ‘savers’ who build reserves and keep ‘money in the bank’ and ‘spender-delivers’ who put their resources ‘on the ground’. To progress an understanding of this dichotomy, the study suggests a need to move from its current approach to one which needs to more closely explore accounts based empirical donor attitude and nonprofit Management Board strategy.