747 resultados para statistical framework
Resumo:
Renewable energy is commonly considered a technological addition to urban environments. By contrast, this PhD used a holistic approach to develop a design framework for integrating local electricity production into the ecological function and cultural use of public space. The framework addresses social engagement related to public interaction, and economic engagement related to the estimated quantity of electricity produced, in conjunction with environmental engagement related to the embodied energy required to construct the renewable energy infrastructure. The outcomes will contribute to social and environmental change by engaging society, enriching the local economy and increasing social networks.
Resumo:
Baby Boomers are a generation of life long association joiners, but following generations prefer spontaneous and episodic volunteering. This trend is apparent not only during natural disasters, but in most other spheres of volunteering. Legal liability for such volunteers is a growing concern, which unresolved, may dampen civic participation. We critically examine the current treatment of these liabilities through legislation, insurance and risk management.
Resumo:
Analytical techniques for measuring and planning railway capacity expansion activities have been considered in this article. A preliminary mathematical framework involving track duplication and section sub divisions is proposed for this task. In railways these features have a great effect on network performance and for this reason they have been considered. Additional motivations have also arisen from the limitations of prior models that have not included them.
Resumo:
Quantifying the stiffness properties of soft tissues is essential for the diagnosis of many cardiovascular diseases such as atherosclerosis. In these pathologies it is widely agreed that the arterial wall stiffness is an indicator of vulnerability. The present paper focuses on the carotid artery and proposes a new inversion methodology for deriving the stiffness properties of the wall from cine-MRI (magnetic resonance imaging) data. We address this problem by setting-up a cost function defined as the distance between the modeled pixel signals and the measured ones. Minimizing this cost function yields the unknown stiffness properties of both the arterial wall and the surrounding tissues. The sensitivity of the identified properties to various sources of uncertainty is studied. Validation of the method is performed on a rubber phantom. The elastic modulus identified using the developed methodology lies within a mean error of 9.6%. It is then applied to two young healthy subjects as a proof of practical feasibility, with identified values of 625 kPa and 587 kPa for one of the carotid of each subject.
Resumo:
This paper proposes new metrics and a performance-assessment framework for vision-based weed and fruit detection and classification algorithms. In order to compare algorithms, and make a decision on which one to use fora particular application, it is necessary to take into account that the performance obtained in a series of tests is subject to uncertainty. Such characterisation of uncertainty seems not to be captured by the performance metrics currently reported in the literature. Therefore, we pose the problem as a general problem of scientific inference, which arises out of incomplete information, and propose as a metric of performance the(posterior) predictive probabilities that the algorithms will provide a correct outcome for target and background detection. We detail the framework through which these predicted probabilities can be obtained, which is Bayesian in nature. As an illustration example, we apply the framework to the assessment of performance of four algorithms that could potentially be used in the detection of capsicums (peppers).
Resumo:
Purpose – Business models to date have remained the creation of management, however, it is the belief of the authors that designers should be critically approaching, challenging and creating new business models as part of their practice. This belief portrays a new era where business model constructs become the new design brief of the future and fuel design and innovation to work together at the strategic level of an organisation. Design/methodology/approach – The purpose of this paper is to explore and investigate business model design. The research followed a deductive structured qualitative content analysis approach utilizing a predetermined categorization matrix. The analysis of forty business cases uncovered commonalities of key strategic drivers behind these innovative business models. Findings – Five business model typologies were derived from this content analysis, from which quick prototypes of new business models can be created. Research limitations/implications – Implications from this research suggest there is no “one right” model, but rather through experimentation, the generation of many unique and diverse concepts can result in greater possibilities for future innovation and sustained competitive advantage. Originality/value – This paper builds upon the emerging research and exploration into the importance and relevance of dynamic, design-driven approaches to the creation of innovative business models. These models aim to synthesize knowledge gained from real world examples into a tangible, accessible and provoking framework that provide new prototyping templates to aid the process of business model experimentation.
Resumo:
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.
Resumo:
Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.
Resumo:
Mapping and evaluating a student's progress on placement is a core element of social work education but there has been scant attention to indicate how to effectively create and assess student learning and performance. This paper outlines a project undertaken by the Combined Schools of Social Work to develop a common learning and assessment tool that is being used by all social work schools in Victoria. The paper describes how the Common Assessment Tool (CAT) was developed, drawing on the Australian Association of Social Work Practice Standards, leading to seven key learning areas that form the basis of the assessment of a student's readiness for practice. An evaluation of the usefulness of the CAT was completed by field educators, liaison staff, and students, which confirmed that the CAT was a useful framework for evaluating students' learning goals. The feedback also identified a number of problematic features that were addressed in a revised CAT and rating scale.
Resumo:
So far, most Phase II trials have been designed and analysed under a frequentist framework. Under this framework, a trial is designed so that the overall Type I and Type II errors of the trial are controlled at some desired levels. Recently, a number of articles have advocated the use of Bavesian designs in practice. Under a Bayesian framework, a trial is designed so that the trial stops when the posterior probability of treatment is within certain prespecified thresholds. In this article, we argue that trials under a Bayesian framework can also be designed to control frequentist error rates. We introduce a Bayesian version of Simon's well-known two-stage design to achieve this goal. We also consider two other errors, which are called Bayesian errors in this article because of their similarities to posterior probabilities. We show that our method can also control these Bayesian-type errors. We compare our method with other recent Bayesian designs in a numerical study and discuss implications of different designs on error rates. An example of a clinical trial for patients with nasopharyngeal carcinoma is used to illustrate differences of the different designs.
Resumo:
Statistical analyses of health program participation seek to address a number of objectives compatible with the evaluation of demand for current resources. In this spirit, a spatial hierarchical model is developed for disentangling patterns in participation at the small area level, as a function of population-based demand and additional variation. For the former, a constrained gravity model is proposed to quantify factors associated with spatial choice and account for competition effects, for programs delivered by multiple clinics. The implications of gravity model misspecification within a mixed effects framework are also explored. The proposed model is applied to participation data from a no-fee mammography program in Brisbane, Australia. Attention is paid to the interpretation of various model outputs and their relevance for public health policy.
Resumo:
Although statistical data in some developed countries indicate that migrant workers are nearly 30% more likely to have work-related injuries than local workers, no equivalent official injury/ incident statistics on the health and safety (H&S) of migrant workers are currently tracked in Australia. With increasing numbers of migrant workers having joined Australia’s extractive industries infrastructure and commercial construction industry, this suggests the need for some investigation. A particular issue is that lack of H&S communication is one of the key factors leading to construction industry accidents/ incidents as it prevents workers from effectively receiving H&S safety training and acquiring H&S information. Migrant workers whose first languages are not English are particularly affected by this problem and ways are needed to improve their situation. The research aims to do this by evaluating the H&S communication problems of migrant workers and identify an effective H&S communication structure. An overview of the challenge being addressed by the research is firstly provided, followed by a description of the research framework, and a report of the initial findings, from which recommendations are provided for improving H&S performance in the construction industry.
Resumo:
An Electronic Medical Record (EMR) is a system that has been embraced by healthcare providers worldwide. However, the implementation success of EMRs has varied widely. Studies have identified both barriers to and facilitators for implementing EMRs within healthcare organisations. In Saudi Arabia (SA), the majority of healthcare providers manage patient records manually. As public hospitals are a major provider of health services in SA and have been shown to face more EMR implementation barriers than private hospitals, there is a need for an implementation framework to guide EMR implementation in Saudi public hospitals. This doctoral project therefore aimed to develop an evidence-based EMR implementation framework for public hospitals in SA informed by those who work at the micro-implementation level and the macro-implementation level and the extant literature sensitive to the cultural, resource-related, and technological, organisational, and environmental issues of SA.
Resumo:
Pharmaceutical Care is defined as “the responsible provision of drug therapy for the purpose of achieving definite outcomes that improve a patient’s quality of life”. One of the fundamental concepts in understanding needs for pharmaceutical care are Drug-Related Problems (DRPs). As the complexity of medication treatment increases, identification of drug-related problems (DRPs) by healthcare professionals remains vital to patient safety and Quality Use of Medicines(QUM). DRPs have been used by many researchers to evaluate the QUM in different settings. DRPs present, however, a list of potential problems not a strategic framework for assessing a medication regimen.