89 resultados para data quality issues
Resumo:
This paper proposes a distributed control approach to coordinate multiple energy storage units (ESUs) to avoid violation of voltage and thermal constraints, which are some of the main power quality challenges for future distribution networks. ESUs usually are connected to a network through voltage source converters. In this paper, both ESU converters active and reactive power are used to deal with the above mentioned power quality issues. ESUs' reactive power is proposed to be used for voltage support, while the active power is to be utilized in managing network loading. Two typical distribution networks are used to apply the proposed method, and the simulated results are illustrated in this paper to show the effectiveness of this approach.
Resumo:
Energy policy is driving renewable energy deployment with most of the developed countries having some form of renewable energy portfolio standard and emissions reduction target. To deliver upon these ambitious targets, those renewable energy technologies that are commercially available, such as wind and solar, are being deployed, but inherently have issues with intermittency of supply. To overcome these issues, storage options will need to be introduced into the distribution network with benefits for both demand management and power systems quality. How this can be utilised most effectively within the distribution network will allow for an even greater proportion of our energy demand to be met through renewable resources and meet the aspirational targets set. The distribution network will become a network of smart-grids, but to work efficiently and effectively, power quality issues surrounding intermittency must be overcome, with storage being a major factor in this solution.
Resumo:
Police reported crash data are the primary source of crash information in most jurisdictions. However, the definition of serious injury within police-reported data is not consistent across jurisdictions and may not be accurate. With the Australian National Road Safety Strategy targeting the reduction of serious injuries, there is a greater need to assess the accuracy of the methods used to identify these injuries. A possible source of more accurate information relating to injury severity is hospital data. While other studies have compared police and hospital data to highlight the under-reporting in police-reported data, little attention has been given to the accuracy of the methods used by police to identify serious injuries. The current study aimed to assess how accurate the identification of serious injuries is in police-reported crash data, by comparing the profiles of transport-related injuries in the Queensland Road Crash Database with an aligned sample of data from the Queensland Hospital Admitted Patients Data Collection. Results showed that, while a similar number of traffic injuries were recorded in both data sets, the profile of these injuries was different based on gender, age, location, and road user. The results suggest that the ‘hospitalisation’ severity category used by police may not reflect true hospitalisations in all cases. Further, it highlights the wide variety of severity levels within hospitalised cases that are not captured by the current police-reported definitions. While a data linkage study is required to confirm these results, they highlight that a reliance on police-reported serious traffic injury data alone could result in inaccurate estimates of the impact and cost of crashes and lead to a misallocation of valuable resources.
Resumo:
This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a data quality metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.
Resumo:
Creating an authentic assessment which at once assesses competencies, scene management, communication and overall patient care is challenging in the competitive tertiary education market. Increasing student numbers and the cost of evaluating scenario based competencies serve to ensure the need for consistent objectivity and need for timely feedback to students on their performance. Objective structured clinical examination (OSCE) is currently the most flexible approach to competency based formative and summative assessment and widely used within paramedic degree programs. Students are understandably compelled to perform well and can be frustrated by not receiving timely and appropriate feedback. Increasingly a number of products aimed at providing a more efficient and paperless approach have begun to enter the market. These products, it is suggested are aimed at medicine programs and not at allied health professions and limited to one operating system and therefore ignore issues surrounding equity and accessibility. OSCE Online aims to address this gap in the market and is tailored to these disciplines. The application will provide a service that can be both tailored and standardised from a pre-written bank, depending upon requirement to fit around the needs of clinical competency assessment. Delivering authentic assessments to address student milestones in their training to become paramedics is the cornerstone of OSCE Online. By not being restricted to a specific device it will address issues of functionality, adaptability, accessibility, authenticity and importantly: transparency and accountability by producing contemporaneous data allowing issues to be easily identified and rectified.
Resumo:
Background Most studies examining determinants of rising rates of caesarean section have examined patterns in documented reasons for caesarean over time in a single location. Further insights could be gleaned from cross-cultural research that examines practice patterns in locations with disparate rates of caesarean section at a single time point. Methods We compared both rates of and main reason for pre-labour and intrapartum caesarean between England and Queensland, Australia, using data from retrospective cross-sectional surveys of women who had recently given birth in England (n = 5,250) and Queensland (n = 3,467). Results Women in Queensland were more likely to have had a caesarean birth (36.2%) than women in England (25.1% of births; OR = 1.44, 95% CI = 1.28-1.61), after adjustment for obstetric characteristics. Between-country differences were found for rates of pre-labour caesarean (21.2% vs. 12.2%) but not for intrapartum caesarean or assisted vaginal birth. Compared to women in England, women in Queensland with a history of caesarean were more likely to have had a pre-labour caesarean and more likely to have had an intrapartum caesarean, due only to a previous caesarean. Among women with no previous caesarean, Queensland women were more likely than women in England to have had a caesarean due to suspected disproportion and failure to progress in labour. Conclusions The higher rates of caesarean birth in Queensland are largely attributable to higher rates of caesarean for women with a previous caesarean, and for the main reason of having had a previous caesarean. Variation between countries may be accounted for by the absence of a single, comprehensive clinical guideline for caesarean section in Queensland. Keywords: Caesarean section; Childbirth; Pregnancy; Cross-cultural comparison; Vaginal birth after caesarean; Previous caesarean section; Patient-reported data; Quality improvement
Resumo:
This project was an innovative approach in developing smart coordination of available energy resources to improve the integration level of PV in distribution network. Voltage and loading issues are considered as the main concerns for future electricity grid which need to be avoided using such resources. A distributed control structure was proposed for the resources in distribution network to avoid noted power quality issues.
Resumo:
Variations that exist in the treatment of patients (with similar symptoms) across different hospitals do substantially impact the quality and costs of healthcare. Consequently, it is important to understand the similarities and differences between the practices across different hospitals. This paper presents a case study on the application of process mining techniques to measure and quantify the differences in the treatment of patients presenting with chest pain symptoms across four South Australian hospitals. Our case study focuses on cross-organisational benchmarking of processes and their performance. Techniques such as clustering, process discovery, performance analysis, and scientific workflows were applied to facilitate such comparative analyses. Lessons learned in overcoming unique challenges in cross-organisational process mining, such as ensuring population comparability, data granularity comparability, and experimental repeatability are also presented.
Resumo:
Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.
Resumo:
Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.
Resumo:
The importance of passenger experience in aviation has become well understood in the last several years. It is now generally accepted that the provision of good passenger experience is not an option, but a necessity, from an aviation profitability perspective. In this paper, we paint a picture of the future passenger experience by consolidating a number of industry and research perspectives. Using the future passenger experience as a starting point, we explore the components needed to enable this future vision. From this bottom-up approach, we identify the need to resolve data formatting and data ownership issues. The resolution of these data integration issues is necessary to enable the seamless future travel experience that is envisioned by the aviation industry. By looking at the passenger experience from this bottom-up, data centric perspective, we identify a potential shift in the way that future passenger terminals will be designed. Whereas currently the design of terminals is largely an architectural practice, in the near future, the design of the terminal building may become more of a virtual technology practice. This of course will pose a new set of challenges to designers of airport terminal environments.
Resumo:
Purpose of this paper This research aims to examine the effects of inadequate documentation to the cost management & tendering processes in Managing Contractor Contracts using Fixed Lump Sum as a benchmark. Design/methodology/approach A questionnaire survey was conducted with industry practitioners to solicit their views on documentation quality issues associated with the construction industry. This is followed by a series of semi-structured interviews with a purpose of validating survey findings. Findings and value The results showed that documentation quality remains a significant issue, contributing to the industries inefficiency and poor reputation. The level of satisfaction for individual attributes of documentation quality varies. Attributes that do appear to be affected by the choice of procurement method include coordination, build ability, efficiency, completeness and delivery time. Similarly the use and effectiveness of risk mitigation techniques appears to vary between the methods, based on a number of factors such as documentation completeness, early involvement, fast tracking etc. Originality/value of paper This research fills the gap of existing body of knowledge in terms of limited studies on the choice of a project procurement system has an influence on the documentation quality and the level of impact. Conclusions Ultimately research concludes that the entire project team including the client and designers should carefully consider the individual projects requirements and compare those to the trade-offs associated with documentation quality and the procurement method. While documentation quality is definitely an issue to be improved upon, by identifying the projects performance requirements a procurement method can be chosen to maximise the likelihood that those requirements will be met. This allows the aspects of documentation quality considered most important to the individual project to be managed appropriately.
Resumo:
Speculative property developers, criticised for building dog boxes and the slums of tomorrow, are generally hated by urban planners and the public alike. But the doors of state governments are seemingly always open to developers and their lobbyists. Politicians find it hard to say no to the demands of the development industry for concessions because of the contribution housing construction makes to the economic bottom line and because there is a need for well located housing. New supply is also seen as a solution to declining housing affordability. Classical economic theory however is too simplistic for housing supply. Instead, an offshoot of Game Theory - Market Design – not only offers greater insight into apartment supply but also can simultaneously address price, design and quality issues. New research reveals the most significant risk in residential development is settlement risk – when buyers fail to proceed with their purchase despite there being a pre-sale contract. At the point of settlement, the developer has expended all the project funds only to see forecast revenue evaporate. While new buyers may be found, this process is likely to strip the profitability out of the project. As the global financial crisis exposed, buyers are inclined to walk if property values slide. This settlement problem reflects a poor legal mechanism (the pre-sale contract), and a lack of incentive for truthfulness. A second problem is the search costs of finding buyers. At around 10% of project costs, pre-sales are more expensive to developers than finance. This is where Market Design comes in.
Resumo:
This research used design science research methods to develop, instantiate, implement, and measure the acceptance of a novel software artefact. The primary purpose of this software artefact was to enhance data collection, improve its quality and enable its capture in classroom environments without distracting from the teaching activity. The artefact set is an iOS app, with supporting web services and technologies designed in response to teacher and pastoral care needs. System analysis and design used Enterprise Architecture methods. The novel component of the iOS app implemented proximity detection to identify the student through their iPad and automatically link to that student's data. The use of this novel software artefact and web services was trialled in a school setting, measuring user acceptance and system utility. This integrated system was shown to improve the accuracy, consistency, completeness and timeliness of captured data and the utility of the input and reporting systems.