809 resultados para Modifed Super-Time-Stepping
Resumo:
Electronic Health Record (EHR) systems are being introduced to overcome the limitations associated with paper-based and isolated Electronic Medical Record (EMR) systems. This is accomplished by aggregating medical data and consolidating them in one digital repository. Though an EHR system provides obvious functional benefits, there is a growing concern about the privacy and reliability (trustworthiness) of Electronic Health Records. Security requirements such as confidentiality, integrity, and availability can be satisfied by traditional hard security mechanisms. However, measuring data trustworthiness from the perspective of data entry is an issue that cannot be solved with traditional mechanisms, especially since degrees of trust change over time. In this paper, we introduce a Time-variant Medical Data Trustworthiness (TMDT) assessment model to evaluate the trustworthiness of medical data by evaluating the trustworthiness of its sources, namely the healthcare organisation where the data was created and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record, with respect to a certain period of time. The result can then be used by the EHR system to manipulate health record metadata to alert medical practitioners relying on the information to possible reliability problems.
Resumo:
Purpose – The purpose of this paper is to determine consumer perceptions of service quality in wet markets and supermarkets in Hong Kong. Design/methodology/approach – A questionnaire was developed and distributed via a convenience sample to consumers in shopping malls in Causeway Bay, Mong Kok and Tsuen Wan. Findings – The study finds that supermarkets outperformed wet markets across all aspects of service quality as measured by SERVQUAL-P. Research limitations/implications – Implications suggest that wet market vendors are not providing the level of service quality demanded by their customers. In particular, findings suggest that wet market vendors need to improve the visual attractiveness of their stalls, work on making them look more professional and start using more modern equipment. Practical implications – Wet market vendors in conjunction with government representatives need to develop standards of service quality for wet markets across Hong Kong. This is imperative if the wet market model is to survive in what is a highly competitive food retailing industry. Without action, it appears that the supermarketization of the Hong Kong food retailing industry will continue unabated. Originality/value – This paper adds to a small but growing research stream examining service quality in the food retailing industry in Hong Kong. It provides empirical results that guide suggested actions for change.
Resumo:
In this third Quantum Interaction (QI) meeting it is time to examine our failures. One of the weakest elements of QI as a field, arises in its continuing lack of models displaying proper evolutionary dynamics. This paper presents an overview of the modern generalised approach to the derivation of time evolution equations in physics, showing how the notion of symmetry is essential to the extraction of operators in quantum theory. The form that symmetry might take in non-physical models is explored, with a number of viable avenues identified.
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.
Resumo:
Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study
Resumo:
This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.
Resumo:
Fatigue and overwork are problems experienced by numerous employees in many industry sectors. Focusing on improving work-life balance can frame the ‘problem’ of long work hours to resolve working time duration issues. Flexible work options through re-organising working time arrangements is key to developing an organisational response for delivering work-life balance and usually involves changing the internal structure of work time. This study examines the effect of compressed long weekly working hours and the consequent ‘long break’ on work-life balance. Using Spillover theory and Border theory, this research considers organisational and personal determinants of overwork and fatigue. It concludes compressed long work hours with a long break provide better work-life balance. Further, a long break allows gaining ‘personal time’ and overcoming fatigue.
Resumo:
As the paper’s subtitle suggests broadband has had a remarkably checkered trajectory in Australia. It was synonymous with the early 1990s information superhighway and seemed to presage a moment in which “content is [to be] king”. It disappeared almost entirely as a public priority in the mid to late 1990s as intrastructure and content were disconnected in services frameworks focused on information and communication technologies. And it came back in the 2000s as a critical infrastructure for innovation and the knowledge economy. But this time content was not king but rather an intermediate input at the service of innovating industries and processes. Broadband was a critical infrastructure for the digitally-based creative industries. Today the quality of the broadband infrastructure in Australia—itself an outcome of these different policy frameworks—is identified as “fraudband” holding back business, creativity and consumer uptake. In this paper I use the checkered trajectory of broadband on Australian political and policy horizons as a stepping off point to reflect on the ideas governing these changing governmental and public settings. This history enables me to explore how content and infrastructure are simultaneously connected and disconnected in our thinking. And, finally, I want to make some remarks about the way communication, particularly media communication, has been marginally positioned after being, initially so apparently central.
Resumo:
This article observes a paradox in the recent history of the Special Broadcasting Service. It is argued that, in contrast to the Australian Broadcasting Corporation, the role and general direction of SBS were not extensively debated as part of the ‘culture wars’ that occurred during the years of the Howard government. While that made SBS a less fraught space during that period, it may now be a factor in the comparative lack of support being given by the Rudd Labor government to SBS in comparison with the ABC, as some of the ‘special’ status of SBS has been blunted by its drift towards more mainstream programming and a mixed economy of commercial advertising, as well as government funding.
Resumo:
Aims: Influenza is commonly spread by infectious aerosols; however, detection of viruses in aerosols is not sensitive enough to confirm the characteristics of virus aerosols. The aim of this study was to develop an assay for respiratory viruses sufficiently sensitive to be used in epidemiological studies. Method: A two-step, nested real-time PCR assay was developed for MS2 bacteriophage, and for influenza A and B, parainfluenza 1 and human respiratory syncytial virus. Outer primer pairs were designed to nest each existing real-time PCR assay. The sensitivities of the nested real-time PCR assays were compared to those of existing real-time PCR assays. Both assays were applied in an aerosol study to compare their detection limits in air samples. Conclusions: The nested real-time PCR assays were found to be several logs more sensitive than the real-time PCR assays, with lower levels of virus detected at lower Ct values. The nested real-time PCR assay successfully detected MS2 in air samples, whereas the real-time assay did not. Significance and Impact of the Study: The sensitive assays for respiratory viruses will permit further research using air samples from naturally generated virus aerosols. This will inform current knowledge regarding the risks associated with the spread of viruses through aerosol transmission.
Resumo:
Review of 'The Pineapple Queen', La Boite Theatre Company, published in The Australian, 31 July 2009.