406 resultados para federated search tool


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim This paper is a report of a study conducted to validate an instrument for measuring advanced practice nursing role delineation in an international contemporary health service context using the Delphi technique. Background Although most countries now have clear definitions and competency standards for nurse practitioners, no such clarity exists for many advanced practice nurse roles, leaving healthcare providers uncertain whether their service needs can or should be met by an advanced practice nurse or a nurse practitioner. The validation of a tool depicting advanced practice nursing is essential for the appropriate deployment of advanced practice nurses. This paper is the second in a three-phase study to develop an operational framework for assigning advanced practice nursing roles. Method An expert panel was established to review the activities in the Strong Model of Advanced Practice Role Delineation tool. Using the Delphi technique, data were collected via an on-line survey through a series of iterative rounds in 2008. Feedback and statistical summaries of responses were distributed to the panel until the 75% consensus cut-off was obtained. Results After three rounds and modification of five activities, consensus was obtained for validation of the content of this tool. Conclusion The Strong Model of Advanced Practice Role Delineation tool is valid for depicting the dimensions of practice of the advanced practice role in an international contemporary health service context thereby having the potential to optimize the utilization of the advanced practice nursing workforce.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Lane Change Test (LCT) is one of the growing number of methods developed to quantify driving performance degradation brought about by the use of in-vehicle devices. Beyond its validity and reliability, for such a test to be of practical use, it must also be sensitive to the varied demands of individual tasks. The current study evaluated the ability of several recent LCT lateral control and event detection parameters to discriminate between visual-manual and cognitive surrogate In-Vehicle Information System tasks with different levels of demand. Twenty-seven participants (mean age 24.4 years) completed a PC version of the LCT while performing visual search and math problem solving tasks. A number of the lateral control metrics were found to be sensitive to task differences, but the event detection metrics were less able to discriminate between tasks. The mean deviation and lane excursion measures were able to distinguish between the visual and cognitive tasks, but were less sensitive to the different levels of task demand. The other LCT metrics examined were less sensitive to task differences. A major factor influencing the sensitivity of at least some of the LCT metrics could be the type of lane change instructions given to participants. The provision of clear and explicit lane change instructions and further refinement of its metrics will be essential for increasing the utility of the LCT as an evaluation tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Value Management (VM) has been proven to provide a structured framework, together with other supporting tools and techniques, that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. One of the major success factors of VM in achieving better project objectives for clients is through the provision of beneficial input by multi-disciplinary team members being involved in critical decision-making discussions during the early stage of construction projects. This paper describes a doctoral research proposal based on the application of VM in design and build construction projects, especially focusing on the design stage. The research aims to study the effects of implementing VM in design and build construction projects, in particular how well the methodology addresses issues related to cost overruns resulting from poor coordination and overlooking of critical constructability issues amongst team members in construction projects in Malaysia. It is proposed that through contractors’ early involvement during the design stage, combined with the use of the VM methodology, particularly as a decision-making tool, better optimization of construction cost can be achieved, thus promoting more efficient and effective constructability. The main methods used in this research involve a thorough literature study, semi-structured interviews, and a survey of major stakeholders, a detailed case study and a VM workshop and focus group discussions involving construction professionals in order to explore and possibly develop a framework and a specific methodology for the facilitating successful application of VM within design and build construction projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: This paper aims to show that identification of expectations and software functional requirements via consultation with potential users is an integral component of the development of an emergency department patient admissions prediction tool. ---------- Design/methodology/approach: Thematic analysis of semi-structured interviews with 14 key health staff delivered rich data regarding existing practice and future needs. Participants included emergency department staff, bed managers, nurse unit managers, directors of nursing, and personnel from health administration. ---------- Findings: Participants contributed contextual insights on the current system of admissions, revealing a culture of crisis, imbued with misplayed communication. Their expectations and requirements of a potential predictive tool provided strategic data that moderated the development of the Emergency Department Patient Admissions Prediction Tool, based on their insistence that it feature availability, reliability and relevance. In order to deliver these stipulations, participants stressed that it should be incorporated, validated, defined and timely. ---------- Research limitations/implications: Participants were envisaging a concept and use of a tool that was somewhat hypothetical. However, further research will evaluate the tool in practice. ---------- Practical implications: Participants' unsolicited recommendations regarding implementation will not only inform a subsequent phase of the tool evaluation, but are eminently applicable to any process of implementation in a healthcare setting. ---------- Originality/value: The consultative process engaged clinicians and the paper delivers an insider view of an overburdened system, rather than an outsider's observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The field of collaborative health planning faces significant challenges posed by the lack of effective information, systems and a framework to organise that information. Such a framework is critical in order to make accessible and informed decisions for planning healthy cities. The challenges have been exaggerated by the rise of the healthy cities movement, as a result of which, there have been more frequent calls for localised, collaborative and evidence-based decision-making. Some studies suggest that the use of ICT-based tools in health planning may lead to: increased collaboration between stakeholder sand the community; improve the accuracy and quality of the decision making process; and, improve the availability of data and information for health decision-makers as well as health service planners. Research has justified the use of decision support systems (DSS) in planning for healthy cities as these systems have been found to improve the planning process. DSS are information communication technology (ICT) tools including geographic information systems (GIS) that provide the mechanisms to help decision-makers and related stake holders assess complex problems and solve these in a meaningful way. Consequently, it is now more possible than ever before to make use of ICT-based tools in health planning. However, knowledge about the nature and use of DSS within collaborative health planning is relatively limited. In particular, little research has been conducted in terms of evaluating the impact of adopting these tools upon stakeholders, policy-makers and decision-makers within the health planning field. This paper presents an integrated method that has been developed to facilitate an informed decision-making process to assist in the health planning process. Specifically, the paper describes the participatory process that has been adopted to develop an online GIS-based DSS for health planners. The literature states that the overall aim of DSS is to improve the efficiency of the decisions made by stakeholders, optimising their overall performance and minimizing judgmental biases. For this reason, the paper examines the effectiveness and impact of an innovative online GIS-based DSS on health planners. The case study of the online DSS is set within a unique settings-based initiative designed to plan for and improve the health capacity of Logan-Beaudesert area, Australia. This unique setting-based initiative is named the Logan-Beaudesert Health Coalition (LBHC).The paper outlines the impact occurred by implementing the ICT-based DSS. In conclusion, the paper emphasizes upon the need for the proposed tool for enhancing health planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Groundwater is increasingly recognised as an important yet vulnerable natural resource, and a key consideration in water cycle management. However, communication of sub-surface water system behaviour, as an important part of encouraging better water management, is visually difficult. Modern 3D visualisation techniques can be used to effectively communicate these complex behaviours to engage and inform community stakeholders. Most software developed for this purpose is expensive and requires specialist skills. The Groundwater Visualisation System (GVS) developed by QUT integrates a wide range of surface and sub-surface data, to produce a 3D visualisation of the behaviour, structure and connectivity of groundwater/surface water systems. Surface data (elevation, surface water, land use, vegetation and geology) and data collected from boreholes (bore locations and subsurface geology) are combined to visualise the nature, structure and connectivity of groundwater/surface water systems. Time-series data (water levels, groundwater quality, rainfall, stream flow and groundwater abstraction) is displayed as an animation within the 3D framework, or graphically, to show water system condition changes over time. GVS delivers an interactive, stand-alone 3D Visualisation product that can be used in a standard PC environment. No specialised training or modelling skills are required. The software has been used extensively in the SEQ region to inform and engage both water managers and the community alike. Examples will be given of GVS visualisations developed in areas where there have been community concerns around groundwater over-use and contamination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a person searching electronic health records, a search for the term ‘cracked skull’ should return documents that contain the term ‘cranium fracture’. A information retrieval systems is required that matches concepts, not just keywords. Further more, determining relevance of a query to a document requires inference – its not simply matching concepts. For example a document containing ‘dialysis machine’ should align with a query for ‘kidney disease’. Collectively we describe this problem as the ‘semantic gap’ – the difference between the raw medical data and the way a human interprets it. This paper presents an approach to semantic search of health records by combining two previous approaches: an ontological approach using the SNOMED CT medical ontology; and a distributional approach using semantic space vector space models. Our approach will be applied to a specific problem in health informatics: the matching of electronic patient records to clinical trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the application of local search methods on the railway junction traffic conflict-resolution problem, with the objective of attaining a quick and reasonable solution. A procedure based on local search relies on finding a better solution than the current one by a search in the neighbourhood of the current one. The structure of neighbourhood is therefore very important to an efficient local search procedure. In this paper, the formulation of the structure of the solution, which is the right-of-way sequence assignment, is first described. Two new neighbourhood definitions are then proposed and the performance of the corresponding local search procedures is evaluated by simulation. It has been shown that they provide similar results but they can be used to handle different traffic conditions and system requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software used by architectural and industrial designers – has moved from becoming a tool for drafting, towards use in verification, simulation, project management and project sharing remotely. In more advanced models, parameters for the designed object can be adjusted so a family of variations can be produced rapidly. With advances in computer aided design technology, numerous design options can now be generated and analyzed in real time. However the use of digital tools to support design as an activity is still at an early stage and has largely been limited in functionality with regard to the design process. To date, major CAD vendors have not developed an integrated tool that is able to both leverage specialized design knowledge from various discipline domains (known as expert knowledge systems) and support the creation of design alternatives that satisfy different forms of constraints. We propose that evolutionary computing and machine learning be linked with parametric design techniques to record and respond to a designer’s own way of working and design history. It is expected that this will lead to results that impact on future work on design support systems-(ergonomics and interface) as well as implicit constraint and problem definition for problems that are difficult to quantify.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently in Australia, there are no decision support tools for traffic and transport engineers to assess the crash risk potential of proposed road projects at design level. A selection of equivalent tools already exists for traffic performance assessment, e.g. aaSIDRA or VISSIM. The Urban Crash Risk Assessment Tool (UCRAT) was developed for VicRoads by ARRB Group to promote methodical identification of future crash risks arising from proposed road infrastructure, where safety cannot be evaluated based on past crash history. The tool will assist practitioners with key design decisions to arrive at the safest and the most cost -optimal design options. This paper details the development and application of UCRAT software. This professional tool may be used to calculate an expected mean number of casualty crashes for an intersection, a road link or defined road network consisting of a number of such elements. The mean number of crashes provides a measure of risk associated with the proposed functional design and allows evaluation of alternative options. The tool is based on historical data for existing road infrastructure in metropolitan Melbourne and takes into account the influence of key design features, traffic volumes, road function and the speed environment. Crash prediction modelling and risk assessment approaches were combined to develop its unique algorithms. The tool has application in such projects as road access proposals associated with land use developments, public transport integration projects and new road corridor upgrade proposals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a result of the managerial reforms adopted by government agencies since the 1980s, the stakeholder approach has become more widely accepted as a strategic management tool. However it remains a difficult and demanding task for agencies to determine who their stakeholders are and to optimise interactions with them. This paper examines how government agencies identify, classify and engage with stakeholders who have competing demands, differing access to resources and the ability to exert political pressure. To do this, the stakeholder approaches of nine agencies at three levels of government in Queensland were studied. This resulted in the development of a Stakeholder Classification Model for Public Agencies which could be used to create more focused and relevant stakeholder interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User-Web interactions have emerged as an important research in the field of information science. In this study, we examine extensively the Web searching performed by general users. Our goal is to investigate the effects of users’ cognitive styles on their Web search behavior in relation to two broad components: Information Searching and Information Processing Approaches. We use questionnaires, a measure of cognitive style, Web session logs and think-aloud as the data collection instruments. Our study findings show wholistic Web users tend to adopt a top-down approach to Web searching, where the users searched for a generic topic, and then reformulate their queries to search for specific information. They tend to prefer reading to process information. Analytic users tend to prefer a bottom-up approach to information searching and they process information by scanning search result pages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.