943 resultados para Question-answering systems
Resumo:
The objective of this research was to develop a question prompt list aimed at increasing question asking and reducing the unmet information needs of adults with primary brain tumours, and to pilot the question prompt list to determine its suitability for the intended population. Thematic analysis of existing resources was used to create a draft which was refined via interviews with 12 brain tumour patients and six relatives, readability testing and review by health professionals. A non-randomised before–after pilot study with 20 brain tumour patients was used to assess the acceptability and usefulness of the question prompt list, compared with a ‘standard brochure’, and the feasibility of evaluation strategies. The question prompt list developed covered seven main topics (diagnosis, prognosis, symptoms and changes, treatment, support, after treatment finishes and the health professional team). Pilot study participants provided with the question prompt list agreed that it was helpful (7/7), contained questions that were useful to them (7/7) and prompted them to ask their medical oncologist questions (5/7). The question prompt list is acceptable to patients and contains questions relevant to them. Research is now needed to assess its effectiveness in increasing question asking and reducing unmet information needs.
Resumo:
The Cardiac Access-Remoteness Index of Australia (Cardiac ARIA) used geographic information systems (GIS) to model population level, road network accessibility to cardiac services before and after a cardiac event for all (20,387) population localities in Australia., The index ranged from 1A (access to all cardiac services within 1 h driving time) to 8E (limited or no access). The methodology derived an objective geographic measure of accessibility to required cardiac services across Australia. Approximately 71% of the 2006 Australian population had very good access to acute hospital services and services after hospital discharge. This GIS model could be applied to other regions or health conditions where spatially enabled data were available.
Resumo:
This summary is based on an international review of leading peer reviewed journals, in both technical and management fields. It draws on highly cited articles published between 2000 and 2009 to investigate the research question, "What are the diffusion determinants for passive building technologies in Australia?". Using a conceptual framework drawn from the innovation systems literature, this paper synthesises and interprets the literature to map the current state of passive building technologies in Australia and to analyse the drivers for, and obstacles to, their optimal diffusion. The paper concludes that the government has a key role to play through its influence over the specification of building codes.
Resumo:
To ensure infrastructure assets are procured and maintained by government on behalf of citizens, appropriate policy and institutional architecture is needed, particularly if a fundamental shift to more sustainable infrastructure is the goal. The shift in recent years from competitive and resource-intensive procurement to more collaborative and sustainable approaches to infrastructure governance is considered a major transition in infrastructure procurement systems. In order to better understand this transition in infrastructure procurement arrangements, the concept of emergence from Complex Adaptive Systems (CAS) theory is offered as a key construct. Emergence holds that micro interactions can result in emergent macro order. Applying the concept of emergence to infrastructure procurement, this research examines how interaction of agents in individual projects can result in different industry structural characteristics. The paper concludes that CAS theory, and particularly the concept of ‘emergence’, provides a useful construct to understand infrastructure procurement dynamics and progress towards sustainability.
Resumo:
Society faces an unprecedented global education challenge to equip professionals with the knowledge and skills to address emerging 21st Century challenges, spanning climate change mitigation through to adaptation measures to deal with issues such as temperature and sea level rise, and diminishing fresh water and fossil fuel reserves. This paper discusses the potential for systemic and synergistic integration of curriculum with campus operations to accelerate curriculum renewal towards ESD, drawing on the authors' experiences within engineering education. The paper begins by a providing a brief overview of the need for timely curriculum renewal towards ESD in tertiary education. The paper then highlights some examples of academic barriers that need to be overcome for integration efforts to be successful, and opportunities for promoting the benefits of such integration. The paper concludes by discussing the rational for planning green campus initiatives within a larger system of curriculum renewal considerations, including awareness raising and developing a common understanding, identifying and mapping graduate attributes, curriculum auditing, content development and strategic renewal, and bridging and outreach.
Resumo:
Air conditioning systems have become an integral part of many modern buildings. Proper design and operation of air conditioning systems have significant impact not only on the energy use and greenhouse gas emissions from the buildings, but also on the thermal comfort and productivity of the occupants. In this paper, the purpose and need of installing air conditioning systems is first introduced. The methods used for the classification of air conditioning systems are then presented. This is followed by a discussion on the pros and cons of each type of the air conditioning systems, including both common and new air conditioning technologies. The procedures used to design air conditioning systems are also outlined, and the implications of air conditioning systems, including design, selection, operation and maintenance, on building energy efficiency is also discussed.
Resumo:
Distributed generators (DGs) are defined as generators that are connected to a distribution network. The direction of the power flow and short-circuit current in a network could be changed compared with one without DGs. The conventional protective relay scheme does not meet the requirement in this emerging situation. As the number and capacity of DGs in the distribution network increase, the problem of coordinating protective relays becomes more challenging. Given this background, the protective relay coordination problem in distribution systems is investigated, with directional overcurrent relays taken as an example, and formulated as a mixed integer nonlinear programming problem. A mathematical model describing this problem is first developed, and the well-developed differential evolution algorithm is then used to solve it. Finally, a sample system is used to demonstrate the feasiblity and efficiency of the developed method.
Resumo:
Children who have suffered physical or sexual abuse are as vulnerable as adult trauma victims to experience "secondary trauma", in which the reactions of the family or broader system exacerbate the child's difficulties. Three clinical cases (a 7 yr old male, an 8 yr old male, and a 7 yr old female) are presented that suggest that this secondary trauma can be made worse by either excessive or insufficient provision of individual child psychotherapy, and the way the system interprets and reacts to these clinical decisions. Types of secondary trauma and their interactions with clinical decisions are discussed. Ways of framing clinical decisions to minimize the potential secondary trauma are presented.
Resumo:
Background: Previous research identified that primary brain tumour patients have significant psychological morbidity and unmet needs, particularly the need for more information and support. However, the utility of strategies to improve information provision in this setting is unknown. This study involved the development and piloting of a brain tumour specific question prompt list (QPL). A QPL is a list of questions patients may find useful to ask their health professionals, and is designed to facilitate communication and information exchange. Methods: Thematic analysis of QPLs developed for other chronic diseases and brain tumour specific patient resources informed a draft QPL. Subsequent refinement of the QPL involved an iterative process of interviews and review with 12 recently diagnosed patients and six caregivers. Final revisions were made following readability analyses and review by health professionals. Piloting of the QPL is underway using a non-randomised control group trial with patients undergoing treatment for a primary brain tumour in Brisbane, Queensland. Following baseline interviews, consenting participants are provided with the QPL or standard information materials. Follow-up interviews four to 6 weeks later allow assessment of the acceptability of the QPL, how it is used by patients, impact on information needs, and feasibility of recruitment, implementation and outcome assessment. Results: The final QPL was determined to be readable at the sixth grade level. It contains seven sections: diagnosis, prognosis, symptoms and changes, the health professional team, support, treatment and management, and post-treatment concerns. At this time, fourteen participants have been recruited for the pilot, and data collection completed for eleven. Data collection and preliminary analysis are expected to be completed by and presented at the conference. Conclusions: If acceptable to participants, the QPL may encourage patients, doctors and nurses to communicate more effectively, reducing unmet information needs and ultimately improving psychological wellbeing.
Resumo:
This project investigates machine listening and improvisation in interactive music systems with the goal of improvising musically appropriate accompaniment to an audio stream in real-time. The input audio may be from a live musical ensemble, or playback of a recording for use by a DJ. I present a collection of robust techniques for machine listening in the context of Western popular dance music genres, and strategies of improvisation to allow for intuitive and musically salient interaction in live performance. The findings are embodied in a computational agent – the Jambot – capable of real-time musical improvisation in an ensemble setting. Conceptually the agent’s functionality is split into three domains: reception, analysis and generation. The project has resulted in novel techniques for addressing a range of issues in each of these domains. In the reception domain I present a novel suite of onset detection algorithms for real-time detection and classification of percussive onsets. This suite achieves reasonable discrimination between the kick, snare and hi-hat attacks of a standard drum-kit, with sufficiently low-latency to allow perceptually simultaneous triggering of accompaniment notes. The onset detection algorithms are designed to operate in the context of complex polyphonic audio. In the analysis domain I present novel beat-tracking and metre-induction algorithms that operate in real-time and are responsive to change in a live setting. I also present a novel analytic model of rhythm, based on musically salient features. This model informs the generation process, affording intuitive parametric control and allowing for the creation of a broad range of interesting rhythms. In the generation domain I present a novel improvisatory architecture drawing on theories of music perception, which provides a mechanism for the real-time generation of complementary accompaniment in an ensemble setting. All of these innovations have been combined into a computational agent – the Jambot, which is capable of producing improvised percussive musical accompaniment to an audio stream in real-time. I situate the architectural philosophy of the Jambot within contemporary debate regarding the nature of cognition and artificial intelligence, and argue for an approach to algorithmic improvisation that privileges the minimisation of cognitive dissonance in human-computer interaction. This thesis contains extensive written discussions of the Jambot and its component algorithms, along with some comparative analyses of aspects of its operation and aesthetic evaluations of its output. The accompanying CD contains the Jambot software, along with video documentation of experiments and performances conducted during the project.
Resumo:
In recent times, light gauge steel framed (LSF) structures, such as cold-formed steel wall systems, are increasingly used, but without a full understanding of their fire performance. Traditionally the fire resistance rating of these load-bearing LSF wall systems is based on approximate prescriptive methods developed based on limited fire tests. Very often they are limited to standard wall configurations used by the industry. Increased fire rating is provided simply by adding more plasterboards to these walls. This is not an acceptable situation as it not only inhibits innovation and structural and cost efficiencies but also casts doubt over the fire safety of these wall systems. Hence a detailed fire research study into the performance of LSF wall systems was undertaken using full scale fire tests and extensive numerical studies. A new composite wall panel developed at QUT was also considered in this study, where the insulation was used externally between the plasterboards on both sides of the steel wall frame instead of locating it in the cavity. Three full scale fire tests of LSF wall systems built using the new composite panel system were undertaken at a higher load ratio using a gas furnace designed to deliver heat in accordance with the standard time temperature curve in AS 1530.4 (SA, 2005). Fire tests included the measurements of load-deformation characteristics of LSF walls until failure as well as associated time-temperature measurements across the thickness and along the length of all the specimens. Tests of LSF walls under axial compression load have shown the improvement to their fire performance and fire resistance rating when the new composite panel was used. Hence this research recommends the use of the new composite panel system for cold-formed LSF walls. The numerical study was undertaken using a finite element program ABAQUS. The finite element analyses were conducted under both steady state and transient state conditions using the measured hot and cold flange temperature distributions from the fire tests. The elevated temperature reduction factors for mechanical properties were based on the equations proposed by Dolamune Kankanamge and Mahendran (2011). These finite element models were first validated by comparing their results with experimental test results from this study and Kolarkar (2010). The developed finite element models were able to predict the failure times within 5 minutes. The validated model was then used in a detailed numerical study into the strength of cold-formed thin-walled steel channels used in both the conventional and the new composite panel systems to increase the understanding of their behaviour under nonuniform elevated temperature conditions and to develop fire design rules. The measured time-temperature distributions obtained from the fire tests were used. Since the fire tests showed that the plasterboards provided sufficient lateral restraint until the failure of LSF wall panels, this assumption was also used in the analyses and was further validated by comparison with experimental results. Hence in this study of LSF wall studs, only the flexural buckling about the major axis and local buckling were considered. A new fire design method was proposed using AS/NZS 4600 (SA, 2005), NAS (AISI, 2007) and Eurocode 3 Part 1.3 (ECS, 2006). The importance of considering thermal bowing, magnified thermal bowing and neutral axis shift in the fire design was also investigated. A spread sheet based design tool was developed based on the above design codes to predict the failure load ratio versus time and temperature for varying LSF wall configurations including insulations. Idealised time-temperature profiles were developed based on the measured temperature values of the studs. This was used in a detailed numerical study to fully understand the structural behaviour of LSF wall panels. Appropriate equations were proposed to find the critical temperatures for different composite panels, varying in steel thickness, steel grade and screw spacing for any load ratio. Hence useful and simple design rules were proposed based on the current cold-formed steel structures and fire design standards, and their accuracy and advantages were discussed. The results were also used to validate the fire design rules developed based on AS/NZS 4600 (SA, 2005) and Eurocode Part 1.3 (ECS, 2006). This demonstrated the significant improvements to the design method when compared to the currently used prescriptive design methods for LSF wall systems under fire conditions. In summary, this research has developed comprehensive experimental and numerical thermal and structural performance data for both the conventional and the proposed new load bearing LSF wall systems under standard fire conditions. Finite element models were developed to predict the failure times of LSF walls accurately. Idealized hot flange temperature profiles were developed for non-insulated, cavity and externally insulated load bearing wall systems. Suitable fire design rules and spread sheet based design tools were developed based on the existing standards to predict the ultimate failure load, failure times and failure temperatures of LSF wall studs. Simplified equations were proposed to find the critical temperatures for varying wall panel configurations and load ratios. The results from this research are useful to both structural and fire engineers and researchers. Most importantly, this research has significantly improved the knowledge and understanding of cold-formed LSF loadbearing walls under standard fire conditions.
Resumo:
The changing R&D Tax Concession has been touted as the biggest reform to business innovation policy in over a decade. But, is it a changing tax for changing times? This paper addresses this question and further asks ‘what’s tax got to do with it?’. To answer this question, the paper argues that rather than substantive tax reform, the proposed measures simply alter the criteria and means by which companies become eligible for a Federal Government subsidy for qualifying R&D activity. It further argues that when considered as part of the broader innovation agenda, the R&D Tax Concession should be evaluated as a government spending program in the same way as any direct spending on innovation. When this is done, the tax regime is arguably only the administrative policy instrument by which the subsidy is delivered. However, it is proposed that this may not be best practice to distribute those funds fairly, efficiently, and without distortion, while at the same time maintaining adequate government control and accountability. Finally, in answering the question of ‘what’s tax got to do with it?’ the paper concludes that the answer is ‘very little’.
Resumo:
Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.
Resumo:
A distributed fuzzy system is a real-time fuzzy system in which the input, output and computation may be located on different networked computing nodes. The ability for a distributed software application, such as a distributed fuzzy system, to adapt to changes in the computing network at runtime can provide real-time performance improvement and fault-tolerance. This paper introduces an Adaptable Mobile Component Framework (AMCF) that provides a distributed dataflow-based platform with a fine-grained level of runtime reconfigurability. The execution location of small fragments (possibly as little as few machine-code instructions) of an AMCF application can be moved between different computing nodes at runtime. A case study is included that demonstrates the applicability of the AMCF to a distributed fuzzy system scenario involving multiple physical agents (such as autonomous robots). Using the AMCF, fuzzy systems can now be developed such that they can be distributed automatically across multiple computing nodes and are adaptable to runtime changes in the networked computing environment. This provides the opportunity to improve the performance of fuzzy systems deployed in scenarios where the computing environment is resource-constrained and volatile, such as multiple autonomous robots, smart environments and sensor networks.
Resumo:
The question of under what conditions conceptual representation is compositional remains debatable within cognitive science. This paper proposes a well developed mathematical apparatus for a probabilistic representation of concepts, drawing upon methods developed in quantum theory to propose a formal test that can determine whether a specific conceptual combination is compositional, or not. This test examines a joint probability distribution modeling the combination, asking whether or not it is factorizable. Empirical studies indicate that some combinations should be considered non-compositionally.