412 resultados para Requirements elicitation techniques
Resumo:
We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate(subject pool) level the results are (roughly) consistent, on an individual(within-subject) level,behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining e�ects in other experimental games.
Resumo:
Purpose. To compare radiological records of 90 consecutive patients who underwent cemented total hip arthroplasty (THA) with or without use of the Rim Cutter to prepare the acetabulum. Methods. The acetabulum of 45 patients was prepared using the Rim Cutter, whereas the device was not used in the other 45 patients. Postoperative radiographs were evaluated using a digital templating system to measure (1) the positions of the operated hips with respect to the normal, contralateral hips (the centre of rotation of the socket, the height of the centre of rotation from the teardrop, and lateralisation of the centre of rotation from the teardrop) and (2) the uniformity and width of the cement mantle in the 3 DeLee Charnley acetabular zones, and the number of radiolucencies in these zones. Results. The study group showed improved radiological parameters and were closer to the anatomic centre of rotation both vertically (1.5 vs. 3.7 mm, p<0.001) and horizontally (1.8 vs. 4.4 mm, p<0.001) and had consistently thicker and more uniform cement mantles (p<0.001). There were 2 radiolucent lines in the control group but none in the study group. Conclusion. The Rim Cutter resulted in more accurate placement of the centre of rotation of a cemented prosthetic socket, and produced a thicker, more congruent cement mantle with fewer radiolucent lines.
Resumo:
Sound tagging has been studied for years. Among all sound types, music, speech, and environmental sound are three hottest research areas. This survey aims to provide an overview about the state-of-the-art development in these areas.We discuss about the meaning of tagging in different sound areas at the beginning of the journey. Some examples of sound tagging applications are introduced in order to illustrate the significance of this research. Typical tagging techniques include manual, automatic, and semi-automatic approaches.After reviewing work in music, speech and environmental sound tagging, we compare them and state the research progress to date. Research gaps are identified for each research area and the common features and discriminations between three areas are discovered as well. Published datasets, tools used by researchers, and evaluation measures frequently applied in the analysis are listed. In the end, we summarise the worldwide distribution of countries dedicated to sound tagging research for years.
Resumo:
Critically ill patients receiving extracorporeal membrane oxygenation (ECMO) are often noted to have increased sedation requirements. However, data related to sedation in this complex group of patients is limited. The aim of our study was to characterise the sedation requirements in adult patients receiving ECMO for cardiorespiratory failure. A retrospective chart review was performed to collect sedation data for 30 consecutive patients who received venovenous or venoarterial ECMO between April 2009 and March 2011. To test for a difference in doses over time we used a regression model. The dose of midazolam received on ECMO support increased by an average of 18 mg per day (95% confidence interval 8, 29 mg, P=0.001), while the dose of morphine increased by 29 mg per day (95% confidence interval 4, 53 mg, P=0.021) The venovenous group received a daily midazolam dose that was 157 mg higher than the venoarterial group (95% confidence interval 53, 261 mg, P=0.005). We did not observe any significant increase in fentanyl doses over time (95% confidence interval 1269, 4337 µg, P=0.94). There is a significant increase in dose requirement for morphine and midazolam during ECMO. Patients on venovenous ECMO received higher sedative doses as compared to patients on venoarterial ECMO. Future research should focus on mechanisms behind these changes and also identify drugs that are most suitable for sedation during ECMO.
Resumo:
Noradrenaline which occurs naturally in the body binds to beta-adrenoceptors on the heart, causing the heart to beat faster and with greater force in response to increased demand. This enables the heart to provide oxygenated blood to vital organs. Prolonged overstimulation by noradrenaline can be harmful to the heart and lead to the progression of heart disease. In these circumstances beta-adrenoceptors are blocked with drugs called beta-blockers. Beta-blockers block the effects of noradrenaline by binding to the same site on the beta-adrenoceptor. Some beta-blockers such as CGP12177 can also cause increases in heart rate. Therefore it was proposed that CGP12177 could bind in a different place to noradrenaline. The aim of this study was to determine where CGP12177 binds to on the beta-adrenoceptor. The results have revealed a separate binding site named beta-1-low. These results may lead to the development of improved -blockers for the management of heart conditions.
Resumo:
CTAC2012 was the 16th biennial Computational Techniques and Applications Conference, and took place at Queensland University of Technology from 23 - 26 September, 2012. The ANZIAM Special Interest Group in Computational Techniques and Applications is responsible for the CTAC meetings, the first of which was held in 1981.
Resumo:
An optical system which performs the multiplication of binary numbers is described and proof-of-principle experiments are performed. The simultaneous generation of all partial products, optical regrouping of bit products, and optical carry look-ahead addition are novel features of the proposed scheme which takes advantage of the parallel operations capability of optical computers. The proposed processor uses liquid crystal light valves (LCLVs). By space-sharing the LCLVs one such system could function as an array of multipliers. Together with the optical carry look-ahead adders described, this would constitute an optical matrix-vector multiplier.
Resumo:
Introduction: Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. Methods: A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. Results: All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. Conclusion: The Snapshot method proved to be a feasible and efficient method of gathering useful
Resumo:
Traffic congestion has a significant impact on the economy and environment. Encouraging the use of multimodal transport (public transport, bicycle, park’n’ride, etc.) has been identified by traffic operators as a good strategy to tackle congestion issues and its detrimental environmental impacts. A multi-modal and multi-objective trip planner provides users with various multi-modal options optimised on objectives that they prefer (cheapest, fastest, safest, etc) and has a potential to reduce congestion on both a temporal and spatial scale. The computation of multi-modal and multi-objective trips is a complicated mathematical problem, as it must integrate and utilize a diverse range of large data sets, including both road network information and public transport schedules, as well as optimising for a number of competing objectives, where fully optimising for one objective, such as travel time, can adversely affect other objectives, such as cost. The relationship between these objectives can also be quite subjective, as their priorities will vary from user to user. This paper will first outline the various data requirements and formats that are needed for the multi-modal multi-objective trip planner to operate, including static information about the physical infrastructure within Brisbane as well as real-time and historical data to predict traffic flow on the road network and the status of public transport. It will then present information on the graph data structures representing the road and public transport networks within Brisbane that are used in the trip planner to calculate optimal routes. This will allow for an investigation into the various shortest path algorithms that have been researched over the last few decades, and provide a foundation for the construction of the Multi-modal Multi-objective Trip Planner by the development of innovative new algorithms that can operate the large diverse data sets and competing objectives.
Resumo:
This paper presents a new approach for the inclusion of human expert cognition into autonomous trajectory planning for unmanned aerial systems (UASs) operating in low-altitude environments. During typical UAS operations, multiple objectives may exist; therefore, the use of multicriteria decision aid techniques can potentially allow for convergence to trajectory solutions which better reflect overall mission requirements. In that context, additive multiattribute value theory has been applied to optimize trajectories with respect to multiple objectives. A graphical user interface was developed to allow for knowledge capture from a human decision maker (HDM) through simulated decision scenarios. The expert decision data gathered are converted into value functions and corresponding criteria weightings using utility additive theory. The inclusion of preferences elicited from HDM data within an automated decision system allows for the generation of trajectories which more closely represent the candidate HDM decision preferences. This approach has been demonstrated in this paper through simulation using a fixed-wing UAS operating in low-altitude environments.
Resumo:
In Hill v Robertson Suspension Systems Pty Ltd [2009] QDC 165 McGill DCJ considered the procedural requirements for the service of originating process on a company, and for proving that service for the purpose of obtaining default judgment.The judge’s views adopt a strict and technical construction of the requirements for an affidavit of service under r 120(1)(b). Though clearly obiter, they may well affect the approach taken on applications to enter or set aside default judgments in the lower courts. Pending further judicial consideration of the issue, it is suggested the prudent course is to ensure that the deponent of an affidavit for service effected under s 109X(1)(a) of the Act deposes not only to the location of the registered office of the company but also, at a minimum, provides the source of that information.
Resumo:
One of the primary desired capabilities of any future air traffic separation management system is the ability to provide early conflict detection and resolution effectively and efficiently. In this paper, we consider the risk of conflict as a primary measurement to be used for early conflict detection. This paper focuses on developing a novel approach to assess the impact of different measurement uncertainty models on the estimated risk of conflict. The measurement uncertainty model can be used to represent different sensor accuracy and sensor choices. Our study demonstrates the value of modelling measurement uncertainty in the conflict risk estimation problem and presents techniques providing a means of assessing sensor requirements to achieve desired conflict detection performance.
Resumo:
Norms regulate the behaviour of their subjects and define what is legal and what is illegal. Norms typically describe the conditions under which they are applicable and the normative effects as a results of their applications. On the other hand, process models specify how a business operation or service is to be carried out to achieve a desired outcome. Norms can have significant impact on how business operations are conducted and they can apply to the whole or part of a business process. For example, they may impose conditions on the different aspects of a process (e.g., perform tasks in a specific sequence (control-flow), at a specific time or within a certain time frame (temporal aspect), by specific people (resources)). We propose a framework that provides the formal semantics of the normative requirements for determining whether a business process complies with a normative document (where a normative document can be understood in a very broad sense, ranging from internal policies to best practice policies, to statutory acts). We also present a classification of normal requirements based on the notion of different types of obligations and the effects of violating these obligations.
Resumo:
With the explosive growth of resources available through the Internet, information mismatching and overload have become a severe concern to users. Web users are commonly overwhelmed by huge volume of information and are faced with the challenge of finding the most relevant and reliable information in a timely manner. Personalised information gathering and recommender systems represent state-of-the-art tools for efficient selection of the most relevant and reliable information resources, and the interest in such systems has increased dramatically over the last few years. However, web personalization has not yet been well-exploited; difficulties arise while selecting resources through recommender systems from a technological and social perspective. Aiming to promote high quality research in order to overcome these challenges, this paper provides a comprehensive survey on the recent work and achievements in the areas of personalised web information gathering and recommender systems. The report covers concept-based techniques exploited in personalised information gathering and recommender systems.
Resumo:
The configuration of comprehensive Enterprise Systems to meet the specific requirements of an organisation up to today is consuming significant resources. The results of failing implementation projects are severe and may even threaten the organisation’s existence. This paper proposes a method which aims at increasing the efficiency of Enterprise Systems implementations. First, we argue that existing process modelling languages that feature different degrees of abstraction for different user groups exist and are used for different purposes which makes it necessary to integrate them. We describe how to do this using the meta models of the involved languages. Second, we motivate that an integrated process model based on the integrated meta model needs to be configurable and elaborate on the mechanisms by which this model configuration can be achieved. We introduce a business example using SAP modelling techniques to illustrate the proposed method.