885 resultados para Tool support
Resumo:
The health system is one sector dealing with a deluge of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Also, there are many healthcare organisations, which still have stand-alone systems, not integrated for management of information and decision-making. This shows, there is a need for an effective system to capture, collate and distribute this health data. Therefore, implementing the data warehouse concept in healthcare is potentially one of the solutions to integrate health data. Data warehousing has been used to support business intelligence and decision-making in many other sectors such as the engineering, defence and retail sectors. The research problem that is going to be addressed is, "how can data warehousing assist the decision-making process in healthcare". To address this problem the researcher has narrowed an investigation focusing on a cardiac surgery unit. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. The cardiac surgery unit at TPCH uses a stand-alone database of patient clinical data, which supports clinical audit, service management and research functions. However, much of the time, the interaction between the cardiac surgery unit information system with other units is minimal. There is a limited and basic two-way interaction with other clinical and administrative databases at TPCH which support decision-making processes. The aims of this research are to investigate what decision-making issues are faced by the healthcare professionals with the current information systems and how decision-making might be improved within this healthcare setting by implementing an aligned data warehouse model or models. As a part of the research the researcher will propose and develop a suitable data warehouse prototype based on the cardiac surgery unit needs and integrating the Intensive Care Unit database, Clinical Costing unit database (Transition II) and Quality and Safety unit database [electronic discharge summary (e-DS)]. The goal is to improve the current decision-making processes. The main objectives of this research are to improve access to integrated clinical and financial data, providing potentially better information for decision-making for both improved from the questionnaire and by referring to the literature, the results indicate a centralised data warehouse model for the cardiac surgery unit at this stage. A centralised data warehouse model addresses current needs and can also be upgraded to an enterprise wide warehouse model or federated data warehouse model as discussed in the many consulted publications. The data warehouse prototype was able to be developed using SAS enterprise data integration studio 4.2 and the data was analysed using SAS enterprise edition 4.3. In the final stage, the data warehouse prototype was evaluated by collecting feedback from the end users. This was achieved by using output created from the data warehouse prototype as examples of the data desired and possible in a data warehouse environment. According to the feedback collected from the end users, implementation of a data warehouse was seen to be a useful tool to inform management options, provide a more complete representation of factors related to a decision scenario and potentially reduce information product development time. However, there are many constraints exist in this research. For example the technical issues such as data incompatibilities, integration of the cardiac surgery database and e-DS database servers and also, Queensland Health information restrictions (Queensland Health information related policies, patient data confidentiality and ethics requirements), limited availability of support from IT technical staff and time restrictions. These factors have influenced the process for the warehouse model development, necessitating an incremental approach. This highlights the presence of many practical barriers to data warehousing and integration at the clinical service level. Limitations included the use of a small convenience sample of survey respondents, and a single site case report study design. As mentioned previously, the proposed data warehouse is a prototype and was developed using only four database repositories. Despite this constraint, the research demonstrates that by implementing a data warehouse at the service level, decision-making is supported and data quality issues related to access and availability can be reduced, providing many benefits. Output reports produced from the data warehouse prototype demonstrated usefulness for the improvement of decision-making in the management of clinical services, and quality and safety monitoring for better clinical care. However, in the future, the centralised model selected can be upgraded to an enterprise wide architecture by integrating with additional hospital units’ databases.
Resumo:
-
Resumo:
With the growth and development of communication technology there is an increasing need for the use of interception technologies in modern policing. Law enforcement agencies are faced with increasingly sophisticated and complex criminal networks that utilise modern communication technology as a basis for their criminal success. In particular, transnational organised crime (TOC) is a diverse and complicated arena, costing global society in excess of $3 trillion annually, a figure that continues to grow (Borger, 2007) as crime groups take advantage of disappearing borders and greater profit markets. However, whilst communication can be a critical success factor for criminal enterprise it is also a key vulnerability. It is this vulnerability that the use of CIT, such as phone taps or email interception, can exploit. As such, law enforcement agencies now need a method and framework that allows them to utilise CIT to combat these crimes efficiently and successfully. This paper provides a review of current literature with the specific purpose of considering the effectiveness of CIT in the fight against TOC and the groundwork that must be laid in order for it to be fully exploited. In doing so, it fills an important gap in current research, focusing on the practical implementation of CIT as opposed to the traditional area of privacy concerns that arise with intrusive methods of investigation. The findings support the notion that CIT is an essential intelligence gathering tool that has a strong place within the modern policing arena. It identifies that the most effective use of CIT is grounded within a proactive, intelligence‐led framework and concludes that in order for this to happen Australian authorities and law enforcement agencies must re‐evaluate and address the current legislative and operational constraints placed on the use of CIT and the culture that surrounds intelligence in policing.
Resumo:
With the current curriculum focus on correlating classroom problem solving lessons to real-world contexts, are LEGO robotics an effective problem solving tool? This present study was designed to investigate this question and to ascertain what problem solving strategies primary students engaged with when working with LEGO robotics and whether the students were able to effectively relate their problem solving strategies to real-world contexts. The qualitative study involved 23 Grade 6 students participating in robotics activities at a Brisbane primary school. The study included data collected from researcher observations of student problem solving discussions, collected software programs, and data from a student completed questionnaire. Results from the study indicated that the robotic activities assisted students to reflect on the problem solving decisions they made. The study also highlighted that the students were able to relate their problem solving strategies to real-world contexts. The study demonstrated that while LEGO robotics can be considered useful problem solving tools in the classroom, careful teacher scaffolding needs to be implemented in regards to correlating LEGO with authentic problem solving. Further research in regards to how teachers can best embed realworld contexts into effective robotics lessons is recommended.
Resumo:
Increasingly, studies are reported that examine how conceptual modeling is conducted in practice. Yet, typically the studies to date have examined in isolation how modeling grammars can be, or are, used to develop models of information systems or organizational processes, without considering that such modeling is typically done by means of a modeling tool that extends the modeling functionality offered by a grammar through complementary features. This paper extends the literature by examining how the use of seven different features of modeling tools affects usage beliefs users develop when using modeling grammars for process modeling. We show that five distinct tool features positively affect usefulness, ease of use and satisfaction beliefs of users. We offer a number of interpretations about the findings. We also describe how the results inform decisions of relevance to developers of modeling tools as well as managers in charge for making modeling-related investment decisions.
Resumo:
Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.
Resumo:
Proteases regulate a spectrum of diverse physiological processes, and dysregulation of proteolytic activity drives a plethora of pathological conditions. Understanding protease function is essential to appreciating many aspects of normal physiology and progression of disease. Consequently, development of potent and specific inhibitors of proteolytic enzymes is vital to provide tools for the dissection of protease function in biological systems and for the treatment of diseases linked to aberrant proteolytic activity. The studies in this thesis describe the rational design of potent inhibitors of three proteases that are implicated in disease development. Additionally, key features of the interaction of proteases and their cognate inhibitors or substrates are analysed and a series of rational inhibitor design principles are expounded and tested. Rational design of protease inhibitors relies on a comprehensive understanding of protease structure and biochemistry. Analysis of known protease cleavage sites in proteins and peptides is a commonly used source of such information. However, model peptide substrate and protein sequences have widely differing levels of backbone constraint and hence can adopt highly divergent structures when binding to a protease’s active site. This may result in identical sequences in peptides and proteins having different conformations and diverse spatial distribution of amino acid functionalities. Regardless of this, protein and peptide cleavage sites are often regarded as being equivalent. One of the key findings in the following studies is a definitive demonstration of the lack of equivalence between these two classes of substrate and invalidation of the common practice of using the sequences of model peptide substrates to predict cleavage of proteins in vivo. Another important feature for protease substrate recognition is subsite cooperativity. This type of cooperativity is commonly referred to as protease or substrate binding subsite cooperativity and is distinct from allosteric cooperativity, where binding of a molecule distant from the protease active site affects the binding affinity of a substrate. Subsite cooperativity may be intramolecular where neighbouring residues in substrates are interacting, affecting the scissile bond’s susceptibility to protease cleavage. Subsite cooperativity can also be intermolecular where a particular residue’s contribution to binding affinity changes depending on the identity of neighbouring amino acids. Although numerous studies have identified subsite cooperativity effects, these findings are frequently ignored in investigations probing subsite selectivity by screening against diverse combinatorial libraries of peptides (positional scanning synthetic combinatorial library; PS-SCL). This strategy for determining cleavage specificity relies on the averaged rates of hydrolysis for an uncharacterised ensemble of peptide sequences, as opposed to the defined rate of hydrolysis of a known specific substrate. Further, since PS-SCL screens probe the preference of the various protease subsites independently, this method is inherently unable to detect subsite cooperativity. However, mean hydrolysis rates from PS-SCL screens are often interpreted as being comparable to those produced by single peptide cleavages. Before this study no large systematic evaluation had been made to determine the level of correlation between protease selectivity as predicted by screening against a library of combinatorial peptides and cleavage of individual peptides. This subject is specifically explored in the studies described here. In order to establish whether PS-SCL screens could accurately determine the substrate preferences of proteases, a systematic comparison of data from PS-SCLs with libraries containing individually synthesised peptides (sparse matrix library; SML) was carried out. These SML libraries were designed to include all possible sequence combinations of the residues that were suggested to be preferred by a protease using the PS-SCL method. SML screening against the three serine proteases kallikrein 4 (KLK4), kallikrein 14 (KLK14) and plasmin revealed highly preferred peptide substrates that could not have been deduced by PS-SCL screening alone. Comparing protease subsite preference profiles from screens of the two types of peptide libraries showed that the most preferred substrates were not detected by PS SCL screening as a consequence of intermolecular cooperativity being negated by the very nature of PS SCL screening. Sequences that are highly favoured as result of intermolecular cooperativity achieve optimal protease subsite occupancy, and thereby interact with very specific determinants of the protease. Identifying these substrate sequences is important since they may be used to produce potent and selective inhibitors of protolytic enzymes. This study found that highly favoured substrate sequences that relied on intermolecular cooperativity allowed for the production of potent inhibitors of KLK4, KLK14 and plasmin. Peptide aldehydes based on preferred plasmin sequences produced high affinity transition state analogue inhibitors for this protease. The most potent of these maintained specificity over plasma kallikrein (known to have a very similar substrate preference to plasmin). Furthermore, the efficiency of this inhibitor in blocking fibrinolysis in vitro was comparable to aprotinin, which previously saw clinical use to reduce perioperative bleeding. One substrate sequence particularly favoured by KLK4 was substituted into the 14 amino acid, circular sunflower trypsin inhibitor (SFTI). This resulted in a highly potent and selective inhibitor (SFTI-FCQR) which attenuated protease activated receptor signalling by KLK4 in vitro. Moreover, SFTI-FCQR and paclitaxel synergistically reduced growth of ovarian cancer cells in vitro, making this inhibitor a lead compound for further therapeutic development. Similar incorporation of a preferred KLK14 amino acid sequence into the SFTI scaffold produced a potent inhibitor for this protease. However, the conformationally constrained SFTI backbone enforced a different intramolecular cooperativity, which masked a KLK14 specific determinant. As a consequence, the level of selectivity achievable was lower than that found for the KLK4 inhibitor. Standard mechanism inhibitors such as SFTI rely on a stable acyl-enzyme intermediate for high affinity binding. This is achieved by a conformationally constrained canonical binding loop that allows for reformation of the scissile peptide bond after cleavage. Amino acid substitutions within the inhibitor to target a particular protease may compromise structural determinants that support the rigidity of the binding loop and thereby prevent the engineered inhibitor reaching its full potential. An in silico analysis was carried out to examine the potential for further improvements to the potency and selectivity of the SFTI-based KLK4 and KLK14 inhibitors. Molecular dynamics simulations suggested that the substitutions within SFTI required to target KLK4 and KLK14 had compromised the intramolecular hydrogen bond network of the inhibitor and caused a concomitant loss of binding loop stability. Furthermore in silico amino acid substitution revealed a consistent correlation between a higher frequency of formation and the number of internal hydrogen bonds of SFTI-variants and lower inhibition constants. These predictions allowed for the production of second generation inhibitors with enhanced binding affinity toward both targets and highlight the importance of considering intramolecular cooperativity effects when engineering proteins or circular peptides to target proteases. The findings from this study show that although PS-SCLs are a useful tool for high throughput screening of approximate protease preference, later refinement by SML screening is needed to reveal optimal subsite occupancy due to cooperativity in substrate recognition. This investigation has also demonstrated the importance of maintaining structural determinants of backbone constraint and conformation when engineering standard mechanism inhibitors for new targets. Combined these results show that backbone conformation and amino acid cooperativity have more prominent roles than previously appreciated in determining substrate/inhibitor specificity and binding affinity. The three key inhibitors designed during this investigation are now being developed as lead compounds for cancer chemotherapy, control of fibrinolysis and cosmeceutical applications. These compounds form the basis of a portfolio of intellectual property which will be further developed in the coming years.
Resumo:
There is a continued need to consider ways to prevent early adolescent engagement in a variety of harmful risk-taking behaviours for example, violence, road-related risks and alcohol use. The current prospective study examined adolescents’ reports of intervening to try and stop friends’ engagement in such behaviours among 207 early adolescents (mean age = 13.51 years, 50.1% females). Findings showed that intervening behaviour after three months was predicted by the confidence to intervene which in turn was predicted by student and teacher support although not parental support. The findings suggest that the benefits of positive relationship experiences might extend to the safety of early adolescent friendship groups particularly through the development of confidence to try and stop friends’ risky and dangerous behaviours. Findings from the study support the important role of the school in creating a culture of positive adolescent behaviour whereby young people take social responsibility.
Resumo:
Current urban development in South East Queensland (SEQ) is impacted by a number of factors: growth and sprawl eroding subtropical character and identity; changing demographics and housing needs; lack of developable land; rising transport costs; diminishing fresh water supply; high energy consumption; and generic building designs which ignore local climate, landscape and lifestyle conditions. The Subtropical Row House project sought to research ‘best practice’ planning and design for contemporary and future needs for urban development in SEQ, and stimulate higher-density housing responses that achieve sustainable, low-energy and low water outcomes and support subtropical character and identity by developing a workable new typology for homes that the local market can adopt. The methodology was that of charrette, an established methodology in architecture and design. Four leading Queensland architectural firms were invited to form multidisciplinary creative teams. During the two-day charrette, the teams visited a selected greenfield site, defined the problems and issues, developed ideas and solutions, and benchmarked performance of designs using the Australian Green Building Council’s Pilot Green Star Multi-Unit residential tool. Each of the four resulting designs simultaneously express a positive relationship with climate and place by demonstrating: suitability for the subtropical climate; flexibility for a diversity of households; integrated building/site/vegetation strategies; market appeal to occupants and developers; affordability in operation; constructability by ‘domestic’ builders; and reduced energy, water and wastage. The project was awarded a Regional Commendation by the Australian Institute of Architects.
Resumo:
With rapid and continuing growth of learning support initiatives in mathematics and statistics found in many parts of the world, and with the likelihood that this trend will continue, there is a need to ensure that robust and coherent measures are in place to evaluate the effectiveness of these initiatives. The nature of learning support brings challenges for measurement and analysis of its effects. After briefly reviewing the purpose, rationale for, and extent of current provision, this article provides a framework for those working in learning support to think about how their efforts can be evaluated. It provides references and specific examples of how workers in this field are collecting, analysing and reporting their findings. The framework is used to structure evaluation in terms of usage of facilities, resources and services provided, and also in terms of improvements in performance of the students and staff who engage with them. Very recent developments have started to address the effects of learning support on the development of deeper approaches to learning, the affective domain and the development of communities of practice of both learners and teachers. This article intends to be a stimulus to those who work in mathematics and statistics support to gather even richer, more valuable, forms of data. It provides a 'toolkit' for those interested in evaluation of learning support and closes by referring to an on-line resource being developed to archive the growing body of evidence. © 2011 Taylor & Francis.
Resumo:
The widespread development of Decision Support System (DSS) in construction indicate that the evaluation of software become more important than before. However, it is identified that most research in construction discipline did not attempt to assess its usability. Therefore, little is known about the approach on how to properly evaluate a DSS for specific problem. In this paper, we present a practical framework that can be guidance for DSS evaluation. It focuses on how to evaluate software that is dedicatedly designed for consultant selection problem. The framework features two main components i.e. Sub-system Validation and Face Validation. Two case studies of consultant selection at Malaysian Department of Irrigation and Drainage were integrated in this framework. Some inter-disciplinary area such as Software Engineering, Human Computer Interaction (HCI) and Construction Project Management underpinned the discussion of the paper. It is anticipated that this work can foster better DSS development and quality decision making that accurately meet the client’s expectation and needs
Resumo:
Buildings are one of the most significant infrastructures in modern societies. The construction and operation of modern buildings consume a considerable amount of energy and materials, therefore contribute significantly to the climate change process. In order to reduce the environmental impact of buildings, various green building rating tools have been developed. In this paper, energy uses of the building sector in Australia and over the world are first reviewed. This is then followed by discussions on the development and scopes of various green building rating tools, with a particular focus on the Green Star rating scheme developed in Australia. It is shown that Green Star has significant implications on almost every aspect of the design of HVAC systems, including the selection of air handling and distribution systems, fluid handling systems, refrigeration systems, heat rejection systems and building control systems.
Resumo:
A hospital consists of a number of wards, units and departments that provide a variety of medical services and interact on a day-to-day basis. Nearly every department within a hospital schedules patients for the operating theatre (OT) and most wards receive patients from the OT following post-operative recovery. Because of the interrelationships between units, disruptions and cancellations within the OT can have a flow-on effect to the rest of the hospital. This often results in dissatisfied patients, nurses and doctors, escalating waiting lists, inefficient resource usage and undesirable waiting times. The objective of this study is to use Operational Research methodologies to enhance the performance of the operating theatre by improving elective patient planning using robust scheduling and improving the overall responsiveness to emergency patients by solving the disruption management and rescheduling problem. OT scheduling considers two types of patients: elective and emergency. Elective patients are selected from a waiting list and scheduled in advance based on resource availability and a set of objectives. This type of scheduling is referred to as ‘offline scheduling’. Disruptions to this schedule can occur for various reasons including variations in length of treatment, equipment restrictions or breakdown, unforeseen delays and the arrival of emergency patients, which may compete for resources. Emergency patients consist of acute patients requiring surgical intervention or in-patients whose conditions have deteriorated. These may or may not be urgent and are triaged accordingly. Most hospitals reserve theatres for emergency cases, but when these or other resources are unavailable, disruptions to the elective schedule result, such as delays in surgery start time, elective surgery cancellations or transfers to another institution. Scheduling of emergency patients and the handling of schedule disruptions is an ‘online’ process typically handled by OT staff. This means that decisions are made ‘on the spot’ in a ‘real-time’ environment. There are three key stages to this study: (1) Analyse the performance of the operating theatre department using simulation. Simulation is used as a decision support tool and involves changing system parameters and elective scheduling policies and observing the effect on the system’s performance measures; (2) Improve viability of elective schedules making offline schedules more robust to differences between expected treatment times and actual treatment times, using robust scheduling techniques. This will improve the access to care and the responsiveness to emergency patients; (3) Address the disruption management and rescheduling problem (which incorporates emergency arrivals) using innovative robust reactive scheduling techniques. The robust schedule will form the baseline schedule for the online robust reactive scheduling model.
Resumo:
The purpose of this paper is to explore the means of building the capacity of those who are running an organisation designed to support and resource start-ups and growing micro enterprises among some of the world’s poorest urban poor. The project is based in Beira, Mozambique, one of the poorest countries in the world. The result of this study is the development of a model for providing ongoing, inexpensive, effective, capacity building in developing economies. The model also provides a base for the further development of strategies to provide better support to micro entrepreneurs in poor developing economies.
Resumo:
Regulatory commentators have identified the need for more responsive regulation to allow enforcement agencies to respond to different types and degrees of non-compliance. One tool considered to support responsive enforcement is the Enforceable Undertaking (EU). EUs are used extensively by Australian regulators in decisions that forego litigation in exchange for offenders promising to (amongst other things) correct behaviour and comply in the future. This arguably allows regulatory agencies greater flexibility in how they obtain compliance with regulations. EUs became an additional enforcement tool for the Fair Work Ombudsman (FWO) under the Fair Work Act 2009. This paper is a preliminary exploration of the comparative use of EUs by the Australian Competition and Consumer Commission and the FWO to assess their effectiveness for the minimum labour standards' environment.