153 resultados para problem-based methodology
Resumo:
In the field of semantic grid, QoS-based Web service composition is an important problem. In semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the composition consider not only QoS properties of Web services, but also inter service dependencies and conflicts which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address the Web service composition optimization problem in the presence of domain constraints and inter service dependencies and conflicts. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.
Resumo:
In this age of evidence-based practice, nurses are increasingly expected to use research evidence in a systematic and judicious way when making decisions about patient care practices. Clinicians recognise the role of research when it provides valid, realistic answers in practical situations. Nonetheless, research is still perceived by some nurses as external to practice and implementing research findings into practice is often difficult. Since its conceptual platform in the 1960s, the emergence and growth of Nursing Development Units, and later, Practice Development Units has been described in the literature as strategic, organisational vehicles for changing the way nurses think about nursing by promoting and supporting a culture of inquiry and research-based practice. Thus, some scholars argue that practice development is situated in the gap between research and practice. Since the 1990s, the discourse has shifted from the structure and outcomes of developing practice to the process of developing practice, using a Practice Development methodology; underpinned by critical social science theory, as a vehicle for changing the culture and context of care. The nursing and practice development literature is dominated by descriptive reports of local practice development activity, typically focusing on reflection on processes or outcomes of processes, and describing perceived benefits. However, despite the volume of published literature, there is little published empirical research in the Australian or international context on the effectiveness of Practice Development as a methodology for changing the culture and context of care - leaving a gap in the literature. The aim of this study was to develop, implement and evaluate the effectiveness of a Practice Development model for clinical practice review and change on changing the culture and context of care for nurses working in an acute care setting. A longitudinal, pre-test/post-test, non-equivalent control group design was used to answer the following research questions: 1. Is there a relationship between nurses' perceptions of the culture and context of care and nurses' perceptions of research and evidence-based practice? 2. Is there a relationship between engagement in a facilitated process of Practice Development and change in nurses' perceptions of the culture and context of care? 3. Is there a relationship between engagement in a facilitated process of Practice Development and change in nurses' perceptions of research and evidence-based practice? Through a critical analysis of the literature and synthesis of the findings of past evaluations of Nursing and Practice Development structures and processes, this research has identified key attributes consistent throughout the chronological and theoretical development of Nursing and Practice Development that exemplify a culture and context of care that is conducive to creating a culture of inquiry and evidence-based practice. The study findings were then used in the development, validation and testing of an instrument to measure change in the culture and context of care. Furthermore, this research has also provided empirical evidence of the relationship of the key attributes to each other and to barriers to research and evidence-based practice. The research also provides empirical evidence regarding the effectiveness of a Practice Development methodology in changing the culture and context of care. This research is noteworthy in its contribution to advancing the discipline of nursing by providing evidence of the degree to which attributes of the culture and context of care, namely autonomy and control, workplace empowerment and constructive team dynamics, can be connected to engagement with research and evidence-based practice.
Resumo:
Purpose: Computer vision has been widely used in the inspection of electronic components. This paper proposes a computer vision system for the automatic detection, localisation, and segmentation of solder joints on Printed Circuit Boards (PCBs) under different illumination conditions. Design/methodology/approach: An illumination normalization approach is applied to an image, which can effectively and efficiently eliminate the effect of uneven illumination while keeping the properties of the processed image the same as in the corresponding image under normal lighting conditions. Consequently special lighting and instrumental setup can be reduced in order to detect solder joints. These normalised images are insensitive to illumination variations and are used for the subsequent solder joint detection stages. In the segmentation approach, the PCB image is transformed from an RGB color space to a YIQ color space for the effective detection of solder joints from the background. Findings: The segmentation results show that the proposed approach improves the performance significantly for images under varying illumination conditions. Research limitations/implications: This paper proposes a front-end system for the automatic detection, localisation, and segmentation of solder joint defects. Further research is required to complete the full system including the classification of solder joint defects. Practical implications: The methodology presented in this paper can be an effective method to reduce cost and improve quality in production of PCBs in the manufacturing industry. Originality/value: This research proposes the automatic location, identification and segmentation of solder joints under different illumination conditions.
Resumo:
Many current chemistry programs privilege de-contextualised conceptual learning, often limited by a narrow selection of pedagogies that too often ignore the realities of studentse own lives and interests (e.g., Tytler, 2007). One new approach that offers hope for improving studentse engagement in learning chemistry and perceived relevance of chemistry is the context-based approach. This study investigated how teaching and learning occurred in one year 11 context-based chemistry classroom. Through an interpretive methodology using a case study design, the teaching and learning that occurred during one term (ten weeks) of a unit on Water Quality are described. The researcher was a participant observer in the study who co-designed the unit of work with the teacher. The research questions explored the structure and implementation of the context-based approach, the circumstances by which students connected concepts and context in the context-based classroom and the outcome of the approach for the students and the teacher. A dialectical sociocultural theoretical framework using the dialectics of structure | agency and agency | passivity was used as a lens to explore the interactions between learners in different fields, such as the field of the classroom and the field of the local community. The findings of this study highlight the difficulties teachers face when implementing a new pedagogical approach. Time constraints and opportunities for students to demonstrate a level of conceptual understanding that satisfied the teacher, hindered a full implementation of the approach. The study found that for high (above average) and sound (average) achieving students, connections between sanctioned science content of school curriculum and the studentse out-of-school worlds were realised when students actively engaged in fields that contextualised inquiry and gave them purpose for learning. Fluid transitions or the toing and froing between concepts and contexts occurred when structures in the classroom afforded students the agency to connect concepts and contexts. The implications for teaching by a context-based approach suggest that keeping the context central, by teaching content on a "need-to-know" basis, contextualises the chemistry for students. Also, if teachers provide opportunities for student-student interactions and written work student learning can improve.
Resumo:
The load–frequency control (LFC) problem has been one of the major subjects in a power system. In practice, LFC systems use proportional–integral (PI) controllers. However since these controllers are designed using a linear model, the non-linearities of the system are not accounted for and they are incapable of gaining good dynamical performance for a wide range of operating conditions in a multi-area power system. A strategy for solving this problem because of the distributed nature of a multi-area power system is presented by using a multi-agent reinforcement learning (MARL) approach. It consists of two agents in each power area; the estimator agent provides the area control error (ACE) signal based on the frequency bias estimation and the controller agent uses reinforcement learning to control the power system in which genetic algorithm optimisation is used to tune its parameters. This method does not depend on any knowledge of the system and it admits considerable flexibility in defining the control objective. Also, by finding the ACE signal based on the frequency bias estimation the LFC performance is improved and by using the MARL parallel, computation is realised, leading to a high degree of scalability. Here, to illustrate the accuracy of the proposed approach, a three-area power system example is given with two scenarios.
Resumo:
In a power network, when a propagation energy wave caused by a disturbance hits a weak link, a reflection is appeared and some of energy is transferred across the link. In this work, an analytical descriptive methodology is proposed to study the dynamical stability of a large scale power system. For this purpose, the measured electrical indices (angle, or voltage/frequency) following a fault in different points among the network are used, and the behaviors of the propagated waves through the lines, nodes and buses are studied. This work addresses a new tool for power system stability analysis based on a descriptive study of electrical measurements. The proposed methodology is also useful to detect the contingency condition and synthesis of an effective emergency control scheme.
Resumo:
Appearance-based mapping and localisation is especially challenging when separate processes of mapping and localisation occur at different times of day. The problem is exacerbated in the outdoors where continuous change in sun angle can drastically affect the appearance of a scene. We confront this challenge by fusing the probabilistic local feature based data association method of FAB-MAP with the pose cell filtering and experience mapping of RatSLAM. We evaluate the effectiveness of our amalgamation of methods using five datasets captured throughout the day from a single camera driven through a network of suburban streets. We show further results when the streets are re-visited three weeks later, and draw conclusions on the value of the system for lifelong mapping.
Resumo:
Competent navigation in an environment is a major requirement for an autonomous mobile robot to accomplish its mission. Nowadays, many successful systems for navigating a mobile robot use an internal map which represents the environment in a detailed geometric manner. However, building, maintaining and using such environment maps for navigation is difficult because of perceptual aliasing and measurement noise. Moreover, geometric maps require the processing of huge amounts of data which is computationally expensive. This thesis addresses the problem of vision-based topological mapping and localisation for mobile robot navigation. Topological maps are concise and graphical representations of environments that are scalable and amenable to symbolic manipulation. Thus, they are well-suited for basic robot navigation applications, and also provide a representational basis for the procedural and semantic information needed for higher-level robotic tasks. In order to make vision-based topological navigation suitable for inexpensive mobile robots for the mass market we propose to characterise key places of the environment based on their visual appearance through colour histograms. The approach for representing places using visual appearance is based on the fact that colour histograms change slowly as the field of vision sweeps the scene when a robot moves through an environment. Hence, a place represents a region of the environment rather than a single position. We demonstrate in experiments using an indoor data set, that a topological map in which places are characterised using visual appearance augmented with metric clues provides sufficient information to perform continuous metric localisation which is robust to the kidnapped robot problem. Many topological mapping methods build a topological map by clustering visual observations to places. However, due to perceptual aliasing observations from different places may be mapped to the same place representative in the topological map. A main contribution of this thesis is a novel approach for dealing with the perceptual aliasing problem in topological mapping. We propose to incorporate neighbourhood relations for disambiguating places which otherwise are indistinguishable. We present a constraint based stochastic local search method which integrates the approach for place disambiguation in order to induce a topological map. Experiments show that the proposed method is capable of mapping environments with a high degree of perceptual aliasing, and that a small map is found quickly. Moreover, the method of using neighbourhood information for place disambiguation is integrated into a framework for topological off-line simultaneous localisation and mapping which does not require an initial categorisation of visual observations. Experiments on an indoor data set demonstrate the suitability of our method to reliably localise the robot while building a topological map.
Resumo:
A configurable process model describes a family of similar process models in a given domain. Such a model can be configured to obtain a specific process model that is subsequently used to handle individual cases, for instance, to process customer orders. Process configuration is notoriously difficult as there may be all kinds of interdependencies between configuration decisions.} In fact, an incorrect configuration may lead to behavioral issues such as deadlocks and livelocks. To address this problem, we present a novel verification approach inspired by the ``operating guidelines'' used for partner synthesis. We view the configuration process as an external service, and compute a characterization of all such services which meet particular requirements using the notion of configuration guideline. As a result, we can characterize all feasible configurations (i.\,e., configurations without behavioral problems) at design time, instead of repeatedly checking each individual configuration while configuring a process model.
Resumo:
In the paper, the flow-shop scheduling problem with parallel machines at each stage (machine center) is studied. For each job its release and due date as well as a processing time for its each operation are given. The scheduling criterion consists of three parts: the total weighted earliness, the total weighted tardiness and the total weighted waiting time. The criterion takes into account the costs of storing semi-manufactured products in the course of production and ready-made products as well as penalties for not meeting the deadlines stated in the conditions of the contract with customer. To solve the problem, three constructive algorithms and three metaheuristics (based one Tabu Search and Simulated Annealing techniques) are developed and experimentally analyzed. All the proposed algorithms operate on the notion of so-called operation processing order, i.e. the order of operations on each machine. We show that the problem of schedule construction on the base of a given operation processing order can be reduced to the linear programming task. We also propose some approximation algorithm for schedule construction and show the conditions of its optimality.
Resumo:
Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.
Resumo:
Interdisciplinary studies are fundamental to the signature practices for the middle years of schooling. Middle years researchers claim that interdisciplinarity in teaching appropriately meets the needs of early adolescents by tying concepts together, providing frameworks for the relevance of knowledge, and demonstrating the linking of disparate information for solution of novel problems. Cognitive research is not wholeheartedly supportive of this position. Learning theorists assert that application of knowledge in novel situations for the solution of problems is actually dependent on deep discipline based understandings. The present research contrasts the capabilities of early adolescent students from discipline based and interdisciplinary based curriculum schooling contexts to successfully solve multifaceted real world problems. This will inform the development of effective management of middle years of schooling curriculum.
Resumo:
Purpose: Although the branding literature emerged during the 1940s, research relating to tourism destination branding has only gained momentum since the late 1990s. There remains a lack of theory in particular that addresses the measurement of the effectiveness of destination branding over time. The purpose of the research was to test the effectiveness of a model of consumer-based brand equity (CBBE) for a country destination.---------- Design/methodology: A model of consumer-based brand equity was adapted from the marketing literature and applied to a nation context. The model was tested by using structural equation modelling with data from a large Chilean sample (n=845), comprising a mix of previous visitors and non-visitors. The model fits the data well. Findings: This paper reports the results of an investigation into brand equity for Australia as a long haul destination in an emerging market. The research took place just before the launch of the nation’s fourth new brand campaign in six years. The results indicate Australia is a well known but not compelling destination brand for tourists in Chile, which reflects the lower priority the South American market has been given by the national tourism office (NTO).---------- Practical implications: It is suggested that CBBE measures could be analysed at various points in time to track any strengthening or weakening of market perceptions in relation to brand objectives. A standard CBBE instrument could provide long-term effectiveness performance measures regardless of changes in destination marketing organisation (DMO) staff, advertising agency, other stakeholders, and budget.---------- Originality/value: This study contributes to the nation-branding literature by being one of the first to test the efficacy of a model of consumer-based brand equity for a tourism destination brand.
Resumo:
It is widely held that strong relationships exist between housing, economic status, and well being. This is exemplified by widespread housing stock surpluses in many countries which threaten to destabilise numerous aspects related to individuals and community. However, the position of housing demand and supply is not consistent. The Australian position provides a distinct contrast whereby seemingly inexorable housing demand generally remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand ensures elements related to housing affordability continue to gain prominence. A significant, but less visible factor impacting housing affordability – particularly new housing development – relates to holding costs. These costs are in many ways “hidden” and cannot always be easily identified. Although it is only one contributor, the nature and extent of its impact requires elucidation. In its simplest form, it commences with a calculation of the interest or opportunity cost of land holding. However, there is significantly more complexity for major new developments - particularly greenfield property development. Preliminary analysis conducted by the author suggests that even small shifts in primary factors impacting holding costs can appreciably affect housing affordability – and notably, to a greater extent than commonly held. Even so, their importance and perceived high level impact can be gauged from the unprecedented level of attention policy makers have given them over recent years. This may be evidenced by the embedding of specific strategies to address burgeoning holding costs (and particularly those cost savings associated with streamlining regulatory assessment) within statutory instruments such as the Queensland Housing Affordability Strategy, and the South East Queensland Regional Plan. However, several key issues require investigation. Firstly, the computation and methodology behind the calculation of holding costs varies widely. In fact, it is not only variable, but in some instances completely ignored. Secondly, some ambiguity exists in terms of the inclusion of various elements of holding costs, thereby affecting the assessment of their relative contribution. Perhaps this may in part be explained by their nature: such costs are not always immediately apparent. Some forms of holding costs are not as visible as the more tangible cost items associated with greenfield development such as regulatory fees, government taxes, acquisition costs, selling fees, commissions and others. Holding costs are also more difficult to evaluate since for the most part they must be ultimately assessed over time in an ever-changing environment, based on their strong relationship with opportunity cost which is in turn dependant, inter alia, upon prevailing inflation and / or interest rates. By extending research in the general area of housing affordability, this thesis seeks to provide a more detailed investigation of those elements related to holding costs, and in so doing determine the size of their impact specifically on the end user. This will involve the development of soundly based economic and econometric models which seek to clarify the componentry impacts of holding costs. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.
Resumo:
The Simultaneous Localisation And Mapping (SLAM) problem is one of the major challenges in mobile robotics. Probabilistic techniques using high-end range finding devices are well established in the field, but recent work has investigated vision-only approaches. We present an alternative approach to the leading existing techniques, which extracts approximate rotational and translation velocity information from a vehicle-mounted consumer camera, without tracking landmarks. When coupled with an existing SLAM system, the vision module is able to map a 45 metre long indoor loop and a 1.6 km long outdoor road loop, without any parameter or system adjustment between tests. The work serves as a promising pilot study into ground-based vision-only SLAM, with minimal geometric interpretation of the environment.