153 resultados para problem-based methodology
Resumo:
Purpose of review: To critique the recent literature on telephone, correspondence-based, and computerized interventions for alcohol problems, which enhance or substitute for practitioner-delivered treatments. Recent findings: There is an unmet need for screening, assessment and intervention for alcohol problems, in part because of the difficulty in accessing such treatment within the current health care system. Research on the efficacy of correspondence or electronic (for example Internet-based) interventions is beginning to emerge. In the period 2003–2004 we identified nine acceptability or feasibility studies of these approaches and seven efficacy trials covering a wide range of settings. These modes of intervention are acceptable to patients and the public, and with careful planning, can be implemented in a variety of settings. Treatment trials demonstrate the efficacy of these interventions in reducing hazardous drinking by university students, in delaying initiation of heavy drinking in children and adolescents, and, intriguingly, in addressing insomnia among recovering alcoholics. Summary: There is strong support among potential users for alcohol interventions that employ telephone assistance, written correspondence, and the Internet. These new technologies offer the prospect of increasing the reach of interventions for problem drinking and being cost- effective alternatives or supplements to face-to-face health service delivery.
Resumo:
INTRODUCTION In their target article, Yuri Hanin and Muza Hanina outlined a novel multidisciplinary approach to performance optimisation for sport psychologists called the Identification-Control-Correction (ICC) programme. According to the authors, this empirically-verified, psycho-pedagogical strategy is designed to improve the quality of coaching and consistency of performance in highly skilled athletes and involves a number of steps including: (i) identifying and increasing self-awareness of ‘optimal’ and ‘non-optimal’ movement patterns for individual athletes; (ii) learning to deliberately control the process of task execution; and iii), correcting habitual and random errors and managing radical changes of movement patterns. Although no specific examples were provided, the ICC programme has apparently been successful in enhancing the performance of Olympic-level athletes. In this commentary, we address what we consider to be some important issues arising from the target article. We specifically focus attention on the contentious topic of optimization in neurobiological movement systems, the role of constraints in shaping emergent movement patterns and the functional role of movement variability in producing stable performance outcomes. In our view, the target article and, indeed, the proposed ICC programme, would benefit from a dynamical systems theoretical backdrop rather than the cognitive scientific approach that appears to be advocated. Although Hanin and Hanina made reference to, and attempted to integrate, constructs typically associated with dynamical systems theoretical accounts of motor control and learning (e.g., Bernstein’s problem, movement variability, etc.), these ideas required more detailed elaboration, which we provide in this commentary.
Resumo:
In the filed of semantic grid, QoS-based Web service scheduling for workflow optimization is an important problem.However, in semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the scheduling consider not only quality properties of Web services, but also inter service dependencies which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address scheduling optimization problems in workflow applications in the presence of domain constraints and inter service dependencies. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.
Resumo:
In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
The problem of impostor dataset selection for GMM-based speaker verification is addressed through the recently proposed data-driven background dataset refinement technique. The SVM-based refinement technique selects from a candidate impostor dataset those examples that are most frequently selected as support vectors when training a set of SVMs on a development corpus. This study demonstrates the versatility of dataset refinement in the task of selecting suitable impostor datasets for use in GMM-based speaker verification. The use of refined Z- and T-norm datasets provided performance gains of 15% in EER in the NIST 2006 SRE over the use of heuristically selected datasets. The refined datasets were shown to generalise well to the unseen data of the NIST 2008 SRE.
Resumo:
This paper presents a novel approach of estimating the confidence interval of speaker verification scores. This approach is utilised to minimise the utterance lengths required in order to produce a confident verification decision. The confidence estimation method is also extended to address both the problem of high correlation in consecutive frame scores, and robustness with very limited training samples. The proposed technique achieves a drastic reduction in the typical data requirements for producing confident decisions in an automatic speaker verification system. When evaluated on the NIST 2005 SRE, the early verification decision method demonstrates that an average of 5–10 seconds of speech is sufficient to produce verification rates approaching those achieved previously using an average in excess of 100 seconds of speech.
Resumo:
The current understanding of students’ group metacognition is limited. The research on metacognition has focused mainly on the individual student. The aim of this study was to address the void by developing a conceptual model to inform the use of scaffolds to facilitate group metacognition during mathematical problem solving in computer supported collaborative learning (CSCL) environments. An initial conceptual framework based on the literature from metacognition, cooperative learning, cooperative group metacognition, and computer supported collaborative learning was used to inform the study. In order to achieve the study aim, a design research methodology incorporating two cycles was used. The first cycle focused on the within-group metacognition for sixteen groups of primary school students working together around the computer; the second cycle included between-group metacognition for six groups of primary school students working together on the Knowledge Forum® CSCL environment. The study found that providing groups with group metacognitive scaffolds resulted in groups planning, monitoring, and evaluating the task and team aspects of their group work. The metacognitive scaffolds allowed students to focus on how their group was completing the problem-solving task and working together as a team. From these findings, a revised conceptual model to inform the use of scaffolds to facilitate group metacognition during mathematical problem solving in computer supported collaborative learning (CSCL) environments was generated.
Resumo:
Mobile robots are widely used in many industrial fields. Research on path planning for mobile robots is one of the most important aspects in mobile robots research. Path planning for a mobile robot is to find a collision-free route, through the robot’s environment with obstacles, from a specified start location to a desired goal destination while satisfying certain optimization criteria. Most of the existing path planning methods, such as the visibility graph, the cell decomposition, and the potential field are designed with the focus on static environments, in which there are only stationary obstacles. However, in practical systems such as Marine Science Research, Robots in Mining Industry, and RoboCup games, robots usually face dynamic environments, in which both moving and stationary obstacles exist. Because of the complexity of the dynamic environments, research on path planning in the environments with dynamic obstacles is limited. Limited numbers of papers have been published in this area in comparison with hundreds of reports on path planning in stationary environments in the open literature. Recently, a genetic algorithm based approach has been introduced to plan the optimal path for a mobile robot in a dynamic environment with moving obstacles. However, with the increase of the number of the obstacles in the environment, and the changes of the moving speed and direction of the robot and obstacles, the size of the problem to be solved increases sharply. Consequently, the performance of the genetic algorithm based approach deteriorates significantly. This motivates the research of this work. This research develops and implements a simulated annealing algorithm based approach to find the optimal path for a mobile robot in a dynamic environment with moving obstacles. The simulated annealing algorithm is an optimization algorithm similar to the genetic algorithm in principle. However, our investigation and simulations have indicated that the simulated annealing algorithm based approach is simpler and easier to implement. Its performance is also shown to be superior to that of the genetic algorithm based approach in both online and offline processing times as well as in obtaining the optimal solution for path planning of the robot in the dynamic environment. The first step of many path planning methods is to search an initial feasible path for the robot. A commonly used method for searching the initial path is to randomly pick up some vertices of the obstacles in the search space. This is time consuming in both static and dynamic path planning, and has an important impact on the efficiency of the dynamic path planning. This research proposes a heuristic method to search the feasible initial path efficiently. Then, the heuristic method is incorporated into the proposed simulated annealing algorithm based approach for dynamic robot path planning. Simulation experiments have shown that with the incorporation of the heuristic method, the developed simulated annealing algorithm based approach requires much shorter processing time to get the optimal solutions in the dynamic path planning problem. Furthermore, the quality of the solution, as characterized by the length of the planned path, is also improved with the incorporated heuristic method in the simulated annealing based approach for both online and offline path planning.
Resumo:
It has long been recognised that government and public sector services suffer an innovation deficit compared to private or market-based services. This paper argues that this can be explained as an unintended consequence of the concerted public sector drive toward the elimination of waste through efficiency, accountability and transparency. Yet in an evolving economy this can be a false efficiency, as it also eliminates the 'good waste' that is a necessary cost of experimentation. This results in a systematic trade0off in the public sector between the static efficiency of minimizing the misuse of public resources and the dynamic efficiency of experimentation. this is inherently biased against risk and uncertainty and therein, explains why governments find service innovation so difficult. In the drive to eliminate static inefficiencies, many political systems have susequently overshot and stifled policy innovation. I propose the 'Red Queen' solution of adaptive economic policy.
Resumo:
Purpose: The purpose of this article is to investigate the engineering of creative urban regions through knowledge-based urban development. In recent years city administrators realised the importance of engineering and orchestrating knowledge city formation through visioning and planning for economic, socio-cultural and physical development. For that purpose a new development paradigm of ‘‘knowledge-based urban development’’ is formed, and quickly finds implementation ground in many parts of the globe.----- Design/methodology/approach: The paper reviews the literature and examines global best practice experiences in order to determine how cities are engineering their creative urban regions so as to establish a base for knowledge city formation.----- Findings: The paper sheds light on the different development approaches for creative urban regions, and concludes with recommendations for urban administrations planning for knowledge-based development of creative urban regions.----- Originality/value: The paper provides invaluable insights and discussion on the vital role of planning for knowledge-based urban development of creative urban regions.
Resumo:
Cultural objects are increasingly generated and stored in digital form, yet effective methods for their indexing and retrieval still remain an important area of research. The main problem arises from the disconnection between the content-based indexing approach used by computer scientists and the description-based approach used by information scientists. There is also a lack of representational schemes that allow the alignment of the semantics and context with keywords and low-level features that can be automatically extracted from the content of these cultural objects. This paper presents an integrated approach to address these problems, taking advantage of both computer science and information science approaches. We firstly discuss the requirements from a number of perspectives: users, content providers, content managers and technical systems. We then present an overview of our system architecture and describe various techniques which underlie the major components of the system. These include: automatic object category detection; user-driven tagging; metadata transform and augmentation, and an expression language for digital cultural objects. In addition, we discuss our experience on testing and evaluating some existing collections, analyse the difficulties encountered and propose ways to address these problems.
Resumo:
We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.
Resumo:
Purpose: A population based, cross-sectional telephone survey was conducted to estimate the total penetrance of contact lens wear in Australia. Methods: A total of 42,749 households around Australia were randomly selected from the national electronic telephone directory based on postcode distribution. Before contact was attempted, letters of introduction were sent. The number of individuals and contact lens wearers in each household was ascertained and lens wearers were interviewed to determine details of lens type and mode of wear using a structured questionnaire. Results: Of households contacted, 59.2% (19,171/32,405) agreed to participate. Response rates were only marginally higher amongst households that first received a letter of introduction. In these households, 35,914 individuals were identified, of which, 1,798 were contact lens wearers. The penetrance of contact lens wear during the study period was 5.01% (95% CI: 4.78-5.24). Soft hydrogel lenses had the largest penetrance in the community, (66.7% of all wearers), however, their market share decreased significantly over the study period with increased uptake of newly introduced lens types. Conclusions: The penetrance of contact lens wear concurs with market estimates and equates to approximately 680,000 contact lens wearers aged between 15 and 64 years in Australia. The low response rate obtained in this study highlights the difficulty in contemporary use of telephone survey methodology
Resumo:
The establishment of corporate objectives regarding economic, environmental, social, and ethical responsibilities, to inform business practice, has been gaining credibility in the business sector since the early 1990’s. This is witnessed through (i) the formation of international forums for sustainable and accountable development, (ii) the emergence of standards, systems, and frameworks to provide common ground for regulatory and corporate dialogue, and (iii) the significant quantum of relevant popular and academic literature in a diverse range of disciplines. How then has this move towards greater corporate responsibility become evident in the provision of major urban infrastructure projects? The gap identified, in both academic literature and industry practice, is a structured and auditable link between corporate intent and project outcomes. Limited literature has been discovered which makes a link between corporate responsibility; project performance indicators (or critical success factors) and major infrastructure provision. This search revealed that a comprehensive mapping framework, from an organisation’s corporate objectives through to intended, anticipated and actual outcomes and impacts has not yet been developed for the delivery of such projects. The research problem thus explored is ‘the need to better identify, map and account for the outcomes, impacts and risks associated with economic, environmental, social and ethical outcomes and impacts which arise from major economic infrastructure projects, both now, and into the future’. The methodology being used to undertake this research is based on Checkland’s soft system methodology, engaging in action research on three collaborative case studies. A key outcome of this research is a value-mapping framework applicable to Australian public sector agencies. This is a decision-making methodology which will enable project teams responsible for delivering major projects, to better identify and align project objectives and impacts with stated corporate objectives.