818 resultados para knowledge based systems
Resumo:
Globalisation and the emergence of knowledge-based economies have forced many countries to reform their education system. The enhancement of human capital to meet modern day demands of a knowledge economy, and equip the new generation with the capacity to meet the challenges of the 21st Century has become a priority. This change is particularly necessary in economies typical of countries, such as Kuwait, which have been dependent on the exploitation of non-renewable natural resources. Transiting from a resource-based economy to an economy based on knowledge and intellectual skills poses a key challenge for an education system. Significant in the development of this new economy has been the expansion of Information Communication Technology (ICT). In education, in particular, ICT is a tool for transforming the education setting. However, transformation is only successful where there are effective change management strategies and appropriate leadership. At the school level, rapid changes have affected the role that principals take particularly in relation to leading the change process. Therefore, this study investigated the leadership practices of school principals for embedding ICT into schools. The case study assessed two Kuwaiti secondary schools; both schools had well established ICT programs. The mode of data collection used a mixed-methods design, to address the purpose of the study, namely, to examine the leadership practices of school principals when managing the change processes associated with embedding ICT in the context of Kuwait. A theoretical model of principal leadership, developed, from the literature, documented and analysed the practices of the respective school principals. The study used the following five data sources: (a) face to face interviews (with each school principal), and two focus group interviews (with five teachers and five students, from each school); (b) school documents (related to the implementation and embedding of ICT); (c) one survey (of all teachers in each school); (d) an open-ended questionnaire (of participating principals and teachers); and (e) the observation of ICT activities (PD ICT activities and instruction meetings). The study revealed a range of strategies used by the principals and aligned with the theoretical perspective. However, these strategies needed to be refined and selectively used to fit the Kuwait context, both culturally and organisationally. The principals of Schools A and B employed three key strategies to maximise the impact on the teaching staff incorporating ICT into their teaching and learning practices. These strategies were: (a) encouragement for teaching staff to implement ICT in their teaching; (b) support to meet the material and human needs of teaching staff using ICT; and (c) provision of instructions and guidance for teaching staff in how and why such behaviours and practices should be performed. The strategies provided the basic leadership practices required to construct a successful ICT embedded implementation process. Hence, a revised model of leadership that has applicability in the adoption of ICT in Kuwait was developed. The findings provide a better understanding of how a school principal’s leadership practices impact upon the ICT embedding process. Hence, the outcome of this study informs emerging countries, which are also undergoing major change related to ICT, for example, other members of the Cooperation Council for the Arab States of the Gulf. From an educational perspective, this knowledge has the potential to support ICT-based learning environments that will help educational practitioners to effectively integrate ICT into teaching and learning that will facilitate students’ ICT engagement, and prepare them for the ICT development challenges that are associated with the new economy; this is achieved by increasing students’ knowledge and performance. Further, the study offers practical strategies that have been shown to work for school principals leading ICT implementation in Kuwait. These strategies include how to deal with the shortage in schools’ budgets, and the promotion of the ICT vision, as well as developing approaches to build collaborative culture in the schools.
Resumo:
A cost estimation method is required to estimate the life cycle cost of a product family at the early stage of product development in order to evaluate the product family design. There are difficulties with existing cost estimation techniques in estimating the life cycle cost for a product family at the early stage of product development. This paper proposes a framework that combines a knowledge based system and an activity based costing techniques in estimating the life cycle cost of a product family at the early stage of product development. The inputs of the framework are the product family structure and its sub function. The output of the framework is the life cycle cost of a product family that consists of all costs at each product family level and the costs of each product life cycle stage. The proposed framework provides a life cycle cost estimation tool for a product family at the early stage of product development using high level information as its input. The framework makes it possible to estimate the life cycle cost of various product family that use any types of product structure. It provides detailed information related to the activity and resource costs of both parts and products that can assist the designer in analyzing the cost of the product family design. In addition, it can reduce the required amount of information and time to construct the cost estimation system.
Resumo:
Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
Changing environments present a number of challenges to mobile robots, one of the most significant being mapping and localisation. This problem is particularly significant in vision-based systems where illumination and weather changes can cause feature-based techniques to fail. In many applications only sections of an environment undergo extreme perceptual change. Some range-based sensor mapping approaches exploit this property by combining occasional place recognition with the assumption that odometry is accurate over short periods of time. In this paper, we develop this idea in the visual domain, by using occasional vision-driven loop closures to infer loop closures in nearby locations where visual recognition is difficult due to extreme change. We demonstrate successful map creation in an environment in which change is significant but constrained to one area, where both the vanilla CAT-Graph and a Sum of Absolute Differences matcher fails, use the described techniques to link dissimilar images from matching locations, and test the robustness of the system against false inferences.
Resumo:
Dynamic capability theory asserts that the learning capabilities of construction organisations influence the degree to which value-for-money (VfM) is achieved on collaborative projects. However, there has been little study conducted to verify this relationship. The evidence is particularly limited within the empirical context of infrastructure delivery in Australia. Primarily drawing on the theoretical perspectives of the resource-based view of the firm (e.g. Barney 1991), dynamic capabilities (e.g. Helfat et al. 2007), absorptive capacity (e.g. Lane et al. 2006) and knowledge management (e.g. Nonaka 1994), this paper conceptualises learning capability as a knowledge-based dynamic capability. Learning capability builds on the micro-foundations of high-order learning routines, which are deliberately developed by construction organisations for managing collaborative projects. Based on this conceptualisation of learning capability, an exploratory case study was conducted. The study investigated the operational and higher-order learning routines adopted by a project alliance team to successfully achieve VfM. The case study demonstrated that the learning routines of the alliance project were developed and modified by the continual joint learning activities of participant organisations. Project-level learning routines were found to significantly influence the development of organisational-level learning routines. In turn, the learning outcomes generated from the alliance project appeared to significantly influence the development of project management routines and contractual arrangements applied by the participant organisations in subsequent collaborative projects. The case study findings imply that the higher-order learning routines that underpin the learning capability of construction organisations have the potential to influence the VfM achieved on both current and future collaborative projects.
Resumo:
Learning capability (LC) is a special dynamic capability that a firm purposefully builds to develop a cognitive focus, so as to enable the configuration and improvement of other capabilities (both dynamic and operational) to create and respond to market changes. Empirical evidence regarding the essential role of LC in leveraging operational manufacturing capabilities is, however, limited in the literature. This study takes a routine-based approach to understand capability, and focuses on demonstrating leveraging power of LC upon two essential operational capabilities within the manufacturing context, i.e., operational new product development capability (ONPDC), and operational supplier integration capability (OSIC). A mixed-methods research framework was used, which combines sources of evidence derived from a survey study and a multiple case study. This study identified high-level routines of LC that can be designed and controlled by managers and practitioners, to reconfigure underlying routines of ONPDC and OSIC to achieve superior performance in a turbulent environment. Hence, the study advances the notion of knowledge-based dynamic capabilities, such as LC, as routine bundles. It also provides an impetus for managing manufacturing operations from a capability-based perspective in the fast changing knowledge era.
Resumo:
Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.
Resumo:
We introduce the use of Ingenuity Pathway Analysis to analyzing global metabonomics in order to characterize phenotypically biochemical perturbations and the potential mechanisms of the gentamicin-induced toxicity in multiple organs. A single dose of gentamicin was administered to Sprague Dawley rats (200 mg/kg, n = 6) and urine samples were collected at -24-0 h pre-dosage, 0-24, 24-48, 48-72 and 72-96 h post-dosage of gentamicin. The urine metabonomics analysis was performed by UPLC/MS, and the mass spectra signals of the detected metabolites were systematically deconvoluted and analyzed by pattern recognition analyses (Heatmap, PCA and PLS-DA), revealing a time-dependency of the biochemical perturbations induced by gentamicin toxicity. As result, the holistic metabolome change induced by gentamicin toxicity in the animal's organisms was characterized. Several metabolites involved in amino acid metabolism were identified in urine, and it was confirmed that gentamicin biochemical perturbations can be foreseen from these biomarkers. Notoriously, it was found that gentamicin induced toxicity in multiple organs system in the laboratory rats. The proof-of-knowledge based Ingenuity Pathway Analysis revealed gentamicin induced liver and heart toxicity, along with the previously known toxicity in kidney. The metabolites creatine, nicotinic acid, prostaglandin E2, and cholic acid were identified and validated as phenotypic biomarkers of gentamicin induced toxicity. Altogether, the significance of the use of metabonomics analyses in the assessment of drug toxicity is highlighted once more; furthermore, this work demonstrated the powerful predictive potential of the Ingenuity Pathway Analysis to study of drug toxicity and its valuable complementation for metabonomics based assessment of the drug toxicity.
Resumo:
Policy makers increasingly recognise that an educated workforce with a high proportion of Science, Technology, Engineering and Mathematics (STEM) graduates is a pre-requisite to a knowledge-based, innovative economy. Over the past ten years, the proportion of first university degrees awarded in Australia in STEM fields is below the global average and continues to decrease from 22.2% in 2002 to 18.8% in 2010 [1]. These trends are mirrored by declines between 20% and 30% in the proportions of high school students enrolled in science or maths. These trends are not unique to Australia but their impact is of concern throughout the policy-making community. To redress these demographic trends, QUT embarked upon a long-term investment strategy to integrate education and research into the physical and virtual infrastructure of the campus, recognising that expectations of students change as rapidly as technology and learning practices change. To implement this strategy, physical infrastructure refurbishment/re-building is accompanied by upgraded technologies not only for learning but also for research. QUT’s vision for its city-based campuses is to create vibrant and attractive places to learn and research and to link strongly to the wider surrounding community. Over a five year period, physical infrastructure at the Gardens Point campus was substantially reconfigured in two key stages: (a) a >$50m refurbishment of heritage-listed buildings to encompass public, retail and social spaces, learning and teaching “test beds” and research laboratories and (b) destruction of five buildings to be replaced by a $230m, >40,000m2 Science and Engineering Centre designed to accommodate retail, recreation, services, education and research in an integrated, coordinated precinct. This landmark project is characterised by (i) self-evident, collaborative spaces for learning, research and social engagement, (ii) sustainable building practices and sustainable ongoing operation and; (iii) dynamic and mobile re-configuration of spaces or staffing to meet demand. Innovative spaces allow for transformative, cohort-driven learning and the collaborative use of space to prosecute joint class projects. Research laboratories are aggregated, centralised and “on display” to the public, students and staff. A major visualisation space – the largest multi-touch, multi-user facility constructed to date – is a centrepiece feature that focuses on demonstrating scientific and engineering principles or science oriented scenes at large scale (e.g. the Great Barrier Reef). Content on this visualisation facility is integrated with the regional school curricula and supports an in-house schools program for student and teacher engagement. Researchers are accommodated in a combined open-plan and office floor-space (80% open plan) to encourage interdisciplinary engagement and cross-fertilisation of skills, ideas and projects. This combination of spaces re-invigorates the on-campus experience, extends educational engagement across all ages and rapidly enhances research collaboration.
Resumo:
Recently, vision-based systems have been deployed in professional sports to track the ball and players to enhance analysis of matches. Due to their unobtrusive nature, vision-based approaches are preferred to wearable sensors (e.g. GPS or RFID sensors) as it does not require players or balls to be instrumented prior to matches. Unfortunately, in continuous team sports where players need to be tracked continuously over long-periods of time (e.g. 35 minutes in field-hockey or 45 minutes in soccer), current vision-based tracking approaches are not reliable enough to provide fully automatic solutions. As such, human intervention is required to fix-up missed or false detections. However, in instances where a human can not intervene due to the sheer amount of data being generated - this data can not be used due to the missing/noisy data. In this paper, we investigate two representations based on raw player detections (and not tracking) which are immune to missed and false detections. Specifically, we show that both team occupancy maps and centroids can be used to detect team activities, while the occupancy maps can be used to retrieve specific team activities. An evaluation on over 8 hours of field hockey data captured at a recent international tournament demonstrates the validity of the proposed approach.
Resumo:
In this paper, we describe a method to represent and discover adversarial group behavior in a continuous domain. In comparison to other types of behavior, adversarial behavior is heavily structured as the location of a player (or agent) is dependent both on their teammates and adversaries, in addition to the tactics or strategies of the team. We present a method which can exploit this relationship through the use of a spatiotemporal basis model. As players constantly change roles during a match, we show that employing a "role-based" representation instead of one based on player "identity" can best exploit the playing structure. As vision-based systems currently do not provide perfect detection/tracking (e.g. missed or false detections), we show that our compact representation can effectively "denoise" erroneous detections as well as enabe temporal analysis, which was previously prohibitive due to the dimensionality of the signal. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labelled data.
Resumo:
Building knowledge economies seems synonymous with re-imaging urban fabrics. Cities producing vibrant public realms are believed to have better success in distinguishing themselves within a highly competitive market. Many governments are heavily investing in cultural enhancements burgeoning distinctive cosmopolitan centers of which public art is emerging as a significant stakeholder. Brisbane’s goal to grow a knowledge-based economy similarly addresses public art. To stimulate engagement with public art Brisbane City Council has delivered an online public art catalogue and assembled three public art trails, with a fourth newly augmented. While many pieces along these trails are obviously public others question the term ‘public’ through an obscured milieu where a ‘look but don’t touch’ policy is subtly implied. This study investigates the interactional relationship between publics and public art, and in doing so, explores the concept of accessibility. This paper recommends that installations of sculpture within an emerging city should be considered in terms of economic output measured through the degree in which the public engages.
Resumo:
Objectives: Experiential knowledge of elite athletes and coaches was investigated to reveal insights on expertise acquisition in cricket fast bowling. Design: Twenty-one past or present elite cricket fast bowlers and coaches of national or international level were interviewed using an in-depth, open-ended, semi-structured approach. Methods: Participants were asked about specific factors which they believed were markers of fast bowling expertise potential. Of specific interest was the relative importance of each potential component of fast bowling expertise and how components interacted or developed over time. Results: The importance of intrinsic motivation early in development was highlighted, along with physical, psychological and technical attributes. Results supported a multiplicative and interactive complex systems model of talent development in fast bowling, in which component weightings were varied due to individual differences in potential experts. Dropout rates in potential experts were attributed to misconceived current talent identification programmes and coaching practices, early maturation and physical attributes, injuries and lack of key psychological attributes and skills. Conclusions: Data are consistent with a dynamical systems model of expertise acquisition in fast bowling, with numerous trajectories available for talent development. Further work is needed to relate experiential and theoretical knowledge on expertise in other sports.