581 resultados para pacs: knowledge engineering techniques
Resumo:
Foreword: In this paper I call upon a praxiological approach. Praxeology (early alteration of praxiology) is the study of human action and conduct. The name praxeology/praxiologyakes is root in praxis, Medieval Latin, from Greek, doing, action, from prassein to do, practice (Merriam-Webster Dictionary). Having been involved in project management education, research and practice for the last twenty years, I have constantly tried to improve and to provide a better understanding/knowledge of the field and related practice, and as a consequence widen and deepen the competencies of the people I was working with (and my own competencies as well!), assuming that better project management lead to more efficient and effective use of resources, development of people and at the end to a better world. For some time I have perceived a need to clarify the foundations of the discipline of project management, or at least elucidate what these foundations could be. An immodest task, one might say! But not a neutral one! I am constantly surprised by the way the world (i.e., organizations, universities, students and professional bodies) sees project management: as a set of methods, techniques, tools, interacting with others fields – general management, engineering, construction, information systems, etc. – bringing some effective ways of dealing with various sets of problems – from launching a new satellite to product development through to organizational change.
Resumo:
This paper introduces the Weighted Linear Discriminant Analysis (WLDA) technique, based upon the weighted pairwise Fisher criterion, for the purposes of improving i-vector speaker verification in the presence of high intersession variability. By taking advantage of the speaker discriminative information that is available in the distances between pairs of speakers clustered in the development i-vector space, the WLDA technique is shown to provide an improvement in speaker verification performance over traditional Linear Discriminant Analysis (LDA) approaches. A similar approach is also taken to extend the recently developed Source Normalised LDA (SNLDA) into Weighted SNLDA (WSNLDA) which, similarly, shows an improvement in speaker verification performance in both matched and mismatched enrolment/verification conditions. Based upon the results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset, we believe that both WLDA and WSNLDA are viable as replacement techniques to improve the performance of LDA and SNLDA-based i-vector speaker verification.
Resumo:
With the advent of social web initiatives, some argued that these new emerging tools might be useful in tacit knowledge sharing through providing interactive and collaborative technologies. However, there is still a poverty of literature to understand how and what might be the contributions of social media in facilitating tacit knowledge sharing. Therefore, this paper is intended to theoretically investigate and map social media concepts and characteristics with tacit knowledge creation and sharing requirements. By conducting a systematic literature review, five major requirements found that need to be present in an environment that involves tacit knowledge sharing. These requirements have been analyzed against social media concepts and characteristics to see how they map together. The results showed that social media have abilities to comply some of the main requirements of tacit knowledge sharing. The relationships have been illustrated in a conceptual framework, suggesting further empirical studies to acknowledge findings of this study.
Resumo:
This paper presents an extended granule mining based methodology, to effectively describe the relationships between granules not only by traditional support and confidence, but by diversity and condition diversity as well. Diversity measures how diverse of a granule associated with the other granules, it provides a kind of novel knowledge in databases. We also provide an algorithm to implement the proposed methodology. The experiments conducted to characterize a real network traffic data collection show that the proposed concepts and algorithm are promising.
Resumo:
Sustainability has emerged as a primary context for engineering education in the 21st Century, particularly the sub-discipline of chemical engineering. However, there is confusion over how to go about integrating sustainability knowledge and skills systemically within bachelor degrees. This paper addresses this challenge, using a case study of an Australian chemical engineering degree to highlight important practical considerations for embedding sustainability at the core of the curriculum. The paper begins with context for considering a systematic process for rapid curriculum renewal. The authors then summarise a 2-year federally funded project, which comprised piloting a model for rapid curriculum renewal led by the chemical engineering staff. Model elements contributing to the renewal of this engineering degree and described in this paper include: industry outreach; staff professional development; attribute identification and alignment; program mapping; and curriculum and teaching resource development. Personal reflections on the progress and process of rapid curriculum renewal in sustainability by the authors and participating engineering staff will be presented as a means to discuss and identify methodological improvements, as well as highlight barriers to project implementation. It is hoped that this paper will provide an example of a formalised methodology on which program reform and curriculum renewal for sustainability can be built upon in other higher education institutions.
Resumo:
Web 2.0 is a new generation of online applications on the web that permit people to collaborate and share information online. The use of such applications by employees in organisations enhances knowledge management (KM) in organisations. Employee involvement is a critical success factor as the concept is based on openness, engagement and collaboration between people where organizational knowledge is derived from employees experience, skills and best practices. Consequently, the employee's perception is recognized as being an important factor in web 2.0 adoption for KM and worthy of investigation. There are few studies that define and explore employee's enterprise 2.0 acceptance for KM. This paper provides a systematic review of the literature prior to demonstrating the findings as part of a preliminary conceptual model that represents the first stage of an ongoing research project that will end up with an empirical study. Reviewing available studies in technology acceptance, knowledge management and enterprise 2.0 literatures aids obtaining all potential user acceptance factors of enterprise 2.0. The preliminary conceptual model is a refinement of the theory of planed behaviour (TPB) as the user acceptance factors has been mapped into the TPB main components including behaviour attitude, subjective norms and behaviour control which are the determinant of individual's intention to a particular behaviour.
Resumo:
EMR (Electronic Medical Record) is an emerging technology that is highly-blended between non-IT and IT area. One methodology is to link the non-IT and IT area is to construct databases. Nowadays, it supports before and after-treatment for patients and should satisfy all stakeholders such as practitioners, nurses, researchers, administrators and financial departments and so on. In accordance with the database maintenance, DAS (Data as Service) model is one solution for outsourcing. However, there are some scalability and strategy issues when we need to plan to use DAS model properly. We constructed three kinds of databases such as plan-text, MS built-in encryption which is in-house model and custom AES (Advanced Encryption Standard) - DAS model scaling from 5K to 2560K records. To perform custom AES-DAS better, we also devised Bucket Index using Bloom Filter. The simulation showed the response times arithmetically increased in the beginning but after a certain threshold, exponentially increased in the end. In conclusion, if the database model is close to in-house model, then vendor technology is a good way to perform and get query response times in a consistent manner. If the model is DAS model, it is easy to outsource the database, however, some techniques like Bucket Index enhances its utilization. To get faster query response times, designing database such as consideration of the field type is also important. This study suggests cloud computing would be a next DAS model to satisfy the scalability and the security issues.
Resumo:
Electronic Health Record (EHR) retrieval processes are complex demanding Information Technology (IT) resources exponentially in particular memory usage. Database-as-a-service (DAS) model approach is proposed to meet the scalability factor of EHR retrieval processes. A simulation study using ranged of EHR records with DAS model was presented. The bucket-indexing model incorporated partitioning fields and bloom filters in a Singleton design pattern were used to implement custom database encryption system. It effectively provided faster responses in the range query compared to different types of queries used such as aggregation queries among the DAS, built-in encryption and the plain-text DBMS. The study also presented with constraints around the approach should consider for other practical applications.
Resumo:
Acoustic emission (AE) analysis is one of the several diagnostic techniques available nowadays for structural health monitoring (SHM) of engineering structures. Some of its advantages over other techniques include high sensitivity to crack growth and capability of monitoring a structure in real time. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). In AE technique, these stress waves are recorded by means of suitable sensors placed on the surface of a structure. Recorded signals are subsequently analysed to gather information about the nature of the source. By enabling early detection of crack growth, AE technique helps in planning timely retrofitting or other maintenance jobs or even replacement of the structure if required. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. Large amount of data is generated during AE testing, hence effective data analysis is necessary, especially for long term monitoring uses. Appropriate analysis of AE data for quantification of damage level is an area that has received considerable attention. Various approaches available for damage quantification for severity assessment are discussed in this paper, with special focus on civil infrastructure such as bridges. One method called improved b-value analysis is used to analyse data collected from laboratory testing.
A tan in a test tube -in vitro models for investigating ultraviolet radiation-induced damage in skin
Resumo:
Presently, global rates of skin cancers induced by ultraviolet radiation (UVR) exposure are on the rise. In view of this, current knowledge gaps in the biology of photocarcinogenesis and skin cancer progression urgently need to be addressed. One factor that has limited skin cancer research has been the need for a reproducible and physiologically-relevant model able to represent the complexity of human skin. This review outlines the main currently-used in vitro models of UVR-induced skin damage. This includes the use of conventional two-dimensional cell culture techniques and the major animal models that have been employed in photobiology and photocarcinogenesis research. Additionally, the progression towards the use of cultured skin explants and tissue-engineered skin constructs, and their utility as models of native skin's responses to UVR are described. The inherent advantages and disadvantages of these in vitro systems are also discussed.
Resumo:
Mesenchymal stem cells (MSCs) are undifferentiated, multi-potent stem cells with the ability to renew. They can differentiate into many types of terminal cells, such as osteoblasts, chondrocytes, adipocytes, myocytes, and neurons. These cells have been applied in tissue engineering as the main cell type to regenerate new tissues. However, a number of issues remain concerning the use of MSCs, such as cell surface markers, the determining factors responsible for their differentiation to terminal cells, and the mechanisms whereby growth factors stimulate MSCs. In this chapter, we will discuss how proteomic techniques have contributed to our current knowledge and how they can be used to address issues currently facing MSC research. The application of proteomics has led to the identification of a special pattern of cell surface protein expression of MSCs. The technique has also contributed to the study of a regulatory network of MSC differentiation to terminal differentiated cells, including osteocytes, chondrocytes, adipocytes, neurons, cardiomyocytes, hepatocytes, and pancreatic islet cells. It has also helped elucidate mechanisms for growth factor–stimulated differentiation of MSCs. Proteomics can, however, not reveal the accurate role of a special pathway and must therefore be combined with other approaches for this purpose. A new generation of proteomic techniques have recently been developed, which will enable a more comprehensive study of MSCs. Keywords
Resumo:
This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.
Resumo:
Companies face the challenges of expanding their markets, improving products, services and processes, and exploiting intellectual capital in a dynamic network. Therefore, more companies are turning to an Enterprise System (ES). Knowledge management (KM) has also received considerable attention and is continuously gaining the interest of industry, enterprises, and academia. For ES, KM can provide support across the entire lifecycle, from selection and implementation to use. In addition, it is also recognised that an ontology is an appropriate methodology to accomplish a common consensus of communication, as well as to support a diversity of KM activities, such as knowledge repository, retrieval, sharing, and dissemination. This paper examines the role of ontology-based KM for ES (OKES) and investigates the possible integration of ontology-based KM and ES. The authors develop a taxonomy as a framework for understanding OKES research. In order to achieve the objective of this study, a systematic review of existing research was conducted. Based on a theoretical framework of the ES lifecycle, KM, KM for ES, ontology, and ontology-based KM, guided by the framework of study, a taxonomy for OKES is established.
Resumo:
Bone’s capacity to repair following trauma is both unique and astounding. However, fractures sometimes fail to heal. Hence, the goal of fracture treatment is the restoration of bone’s structure, composition and function. Fracture fixation devices should provide a favourable mechanical and biological environment for healing to occur. The use of internal fixation is increasing as these devices may be applied with less invasive techniques. Recent studies suggest however that, internal fixation devices may be overly stiff and suppresses callus formation. The degree of mechanical stability influences the healing outcome. This is determined by the stiffness of the fixation device and the degree of limb loading. This project aims to characterise the fixation stability of an internal plate fixation device and the influence of modifications to its configuration on implant stability. As there are no standardised methods for the determination of fixation stiffness, the first part of this project aims to compares different methodologies and determines the most appropriate method to characterise the stiffness of internal plate fixators. The stiffness of a fixation device also influences the physiological loads experienced by the healing bone. Since bone adapts to this applied load by undergoing changes through a remodelling process, undesirable changes could occur during the period of treatment with an implant. The second part of this project aims to develop a methodology to quantify remodelling changes. This quantification is expected to aid our understanding of the changes in pattern due to implant related remodelling and on the factors driving the remodelling process. Knowledge gained in this project is useful to understand how the configuration of internal fixation devices can promote timely healing and prevent undesirable bone loss.
Resumo:
The construction phase of building projects is often a crucial influencing factor in success or failure of projects. Project managers are believed to play a significant role in firms’ success and competitiveness. Therefore, it is important for firms to better understand the demands of managing projects and the competencies that project managers require for more effective project delivery. In a survey of building project managers in the state of Queensland, Australia, it was found that management and information management system are the top ranking competencies required by effective project managers. Furthermore, a significant number of respondents identified the site manager, construction manager and client’s representative as the three individuals whose close and regular contacts with project managers have the greatest influence on the project managers’ performance. Based on these findings, an intra-project workgroups model is proposed to help project managers facilitate more effective management of people and information on building projects.