921 resultados para pacs: C6170K knowledge engineering techniques


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Even though the research on innovation in services has expanded remarkably especially during the past two decades, there is still a need to increase understanding on the special characteristics of service innovation. In addition to studying innovation in service companies and industries, research has also recently focused more on services in innovation, as especially the significance of so-called knowledge intensive business services (KIBS) for the competitive edge of their clients, othercompanies, regions and even nations has been proved in several previous studies. This study focuses on studying technology-based KIBS firms, and technology andengineering consulting (TEC) sector in particular. These firms have multiple roles in innovation systems, and thus, there is also a need for in-depth studies that increase knowledge about the types and dimensions of service innovations as well as underlying mechanisms and procedures which make the innovations successful. The main aim of this study is to generate new knowledge in the fragmented research field of service innovation management by recognizing the different typesof innovations in TEC services and some of the enablers of and barriers to innovation capacity in the field, especially from the knowledge management perspective. The study also aims to shed light on some of the existing routines and new constructions needed for enhancing service innovation and knowledge processing activities in KIBS companies of the TEC sector. The main samples of data in this research include literature reviews and public data sources, and a qualitative research approach with exploratory case studies conducted with the help of the interviews at technology consulting companies in Singapore in 2006. These complement the qualitative interview data gathered previously in Finland during a larger research project in the years 2004-2005. The data is also supplemented by a survey conducted in Singapore. The respondents for the survey by Tan (2007) were technology consulting companies who operate in the Singapore region. The purpose ofthe quantitative part of the study was to validate and further examine specificaspects such as the influence of knowledge management activities on innovativeness and different types of service innovations, in which the technology consultancies are involved. Singapore is known as a South-east Asian knowledge hub and is thus a significant research area where several multinational knowledge-intensive service firms operate. Typically, the service innovations identified in the studied TEC firms were formed by several dimensions of innovations. In addition to technological aspects, innovations were, for instance, related to new client interfaces and service delivery processes. The main enablers of and barriers to innovation seem to be partly similar in Singaporean firms as compared to the earlier study of Finnish TEC firms. Empirical studies also brought forth the significance of various sources of knowledge and knowledge processing activities as themain driving forces of service innovation in technology-related KIBS firms. A framework was also developed to study the effect of knowledge processing capabilities as well as some moderators on the innovativeness of TEC firms. Especially efficient knowledge acquisition and environmental dynamism seem to influence the innovativeness of TEC firms positively. The results of the study also contributeto the present service innovation literature by focusing more on 'innovation within KIBs' rather than 'innovation through KIBS', which has been the typical viewpoint stressed in the previous literature. Additionally, the study provides several possibilities for further research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Micronization techniques based on supercritical fluids (SCFs) are promising for the production of particles with controlled size and distribution. The interest of the pharmaceutical field in the development of SCF techniques is increasing due to the need for clean processes, reduced consumption of energy, and to their several possible applications. The food field is still far from the application of SCF micronization techniques, but there is increasing interest mainly for the processing of products with high added value. The aim of this study is to use SCF micronization techniques for the production of particles of pharmaceuticals and food ingredients with controlled particle size and morphology, and to look at their production on semi-industrial scale. The results obtained are also used to understand the processes from the perspective of broader application within the pharmaceutical and food industries. Certain pharmaceuticals, a biopolymer and a food ingredient have been tested using supercritical antisolvent micronization (SAS) or supercritical assisted atomization (SAA) techniques. The reproducibility of the SAS technique has been studied using physically different apparatuses and on both laboratory and semi-industrial scale. Moreover, a comparison between semi-continuous and batch mode has been performed. The behaviour of the system during the SAS process has been observed using a windowed precipitation vessel. The micronized powders have been characterized by particle size and distribution, morphology and crystallinity. Several analyses have been performed to verify if the SCF process modified the structure of the compound or caused degradation or contamination of the product. The different powder morphologies obtained have been linked to the position of the process operating point with respect to the vapour-liquid equilibrium (VLE) of the systems studied, that is, mainly to the position of the mixture critical point (MCP) of the mixture. Spherical micro, submicro- and nanoparticles, expanded microparticles (balloons) and crystals were obtained by SAS. The obtained particles were amorphous or with different degrees of crystallinity and, in some cases, had different pseudo-polymorphic or polymorphic forms. A compound that could not be processed using SAS was micronized by SAA, and amorphous particles were obtained, stable in vials at room temperature. The SCF micronization techniques studied proved to be effective and versatile for the production of particles for several uses. Furthermore, the findings of this study and the acquired knowledge of the proposed processes can allow a more conscious application of SCF techniques to obtain products with the desired characteristics and enable the use of their principles for broader applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the increase in population, housing and construction of various facilities have been a problem with urbanization. Having exhausted all the trouble free hand, man is nowon the lookout for techniques to improve areas which were originally considered uninhabitable. Thus this study is based on the nature and engineering behavior of soft clays covering long stretches of coastal line and methods to improve their geotechnical properties .The main aim of the present investigation is to study in detail the physical and engineering behavior of the marine clays of Cochin. While it is well known that the marine clays have been posing numerous problems to foundation engineers all along, the relevant literature reveals that no systematic and comprehensive study has been attempted to date. The: knowledge gained through the study is suitably used to improve these properties with appropriate additives.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Among many other knowledge representations formalisms, Ontologies and Formal Concept Analysis (FCA) aim at modeling ‘concepts’. We discuss how these two formalisms may complement another from an application point of view. In particular, we will see how FCA can be used to support Ontology Engineering, and how ontologies can be exploited in FCA applications. The interplay of FCA and ontologies is studied along the life cycle of an ontology: (i) FCA can support the building of the ontology as a learning technique. (ii) The established ontology can be analyzed and navigated by using techniques of FCA. (iii) Last but not least, the ontology may be used to improve an FCA application.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Synechocystis PCC 6803 is a photosynthetic bacterium that has the potential to make bioproducts from carbon dioxide and light. Biochemical production from photosynthetic organisms is attractive because it replaces the typical bioprocessing steps of crop growth, milling, and fermentation, with a one-step photosynthetic process. However, low yields and slow growth rates limit the economic potential of such endeavors. Rational metabolic engineering methods are hindered by limited cellular knowledge and inadequate models of Synechocystis. Instead, inverse metabolic engineering, a scheme based on combinatorial gene searches which does not require detailed cellular models, but can exploit sequence data and existing molecular biological techniques, was used to find genes that (1) improve the production of the biopolymer poly-3-hydroxybutyrate (PHB) and (2) increase the growth rate. A fluorescence activated cell sorting assay was developed to screen for high PHB producing clones. Separately, serial sub-culturing was used to select clones that improve growth rate. Novel gene knock-outs were identified that increase PHB production and others that increase the specific growth rate. These improvements make this system more attractive for industrial use and demonstrate the power of inverse metabolic engineering to identify novel phenotype-associated genes in poorly understood systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Este texto contribuirá a que la institución de salud se organice y prepare la información necesaria para emprender el largo y tortuoso camino de la determinación de la razón costo/beneficio y de la acreditación. Además, podrá ser muy útil para los estudiantes de los programas de pregrado y posgrado de ingeniería biomédica que se quieran especializar en la gestión de tecnologías del equipamiento biomédico y la ingeniería clínica. También podrá ser usado como guía de referencia por personas que estén directamente vinculadas al sector de la salud en departamentos de mantenimiento, ingeniería clínica o de servicios hospitalarios.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper identifies characteristics of knowledge intensive processes and a method to improve their performance based on analysis of investment banking front office processes. The inability to improve these processes using standard process improvement techniques confirmed that much of the process was not codified and depended on tacit knowledge and skills. This led to the use of a semi-structured analysis of the characteristics of the processes via a questionnaire to identify knowledge intensive processes characteristics that adds to existing theory. Further work identified innovative process analysis and change techniques that could generate improvements based on an analysis of their properties and the issue drivers. An improvement methodology was developed to harness a number of techniques that were found to effective in resolving the issue drivers and improving these knowledge intensive processes.