908 resultados para Capability Maturity Model for Software
Resumo:
The aim of this research is to assess the acquisition and absorption technology capabilities of the public sector in developing countries, with specific focus on the State of Kuwait. The assessment process of these two capabilities was conducted using a model originally designed for the private sector. In addition, the research aims to propose a framework to enhance the technological capability of developing countries, as well as the performance of the public sector. To achieve these aims, an investigation of the technology process to three public ministries in Kuwait was conducted. The prime interest of this investigation was to evaluate the role of the transferred technology in enhancing the indigenous technological capability of the public sector. The research is based on a case study approach, comprising a main case study (Ministry of Electricity and Water) and three minor case studies. Based on the outcomes from an extensive literature review and the preliminary sectoral visits, the research question and four hypotheses were formulated. These hypotheses were then tested using interview-based survey and documentation. The findings of the research revealed the weakness of the acquisition and absorption technological capabilities of the public sector. Consequently, the public sector relies extensively on foreign contractors and expatriates to compensate for this weakness. Also, it was found that Kuwait Government has not taken the necessary measures to develop its technological capability. This research has proposed a framework which could lead, if properly managed, to the enhancement of indigenous capability. It has also proposed how to improve performance and productivity in the public sector. Finally, the findings suggest that the assessment model, with minor adjustment, is applicable to the public sector.
Resumo:
The technique of growing human leukaemic cells in diffusion chambers was developed to enable chemicals to be assessed for their ability to induce terminal differentiation. HL-60 promyelocytic leukaemia cell growth, in a lucite chamber with a Millipore filter, was optimised by use of a lateral incision site. Chambers were constructed using 0.45um filters and contained 150ul of serum-free HL-60 cells at a density of 1x106 cells/ml. The chambers were implanted into CBA/Ca mice and spontaneous terminal differentiation of the cells to granulocytes was prevented by the use of serum-free medium. Under these conditions there was an initial growth lag of 72 hours and a logarithmic phase of growth for 96 hours; the cell number reached a plateau after 168 hours of culture in vivo. The amount of drug in the plasma of the animal and in chambers that had been implanted for 5 days, was determined after a single ip injection of equitoxic doses of N-methylformamide, N-ethylformamide, tetramethylurea, N-dibutylformamide, N-tetramethylbutylformamide and hexamethylenebisacetamide. Concentrations of both TMU and HMBA were obtained in the plasma and in the chamber which were pharmacologically effective for the induction of differentiation of HL-60 cells in vitro, that is 12mM TMU and 5mM HMBA. A 4 day regime of treatment of animals implanted with chambers demonstrated that TMU and HMBA induced terminal differentiation of 50% and 35%, respectively, of the implanted HL-60 cells to granulocyte-like cells, assessed by measurement of functional and biochemical markers of maturity. None of the other agents attained concentrations in the plasma that were pharmacologically effective for the induction of differentiation of the cells in vitro and were unable to induce the terminal differentiation of the cells in vivo.
Resumo:
Product design decisions can have a significant impact on the financial and operation performance of manufacturing companies. Therefore good analysis of the financial impact of design decisions is required if the profitability of the business is to be maximised. The product design process can be viewed as a chain of decisions which links decisions about the concept to decisions about the detail. The idea of decision chains can be extended to include the design and operation of the 'downstream' business processes which manufacture and support the product. These chains of decisions are not independent but are interrelated in a complex manner. To deal with the interdependencies requires a modelling approach which represents all the chains of decisions, to a level of detail not normally considered in the analysis of product design. The operational, control and financial elements of a manufacturing business constitute a dynamic system. These elements interact with each other and with external elements (i.e. customers and suppliers). Analysing the chain of decisions for such an environment requires the application of simulation techniques, not just to any one area of interest, but to the whole business i.e. an enterprise simulation. To investigate the capability and viability of enterprise simulation an experimental 'Whole Business Simulation' system has been developed. This system combines specialist simulation elements and standard operational applications software packages, to create a model that incorporates all the key elements of a manufacturing business, including its customers and suppliers. By means of a series of experiments, the performance of this system was compared with a range of existing analysis tools (i.e. DFX, capacity calculation, shop floor simulator, and business planner driven by a shop floor simulator).
Resumo:
Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.
Resumo:
The work described was carried out as part of a collaborative Alvey software engineering project (project number SE057). The project collaborators were the Inter-Disciplinary Higher Degrees Scheme of the University of Aston in Birmingham, BIS Applied Systems Ltd. (BIS) and the British Steel Corporation. The aim of the project was to investigate the potential application of knowledge-based systems (KBSs) to the design of commercial data processing (DP) systems. The work was primarily concerned with BIS's Structured Systems Design (SSD) methodology for DP systems development and how users of this methodology could be supported using KBS tools. The problems encountered by users of SSD are discussed and potential forms of computer-based support for inexpert designers are identified. The architecture for a support environment for SSD is proposed based on the integration of KBS and non-KBS tools for individual design tasks within SSD - The Intellipse system. The Intellipse system has two modes of operation - Advisor and Designer. The design, implementation and user-evaluation of Advisor are discussed. The results of a Designer feasibility study, the aim of which was to analyse major design tasks in SSD to assess their suitability for KBS support, are reported. The potential role of KBS tools in the domain of database design is discussed. The project involved extensive knowledge engineering sessions with expert DP systems designers. Some practical lessons in relation to KBS development are derived from this experience. The nature of the expertise possessed by expert designers is discussed. The need for operational KBSs to be built to the same standards as other commercial and industrial software is identified. A comparison between current KBS and conventional DP systems development is made. On the basis of this analysis, a structured development method for KBSs in proposed - the POLITE model. Some initial results of applying this method to KBS development are discussed. Several areas for further research and development are identified.
Resumo:
Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.
Resumo:
The 21-day experimental gingivitis model, an established noninvasive model of inflammation in response to increasing bacterial accumulation in humans, is designed to enable the study of both the induction and resolution of inflammation. Here, we have analyzed gingival crevicular fluid, an oral fluid comprising a serum transudate and tissue exudates, by LC-MS/MS using Fourier transform ion cyclotron resonance mass spectrometry and iTRAQ isobaric mass tags, to establish meta-proteomic profiles of inflammation-induced changes in proteins in healthy young volunteers. Across the course of experimentally induced gingivitis, we identified 16 bacterial and 186 human proteins. Although abundances of the bacterial proteins identified did not vary temporally, Fusobacterium outer membrane proteins were detected. Fusobacterium species have previously been associated with periodontal health or disease. The human proteins identified spanned a wide range of compartments (both extracellular and intracellular) and functions, including serum proteins, proteins displaying antibacterial properties, and proteins with functions associated with cellular transcription, DNA binding, the cytoskeleton, cell adhesion, and cilia. PolySNAP3 clustering software was used in a multilayered analytical approach. Clusters of proteins that associated with changes to the clinical parameters included neuronal and synapse associated proteins.
Resumo:
A nature inspired decentralised multi-agent algorithm is proposed to solve a problem of distributed task allocation in which cities produce and store batches of different mail types. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. The problem is constrained so that agents are penalised for switching mail types. When an agent process a mail batch of different type to the previous one, it must undergo a change-over, with repeated change-overs rendering the agent inactive. The efficiency (average amount of mail retrieved), and the flexibility (ability of the agents to react to changes in the environment) are investigated both in static and dynamic environments and with respect to sudden changes. New rules for mail selection and specialisation are proposed and are shown to exhibit improved efficiency and flexibility compared to existing ones. We employ a evolutionary algorithm which allows the various rules to evolve and compete. Apart from obtaining optimised parameters for the various rules for any environment, we also observe extinction and speciation.
Resumo:
Objective: Biomedical events extraction concerns about events describing changes on the state of bio-molecules from literature. Comparing to the protein-protein interactions (PPIs) extraction task which often only involves the extraction of binary relations between two proteins, biomedical events extraction is much harder since it needs to deal with complex events consisting of embedded or hierarchical relations among proteins, events, and their textual triggers. In this paper, we propose an information extraction system based on the hidden vector state (HVS) model, called HVS-BioEvent, for biomedical events extraction, and investigate its capability in extracting complex events. Methods and material: HVS has been previously employed for extracting PPIs. In HVS-BioEvent, we propose an automated way to generate abstract annotations for HVS training and further propose novel machine learning approaches for event trigger words identification, and for biomedical events extraction from the HVS parse results. Results: Our proposed system achieves an F-score of 49.57% on the corpus used in the BioNLP'09 shared task, which is only 2.38% lower than the best performing system by UTurku in the BioNLP'09 shared task. Nevertheless, HVS-BioEvent outperforms UTurku's system on complex events extraction with 36.57% vs. 30.52% being achieved for extracting regulation events, and 40.61% vs. 38.99% for negative regulation events. Conclusions: The results suggest that the HVS model with the hierarchical hidden state structure is indeed more suitable for complex event extraction since it could naturally model embedded structural context in sentences.
Resumo:
A quasi-biotic model of knowledge evolution has been applied to manufacturing technology capability development which includes product design and development and manufacturing process/workflow improvement. The concepts of “knowledge genes” and “knowledge body” are introduced to explain the evolution of technological capability. It is shown that knowledge development within the enterprise happens as a result of interactions between an enterprise’s internal knowledge and that acquired from external sources catalysed by: (a) internal mechanisms, recources and incentives, and (b) actions and policies of external agencies. A matrix specifying factors contributing to knowledge development and types of manufacturing capabilities (product design, equipment development or use, and workflow) is developed to explain technological knowledge development. The case studies of Tianjin Pipe Corporation (TPCO) and Tianjin Tianduan Press Co. are presented to illustrate the application of the matrix.
Resumo:
One of the issues in the innovation system literature is examination of technological learning strategies of laggard nations. Two distinct bodies of literature have contributed to our insight into forces driving learning and innovation, National Systems of Innovation (NSI) and technological learning literature. Although both literatures yield insights on catch-up strategies of 'latecomer' nations, the explanatory powers of each literature by itself is limited. In this paper, a possible way of linking the macro- and the micro-level approaches by incorporating enterprises as active learning entities into the learning and innovation system is proposed. The proposed model has been used to develop research hypotheses and indicate research directions and is relevant for investigating the learning strategies of firms in less technologically intensive industries outside East Asia.
Resumo:
This paper proposes a conceptual model for a firm's capability to calibrate supply chain knowledge (CCK). Knowledge calibration is achieved when there is a match between managers' ex ante confidence in the accuracy of held knowledge and the ex post accuracy of that knowledge. Knowledge calibration is closely related to knowledge utility or willingness to use the available ex ante knowledge: a manager uses the ex ante knowledge if he/she is confident in the accuracy of that knowledge, and does not use it or uses it with reservation, when the confidence is low. Thus, knowledge calibration attained through the firm's CCK enables managers to deal with incomplete and uncertain information and enhances quality of decisions. In the supply chain context, although demand- and supply-related knowledge is available, supply chain inefficiencies, such as the bullwhip effect, remain. These issues may be caused not by a lack of knowledge but by a firm's lack of capability to sense potential disagreement between knowledge accuracy and confidence. Therefore, this paper contributes to the understanding of supply chain knowledge utilization by defining CCK and identifying a set of antecedents and consequences of CCK in the supply chain context.
Resumo:
Engineering adaptive software is an increasingly complex task. Here, we demonstrate Genie, a tool that supports the modelling, generation, and operation of highly reconfigurable, component-based systems. We showcase how Genie is used in two case-studies: i) the development and operation of an adaptive flood warning system, and ii) a service discovery application. In this context, adaptation is enabled by the Gridkit reflective middleware platform.
Resumo:
This paper develops and tests a learning organization model derived from HRM and dynamic capability literatures in order to ascertain the model's applicability across divergent global contexts. We define a learning organization as one capable of achieving on-going strategic renewal, arguing based on dynamic capability theory that the model has three necessary antecedents: HRM focus, developmental orientation and customer-facing remit. Drawing on a sample comprising nearly 6000 organizations across 15 countries, we show that learning organizations exhibit higher performance than their less learning-inclined counterparts. We also demonstrate that innovation fully mediates the relationship between our conceptualization of the learning organization and organizational performance in 11 of the 15 countries we examined. It is the first time in our knowledge that these questions have been tested in a major, cross-global study, and our work contributes to both HRM and dynamic capability literatures, especially where the focus is the applicability of best practice parameters across national boundaries.
Resumo:
We have attempted to bring together two areas which are challenging for both IS research and practice: forms of coordination and management of knowledge in the context of global, virtual software development projects. We developed a more comprehensive, knowledge-based model of how coordination can be achieved, and\illustrated the heuristic and explanatory power of the model when applied to global software projects experiencing different degrees of success. We first reviewed the literature on coordination and determined what is known about coordination of knowledge in global software projects. From this we developed a new, distinctive knowledge-based model of coordination, which was then employed to analyze two case studies of global software projects, at SAP and Baan, to illustrate the utility of the model.