959 resultados para Process capability index
Resumo:
Tämä diplomityö määrittelee teknologiaseurantaprosessin, jolla korkean teknologian yritys voi ohjata toimintaansa. Korkean teknologian yrityksille on olennaista seurata teknologian kehitystä. Tällaiset yritykset tarvitsevat hyvin määritellyn järjestelmän, jolla ne voivat seurata ja ennustaa teknologista kehitystä.Työssä esitetään, että teknologiaseuranta ja kilpailuseuranta (competitive intelligence) ovat business intelligencen osa-alueita, jotka täydentävät ja tukevat toisiaan. Tärkeä havainto on, että business intelligence -prosessi on ennen kaikkea organisaation oppimisprosessi. Tästä seuraa, että minkä tahansa BI-prosessin tulisi perustua niihin prosesseihin, joiden avulla organisaatiot oppivat. Työssä esitetään myös, miten business intelligence, tietojohtaminen (knowledge management) ja organisaatioiden oppiminen liittyvät toisiinsa.Teknologiaseuranta on elintärkeä toiminto korkean teknologian yritykselle; sitä tarvitaan monella strategisen johtamisen osa-alueella, ainakin teknologia-, markkinointi- ja henkilöstöjohtamisessa. Teknologiaseurannan havaitaan myös olevan korkean teknologian yritykselle erittäin tärkeä ydinosaamisalue, jota ei voi kokonaan ulkoistaa.Työssä esitellään teknologiaseurantaprosessi, joka perustuu yleiselle business intelligence -prosessille ja siitä johdetulle kilpailuseurantaprosessille. Työssä myös esitetään ehdotus siitä, kuinka teknologiaseuranta voitaisiin järjestää korkean teknologian yrityksessä. Esitetty ratkaisu perustuu Community of practice -käsitteeseen. Community of practice on vapaaehtoisuuteen perustuva tiimi, jonka jäseniä yhdistää kiinnostus johonkin asiaan ja oppimishalu. Esimerkkiyrityksessä on tunnistettu selkeä tarve yhtenäiseen ja koordinoituun teknologiaseurantaan. Työssä esitetään alustava teknologiaseurantaprosessi esimerkkiyritykselle ja tunnistetaan teknologiaseurantaprosessin asiakkaat ja tekijät.
Resumo:
Ohjelmistojen tärkeys nykypäivän yhteiskunnalle kasvaa jatkuvasti. Monia ohjelmistoprojekteja vaivaavat ongelmat aikataulussa pysymisestä, korkean tuottavuuden ylläpitämisestä ja riittävän korkeasta laadusta. Ohjelmistokehitysprosessien parantamisessa on naiden ongelmien minimoimiseksi tehty suuria investointeja. Investointien syynä on ollut olettamus ohjelmistokehityksen kapasiteetin suora riippuvuus tuotteen laadusta. Tämän tutkimuksen tarkoituksena oli tutkia Ohjelmistokehitysprosessien parantamisen mahdollisuuksia. Olemassaolevat ohjelmistokehityksen ja Ohjelmistokehitysprosessin parantamisen mallit, tekniikat ja metodologiat esiteltiin. Esiteltyjen mallien, tekniikoiden ja metodologioiden soveltuvuus analysoitiin ja suositus mallien käytöstä annettiin.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Peer-reviewed
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
In the field of observational methodology the observer is obviously a central figure, and close attention should be paid to the process through which he or she acquires, applies, and maintains the skills required. Basic training in how to apply the operational definitions of categories and the rules for coding, coupled with the opportunity to use the observation instrument in real-life situations, can have a positive effect in terms of the degree of agreement achieved when one evaluates intra- and inter-observer reliability. Several authors, including Arias, Argudo, & Alonso (2009) and Medina and Delgado (1999), have put forward proposals for the process of basic and applied training in this context. Reid y De Master (1982) focuses on the observer's performance and how to maintain the acquired skills, it being argued that periodic checks are needed after initial training because an observer may, over time, become less reliable due to the inherent complexity of category systems. The purpose of this subsequent training is to maintain acceptable levels of observer reliability. Various strategies can be used to this end, including providing feedback about those categories associated with a good reliability index, or offering re-training in how to apply those that yield lower indices. The aim of this study is to develop a performance-based index that is capable of assessing an observer's ability to produce reliable observations in conjunction with other observers.
Resumo:
A company’s capability to map out its cost position compared to other market players is important for competitive decision making. One aspect of cost position is direct product cost that illustrates the cost efficiency of a company’s product designs. If a company can evaluate and compare its own and other market players’ direct product costs, it can implement better decisions in product development and management, manufacturing, sourcing, etc. The main objective of this thesis was to develop a cost evaluation process for competitors’ products. This objective includes a process description and an analysis tool for cost evaluations. Additionally, process implementation is discussed as well. The main result of this thesis was a process description consisting of a sixteen steps process and an Excel based analysis tool. Since literature was quite limited in this field, the solution proposal was combined from many different theoretical concepts. It includes influences from reverse engineering, product cost assessment, benchmarking and cost based decision making. This solution proposal will lead to more systematic and standardized cost position analyses and result in better cost transparency in decision making.
Resumo:
ABSTRACT Roasting is one of the most complex coffee processing steps due to simultaneous transfers of heat and mass. During this process, beans lose mass because of fast physical and chemical changes that will set color and flavor of the commercial coffee beverage. Therefore, we aimed at assessing the kinetics of mass loss in commercially roasted coffee beans according to heating throughout the processing. For that, we used samples of 350-g Arabica coffee processed grains with water content of 0.1217 kga kg-1, in addition to a continuous roaster with firing gas. The roaster had initial temperatures of 285, 325, 345 and 380 °C, decreasing during the process up to 255, 285, 305 and 335 °C respectively. Mass loss was calculated by the difference between grain weight before and after roasting. We observed a linear variation directly dependent on roaster temperature. For each temperature during the process was obtained a constant mass loss rate, which was reported by the Arrhenius model with r2 above 0.98. In a roaster in non-isothermal conditions, the required activation energy to start the mass loss in a commercial coffee roasting index was 52.27 kJ mol -1.
Resumo:
Abstract—This paper discusses existing military capability models and proposes a comprehensive capability meta-model (CCMM) which unites the existing capability models into an integrated and hierarchical whole. The Zachman Framework for Enterprise Architecture is used as a structure for the CCMM. The CCMM takes into account the abstraction level, the primary area of application, stakeholders, intrinsic process, and life cycle considerations of each existing capability model, and shows how the models relate to each other. The validity of the CCMM was verified through a survey of subject matter experts. The results suggest that the CCMM is of practical value to various capability stakeholders in many ways, such as helping to improve communication between the different capability communities.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.
Resumo:
In the last two centuries, papers have been published including measurements of the germination process. High diversity of mathematical expressions has made comparisons between papers and some times the interpretation of results difficult. Thus, in this paper is included a review about measurements of the germination process, with an analysis of the several mathematical expressions included in the specific literature, recovering the history, sense, and limitations of some germination measurements. Among the measurements included in this paper are the germinability, germination time, coefficient of uniformity of germination (CUG), coefficient of variation of the germination time (CVt), germination rate (mean rate, weighted mean rate, coefficient of velocity, germination rate of George, Timsons index, GV or Czabators index; Throneberry and Smiths method and its adaptations, including Maguires rate; ERI or emergence rate index, germination index, and its modifications), uncertainty associated to the distribution of the relative frequency of germination (U), and synchronization index (Z). The limits of the germination measurements were included to make the interpretation and decisions during comparisons easier. Time, rate, homogeneity, and synchrony are aspects that can be measured, informing the dynamics of the germination process. These characteristics are important not only for physiologists and seed technologists, but also for ecologists because it is possible to predict the degree of successful of a species based on the capacity of their harvest seed to spread the germination through time, permitting the recruitment in the environment of some part of the seedlings formed.
Resumo:
The effect of two concentrations of caffeine (1500 mg/ml and 2500 mg/ml) on mitotic indices of Drosophila prosaltans was analyzed in larval brain cells. Although the differences detected between treated and control cells were not significant, the percentages obtained suggest a possible effect of caffeine in slowing the process of cell division
Resumo:
The aim of this study was to determine the influence of process parameters and Passion Fruit Fiber (PFF) addition on the Glycemic Index (GI) of an extruded breakfast cereal. A 2³ Central Composite Rotational Design (CCRD) was used, with the following independent variables: raw material moisture content (18-28%), 2nd and 3rd barrel zone temperatures (120-160 ºC), and PFF (0-30%). Raw materials (organic corn flour and organic PFF) were characterized as to their proximate composition, particle size, and in vitro GI. The extrudates were characterized as to their in vitro GI. The Response Surface Methodology (RSM) and Principal Component Analysis (PCA) were used to analyze the results. Corn flour and PFF presented 8.55 and 7.63% protein, 2.61 and 0.60% fat, 0.52 and 6.17% ash, 78.77 and 78.86% carbohydrates (3 and 64% total dietary fiber), respectively. The corn flour particle size distribution was homogeneous, while PFF presented a heterogeneous particle size distribution. Corn flour and PFF presented values of GI of 48 and 45, respectively. When using RSM, no effect of the variables was observed in the GI of the extrudates (average value of 48.41), but PCA showed that the GI tended to be lower when processing at lower temperatures (<128 ºC) and at higher temperatures (>158 ºC). When compared to white bread, the extrudates showed a reduction of the GI of up to 50%, and could be considered an interesting alternative in weight and glycemia control diets.
Resumo:
A blend of 50% Potato Starch (PS), 35% Quality Protein Maize (QPM), and 15% Soybean Meal (SM) were used in the preparation of expanded pellets utilizing a laboratory extruder with a 1.5 × 20.0 × 100.0 mm die-nozzle. The independent variables analyzed were Barrel Temperature (BT) (75-140 °C) and Feed Moisture (FM) (16-30%). The effect of extrusion variables was investigated in terms of Expansion Index (EI), apparent density (ApD), Penetration Force (PF) and Specific Mechanical Energy (SME), viscosity profiles, DSC, crystallinity by X-ray diffraction, and Scanning Electronic Microscopy (SEM). The PF decreased from 30 to 4 kgf with the increase of both independent variables (BT and FM). SME was affected only by FM, and decreased with the increase in this variable. The optimal region showed that the maximum EI was found for BT in the range of 123-140 °C and 27-31% for FM, respectively. The extruded pellets obtained from the optimal processing region were probably not completely degraded, as shown in the structural characterization. Acceptable expanded pellets could be produced using a blend of PS, QPM, and SM by extrusion cooking.