924 resultados para Distributed process model
Resumo:
Sähköisen liiketoiminnan ja mobiliteetin konvergenssi yhdessä teknologisen innovaation kiihtyvän vauhdin kanssa ovat saaneet aikaan kiinnostusta langattomia liiketoimintaratkaisuja kohtaan. Tämän diplomityön tavoitteena oli tutkia sähköisen liiketoiminnan langattomien sovellusten arviointi- ja kehitysprosessia. Työ keskittyy tarkastelemaan paperiteollisuuden toimitusketjun langatonta seurantaa. Tutkimuksessa esitetään langattoman sähköisen liiketoiminnan määritelmä, kuvaillaan langattomuuden eri sovellusalueita ja sovellusten arviointi- ja kehitysprosessin strategisia sekä teknologisia ulottuvuuksia. Työ luo viitekehyksen, jonka avulla tarkastella langattomien teknologioiden merkitystä logistiikassa. Tutkimuksen merkittävin tulos on prosessimalli sovellusten arvioimiseksi ja kehittämiseksi. Mallilla kehitetty langaton sovellus osoittautui tarkastelussa hyödylliseksi toimitusketjun hallinnassa.
Resumo:
The aim of this study was to create an outsourcing process for pharmaceutical product development. This study focuses on two main questions. The first question is “What is the outsourcing process model?” In the second phase key success factors of the outsourcing process are identified. As a result of the literature reviews, a general outsourcing process was created. Transaction cost economics and resource based view were used to derived a theoretical framework to the process by combining the existing processes presented in the literature. The model of process is considered used to the outsourcing broadly. The general outsourcing process was then developed further with the key factors that affect the success of pharmaceutical product development and the interviews of pharmaceutical outsourcing experts. The result of the research was the process consists of seven phases with key activities and expected outputs for each of the phases. In addition, the strategic decision-making framework for outsourcing decision in pharmaceutical product development is giving as well as the tools for selecting supplier and preparing structured contract. This study also gives some recommendations for managing the outsourcing process.
Resumo:
Vaatimustenhallinnan alue on hyvin kompleksinen. Sen terminologia on moninaista ja samat termit voivat tarkoittaa eri asioita eri ihmisille. Tämän työn tarkoituksena on selkeyttää vaatimustenhallinnan aluetta. Se vastaa kysymyksiin kuten, mitä vaatimustenhallinta on ja miten sitä voidaan tehdä. Työ keskittyy vaatimusten analysoinnin ja validoinnin alueisiin, joten tältä osin se vastaa myös tarkempiin kysymyksiin kuten, miten koottujen vaatimusten jäljitettävyyttä, dokumentointia, analysointia ja validointia voidaan tehdä. Tämän työn kautta vaatimustenhallinta voidaan esitellä yritykselle ja sen eri osat voivat saada saman käsityksen vaatimustenhallinnasta. Tutkimus esittelee vaatimustenhallinnan prosessina, joka pitää sisällään vaatimusten jäljitettävyyden, vaatimusten dokumentoinnin, vaatimusten muutoksenhallinnan ja vaatimusmäärityksen. Vaatimusmääritys voidaan edelleen jakaa vaatimusten koostamiseen, analysointiin ja neuvotteluun sekä validointiin. Työssä esitellään geneerinen vaatimustenhallinnan prosessimalli. Mallin avulla näytetään, että vaatimustenhallinta on jatkuva prosessi, jossa kaikki aktiviteetit ovat kytköksissä toisiinsa. Näitä aktiviteettejä suoritetaan enemmän tai vähemmän samanaikaisesti. Malli esitetään geneerisessä muodossa, jotta se olisi hyödynnettävissä systeemi- ja tuotekehitys projekteissa sekä sisäisissä kehitysprojekteissa. Se kertoo, että vaatimukset tulisi jalostaa niin aikaisin, kuin mahdollista, jotta muutoksien määrä kehitystyön myöhemmissä vaiheissa voitaisiin minimoida. Jotkin muutokset eivät ole vältettävissä, joten muutoksenhallinnan tueksi tulisi kehittää jäljitettävyyskäsikirja ja jäljitettävyyskäytännöt. Vaatimustenhallintaa tarkastellaan meneillään olevassa kehitysprojektissa. Tarkastelussa tutkitaan, mitä vaatimustenhallinnan toimintatapoja sekä analysointi- ja validointimetodeja käytetään ja mitä voitaisiin tehdä vaatimustenhallinnan parantamiseksi projektissa.
Resumo:
The target of this thesis is to develop a brand positioning process model for the case company’s international operations. The model will make the process more effective and decrease the risk of relevant aspects being forgotten. The focus is on the international operations although generally the brand positioning can be seen as a standardized subject and, thus, there is no need to distinguish market areas. Constructive research approach is chosen as a research method. Internal interviews are done in order to give the much needed insight about the case company’s current processes and circumstances. Based on theory, interviews as well as internal and external material the model is built. The most difficult part in building the model is to determine the order of each phase. Also, deciding the number of each phase can be problematic. The model should be brief and assertive in order to reduce the risk of misunderstanding between employees from different units. Based on the analysis of the interviews and the theory the brand positioning process model is presented with indication of the order of each phase. The model is divided to three main groups: Analyzing the Environment, Determining the Brand Position, and Documenting the BPS. The benefits of the model are that overlapping work can be reduced, too similar brands can be noticed and it is easier to train new employees.
Resumo:
Investigation of high pressure pretreatment process for gold leaching is the objective of the present master's thesis. The gold ores and concentrates which cannot be easily treated by leaching process are called "refractory". These types of ores or concentrates often have high content of sulfur and arsenic that renders the precious metal inaccessible to the leaching agents. Since the refractory ores in gold manufacturing industry take a considerable share, the pressure oxidation method (autoclave method) is considered as one of the possible ways to overcome the related problems. Mathematical modeling is the main approach in this thesis which was used for investigation of high pressure oxidation process. For this task, available information from literature concerning this phenomenon, including chemistry, mass transfer and kinetics, reaction conditions, applied apparatus and application, was collected and studied. The modeling part includes investigation of pyrite oxidation kinetics in order to create a descriptive mathematical model. The following major steps are completed: creation of process model by using the available knowledge; estimation of unknown parameters and determination of goodness of the fit; study of the reliability of the model and its parameters.
Resumo:
The development of carbon capture and storage (CCS) has raised interest towards novel fluidised bed (FB) energy applications. In these applications, limestone can be utilized for S02 and/or CO2 capture. The conditions in the new applications differ from the traditional atmospheric and pressurised circulating fluidised bed (CFB) combustion conditions in which the limestone is successfully used for SO2 capture. In this work, a detailed physical single particle model with a description of the mass and energy transfer inside the particle for limestone was developed. The novelty of this model was to take into account the simultaneous reactions, changing conditions, and the effect of advection. Especially, the capability to study the cyclic behaviour of limestone on both sides of the calcination-carbonation equilibrium curve is important in the novel conditions. The significances of including advection or assuming diffusion control were studied in calcination. Especially, the effect of advection in calcination reaction in the novel combustion atmosphere was shown. The model was tested against experimental data; sulphur capture was studied in a laboratory reactor in different fluidised bed conditions. Different Conversion levels and sulphation patterns were examined in different atmospheres for one limestone type. The Conversion curves were well predicted with the model, and the mechanisms leading to the Conversion patterns were explained with the model simulations. In this work, it was also evaluated whether the transient environment has an effect on the limestone behaviour compared to the averaged conditions and in which conditions the effect is the largest. The difference between the averaged and transient conditions was notable only in the conditions which were close to the calcination-carbonation equilibrium curve. The results of this study suggest that the development of a simplified particle model requires a proper understanding of physical and chemical processes taking place in the particle during the reactions. The results of the study will be required when analysing complex limestone reaction phenomena or when developing the description of limestone behaviour in comprehensive 3D process models. In order to transfer the experimental observations to furnace conditions, the relevant mechanisms that take place need to be understood before the important ones can be selected for 3D process model. This study revealed the sulphur capture behaviour under transient oxy-fuel conditions, which is important when the oxy-fuel CFB process and process model are developed.
Resumo:
The main objective of the study was to form a strategic process model and project management tool to help IFRS change implementation projects in the future. These research results were designed based on the theoretical framework of Total Quality Management and leaning on the facts that were collected during the empirical case study of IAS 17 change. The us-age of the process oriented approach in IFRS standard change implementation after the initial IFRS implementation is rationalized with the following arguments: 1) well designed process tools lead to optimization of resources 2) With the help of process stages and related tasks it is easy to ensure the efficient way of working and managing the project as well as make sure to include all necessary stakeholders to the change process. This research is following the qualitative approach and the analysis is in describing format. The first part of the study is a literature review and the latter part has been conducted as a case study. The data has been col-lected in the case company with interviews and observation. The main findings are a process model for IFRS standard change process and a check-list formatted management tool for up-coming IFRS standard change projects. The process flow follows the main cornerstones in IASB’s standard setting process and the management tool has been divided to stages accordingly.
Resumo:
This thesis studies the possibility to use lean tools and methods in a quotation process which is carried out in an office environment. The aim of the study was to find out and test the relevant lean tools and methods which can help to balance and standardize the quotation process, and reduce the variance in quotation lead times and in quality. Seminal works, researches and guide books related to the topic were used as the basis for the theory development. Based on the literature review and the case company’s own lean experience, the applicable lean tools and methods were selected to be tested by a sales support team. Leveling production, by product categorization and value stream mapping, was a key method to be used to balance the quotation process. 5S method was started concurrently for standardizing the work. Results of the testing period showed that lean tools and methods are applicable in office process and selected tools and methods helped to balance and standardize the quotation process. Case company’s sales support team decided to implement new lean based quotation process model.
Resumo:
years 8 months) and 24 older (M == 7 years 4 months) children. A Monitoring Process Model (MPM) was developed and tested in order to ascertain at which component process ofthe MPM age differences would emerge. The MPM had four components: (1) assessment; (2) evaluation; (3) planning; and (4) behavioural control. The MPM was assessed directly using a referential communication task in which the children were asked to make a series of five Lego buildings (a baseline condition and one building for each MPM component). Children listened to instructions from one experimenter while a second experimenter in the room (a confederate) intetjected varying levels ofverbal feedback in order to assist the children and control the component ofthe MPM. This design allowed us to determine at which "stage" ofprocessing children would most likely have difficulty monitoring themselves in this social-cognitive task. Developmental differences were obselVed for the evaluation, planning and behavioural control components suggesting that older children were able to be more successful with the more explicit metacomponents. Interestingly, however, there was no age difference in terms ofLego task success in the baseline condition suggesting that without the intelVention ofthe confederate younger children monitored the task about as well as older children. This pattern ofresults indicates that the younger children were disrupted by the feedback rather than helped. On the other hand, the older children were able to incorporate the feedback offered by the confederate into a plan ofaction. Another aim ofthis study was to assess similar processing components to those investigated by the MPM Lego task in a more naturalistic observation. Together the use ofthe Lego Task ( a social cognitive task) and the naturalistic social interaction allowed for the appraisal of cross-domain continuities and discontinuities in monitoring behaviours. In this vein, analyses were undertaken in order to ascertain whether or not successful performance in the MPM Lego Task would predict cross-domain competence in the more naturalistic social interchange. Indeed, success in the two latter components ofthe MPM (planning and behavioural control) was related to overall competence in the naturalistic task. However, this cross-domain prediction was not evident for all levels ofthe naturalistic interchange suggesting that the nature ofthe feedback a child receives is an important determinant ofresponse competency. Individual difference measures reflecting the children's general cognitive capacity (Working Memory and Digit Span) and verbal ability (vocabulary) were also taken in an effort to account for more variance in the prediction oftask success. However, these individual difference measures did not serve to enhance the prediction oftask performance in either the Lego Task or the naturalistic task. Similarly, parental responses to questionnaires pertaining to their child's temperament and social experience also failed to increase prediction oftask performance. On-line measures ofthe children's engagement, positive affect and anxiety also failed to predict competence ratings.
Resumo:
In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.
Resumo:
We present a statistical image-based shape + structure model for Bayesian visual hull reconstruction and 3D structure inference. The 3D shape of a class of objects is represented by sets of contours from silhouette views simultaneously observed from multiple calibrated cameras. Bayesian reconstructions of new shapes are then estimated using a prior density constructed with a mixture model and probabilistic principal components analysis. We show how the use of a class-specific prior in a visual hull reconstruction can reduce the effect of segmentation errors from the silhouette extraction process. The proposed method is applied to a data set of pedestrian images, and improvements in the approximate 3D models under various noise conditions are shown. We further augment the shape model to incorporate structural features of interest; unknown structural parameters for a novel set of contours are then inferred via the Bayesian reconstruction process. Model matching and parameter inference are done entirely in the image domain and require no explicit 3D construction. Our shape model enables accurate estimation of structure despite segmentation errors or missing views in the input silhouettes, and works even with only a single input view. Using a data set of thousands of pedestrian images generated from a synthetic model, we can accurately infer the 3D locations of 19 joints on the body based on observed silhouette contours from real images.
Resumo:
Se presenta el análisis de sensibilidad de un modelo de percepción de marca y ajuste de la inversión en marketing desarrollado en el Laboratorio de Simulación de la Universidad del Rosario. Este trabajo de grado consta de una introducción al tema de análisis de sensibilidad y su complementario el análisis de incertidumbre. Se pasa a mostrar ambos análisis usando un ejemplo simple de aplicación del modelo mediante la aplicación exhaustiva y rigurosa de los pasos descritos en la primera parte. Luego se hace una discusión de la problemática de medición de magnitudes que prueba ser el factor más complejo de la aplicación del modelo en el contexto práctico y finalmente se dan conclusiones sobre los resultados de los análisis.
Resumo:
Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.
Resumo:
Purpose – The purpose of this paper is to propose a process model for knowledge transfer in using theories relating knowledge communication and knowledge translation. Design/methodology/approach – Most of what is put forward in this paper is based on a research project titled “Procurement for innovation and knowledge transfer (ProFIK)”. The project is funded by a UK government research council – The Engineering and Physical Sciences Research Council (EPSRC). The discussions are mainly grounded on a thorough review of literature accomplished as part of the research project. Findings – The process model developed in this paper has built upon the theory of knowledge transfer and the theory of communication. Knowledge transfer, per se, is not a mere transfer of knowledge. It involves different stages of knowledge transformation. Depending on the context of knowledge transfer, it can also be influenced by many factors; some positive and some negative. The developed model of knowledge transfer attempts to encapsulate all these issues in order to create a holistic framework. Originality/value of paper – An attempt has been made in the paper to combine some of the significant theories or findings relating to knowledge transfer together, making the paper an original and valuable one.
Resumo:
The incorporation of numerical weather predictions (NWP) into a flood forecasting system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and lead to a high number of false alarms. The availability of global ensemble numerical weather prediction systems through the THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for flood forecast. The Grid-Xinanjiang distributed hydrological model, which is based on the Xinanjiang model theory and the topographical information of each grid cell extracted from the Digital Elevation Model (DEM), is coupled with ensemble weather predictions based on the TIGGE database (CMC, CMA, ECWMF, UKMO, NCEP) for flood forecast. This paper presents a case study using the coupled flood forecasting model on the Xixian catchment (a drainage area of 8826 km2) located in Henan province, China. A probabilistic discharge is provided as the end product of flood forecast. Results show that the association of the Grid-Xinanjiang model and the TIGGE database gives a promising tool for an early warning of flood events several days ahead.