926 resultados para Process Models
Resumo:
Darwin observed that multiple, lowly organized, rudimentary, or exaggerated structures show increased relative variability. However, the cellular basis for these laws has never been investigated. Some animals, such as the nematode Caenorhabditis elegans, are famous for having organs that possess the same number of cells in all individuals, a property known as eutely. But for most multicellular creatures, the extent of cell number variability is unknown. Here we estimate variability in organ cell number for a variety of animals, plants, slime moulds, and volvocine algae. We find that the mean and variance in cell number obey a power law with an exponent of 2, comparable to Taylor's law in ecological processes. Relative cell number variability, as measured by the coefficient of variation, differs widely across taxa and tissues, but is generally independent of mean cell number among homologous tissues of closely related species. We show that the power law for cell number variability can be explained by stochastic branching process models based on the properties of cell lineages. We also identify taxa in which the precision of developmental control appears to have evolved. We propose that the scale independence of relative cell number variability is maintained by natural selection.
Resumo:
The organizational structure of the companies in the biomass energy sector, regarding the supply chain management services, can be greatly improved through the use of software decision support tools. These tools should be able to provide real-time alternative scenarios when deviations from the initial production plans are observed. To make this possible it is necessary to have representative production chain process models where several scenarios and solutions can be evaluated accurately. Due to its nature, this type of process is more adequately represented by means of event-based models. In particular, this work presents the modelling of a typical biomass production chain using the computing platform SIMEVENTS. Throughout the article details about the conceptual model, as well as simulation results, are provided
Resumo:
Dynamic spatial analysis addresses computational aspects of space–time processing. This paper describes the development of a spatial analysis tool and modelling framework that together offer a solution for simulating landscape processes. A better approach to integrating landscape spatial analysis with Geographical Information Systems is advocated in this paper. Enhancements include special spatial operators and map algebra language constructs to handle dispersal and advective flows over landscape surfaces. These functional components to landscape modelling are developed in a modular way and are linked together in a modelling framework that performs dynamic simulation. The concepts and modelling framework are demonstrated using a hydrological modelling example. The approach provides a modelling environment for scientists and land resource managers to write and to visualize spatial process models with ease.
Resumo:
In this paper, we present a top down approach for integrated process modelling and distributed process execution. The integrated process model can be utilized for global monitoring and visualization and distributed process models for local execution. Our main focus in this paper is the presentation of the approach to support automatic generation and linking of distributed process models from an integrated process definition.
Resumo:
A Bayesian procedure for the retrieval of wind vectors over the ocean using satellite borne scatterometers requires realistic prior near-surface wind field models over the oceans. We have implemented carefully chosen vector Gaussian Process models; however in some cases these models are too smooth to reproduce real atmospheric features, such as fronts. At the scale of the scatterometer observations, fronts appear as discontinuities in wind direction. Due to the nature of the retrieval problem a simple discontinuity model is not feasible, and hence we have developed a constrained discontinuity vector Gaussian Process model which ensures realistic fronts. We describe the generative model and show how to compute the data likelihood given the model. We show the results of inference using the model with Markov Chain Monte Carlo methods on both synthetic and real data.
Resumo:
We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.
Resumo:
On the basis of convolutional (Hamming) version of recent Neural Network Assembly Memory Model (NNAMM) for intact two-layer autoassociative Hopfield network optimal receiver operating characteristics (ROCs) have been derived analytically. A method of taking into account explicitly a priori probabilities of alternative hypotheses on the structure of information initiating memory trace retrieval and modified ROCs (mROCs, a posteriori probabilities of correct recall vs. false alarm probability) are introduced. The comparison of empirical and calculated ROCs (or mROCs) demonstrates that they coincide quantitatively and in this way intensities of cues used in appropriate experiments may be estimated. It has been found that basic ROC properties which are one of experimental findings underpinning dual-process models of recognition memory can be explained within our one-factor NNAMM.
Resumo:
In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.
Resumo:
A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.
Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.
The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.
The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.
All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.
Resumo:
Aims/Purpose: Protocols are evidenced-based structured guides for directing care to achieve improvements. But translating that evidence into practice is a major challenge. It is not acceptable to simply introduce the protocol and expect it to be adopted and lead to change in practice. Implementation requires effective leadership and management. This presentation describes a strategy for implementation that should promote successful adoption and lead to practice change.
Presentation description: There are many social and behavioural change models to assist and guide practice change. Choosing a model to guide implementation is important for providing a framework for action. The change process requires careful thought, from the protocol itself to the policies and politics within the ICU. In this presentation, I discuss a useful pragmatic guide called the 6SQUID (6 Steps in QUality Intervention Development). This was initially designed for public health interventions, but the model has wider applicability and has similarities with other change process models. Steps requiring consideration include examining the purpose and the need for change; the staff that will be affected and the impact on their workload; and the evidence base supporting the protocol. Subsequent steps in the process that the ICU manager should consider are the change mechanism (widespread multi-disciplinary consultation; adapting the protocol to the local ICU); and identifying how to deliver the change mechanism (educational workshops and preparing staff for the changes are imperative). Recognising the barriers to implementation and change and addressing these locally is also important. Once the protocol has been implemented, there is generally a learning curve before it becomes embedded in practice. Audit and feedback on adherence are useful strategies to monitor and sustain the changes.
Conclusion: Managing change successfully will promote a positive experience for staff. In turn, this will encourage a culture of enthusiasm for translating evidence into practice.
Resumo:
The change in the economic world and the emergence of Internet as a tool for communication and integration among the markets have forced organizations to adopt a different structure, process-oriented with a focus on information management. Thus, information technology has gained prominence in the organizational context, increasing its complexity and range of services provided by this function. Moreover, outsourcing has become an important model for flexible corporate structure, helping organizations to achieve better results when carrying out their activities and processes and be more competitive. To make the IT outsourcing, it is necessary to follow certain steps that range from strategic assessment to the management of outsourced service. Such steps can influence the form of contracting services, varying the types of service providers and contractors. Thus, the study aimed to identify how this IT outsourcing process influences the use of models for contracting services. For this, a study was conducted in multiple cases study involving two companies in Rio Grande do Norte State, specifically the health sector. Data collection was carried out with the CIOs of the companies surveyed through semi-structured interviews. According to the results obtained, it was found that the outsourcing process more structured influences the use of a more advanced contracting model. However, there are features found in these steps carrying more clearly this influence, as the goals pursued by outsourcing, the criteria used in selecting the supplier, a contract negotiation, how to transition services and the use of methods management, but can vary depending on the level of maturity in the relationship of the companies examined. Moreover, it was found that the use of contracting model may also influence how it is developed the IT outsourcing process, requiring or not its more formalized and organization
Resumo:
Käytettävien ohjelmistojen suunnittelu tuo hyötyjä loppukäyttäjälle sekä muille sidosryhmille. Verkkokaupassa käytettävyys on elintärkeää, koska asiakkaat vaihtavat helposti seuraavalle sivustolle, mikäli he eivät löydä etsimäänsä. Tutkimusten mukaan käytettävyys vaikuttaa ostopäätöksen tekemiseen. Lisäksi käytettävyydellä on merkitystä asiakastyytyväisyyteen, joka taas vaikuttaa asiakasuskollisuuteen. Tässä tutkielmassa tutkittiin, miten käytettävyyttä suunnitellaan käytännössä verrattuna teoreettisiin suosituksiin. Tapaustutkimuksen kohteena oli huonekaluja myyvän kansainvälisen yrityksen verkkokaupan uudistamiseen tähtäävä projekti. Uudistamistarve nousi aikaisemman verkkokauppaversion puutteellisesta käytettävyydestä. Projekti toteutettiin ketterällä Scrum-menetelmällä. Empiirinen aineisto kerättiin puolistrukturoitujen haastattelujen avulla. Haastateltavat olivat käyttökokemuksen suunnitteluun osallistuvia henkilöitä. Haastattelujen teemat laadittiin teoreettisen aineiston pohjalta. Teoreettisessa osuudessa tutkittiin käytettävyyden suunnitteluun liittyviä periaatteita, prosessia ja menetelmiä. Aikaisemmasta tutkimuksesta löydettiin 12 periaatetta, jotka tukevat ja luonnehtivat käyttäjäkeskeistä suunnittelua. Käytettävyyttä suunnitellaan käyttäjäkeskeisen prosessin avulla. Eri prosessimallit pitivät keskeisinä asioina käyttökontekstin määrittelyä ja ymmärtämistä, mitattavia käytettävyysvaatimuksia, suunnitteluratkaisujen empiiristä arviointia sekä suunnitteluprosessin iteratiivisuutta. Lisäksi tarkasteltiin, mitä suunnittelumenetelmiä tutkijat ehdottavat käytettävyyden suunnitteluun ja mitä kyselytutkimusten perusteella todellisuudessa käytetään. Verkkokauppaprojektissa käytettävyyden suunnittelu erosi osittain teoreettisista suosituksista. Käyttökontekstitietoa ei ollut kaikilla projektiin osallistuvilla, eikä käytettävyysvaatimuksia ollut asetettu teorian tarkoittamalla tavalla. Yhtäläisyyksiäkin löytyi. Verkkokauppaprojektissa suunnitteluratkaisuja arvioitiin empiirisesti todellisten käyttäjien edustajien avulla. Suunnitteluprosessi oli iteratiivinen eli suunnitteluratkaisuja oltiin valmiita muuttamaan arvioinnin tuloksena. Tutkimuksen perusteella suositellaan, että verkkokauppaprojektissa parannettaisiin kommunikointia, koska käyttökontekstitieto ei saavuttanut kaikkia projektissa työskenteleviä. Teorian tulisi entisestään korostaa kommunikoinnin tärkeyttä. Tutkimuksen perusteella esitetään myös, että teoria ohjaisi paremmin vaatimusmäärittelyjen tekemiseen käytännössä. Avainsanat: Käytettävyys, käyttäjäkeskeinen suunnittelu, käytettävyyden periaatteet, käytettävyyden suunnittelumenetelmät, ketterä ohjelmistokehitys, tapaustutkimus
Resumo:
The change in the economic world and the emergence of Internet as a tool for communication and integration among the markets have forced organizations to adopt a different structure, process-oriented with a focus on information management. Thus, information technology has gained prominence in the organizational context, increasing its complexity and range of services provided by this function. Moreover, outsourcing has become an important model for flexible corporate structure, helping organizations to achieve better results when carrying out their activities and processes and be more competitive. To make the IT outsourcing, it is necessary to follow certain steps that range from strategic assessment to the management of outsourced service. Such steps can influence the form of contracting services, varying the types of service providers and contractors. Thus, the study aimed to identify how this IT outsourcing process influences the use of models for contracting services. For this, a study was conducted in multiple cases study involving two companies in Rio Grande do Norte State, specifically the health sector. Data collection was carried out with the CIOs of the companies surveyed through semi-structured interviews. According to the results obtained, it was found that the outsourcing process more structured influences the use of a more advanced contracting model. However, there are features found in these steps carrying more clearly this influence, as the goals pursued by outsourcing, the criteria used in selecting the supplier, a contract negotiation, how to transition services and the use of methods management, but can vary depending on the level of maturity in the relationship of the companies examined. Moreover, it was found that the use of contracting model may also influence how it is developed the IT outsourcing process, requiring or not its more formalized and organization