21 resultados para Process Modeling, Collaboration, Distributed Modeling, Collaborative Technology

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT This dissertation focuses on new technology commercialization, innovation and new business development. Industry-based novel technology may achieve commercialization through its transfer to a large research laboratory acting as a lead user and technical partner, and providing the new technology with complementary assets and meaningful initial use in social practice. The research lab benefits from the new technology and innovation through major performance improvements and cost savings. Such mutually beneficial collaboration between the lab and the firm does not require any additional administrative efforts or funds from the lab, yet requires openness to technologies and partner companies that may not be previously known to the lab- Labs achieve the benefits by applying a proactive procurement model that promotes active pre-tender search of new technologies and pre-tender testing and piloting of these technological options. The collaboration works best when based on the development needs of both parties. This means that first of all the lab has significant engineering activity with well-defined technological needs and second, that the firm has advanced prototype technology yet needs further testing, piloting and the initial market and references to achieve the market breakthrough. The empirical evidence of the dissertation is based on a longitudinal multiple-case study with the European Laboratory for Particle Physics. The key theoretical contribution of this study is that large research labs, including basic research, play an important role in product and business development toward the end, rather than front-end, of the innovation process. This also implies that product-orientation and business-orientation can contribute to basic re-search. The study provides practical managerial and policy guidelines on how to initiate and manage mutually beneficial lab-industry collaboration and proactive procurement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid adoption of online media like Facebook, Twitter or Wikileaks leaves us with little time to think. Where is information technology taking us, our society and our democratic institutions ? Is the Web replicating social divides that already exist offline or does collaborative technology pave the way for a more equal society ? How do we find the right balance between openness and privacy ? Can social media improve civic participation or do they breed superficial exchange and the promotion of false information ? These and lots of other questions arise when one starts to look at the Internet, society and politics. The first part of this paper gives an overview of the social changes that occur with the rise of the Web. The second part serves as an overview on how the Web is being used for political participation in Switzerland and abroad. Le développement rapide de nouveaux médias comme Facebook, Twitter ou Wikileaks ne laisse que peu de temps à la réflexion. Quels sont les changements que ces technologies de l'information impliquent pour nous, notre société et nos institutions démocratiques ? Internet ne fait-il que reproduire des divisions sociales qui lui préexistent ou constitue-t-il un moyen de lisser et d'égaliser ces mêmes divisions ? Comment trouver le bon équilibre entre transparence et respect de la vie privée ? Les médias sociaux permettent-ils de stimuler la participation politique ou ne sont-ils que le vecteur d'échanges superficiels et de fausses informations ? Ces questions, parmi d'autres, émergent rapidement lorsque l'on s'intéresse à la question des liens entre Internet, la société et la politique. La première partie de ce cahier est consacrée aux changements sociaux générés par l'émergence et le développement d'Internet. La seconde fait l'état des lieux de la manière dont Internet est utilisé pour stimuler la participation politique en Suisse et à l'étranger.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract This paper shows how to calculate recursively the moments of the accumulated and discounted value of cash flows when the instantaneous rates of return follow a conditional ARMA process with normally distributed innovations. We investigate various moment based approaches to approximate the distribution of the accumulated value of cash flows and we assess their performance through stochastic Monte-Carlo simulations. We discuss the potential use in insurance and especially in the context of Asset-Liability Management of pension funds.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The Mont Collon mafic complex is one of the best preserved examples of the Early Permian magmatism in the Central Alps, related to the intra-continental collapse of the Variscan belt. It mostly consists (> 95 vol.%) of ol+hy-nonnative plagioclase-wehrlites, olivine- and cpx-gabbros with cumulitic structures, crosscut by acid dikes. Pegmatitic gabbros, troctolites and anorthosites outcrop locally. A well-preserved cumulative, sequence is exposed in the Dents de Bertol area (center of intrusion). PT-calculations indicate that this layered magma chamber emplaced at mid-crustal levels at about 0.5 GPa and 1100 degrees C. The Mont Collon cumulitic rocks record little magmatic differentiation, as illustrated by the restricted range of clinopyroxene mg-number (Mg#(cpx)=83-89). Whole-rock incompatible trace-element contents (e.g. Nb, Zr, Ba) vary largely and without correlation with major-element composition. These features are characteristic of an in-situ crystallization process with variable amounts of interstitial liquid L trapped between the cumulus mineral phases. LA-ICPMS measurements show that trace-element distribution in the latter is homogeneous, pointing to subsolidus re-equilibration between crystals and interstitial melts. A quantitative modeling based on Langmuir's in-situ crystallization equation successfully duplicated the REE concentrations in cumulitic minerals of all rock facies of the intrusion. The calculated amounts of interstitial liquid L vary between 0 and 35% for degrees of differentiation F of 0 to 20%, relative to the least evolved facies of the intrusion. L values are well correlated with the modal proportions of interstitial amphibole and whole-rock incompatible trace-element concentrations (e.g. Zr, Nb) of the tested samples. However, the in-situ crystallization model reaches its limitations with rock containing high modal content of REE-bearing minerals (i.e. zircon), such as pegmatitic gabbros. Dikes of anorthositic composition, locally crosscutting the layered lithologies, evidence that the Mont Collon rocks evolved in open system with mixing of intercumulus liquids of different origins and possibly contrasting compositions. The proposed model is not able to resolve these complex open systems, but migrating liquids could be partly responsible for the observed dispersion of points in some correlation diagrams. Absence of significant differentiation with recurrent lithologies in the cumulitic pile of Dents de Bertol points to an efficiently convective magma chamber, with possible periodic replenishment, (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

MOTIVATION: In silico modeling of gene regulatory networks has gained some momentum recently due to increased interest in analyzing the dynamics of biological systems. This has been further facilitated by the increasing availability of experimental data on gene-gene, protein-protein and gene-protein interactions. The two dynamical properties that are often experimentally testable are perturbations and stable steady states. Although a lot of work has been done on the identification of steady states, not much work has been reported on in silico modeling of cellular differentiation processes. RESULTS: In this manuscript, we provide algorithms based on reduced ordered binary decision diagrams (ROBDDs) for Boolean modeling of gene regulatory networks. Algorithms for synchronous and asynchronous transition models have been proposed and their corresponding computational properties have been analyzed. These algorithms allow users to compute cyclic attractors of large networks that are currently not feasible using existing software. Hereby we provide a framework to analyze the effect of multiple gene perturbation protocols, and their effect on cell differentiation processes. These algorithms were validated on the T-helper model showing the correct steady state identification and Th1-Th2 cellular differentiation process. AVAILABILITY: The software binaries for Windows and Linux platforms can be downloaded from http://si2.epfl.ch/~garg/genysis.html.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Hidden Markov models (HMMs) are probabilistic models that are well adapted to many tasks in bioinformatics, for example, for predicting the occurrence of specific motifs in biological sequences. MAMOT is a command-line program for Unix-like operating systems, including MacOS X, that we developed to allow scientists to apply HMMs more easily in their research. One can define the architecture and initial parameters of the model in a text file and then use MAMOT for parameter optimization on example data, decoding (like predicting motif occurrence in sequences) and the production of stochastic sequences generated according to the probabilistic model. Two examples for which models are provided are coiled-coil domains in protein sequences and protein binding sites in DNA. A wealth of useful features include the use of pseudocounts, state tying and fixing of selected parameters in learning, and the inclusion of prior probabilities in decoding. AVAILABILITY: MAMOT is implemented in C++, and is distributed under the GNU General Public Licence (GPL). The software, documentation, and example model files can be found at http://bcf.isb-sib.ch/mamot

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Despite their limited proliferation capacity, regulatory T cells (T(regs)) constitute a population maintained over the entire lifetime of a human organism. The means by which T(regs) sustain a stable pool in vivo are controversial. Using a mathematical model, we address this issue by evaluating several biological scenarios of the origins and the proliferation capacity of two subsets of T(regs): precursor CD4(+)CD25(+)CD45RO(-) and mature CD4(+)CD25(+)CD45RO(+) cells. The lifelong dynamics of T(regs) are described by a set of ordinary differential equations, driven by a stochastic process representing the major immune reactions involving these cells. The model dynamics are validated using data from human donors of different ages. Analysis of the data led to the identification of two properties of the dynamics: (1) the equilibrium in the CD4(+)CD25(+)FoxP3(+)T(regs) population is maintained over both precursor and mature T(regs) pools together, and (2) the ratio between precursor and mature T(regs) is inverted in the early years of adulthood. Then, using the model, we identified three biologically relevant scenarios that have the above properties: (1) the unique source of mature T(regs) is the antigen-driven differentiation of precursors that acquire the mature profile in the periphery and the proliferation of T(regs) is essential for the development and the maintenance of the pool; there exist other sources of mature T(regs), such as (2) a homeostatic density-dependent regulation or (3) thymus- or effector-derived T(regs), and in both cases, antigen-induced proliferation is not necessary for the development of a stable pool of T(regs). This is the first time that a mathematical model built to describe the in vivo dynamics of regulatory T cells is validated using human data. The application of this model provides an invaluable tool in estimating the amount of regulatory T cells as a function of time in the blood of patients that received a solid organ transplant or are suffering from an autoimmune disease.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A remarkable feature of the carcinogenicity of inorganic arsenic is that while human exposures to high concentrations of inorganic arsenic in drinking water are associated with increases in skin, lung, and bladder cancer, inorganic arsenic has not typically caused tumors in standard laboratory animal test protocols. Inorganic arsenic administered for periods of up to 2 yr to various strains of laboratory mice, including the Swiss CD-1, Swiss CR:NIH(S), C57Bl/6p53(+/-), and C57Bl/6p53(+/+), has not resulted in significant increases in tumor incidence. However, Ng et al. (1999) have reported a 40% tumor incidence in C57Bl/6J mice exposed to arsenic in their drinking water throughout their lifetime, with no tumors reported in controls. In order to investigate the potential role of tissue dosimetry in differential susceptibility to arsenic carcinogenicity, a physiologically based pharmacokinetic (PBPK) model for inorganic arsenic in the rat, hamster, monkey, and human (Mann et al., 1996a, 1996b) was extended to describe the kinetics in the mouse. The PBPK model was parameterized in the mouse using published data from acute exposures of B6C3F1 mice to arsenate, arsenite, monomethylarsonic acid (MMA), and dimethylarsinic acid (DMA) and validated using data from acute exposures of C57Black mice. Predictions of the acute model were then compared with data from chronic exposures. There was no evidence of changes in the apparent volume of distribution or in the tissue-plasma concentration ratios between acute and chronic exposure that might support the possibility of inducible arsenite efflux. The PBPK model was also used to project tissue dosimetry in the C57Bl/6J study, in comparison with tissue levels in studies having shorter duration but higher arsenic treatment concentrations. The model evaluation indicates that pharmacokinetic factors do not provide an explanation for the difference in outcomes across the various mouse bioassays. Other possible explanations may relate to strain-specific differences, or to the different durations of dosing in each of the mouse studies, given the evidence that inorganic arsenic is likely to be active in the later stages of the carcinogenic process. [Authors]

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Lesions of anatomical brain networks result in functional disturbances of brain systems and behavior which depend sensitively, often unpredictably, on the lesion site. The availability of whole-brain maps of structural connections within the human cerebrum and our increased understanding of the physiology and large-scale dynamics of cortical networks allow us to investigate the functional consequences of focal brain lesions in a computational model. We simulate the dynamic effects of lesions placed in different regions of the cerebral cortex by recording changes in the pattern of endogenous ("resting-state") neural activity. We find that lesions produce specific patterns of altered functional connectivity among distant regions of cortex, often affecting both cortical hemispheres. The magnitude of these dynamic effects depends on the lesion location and is partly predicted by structural network properties of the lesion site. In the model, lesions along the cortical midline and in the vicinity of the temporo-parietal junction result in large and widely distributed changes in functional connectivity, while lesions of primary sensory or motor regions remain more localized. The model suggests that dynamic lesion effects can be predicted on the basis of specific network measures of structural brain networks and that these effects may be related to known behavioral and cognitive consequences of brain lesions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.