951 resultados para Model for bringing into play
Resumo:
Resum En l’actualitat, els sistemes electrònics de processament de dades són cada cop més significatius dins del sector industrial. Són moltes les necessitats que sorgeixen en el món dels sistemes d’autentificació, de l’electrònica aeronàutica, d’equips d’emmagatzemament de dades, de telecomunicacions, etc. Aquestes necessitats tecnològiques exigeixen ser controlades per un sistema fiable, robust, totalment dependent amb els esdeveniments externs i que compleixi correctament les restriccions temporals imposades per tal de que realitzi el seu propòsit d’una manera eficient. Aquí és on entren en joc els sistemes encastats en temps real, els quals ofereixen una gran fiabilitat, disponibilitat, una ràpida resposta als esdeveniments externs del sistema, una alta garantia de funcionament i una àmplia possibilitat d’aplicacions. Aquest projecte està pensat per a fer una introducció al món dels sistemes encastats, com també explicar el funcionament del sistema operatiu en temps real FreeRTOS; el qual utilitza com a mètode de programació l’ús de tasques independents entre elles. Donarem una visió de les seves característiques de funcionament, com organitza tasques mitjançant un scheduler i uns exemples per a poder dissenyar-hi aplicacions.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.
Resumo:
The New Economic Geography literature allows detailed analysis of the factors that determine the location decisions of firms in integrated markets. However, the competitive process is modelled in a rather rudimentary way, and the empirical evidence has usually been obtained from reduced-form econometric specifications. This study describes a structural model that takes into account strategic interactions between firms. We investigate the relationship between the degree of perceived competition ¿ not only from local firms but from firms in other regions ¿ and geographic concentration. The preliminary results indicate that, in aggregate terms, local firms present stronger competition than firms in other regions. Moreover, it is confirmed that greater geographical concentration of production reduces market power, due to the intensification of local competition; however, its impact on production costs is unclear.
Resumo:
The New Economic Geography literature allows detailed analysis of the factors that determine the location decisions of firms in integrated markets. However, the competitive process is modelled in a rather rudimentary way, and the empirical evidence has usually been obtained from reduced-form econometric specifications. This study describes a structural model that takes into account strategic interactions between firms. We investigate the relationship between the degree of perceived competition ¿ not only from local firms but from firms in other regions ¿ and geographic concentration. The preliminary results indicate that, in aggregate terms, local firms present stronger competition than firms in other regions. Moreover, it is confirmed that greater geographical concentration of production reduces market power, due to the intensification of local competition; however, its impact on production costs is unclear.
Resumo:
An extension of the spin density functional theory simultaneously accounting for dielectric mismatch between neighboring materials and nonparabolicity corrections originating from interactions between conduction and valence bands is presented. This method is employed to calculate ground state and addition energy spectra of homogeneous and multishell spherical quantum dots. Our calculations reveal that corrections become especially relevant when they come into play simultaneously in strong regimes of spatial confinement.
Resumo:
Time is embedded in any sensory experience: the movements of a dance, the rhythm of a piece of music, the words of a speaker are all examples of temporally structured sensory events. In humans, if and how visual cortices perform temporal processing remains unclear. Here we show that both primary visual cortex (V1) and extrastriate area V5/MT are causally involved in encoding and keeping time in memory and that this involvement is independent from low-level visual processing. Most importantly we demonstrate that V1 and V5/MT come into play simultaneously and seem to be functionally linked during interval encoding, whereas they operate serially (V1 followed by V5/MT) and seem to be independent while maintaining temporal information in working memory. These data help to refine our knowledge of the functional properties of human visual cortex, highlighting the contribution and the temporal dynamics of V1 and V5/MT in the processing of the temporal aspects of visual information.
Resumo:
Photopolymerization is commonly used in a broad range of bioapplications, such as drug delivery, tissue engineering, and surgical implants, where liquid materials are injected and then hardened by means of illumination to create a solid polymer network. However, photopolymerization using a probe, e.g., needle guiding both the liquid and the curing illumination, has not been thoroughly investigated. We present a Monte Carlo model that takes into account the dynamic absorption and scattering parameters as well as solid-liquid boundaries of the photopolymer to yield the shape and volume of minimally invasively injected, photopolymerized hydrogels. In the first part of the article, our model is validated using a set of well-known poly(ethylene glycol) dimethacrylate hydrogels showing an excellent agreement between simulated and experimental volume-growth-rates. In the second part, in situ experimental results and simulations for photopolymerization in tissue cavities are presented. It was found that a cavity with a volume of 152 mm3 can be photopolymerized from the output of a 0.28-mm2 fiber by adding scattering lipid particles while only a volume of 38 mm3 (25%) was achieved without particles. The proposed model provides a simple and robust method to solve complex photopolymerization problems, where the dimension of the light source is much smaller than the volume of the photopolymerizable hydrogel.
Resumo:
A critical issue in brain energy metabolism is whether lactate produced within the brain by astrocytes is taken up and metabolized by neurons upon activation. Although there is ample evidence that neurons can efficiently use lactate as an energy substrate, at least in vitro, few experimental data exist to indicate that it is indeed the case in vivo. To address this question, we used a modeling approach to determine which mechanisms are necessary to explain typical brain lactate kinetics observed upon activation. On the basis of a previously validated model that takes into account the compartmentalization of energy metabolism, we developed a mathematical model of brain lactate kinetics, which was applied to published data describing the changes in extracellular lactate levels upon activation. Results show that the initial dip in the extracellular lactate concentration observed at the onset of stimulation can only be satisfactorily explained by a rapid uptake within an intraparenchymal cellular compartment. In contrast, neither blood flow increase, nor extracellular pH variation can be major causes of the lactate initial dip, whereas tissue lactate diffusion only tends to reduce its amplitude. The kinetic properties of monocarboxylate transporter isoforms strongly suggest that neurons represent the most likely compartment for activation-induced lactate uptake and that neuronal lactate utilization occurring early after activation onset is responsible for the initial dip in brain lactate levels observed in both animals and humans.
Resumo:
Since integral abutment bridges decrease the initial and maintenance costs of bridges, they provide an attractive alternative for bridge designers. The objective of this project is to develop rational and experimentally verified design recommendations for these bridges. Field testing consisted of instrumenting two bridges in Iowa to monitor air and bridge temperatures, bridge displacements, and pile strains. Core samples were also collected to determine coefficients of thermal expansion for the two bridges. Design values for the coefficient of thermal expansion of concrete are recommended, as well as revised temperature ranges for the deck and girders of steel and concrete bridges. A girder extension model is developed to predict the longitudinal bridge displacements caused by changing bridge temperatures. Abutment rotations and passive soil pressures behind the abutment were neglected. The model is subdivided into segments that have uniform temperatures, coefficients of expansion, and moduli of elasticity. Weak axis pile strains were predicted using a fixed-head model. The pile is idealized as an equivalent cantilever with a length determined by the surrounding soil conditions and pile properties. Both the girder extension model and the fixed-head model are conservative for design purposes. A longitudinal frame model is developed to account for abutment rotations. The frame model better predicts both the longitudinal displacement and weak axis pile strains than do the simpler models. A lateral frame model is presented to predict the lateral motion of skewed bridges and the associated strong axis pile strains. Full passive soil pressure is assumed on the abutment face. Two alternatives for the pile design are presented. Alternative One is the more conservative and includes thermally induced stresses. Alternative Two neglects thermally induced stresses but allows for the partial formation of plastic hinges (inelastic redistribution of forces). Ductility criteria are presented for this alternative. Both alternatives are illustrated in a design example.
Resumo:
The theory of language has occupied a special place in the history of Indian thought. Indian philosophers give particular attention to the analysis of the cognition obtained from language, known under the generic name of śābdabodha. This term is used to denote, among other things, the cognition episode of the hearer, the content of which is described in the form of a paraphrase of a sentence represented as a hierarchical structure. Philosophers submit the meaning of the component items of a sentence and their relationship to a thorough examination, and represent the content of the resulting cognition as a paraphrase centred on a meaning element, that is taken as principal qualificand (mukhyaviśesya) which is qualified by the other meaning elements. This analysis is the object of continuous debate over a period of more than a thousand years between the philosophers of the schools of Mimāmsā, Nyāya (mainly in its Navya form) and Vyākarana. While these philosophers are in complete agreement on the idea that the cognition of sentence meaning has a hierarchical structure and share the concept of a single principal qualificand (qualified by other meaning elements), they strongly disagree on the question which meaning element has this role and by which morphological item it is expressed. This disagreement is the central point of their debate and gives rise to competing versions of this theory. The Mïmāmsakas argue that the principal qualificand is what they call bhāvanā ̒bringing into being̒, ̒efficient force̒ or ̒productive operation̒, expressed by the verbal affix, and distinct from the specific procedures signified by the verbal root; the Naiyāyikas generally take it to be the meaning of the word with the first case ending, while the Vaiyākaranas take it to be the operation expressed by the verbal root. All the participants rely on the Pāninian grammar, insofar as the Mimāmsakas and Naiyāyikas do not compose a new grammar of Sanskrit, but use different interpretive strategies in order to justify their views, that are often in overt contradiction with the interpretation of the Pāninian rules accepted by the Vaiyākaranas. In each of the three positions, weakness in one area is compensated by strength in another, and the cumulative force of the total argumentation shows that no position can be declared as correct or overall superior to the others. This book is an attempt to understand this debate, and to show that, to make full sense of the irreconcilable positions of the three schools, one must go beyond linguistic factors and consider the very beginnings of each school's concern with the issue under scrutiny. The texts, and particularly the late texts of each school present very complex versions of the theory, yet the key to understanding why these positions remain irreconcilable seems to lie elsewhere, this in spite of extensive argumentation involving a great deal of linguistic and logical technicalities. Historically, this theory arises in Mimāmsā (with Sabara and Kumārila), then in Nyāya (with Udayana), in a doctrinal and theological context, as a byproduct of the debate over Vedic authority. The Navya-Vaiyākaranas enter this debate last (with Bhattoji Dïksita and Kaunda Bhatta), with the declared aim of refuting the arguments of the Mïmāmsakas and Naiyāyikas by bringing to light the shortcomings in their understanding of Pāninian grammar. The central argument has focused on the capacity of the initial contexts, with the network of issues to which the principal qualificand theory is connected, to render intelligible the presuppositions and aims behind the complex linguistic justification of the classical and late stages of this debate. Reading the debate in this light not only reveals the rationality and internal coherence of each position beyond the linguistic arguments, but makes it possible to understand why the thinkers of the three schools have continued to hold on to three mutually exclusive positions. They are defending not only their version of the principal qualificand theory, but (though not openly acknowledged) the entire network of arguments, linguistic and/or extra-linguistic, to which this theory is connected, as well as the presuppositions and aims underlying these arguments.
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.
Resumo:
Proteins can switch between different conformations in response to stimuli, such as pH or temperature variations, or to the binding of ligands. Such plasticity and its kinetics can have a crucial functional role, and their characterization has taken center stage in protein research. As an example, Topoisomerases are particularly interesting enzymes capable of managing tangled and supercoiled double-stranded DNA, thus facilitating many physiological processes. In this work, we describe the use of a cantilever-based nanomotion sensor to characterize the dynamics of human topoisomerase II (Topo II) enzymes and their response to different kinds of ligands, such as ATP, which enhance the conformational dynamics. The sensitivity and time resolution of this sensor allow determining quantitatively the correlation between the ATP concentration and the rate of Topo II conformational changes. Furthermore, we show how to rationalize the experimental results in a comprehensive model that takes into account both the physics of the cantilever and the dynamics of the ATPase cycle of the enzyme, shedding light on the kinetics of the process. Finally, we study the effect of aclarubicin, an anticancer drug, demonstrating that it affects directly the Topo II molecule inhibiting its conformational changes. These results pave the way to a new way of studying the intrinsic dynamics of proteins and of protein complexes allowing new applications ranging from fundamental proteomics to drug discovery and development and possibly to clinical practice.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
A rigorous unit operation model is developed for vapor membrane separation. The new model is able to describe temperature, pressure, and concentration dependent permeation as wellreal fluid effects in vapor and gas separation with hydrocarbon selective rubbery polymeric membranes. The permeation through the membrane is described by a separate treatment of sorption and diffusion within the membrane. The chemical engineering thermodynamics is used to describe the equilibrium sorption of vapors and gases in rubbery membranes with equation of state models for polymeric systems. Also a new modification of the UNIFAC model is proposed for this purpose. Various thermodynamic models are extensively compared in order to verify the models' ability to predict and correlate experimental vapor-liquid equilibrium data. The penetrant transport through the selective layer of the membrane is described with the generalized Maxwell-Stefan equations, which are able to account for thebulk flux contribution as well as the diffusive coupling effect. A method is described to compute and correlate binary penetrant¿membrane diffusion coefficients from the experimental permeability coefficients at different temperatures and pressures. A fluid flow model for spiral-wound modules is derived from the conservation equation of mass, momentum, and energy. The conservation equations are presented in a discretized form by using the control volume approach. A combination of the permeation model and the fluid flow model yields the desired rigorous model for vapor membrane separation. The model is implemented into an inhouse process simulator and so vapor membrane separation may be evaluated as an integralpart of a process flowsheet.
Resumo:
Työssä on tutkittu pumpun valintaa integroidussa simulointiympäristössä. Tutkimuksen avulla on tarkoitus kehittää tieto- ja malligallerian (Galleria) infrastuktuuria ja käytössä olevia työvälineitä siihen suuntaan, että ulkoisten valitsinkomponenttien implementoiminen Gallerian tietokantaan olisi lähitulevaisuudessa helppoa ja nopeaa. Prosessi-integraation tieto- ja malligalleria on infrastruktuurimäärittely ja työvälineistö prosessikomponenttien ja osaprosessien simulointiin tarvittavien malli- ja parametritietojen kirjastoimiseksi internetpohjaiseen tietokantaan, josta ne ovat simulointiohjelmien käyttäjien ja muiden prosessisuunnittelijoiden helposti saatavilla. Tutkimuksen kirjallisessa osassa on keskitytty tutkimaan pumpun valinnan etenemistä ja valintaan vaikuttavia tekijöitä prosessisuunnittelijan näkökulmasta. Tutkimuksen soveltavassa osassa luotiin pumpunvalitsin ja sitä testattiiin Galleria- ympäristöä vastaavassa kehitysympäristössä. Pumpunvalitsimen kehittämisessä on käytetty apuna laitevalmistajalta saatua prosessikomponenttitietoa. Pumpunvalitsimen toteutus perustuu pumpun hydrauliseen valintaan ja esimerkkitapauksena käytettiin keskipakopumppua