813 resultados para Constraint based modelling
Resumo:
Random distributed feedback (DFB) fiber lasers have attracted a great attention since first demonstration [1]. Despite big advance in practical laser systems, random DFB fiber laser spectral properties are far away to be understood or even numerically modelled. Up to date, only generation power could be calculated and optimized numerically [1,2] or analytically [3] within the power balance model. However, spectral and statistical properties of random DFB fiber laser can not be found in this way. Here we present first numerical modelling of the random DFB fiber laser, including its spectral and statistical properties, using NLSE-based model. © 2013 IEEE.
Resumo:
Homogenous secondary pyrolysis is category of reactions following the primary pyrolysis and presumed important for fast pyrolysis. For the comprehensive chemistry and fluid dynamics, a probability density functional (PDF) approach is used; with a kinetic scheme comprising 134 species and 4169 reactions being implemented. With aid of acceleration techniques, most importantly Dimension Reduction, Chemistry Agglomeration and In-situ Tabulation (ISAT), a solution within reasonable time was obtained. More work is required; however, a solution for levoglucosan (C6H10O5) being fed through the inlet with fluidizing gas at 500 °C, has been obtained. 88.6% of the levoglucosan remained non-decomposed, and 19 different decomposition product species were found above 0.01% by weight. A homogenous secondary pyrolysis scheme proposed can thus be implemented in a CFD environment and acceleration techniques can speed-up the calculation for application in engineering settings.
Resumo:
In 2006, a large and prolonged bloom of the dinoflagellate Karenia mikimotoi occurred in Scottish coastal waters, causing extensive mortalities of benthic organisms including annelids and molluscs and some species of fish ( Davidson et al., 2009). A coupled hydrodynamic-algal transport model was developed to track the progression of the bloom around the Scottish coast during June–September 2006 and hence investigate the processes controlling the bloom dynamics. Within this individual-based model, cells were capable of growth, mortality and phototaxis and were transported by physical processes of advection and turbulent diffusion, using current velocities extracted from operational simulations of the MRCS ocean circulation model of the North-west European continental shelf. Vertical and horizontal turbulent diffusion of cells are treated using a random walk approach. Comparison of model output with remotely sensed chlorophyll concentrations and cell counts from coastal monitoring stations indicated that it was necessary to include multiple spatially distinct seed populations of K. mikimotoi at separate locations on the shelf edge to capture the qualitative pattern of bloom transport and development. We interpret this as indicating that the source population was being transported northwards by the Hebridean slope current from where colonies of K. mikimotoi were injected onto the continental shelf by eddies or other transient exchange processes. The model was used to investigate the effects on simulated K. mikimotoi transport and dispersal of: (1) the distribution of the initial seed population; (2) algal growth and mortality; (3) water temperature; (4) the vertical movement of particles by diurnal migration and eddy diffusion; (5) the relative role of the shelf edge and coastal currents; (6) the role of wind forcing. The numerical experiments emphasized the requirement for a physiologically based biological model and indicated that improved modelling of future blooms will potentially benefit from better parameterisation of temperature dependence of both growth and mortality and finer spatial and temporal hydrodynamic resolution.
Resumo:
In 2006, a large and prolonged bloom of the dinoflagellate Karenia mikimotoi occurred in Scottish coastal waters, causing extensive mortalities of benthic organisms including annelids and molluscs and some species of fish ( Davidson et al., 2009). A coupled hydrodynamic-algal transport model was developed to track the progression of the bloom around the Scottish coast during June–September 2006 and hence investigate the processes controlling the bloom dynamics. Within this individual-based model, cells were capable of growth, mortality and phototaxis and were transported by physical processes of advection and turbulent diffusion, using current velocities extracted from operational simulations of the MRCS ocean circulation model of the North-west European continental shelf. Vertical and horizontal turbulent diffusion of cells are treated using a random walk approach. Comparison of model output with remotely sensed chlorophyll concentrations and cell counts from coastal monitoring stations indicated that it was necessary to include multiple spatially distinct seed populations of K. mikimotoi at separate locations on the shelf edge to capture the qualitative pattern of bloom transport and development. We interpret this as indicating that the source population was being transported northwards by the Hebridean slope current from where colonies of K. mikimotoi were injected onto the continental shelf by eddies or other transient exchange processes. The model was used to investigate the effects on simulated K. mikimotoi transport and dispersal of: (1) the distribution of the initial seed population; (2) algal growth and mortality; (3) water temperature; (4) the vertical movement of particles by diurnal migration and eddy diffusion; (5) the relative role of the shelf edge and coastal currents; (6) the role of wind forcing. The numerical experiments emphasized the requirement for a physiologically based biological model and indicated that improved modelling of future blooms will potentially benefit from better parameterisation of temperature dependence of both growth and mortality and finer spatial and temporal hydrodynamic resolution.
Resumo:
The voltage profile of the catenary between traction substations (TSSs) is affected by the trolleybus current intake and by its position with respect to the TSSs: the higher the current requested by the bus and the further the bus from the TSSs, the deeper the voltage drop. When the voltage drops below 500V, the trolleybus is forced to decrease its consumption by reducing its input current. This thesis deals with the analysis of the improvements that the installation of an BESS produces in the operation of a particularly loaded FS of the DC trolleybus network of the city of Bologna. The stationary BESS is charged by the TSSs during off-peak times and delivers the stored energy when the catenary is overloaded alleviating the load on the TSSs and reducing the voltage drops. Only IMC buses are considered in the prospect of a future disposal of all internal combustion engine vehicles. These trolleybuses cause deeper voltage drops because they absorb enough current to power their traction motor and recharge the on board battery. The control of the BESS aims to keep the catenary voltage within the admissible voltage range and makes sure that all physical limitations are met. A model of FS Marconi Trento Trieste is implemented in Simulink environment to simulate its daily operation and compare the behavior of the trolleybus network with and without BESS. From the simulation without BESS, the best location of the energy storage system is deduced, and the battery control is tuned. Furthermore, from the knowledge of the load curve and the battery control trans-characteristic, it is formulated a prediction of the voltage distribution at BESS connection point. The prediction is then compared with the simulation results to validate the Simulink model. The BESS allows to decrease the voltage drops along the catenary, the Joule losses and the current delivered by the TSSs, indicating that the BESS can be a solution to improve the operation of the trolleybus network.
Resumo:
Numerous works have been conducted on modelling basic compliant elements such as wire beams, and closed-form analytical models of most basic compliant elements have been well developed. However, the modelling of complex compliant mechanisms is still a challenging work. This paper proposes a constraint-force-based (CFB) modelling approach to model compliant mechanisms with a particular emphasis on modelling complex compliant mechanisms. The proposed CFB modelling approach can be regarded as an improved free-body- diagram (FBD) based modelling approach, and can be extended to a development of the screw-theory-based design approach. A compliant mechanism can be decomposed into rigid stages and compliant modules. A compliant module can offer elastic forces due to its deformation. Such elastic forces are regarded as variable constraint forces in the CFB modelling approach. Additionally, the CFB modelling approach defines external forces applied on a compliant mechanism as constant constraint forces. If a compliant mechanism is at static equilibrium, all the rigid stages are also at static equilibrium under the influence of the variable and constant constraint forces. Therefore, the constraint force equilibrium equations for all the rigid stages can be obtained, and the analytical model of the compliant mechanism can be derived based on the constraint force equilibrium equations. The CFB modelling approach can model a compliant mechanism linearly and nonlinearly, can obtain displacements of any points of the rigid stages, and allows external forces to be exerted on any positions of the rigid stages. Compared with the FBD based modelling approach, the CFB modelling approach does not need to identify the possible deformed configuration of a complex compliant mechanism to obtain the geometric compatibility conditions and the force equilibrium equations. Additionally, the mathematical expressions in the CFB approach have an easily understood physical meaning. Using the CFB modelling approach, the variable constraint forces of three compliant modules, a wire beam, a four-beam compliant module and an eight-beam compliant module, have been derived in this paper. Based on these variable constraint forces, the linear and non-linear models of a decoupled XYZ compliant parallel mechanism are derived, and verified by FEA simulations and experimental tests.
Resumo:
Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.
Resumo:
Abstract: The Murray-Darling Basin comprises over 1 million km2; it lies within four states and one territory; and over 12, 800 GL of irrigation water is used to produce over 40% of the nation's gross value of agricultural production. This production is used by a diverse collection of some-times mutually exclusive commodities (e.g. pasture; stone fruit; grapes; cotton and field crops). The supply of water for irrigation is subject to climatic and policy uncertainty. Variable inflows mean that water property rights do not provide a guaranteed supply. With increasing public scrutiny and environmental issues facing irrigators, greater pressure is being placed on this finite resource. The uncertainty of the water supply, water quality (salinity), combined with where water is utilised, while attempting to maximising return for investment makes for an interesting research field. The utilisation and comparison of a GAMS and Excel based modelling approach has been used to ask: where should we allocate water?; amongst what commodities?; and how does this affect both the quantity of water and the quality of water along the Murray-Darling river system?
Resumo:
Combating climate change is one of the key tasks of humanity in the 21st century. One of the leading causes is carbon dioxide emissions due to usage of fossil fuels. Renewable energy sources should be used instead of relying on oil, gas, and coal. In Finland a significant amount of energy is produced using wood. The usage of wood chips is expected to increase in the future significantly, over 60 %. The aim of this research is to improve understanding over the costs of wood chip supply chains. This is conducted by utilizing simulation as the main research method. The simulation model utilizes both agent-based modelling and discrete event simulation to imitate the wood chip supply chain. This thesis concentrates on the usage of simulation based decision support systems in strategic decision-making. The simulation model is part of a decision support system, which connects the simulation model to databases but also provides a graphical user interface for the decisionmaker. The main analysis conducted with the decision support system concentrates on comparing a traditional supply chain to a supply chain utilizing specialized containers. According to the analysis, the container supply chain is able to have smaller costs than the traditional supply chain. Also, a container supply chain can be more easily scaled up due to faster emptying operations. Initially the container operations would only supply part of the fuel needs of a power plant and it would complement the current supply chain. The model can be expanded to include intermodal supply chains as due to increased demand in the future there is not enough wood chips located close to current and future power plants.
Resumo:
Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.
Resumo:
The aim of this research is to develop a tool that could allow to organize coopetitional relationships between organizations on the basis of two-sided Internet platform. The main result of current master thesis is a detailed description of the concept of the lead generating internet platform-based coopetition. With the tools of agent-based modelling and simulation, there were obtained results that could be used as a base for suggestion that the developed concept is able to cause a positive effect on some particular industries (e.g. web-design studios market) and potentially can bring some benefits and extra profitability for most companies that operate on this particular industry. Also on the basis of the results it can be assumed that the developed instrument is also able to increase the degree of transparency of the market to which it is applied.
Resumo:
Computational Biology is the research are that contributes to the analysis of biological data through the development of algorithms which will address significant research problems.The data from molecular biology includes DNA,RNA ,Protein and Gene expression data.Gene Expression Data provides the expression level of genes under different conditions.Gene expression is the process of transcribing the DNA sequence of a gene into mRNA sequences which in turn are later translated into proteins.The number of copies of mRNA produced is called the expression level of a gene.Gene expression data is organized in the form of a matrix. Rows in the matrix represent genes and columns in the matrix represent experimental conditions.Experimental conditions can be different tissue types or time points.Entries in the gene expression matrix are real values.Through the analysis of gene expression data it is possible to determine the behavioral patterns of genes such as similarity of their behavior,nature of their interaction,their respective contribution to the same pathways and so on. Similar expression patterns are exhibited by the genes participating in the same biological process.These patterns have immense relevance and application in bioinformatics and clinical research.Theses patterns are used in the medical domain for aid in more accurate diagnosis,prognosis,treatment planning.drug discovery and protein network analysis.To identify various patterns from gene expression data,data mining techniques are essential.Clustering is an important data mining technique for the analysis of gene expression data.To overcome the problems associated with clustering,biclustering is introduced.Biclustering refers to simultaneous clustering of both rows and columns of a data matrix. Clustering is a global whereas biclustering is a local model.Discovering local expression patterns is essential for identfying many genetic pathways that are not apparent otherwise.It is therefore necessary to move beyond the clustering paradigm towards developing approaches which are capable of discovering local patterns in gene expression data.A biclusters is a submatrix of the gene expression data matrix.The rows and columns in the submatrix need not be contiguous as in the gene expression data matrix.Biclusters are not disjoint.Computation of biclusters is costly because one will have to consider all the combinations of columans and rows in order to find out all the biclusters.The search space for the biclustering problem is 2 m+n where m and n are the number of genes and conditions respectively.Usually m+n is more than 3000.The biclustering problem is NP-hard.Biclustering is a powerful analytical tool for the biologist.The research reported in this thesis addresses the problem of biclustering.Ten algorithms are developed for the identification of coherent biclusters from gene expression data.All these algorithms are making use of a measure called mean squared residue to search for biclusters.The objective here is to identify the biclusters of maximum size with the mean squared residue lower than a given threshold. All these algorithms begin the search from tightly coregulated submatrices called the seeds.These seeds are generated by K-Means clustering algorithm.The algorithms developed can be classified as constraint based,greedy and metaheuristic.Constarint based algorithms uses one or more of the various constaints namely the MSR threshold and the MSR difference threshold.The greedy approach makes a locally optimal choice at each stage with the objective of finding the global optimum.In metaheuristic approaches particle Swarm Optimization(PSO) and variants of Greedy Randomized Adaptive Search Procedure(GRASP) are used for the identification of biclusters.These algorithms are implemented on the Yeast and Lymphoma datasets.Biologically relevant and statistically significant biclusters are identified by all these algorithms which are validated by Gene Ontology database.All these algorithms are compared with some other biclustering algorithms.Algorithms developed in this work overcome some of the problems associated with the already existing algorithms.With the help of some of the algorithms which are developed in this work biclusters with very high row variance,which is higher than the row variance of any other algorithm using mean squared residue, are identified from both Yeast and Lymphoma data sets.Such biclusters which make significant change in the expression level are highly relevant biologically.
Resumo:
In many real world contexts individuals find themselves in situations where they have to decide between options of behaviour that serve a collective purpose or behaviours which satisfy one’s private interests, ignoring the collective. In some cases the underlying social dilemma (Dawes, 1980) is solved and we observe collective action (Olson, 1965). In others social mobilisation is unsuccessful. The central topic of social dilemma research is the identification and understanding of mechanisms which yield to the observed cooperation and therefore resolve the social dilemma. It is the purpose of this thesis to contribute this research field for the case of public good dilemmas. To do so, existing work that is relevant to this problem domain is reviewed and a set of mandatory requirements is derived which guide theory and method development of the thesis. In particular, the thesis focusses on dynamic processes of social mobilisation which can foster or inhibit collective action. The basic understanding is that success or failure of the required process of social mobilisation is determined by heterogeneous individual preferences of the members of a providing group, the social structure in which the acting individuals are contained, and the embedding of the individuals in economic, political, biophysical, or other external contexts. To account for these aspects and for the involved dynamics the methodical approach of the thesis is computer simulation, in particular agent-based modelling and simulation of social systems. Particularly conductive are agent models which ground the simulation of human behaviour in suitable psychological theories of action. The thesis develops the action theory HAPPenInGS (Heterogeneous Agents Providing Public Goods) and demonstrates its embedding into different agent-based simulations. The thesis substantiates the particular added value of the methodical approach: Starting out from a theory of individual behaviour, in simulations the emergence of collective patterns of behaviour becomes observable. In addition, the underlying collective dynamics may be scrutinised and assessed by scenario analysis. The results of such experiments reveal insights on processes of social mobilisation which go beyond classical empirical approaches and yield policy recommendations on promising intervention measures in particular.
Estado situacional de los modelos basados en agentes y su impacto en la investigación organizacional
Resumo:
En un mundo hiperconectado, dinámico y cargado de incertidumbre como el actual, los métodos y modelos analíticos convencionales están mostrando sus limitaciones. Las organizaciones requieren, por tanto, herramientas útiles que empleen tecnología de información y modelos de simulación computacional como mecanismos para la toma de decisiones y la resolución de problemas. Una de las más recientes, potentes y prometedoras es el modelamiento y la simulación basados en agentes (MSBA). Muchas organizaciones, incluidas empresas consultoras, emplean esta técnica para comprender fenómenos, hacer evaluación de estrategias y resolver problemas de diversa índole. Pese a ello, no existe (hasta donde conocemos) un estado situacional acerca del MSBA y su aplicación a la investigación organizacional. Cabe anotar, además, que por su novedad no es un tema suficientemente difundido y trabajado en Latinoamérica. En consecuencia, este proyecto pretende elaborar un estado situacional sobre el MSBA y su impacto sobre la investigación organizacional.
Resumo:
The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.