967 resultados para Multilevel Systems Model
Resumo:
A system is said to be "instantaneous" when for a given constant input an equilibrium output is obtained after a while. In the meantime, the output is changing from its initial value towards the equilibrium one. This is the transient period of the system and transients are important features of open-respirometry systems. During transients, one cannot compute the input amplitude directly from the output. The existing models (e.g., first or second order dynamics) cannot account for many of the features observed in real open-respirometry systems, such as time lag. Also, these models do not explain what should be expected when a system is speeded up or slowed down. The purpose of the present study was to develop a mechanistic approach to the dynamics of open-respirometry systems, employing basic thermodynamic concepts. It is demonstrated that all the main relevant features of the output dynamics are due to and can be adequately explained by a distribution of apparent velocities within the set of molecules travelling along the system. The importance of the rate at which the molecules leave the sensor is explored for the first time. The study approaches the difference in calibrating a system with a continuous input and with a "unit impulse": the former truly reveals the dynamics of the system while the latter represents the first derivative (in time) of the former and, thus, cannot adequately be employed in the apparent time-constant determination. Also, we demonstrate why the apparent order of the output changes with volume or flow.
Resumo:
This thesis is a literature study that develops a conceptual model of decision making and decision support in service systems. The study is related to the Ä-Logi, Intelligent Service Logic for Welfare Sector Services research project, and the objective of the study is to develop the necessary theoretical framework to enable further research based on the research project results and material. The study first examines the concepts of service and service systems, focusing on understanding the characteristics of service systems and their implications for decision making and decision support to provide the basis for the development of the conceptual model. Based on the identified service system characteristics, an integrated model of service systems is proposed that views service systems through a number of interrelated perspectives that each offer different, but complementary, implications on the nature of decision making and the requirements for decision support in service systems. Based on the model, it is proposed that different types of decision making contexts can be identified in service systems that may be dominated by different types of decision making processes and where different types of decision support may be required, depending on the characteristics of the decision making context and its decision making processes. The proposed conceptual model of decision making and decision support in service systems examines the characteristics of decision making contexts and processes in service systems, and their typical requirements for decision support. First, a characterization of different types of decision making contexts in service systems is proposed based on the Cynefin framework and the identified service system characteristics. Second, the nature of decision making processes in service systems is proposed to be dual, with both rational and naturalistic decision making processes existing in service systems, and having an important and complementary role in decision making in service systems. Finally, a characterization of typical requirements for decision support in service systems is proposed that examines the decision support requirements associated with different types of decision making processes in characteristically different types of decision making contexts. It is proposed that decision support for the decision making processes that are based on rational decision making can be based on organizational decision support models, while decision support for the decision making processes that are based on naturalistic decision making should be based on supporting the decision makers’ situation awareness and facilitating the development of their tacit knowledge of the system and its tasks. Based on the proposed conceptual model a further research process is proposed. The study additionally provides a number of new perspectives on the characteristics of service systems, and the nature of decision making and requirements for decision support in service systems that can potentially provide a basis for further discussion and research, and support the practice alike.
Resumo:
The recombinant heat shock protein (18 kDa-hsp) from Mycobacterium leprae was studied as a T-epitope model for vaccine development. We present a structural analysis of the stability of recombinant 18 kDa-hsp during different processing steps. Circular dichroism and ELISA were used to monitor protein structure after thermal stress, lyophilization and chemical modification. We observed that the 18 kDa-hsp is extremely resistant to a wide range of temperatures (60% of activity is retained at 80ºC for 20 min). N-Acylation increased its ordered structure by 4% and decreased its ß-T1 structure by 2%. ELISA demonstrated that the native conformation of the 18 kDa-hsp was preserved after hydrophobic modification by acylation. The recombinant 18 kDa-hsp resists to a wide range of temperatures and chemical modifications without loss of its main characteristic, which is to be a source of T epitopes. This resistance is probably directly related to its lack of organization at the level of tertiary and secondary structures.
Resumo:
Concentrated solar power (CSP) is a renewable energy technology, which could contribute to overcoming global problems related to pollution emissions and increasing energy demand. CSP utilizes solar irradiation, which is a variable source of energy. In order to utilize CSP technology in energy production and reliably operate a solar field including thermal energy storage system, dynamic simulation tools are needed in order to study the dynamics of the solar field, to optimize production and develop control systems. The object of this Master’s Thesis is to compare different concentrated solar power technologies and configure a dynamic solar field model of one selected CSP field design in the dynamic simulation program Apros, owned by VTT and Fortum. The configured model is based on German Novatec Solar’s linear Fresnel reflector design. Solar collector components including dimensions and performance calculation were developed, as well as a simple solar field control system. The preliminary simulation results of two simulation cases under clear sky conditions were good; the desired and stable superheated steam conditions were maintained in both cases, while, as expected, the amount of steam produced was reduced in the case having lower irradiation conditions. As a result of the model development process, it can be concluded, that the configured model is working successfully and that Apros is a very capable and flexible tool for configuring new solar field models and control systems and simulating solar field dynamic behaviour.
Resumo:
This master’s thesis has been done for Drive! –project in which a new electric motor solution for mobile working machines is developed. Generic simulation model will be used as marketing and development tool. It can be used to model a wide variety of different vehicles with and without electric motor and to show customer the difference between traditionally build vehicles and those with new electric motor solution. Customers can also use simulation model to research different solutions for their own vehicles. At the start of the project it was decided that MeVEA software would be used as main simulation program and Simulink will only be used to simulate the operation of electrical components. Development of the generic model started with the research of these two software applications, simulation models which are made with them and how these simulation models can be build faster. Best results were used for building of generic simulation model. Finished generic model can be used to produce new tractor models for real-time simulations in short notice. All information about model is collected to one datasheet which can be easily filled by the user. After datasheet is filled a script will automatically build new simulation model in seconds. At the moment generic model is capable of building simulation models for wide variety of different tractors but it can be easily altered for other vehicle types too which would also benefit greatly from electric drive solution. Those could be for example wheel loaders and harvesters.
Resumo:
Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.
Resumo:
The objective of this Master’s thesis is to develop a model which estimates net working capital (NWC) monthly in a year period. The study is conducted by a constructive research which uses a case study. The estimation model is designed in the need of one case company which operates in project business. Net working capital components should be linked together by an automatic model and estimated individually, including advanced components of NWC for example POC receivables. Net working capital estimation model of this study contains three parts: output template, input template and calculation model. The output template gets estimate values automatically from the input template and the calculation model. Into the input template estimate values of more stable NWC components are inputted manually. The calculate model gets estimate values for major affecting components automatically from the systems of a company by using a historical data and made plans. As a precondition for the functionality of the estimation calculation is that sales are estimated in one year period because the sales are linked to all NWC components.
Resumo:
Sensory analysis was used to get an overall flavour description of a reaction mixtures containing 5'-IMP and Cysteine. Ribose/cysteine systems were used as reference systems. Results from triangle and aroma profiling show a clear correlation between the terms used and the volatile analysis described in literature for these model systems. For instance reactions at pH 3.0 and 4.5 for 5'-IMP/cysteine systems, which were described as "meaty" and "boiled meat" by panellists, presented, in the literature, the higher number of "meaty" compounds in volatile analysis (1, 7, 8, 20) .
Resumo:
Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.
Resumo:
With the new age of Internet of Things (IoT), object of everyday such as mobile smart devices start to be equipped with cheap sensors and low energy wireless communication capability. Nowadays mobile smart devices (phones, tablets) have become an ubiquitous device with everyone having access to at least one device. There is an opportunity to build innovative applications and services by exploiting these devices’ untapped rechargeable energy, sensing and processing capabilities. In this thesis, we propose, develop, implement and evaluate LoadIoT a peer-to-peer load balancing scheme that can distribute tasks among plethora of mobile smart devices in the IoT world. We develop and demonstrate an android-based proof of concept load-balancing application. We also present a model of the system which is used to validate the efficiency of the load balancing approach under varying application scenarios. Load balancing concepts can be apply to IoT scenario linked to smart devices. It is able to reduce the traffic send to the Cloud and the energy consumption of the devices. The data acquired from the experimental outcomes enable us to determine the feasibility and cost-effectiveness of a load balanced P2P smart phone-based applications.
Resumo:
Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.
Resumo:
Fluid flow behaviour in porous media is a conundrum. Therefore, this research is focused on filtration-volumetric characterisation of fractured-carbonate sediments, coupled with their proper simulation. For this reason, at laboratory rock properties such as pore volume, permeability and porosity are measured, later phase permeabilities and oil recovery in function of flow rate are assessed. Furthermore, the rheological properties of three oils are measured and analysed. Finally based on rock and fluid properties, a model using COMSOL Multiphysics is built in order to compare the experimental and simulated results. The rock analyses show linear relation between flow rate and differential pressure, from which phase permeabilities and pressure gradient are determined, eventually the oil recovery under low and high flow rate is established. In addition, the oils reveal thixotropic properties as well as non-Newtonian behaviour described by Bingham model, consequently Carreau viscosity model for the used oil is given. Given these points, the model for oil and water is built in COMSOL Multiphysics, whereupon successfully the reciprocity between experimental and simulated results is analysed and compared. Finally, a two-phase displacement model is elaborated.
Resumo:
Building a computational model for complex biological systems is an iterative process. It starts from an abstraction of the process and then incorporates more details regarding the specific biochemical reactions which results in the change of the model fit. Meanwhile, the model’s numerical properties such as its numerical fit and validation should be preserved. However, refitting the model after each refinement iteration is computationally expensive resource-wise. There is an alternative approach which ensures the model fit preservation without the need to refit the model after each refinement iteration. And this approach is known as quantitative model refinement. The aim of this thesis is to develop and implement a tool called ModelRef which does the quantitative model refinement automatically. It is both implemented as a stand-alone Java application and as one of Anduril framework components. ModelRef performs data refinement of a model and generates the results in two different well known formats (SBML and CPS formats). The development of this tool successfully reduces the time and resource needed and the errors generated as well by traditional reiteration of the whole model to perform the fitting procedure.
Resumo:
The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.
Resumo:
This research studied the project performance measurement from the perspective of strategic management. The objective was to find a generic model for project performance measurement that emphasizes strategy and decision making. Research followed the guidelines of a constructive research methodology. As a result, the study suggests a model that measures projects with multiple meters during and after projects. Measurement after the project is suggested to be linked to the strategic performance measures of a company. The measurement should be conducted with centralized project portfolio management e.g. using the project management office in the organization. Metrics, after the project, measure the project’s actual benefit realization. During the project, the metrics are universal and they measure the accomplished objectives relation to costs, schedule and internal resource usage. Outcomes of these measures should be forecasted by using qualitative or stochastic methods. Solid theoretical background for the model was found from the literature that covers the subjects of performance measurement, projects and uncertainty. The study states that the model can be implemented in companies. This statement is supported by empirical evidence from a single case study. The gathering of empiric evidence about the actual usefulness of the model in companies is left to be done by the evaluative research in the future.