842 resultados para Adaptive system theory
Resumo:
The purpose of the study is to determine general features of the supply chain performance management system, to assess the current state of performance management in the case company mills and to make proposals for improvement – how the future state of performance management system would look like. The study covers four phases which consist of theory and case company parts. Theoretical review gives understanding about performance management and measurement. Current state analysis assesses the current state of performance management in the mills. Results and proposals for improvement are derived from current state analysis and finally the conclusions with answers to research questions are presented. Supply chain performance management system consists of five areas: perfor-mance measurement and metrics, action plans, performance tracking, performance dialogue and rewards, consequences and actions. The result of the study revealed that all mills were quite average level in performance management and there is a room for improvement. Created performance improvement matrix served as a tool in assessing current performance management and could work also as a tool in the future in mapping the current state after transformation process. Limited harmonization was revealed as there were different ways to work and manage performance in the mills. Lots of good ideas existed though actions are needed to make a progress. There is also need to harmonize KPI structure.
Resumo:
Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.
Resumo:
This paper presents an HP-Adaptive Procedure with Hierarchical formulation for the Boundary Element Method in 2-D Elasticity problems. Firstly, H, P and HP formulations are defined. Then, the hierarchical concept, which allows a substantial reduction in the dimension of equation system, is introduced. The error estimator used is based on the residual computation over each node inside an element. Finally, the HP strategy is defined and applied to two examples.
Resumo:
In this paper is Analyzed the local dynamical behavior of a slewing flexible structure considering nonlinear curvature. The dynamics of the original (nonlinear) governing equations of motion are reduced to the center manifold in the neighborhood of an equilibrium solution with the purpose of locally study the stability of the system. In this critical point, a Hopf bifurcation occurs. In this region, one can find values for the control parameter (structural damping coefficient) where the system is unstable and values where the system stability is assured (periodic motion). This local analysis of the system reduced to the center manifold assures the stable / unstable behavior of the original system around a known solution.
Resumo:
In 1859, Charles Darwin published his theory of evolution by natural selection, the process occurring based on fitness benefits and fitness costs at the individual level. Traditionally, evolution has been investigated by biologists, but it has induced mathematical approaches, too. For example, adaptive dynamics has proven to be a very applicable framework to the purpose. Its core concept is the invasion fitness, the sign of which tells whether a mutant phenotype can invade the prevalent phenotype. In this thesis, four real-world applications on evolutionary questions are provided. Inspiration for the first two studies arose from a cold-adapted species, American pika. First, it is studied how the global climate change may affect the evolution of dispersal and viability of pika metapopulations. Based on the results gained here, it is shown that the evolution of dispersal can result in extinction and indeed, evolution of dispersalshould be incorporated into the viability analysis of species living in fragmented habitats. The second study is focused on the evolution of densitydependent dispersal in metapopulations with small habitat patches. It resulted a very surprising unintuitive evolutionary phenomenon, how a non-monotone density-dependent dispersal may evolve. Cooperation is surprisingly common in many levels of life, despite of its obvious vulnerability to selfish cheating. This motivated two applications. First, it is shown that density-dependent cooperative investment can evolve to have a qualitatively different, monotone or non-monotone, form depending on modelling details. The last study investigates the evolution of investing into two public-goods resources. The results suggest one general path by which labour division can arise via evolutionary branching. In addition to applications, two novel methodological derivations of fitness measures in structured metapopulations are given.
Resumo:
Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.
Resumo:
This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.
Resumo:
The performance measurement produces information about the operation of the business process. On the basis of this information performance of the company can be followed and improved. Balanced performance measurement system can monitor performance of several perspectives and business processes can be led according to company strategy. Major part of the costs of a company is originated from purchased goods or services are an output of the buying process emphasising the importance of a reliable performance measurement of purchasing process. In the study, theory of balanced performance measurement is orientated and framework of purchasing process performance measurement system is designed. The designed balanced performance measurement system of purchasing process is tested in case company paying attention to the available data and to other environmental enablers. The balanced purchasing performance measurement system is tested and improved during the test period and attention is paid to the definition and scaling of objectives. Found development initiatives are carried out especially in the scaling of indicators. Finally results of the study are evaluated, conclusions and additional research areas proposed.
Resumo:
Ecological specialization in resource utilization has various facades ranging from nutritional resources via host use of parasites or phytophagous insects to local adaptation in different habitats. Therefore, the evolution of specialization affects the evolution of most other traits, which makes it one of the core issues in the theory of evolution. Hence, the evolution of specialization has gained enormous amounts of research interest, starting already from Darwin’s Origin of species in 1859. Vast majority of the theoretical studies has, however, focused on the mathematically most simple case with well-mixed populations and equilibrium dynamics. This thesis explores the possibilities to extend the evolutionary analysis of resource usage to spatially heterogeneous metapopulation models and to models with non-equilibrium dynamics. These extensions are enabled by the recent advances in the field of adaptive dynamics, which allows for a mechanistic derivation of the invasion-fitness function based on the ecological dynamics. In the evolutionary analyses, special focus is set to the case with two substitutable renewable resources. In this case, the most striking questions are, whether a generalist species is able to coexist with the two specialist species, and can such trimorphic coexistence be attained through natural selection starting from a monomorphic population. This is shown possible both due to spatial heterogeneity and due to non-equilibrium dynamics. In addition, it is shown that chaotic dynamics may sometimes inflict evolutionary suicide or cyclic evolutionary dynamics. Moreover, the relations between various ecological parameters and evolutionary dynamics are investigated. Especially, the relation between specialization and dispersal propensity turns out to be counter-intuitively non-monotonous. This observation served as inspiration to the analysis of joint evolution of dispersal and specialization, which may provide the most natural explanation to the observed coexistence of specialist and generalist species.
Resumo:
The topic of the present doctoral dissertation is the analysis of the phonological and tonal structures of a previously largely undescribed language, namely Samue. It is a Gur language belonging to the Niger-Congo language phulym, which is spoken in Burkina Faso. The data were collected during the fieldwork period in a Sama village; the data include 1800 lexical items, thousands of elicited sentences and 30 oral texts. The data were first transcribed phonetically and then the phonological and tonal analyses were conducted. The results show that the phonological system of Samue with the phoneme inventory and phonological processes has the same characteristics as other related Gur languages, although some particularities were found, such as the voicing and lenition of stop consonants in medial positions. Tonal analysis revealed three level tones, which have both lexical and grammatical functions. A particularity of the tonal system is the regressive Mid tone spreading in the verb phrase. The theoretical framework used in the study is Optimality theory. Optimality theory is rarely used in the analysis of an entire language system, and thus an objective was to see whether the theory was applicable to this type of work. Within the tonal analysis especially, some language specific constraints had to be created, although the basic Optimality Theory principle is the universal nature of the constraints. These constraints define the well-formedness of the language structures and they are differently ranked in different languages. This study gives new insights about typological phenomena in Gur languages. It is also a fundamental starting point for the Samue language in relation to the establishment of an orthography. From the theoretical point of view, the study proves that Optimality theory is largely applicable in the analysis of an entire sound system.
Resumo:
Modern food systems face complex global challenges such as climate change, resource scarcities, population growth, concentration and globalization. It is not possible to forecast how all these challenges will affect food systems, but futures research methods provide possibilities to enable better understanding of possible futures and that way increases futures awareness. In this thesis, the two-round online Delphi method was utilized to research experts’ opinions about the present and the future resilience of the Finnish food system up to 2050. The first round questionnaire was constructed based on the resilience indicators developed for agroecosystems. Sub-systems in the study were primary production (main focus), food industry, retail and consumption. Based on the results from the first round, the future images were constructed for primary production and food industry sub-sections. The second round asked experts’ opinion about the future images’ probability and desirability. In addition, panarchy scenarios were constructed by using the adaptive cycle and panarchy frameworks. Furthermore, a new approach to general resilience indicators was developed combining “categories” of the social ecological systems (structure, behaviors and governance) and general resilience parameters (tightness of feedbacks, modularity, diversity, the amount of change a system can withstand, capacity of learning and self- organizing behavior). The results indicate that there are strengths in the Finnish food system for building resilience. According to experts organic farms and larger farms are perceived as socially self-organized, which can promote innovations and new experimentations for adaptation to changing circumstances. In addition, organic farms are currently seen as the most ecologically self-regulated farms. There are also weaknesses in the Finnish food system restricting resilience building. It is important to reach optimal redundancy, in which efficiency and resilience are in balance. In the whole food system, retail sector will probably face the most dramatic changes in the future, especially, when panarchy scenarios and the future images are reflected. The profitability of farms is and will be a critical cornerstone of the overall resilience in primary production. All in all, the food system experts have very positive views concerning the resilience development of the Finnish food system in the future. Sometimes small and local is beautiful, sometimes large and international is more resilient. However, when probabilities and desirability of the future images were questioned, there were significant deviations. It appears that experts do not always believe desirable futures to materialize.
Resumo:
Modern food systems face complex global challenges such as climate change, resource scarcities, population growth, concentration and globalization. It is not possible to forecast how all these challenges will affect food systems, but futures research methods provide possibilities to enable better understanding of possible futures and that way increases futures awareness. In this thesis, the two-round online Delphi method was utilized to research experts’ opinions about the present and the future resilience of the Finnish food system up to 2050. The first round questionnaire was constructed based on the resilience indicators developed for agroecosystems. Sub-systems in the study were primary production (main focus), food industry, retail and consumption. Based on the results from the first round, the future images were constructed for primary production and food industry sub-sections. The second round asked experts’ opinion about the future images’ probability and desirability. In addition, panarchy scenarios were constructed by using the adaptive cycle and panarchy frameworks. Furthermore, a new approach to general resilience indicators was developed combining “categories” of the social ecological systems (structure, behaviors and governance) and general resilience parameters (tightness of feedbacks, modularity, diversity, the amount of change a system can withstand, capacity of learning and self- organizing behavior). The results indicate that there are strengths in the Finnish food system for building resilience. According to experts organic farms and larger farms are perceived as socially self-organized, which can promote innovations and new experimentations for adaptation to changing circumstances. In addition, organic farms are currently seen as the most ecologically self-regulated farms. There are also weaknesses in the Finnish food system restricting resilience building. It is important to reach optimal redundancy, in which efficiency and resilience are in balance. In the whole food system, retail sector will probably face the most dramatic changes in the future, especially, when panarchy scenarios and the future images are reflected. The profitability of farms is and will be a critical cornerstone of the overall resilience in primary production. All in all, the food system experts have very positive views concerning the resilience development of the Finnish food system in the future. Sometimes small and local is beautiful, sometimes large and international is more resilient. However, when probabilities and desirability of the future images were questioned, there were significant deviations. It appears that experts do not always believe desirable futures to materialize.
Resumo:
In this work, bromelain was recovered from ground pineapple stem and rind by means of precipitation with alcohol at low temperature. Bromelain is the name of a group of powerful protein-digesting, or proteolytic, enzymes that are particularly useful for reducing muscle and tissue inflammation and as a digestive aid. Temperature control is crucial to avoid irreversible protein denaturation and consequently to improve the quality of the enzyme recovered. The process was carried out alternatively in two fed-batch pilot tanks: a glass tank and a stainless steel tank. Aliquots containing 100 mL of pineapple aqueous extract were fed into the tank. Inside the jacketed tank, the protein was exposed to unsteady operating conditions during the addition of the precipitating agent (ethanol 99.5%) because the dilution ratio "aqueous extract to ethanol" and heat transfer area changed. The coolant flow rate was manipulated through a variable speed pump. Fine tuned conventional and adaptive PID controllers were on-line implemented using a fieldbus digital control system. The processing performance efficiency was enhanced and so was the quality (enzyme activity) of the product.
Resumo:
Optimization of quantum measurement processes has a pivotal role in carrying out better, more accurate or less disrupting, measurements and experiments on a quantum system. Especially, convex optimization, i.e., identifying the extreme points of the convex sets and subsets of quantum measuring devices plays an important part in quantum optimization since the typical figures of merit for measuring processes are affine functionals. In this thesis, we discuss results determining the extreme quantum devices and their relevance, e.g., in quantum-compatibility-related questions. Especially, we see that a compatible device pair where one device is extreme can be joined into a single apparatus essentially in a unique way. Moreover, we show that the question whether a pair of quantum observables can be measured jointly can often be formulated in a weaker form when some of the observables involved are extreme. Another major line of research treated in this thesis deals with convex analysis of special restricted quantum device sets, covariance structures or, in particular, generalized imprimitivity systems. Some results on the structure ofcovariant observables and instruments are listed as well as results identifying the extreme points of covariance structures in quantum theory. As a special case study, not published anywhere before, we study the structure of Euclidean-covariant localization observables for spin-0-particles. We also discuss the general form of Weyl-covariant phase-space instruments. Finally, certain optimality measures originating from convex geometry are introduced for quantum devices, namely, boundariness measuring how ‘close’ to the algebraic boundary of the device set a quantum apparatus is and the robustness of incompatibility quantifying the level of incompatibility for a quantum device pair by measuring the highest amount of noise the pair tolerates without becoming compatible. Boundariness is further associated to minimum-error discrimination of quantum devices, and robustness of incompatibility is shown to behave monotonically under certain compatibility-non-decreasing operations. Moreover, the value of robustness of incompatibility is given for a few special device pairs.
Resumo:
This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.