69 resultados para dynamic barrier
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Recently, due to the increasing total construction and transportation cost and difficulties associated with handling massive structural components or assemblies, there has been increasing financial pressure to reduce structural weight. Furthermore, advances in material technology coupled with continuing advances in design tools and techniques have encouraged engineers to vary and combine materials, offering new opportunities to reduce the weight of mechanical structures. These new lower mass systems, however, are more susceptible to inherent imbalances, a weakness that can result in higher shock and harmonic resonances which leads to poor structural dynamic performances. The objective of this thesis is the modeling of layered sheet steel elements, to accurately predict dynamic performance. During the development of the layered sheet steel model, the numerical modeling approach, the Finite Element Analysis and the Experimental Modal Analysis are applied in building a modal model of the layered sheet steel elements. Furthermore, in view of getting a better understanding of the dynamic behavior of layered sheet steel, several binding methods have been studied to understand and demonstrate how a binding method affects the dynamic behavior of layered sheet steel elements when compared to single homogeneous steel plate. Based on the developed layered sheet steel model, the dynamic behavior of a lightweight wheel structure to be used as the structure for the stator of an outer rotor Direct-Drive Permanent Magnet Synchronous Generator designed for high-power wind turbines is studied.
Resumo:
The main objective of the present study was to analyze the best approach on how to coat paperboard trays at the pressing stage. The coating gives the paperboard enhanced barrier and mechanical properties. The whole process chain of the barrier coating development was studied in the research. The methodology applied includes obtaining the optimum temperature at which good adhesion and bonding is formed between paperboard and skin film. Evaluation of mechanical properties after the coatings; such as cracking, curling and barrier properties was performed.
Resumo:
Tool center point calibration is a known problem in industrial robotics. The major focus of academic research is to enhance the accuracy and repeatability of next generation robots. However, operators of currently available robots are working within the limits of the robot´s repeatability and require calibration methods suitable for these basic applications. This study was conducted in association with Stresstech Oy, which provides solutions for manufacturing quality control. Their sensor, based on the Barkhausen noise effect, requires accurate positioning. The accuracy requirement admits a tool center point calibration problem if measurements are executed with an industrial robot. Multiple possibilities are available in the market for automatic tool center point calibration. Manufacturers provide customized calibrators to most robot types and tools. With the handmade sensors and multiple robot types that Stresstech uses, this would require great deal of labor. This thesis introduces a calibration method that is suitable for all robots which have two digital input ports free. It functions with the traditional method of using a light barrier to detect the tool in the robot coordinate system. However, this method utilizes two parallel light barriers to simultaneously measure and detect the center axis of the tool. Rotations about two axes are defined with the center axis. The last rotation about the Z-axis is calculated for tools that have different width of X- and Y-axes. The results indicate that this method is suitable for calibrating the geometric tool center point of a Barkhausen noise sensor. In the repeatability tests, a standard deviation inside robot repeatability was acquired. The Barkhausen noise signal was also evaluated after recalibration and the results indicate correct calibration. However, future studies should be conducted using a more accurate manipulator, since the method employs the robot itself as a measuring device.
Resumo:
Rolling element bearings are essential components of rotating machinery. The spherical roller bearing (SRB) is one variant seeing increasing use, because it is self-aligning and can support high loads. It is becoming increasingly important to understand how the SRB responds dynamically under a variety of conditions. This doctoral dissertation introduces a computationally efficient, three-degree-of-freedom, SRB model that was developed to predict the transient dynamic behaviors of a rotor-SRB system. In the model, bearing forces and deflections were calculated as a function of contact deformation and bearing geometry parameters according to nonlinear Hertzian contact theory. The results reveal how some of the more important parameters; such as diametral clearance, the number of rollers, and osculation number; influence ultimate bearing performance. Distributed defects, such as the waviness of the inner and outer ring, and localized defects, such as inner and outer ring defects, are taken into consideration in the proposed model. Simulation results were verified with results obtained by applying the formula for the spherical roller bearing radial deflection and the commercial bearing analysis software. Following model verification, a numerical simulation was carried out successfully for a full rotor-bearing system to demonstrate the application of this newly developed SRB model in a typical real world analysis. Accuracy of the model was verified by comparing measured to predicted behaviors for equivalent systems.
Resumo:
In literature CO 2 liquidization is well studied with steady state modeling. Steady state modeling gives an overview of the process but it doesn’t give information about process behavior during transients. In this master’s thesis three dynamic models of CO2 liquidization were made and tested. Models were straight multi-stage compression model and two compression liquid pumping models, one with and one without cold energy recovery. Models were made with Apros software, models were also used to verify that Apros is capable to model phase changes and over critical state of CO 2. Models were verified against compressor manufacturer’s data and simulation results presented in literature. From the models made in this thesis, straight compression model was found to be the most energy efficient and fastest to react to transients. Also Apros was found to be capable tool for dynamic liquidization modeling.
Resumo:
Traditionally real estate has been seen as a good diversification tool for a stock portfolio due to the lower return and volatility characteristics of real estate investments. However, the diversification benefits of a multi-asset portfolio depend on how the different asset classes co-move in the short- and long-run. As the asset classes are affected by the same macroeconomic factors, interrelationships limiting the diversification benefits could exist. This master’s thesis aims to identify such dynamic linkages in the Finnish real estate and stock markets. The results are beneficial for portfolio optimization tasks as well as for policy-making. The real estate industry can be divided into direct and securitized markets. In this thesis the direct market is depicted by the Finnish housing market index. The securitized market is proxied by the Finnish all-sectors securitized real estate index and by a European residential Real Estate Investment Trust index. The stock market is depicted by OMX Helsinki Cap index. Several macroeconomic variables are incorporated as well. The methodology of this thesis is based on the Vector Autoregressive (VAR) models. The long-run dynamic linkages are studied with Johansen’s cointegration tests and the short-run interrelationships are examined with Granger-causality tests. In addition, impulse response functions and forecast error variance decomposition analyses are used for robustness checks. The results show that long-run co-movement, or cointegration, did not exist between the housing and stock markets during the sample period. This indicates diversification benefits in the long-run. However, cointegration between the stock and securitized real estate markets was identified. This indicates limited diversification benefits and shows that the listed real estate market in Finland is not matured enough to be considered a separate market from the general stock market. Moreover, while securitized real estate was shown to cointegrate with the housing market in the long-run, the two markets are still too different in their characteristics to be used as substitutes in a multi-asset portfolio. This implies that the capital intensiveness of housing investments cannot be circumvented by investing in securitized real estate.
Resumo:
The purpose of this Thesis is to find the most optimal heat recovery solution for Wärtsilä’s dynamic district heating power plant considering Germany energy markets as in Germany government pays subsidies for CHP plants in order to increase its share of domestic power production to 25 % by 2020. Different heat recovery connections have been simulated dozens to be able to determine the most efficient heat recovery connections. The purpose is also to study feasibility of different heat recovery connections in the dynamic district heating power plant in the Germany markets thus taking into consideration the day ahead electricity prices, district heating network temperatures and CHP subsidies accordingly. The auxiliary cooling, dynamical operation and cost efficiency of the power plant is also investigated.
Resumo:
This thesis studies the impact of the latest Russian crisis on global markets, and especially Central and Eastern Europe. The results are compared to other shocks and crises over the last twenty years to see how significant they have been. The cointegration process of Central and Eastern European financial markets is also reviewed and updated. Using three separate conditional correlation GARCH models, the latest crisis is not found to have initiated similar surges in conditional correlations to previous crises over the last two decades. Market cointegration for Central and Eastern Europe is found to have stalled somewhat after initial correlation increases post EU accession.
Resumo:
Guided by the social-ecological conceptualization of bullying, this thesis examines the implications of classroom and school contexts—that is, students’ shared microsystems—for peer-to-peer bullying and antibullying practices. Included are four original publications, three of which are empirical studies utilizing data from a large Finnish sample of students in the upper grade levels of elementary school. Both self- and peer reports of bullying and victimization are utilized, and the hierarchical nature of the data collected from students nested within school ecologies is accounted for by multilevel modeling techniques. The first objective of the thesis is to simultaneously examine risk factors for victimization at individual, classroom, and school levels (Study I). The second objective is to uncover the individual- and classroom-level working mechanisms of the KiVa antibullying program which has been shown to be effective in reducing bullying problems in Finnish schools (Study II). Thirdly, an overview of the extant literature on classroom- and school-level contributions to bullying and victimization is provided (Study III). Finally, attention is paid to the assessment of victimization and, more specifically, to how the classroom context influences the concordance between self- and peer reports of victimization (Study IV). Findings demonstrate the multiple ways in which contextual factors, and importantly students’ perceptions thereof, contribute to the bullying dynamic and efforts to counteract it. Whereas certain popular beliefs regarding the implications of classroom and school contexts do not receive support, the role of peer contextual factors and the significance of students’ perceptions of teachers’ attitudes toward bullying are highlighted. Directions for future research and school-based antibullying practices are suggested.
Resumo:
The increasing use of energy, food, and materials by the growing population in the world is leading to the situation where alternative solutions from renewable carbon resources are sought after. The growing use of plastics depends on the raw-oil production while oil refining are politically governed and required for the polymer manufacturing is not sustainable in terms of carbon footprint. The amount of packaging is also increasing. Packaging is not only utilising cardboard and paper, but also plastics. The synthetic petroleum-derived plastics and inner-coatings in food packaging can be substituted with polymeric material from the renewable resources. The trees in Finnish forests constitute a huge resource, which ought to be utilised more effectively than it is today. One underutilised component of the forests is the wood-derived hemicelluloses, although Spruce Oacetyl-galactoglucomannans (GGMs) have previously shown high potential for material applications and can be recovered in large scale. Hemicelluloses are hydrophilic in their native state, which restrains the use of them for food packaging as non-dry item. To cope with this challenge, we intended to make GGMs more hydrophobic or amphiphilic by chemical grafting and consequently with the focus of using them for barrier applications. Methods of esterification with anhydrides and cationic etherification with a trimethyl ammonium moiety were established. A method of controlled synthesis to obtain the desired properties by the means of altering temperature, reaction time, the quantity of the reagent, and even the solvent for purification of the products was developed. Numerous analytical tools, such as NMR, FTIR, SEC-MALLS/RI, MALDI-TOF-MS, RP-HPLC and polyelectrolyte titration were used to evaluate the products from different perspectives and to acquire parallel proofs of their chemical structure. Modified GGMs with different degree of substitution and the correlating level of hydrophobicity was applied as coatings on cartonboard and on nanofibrillated cellulose-GGM films to exhibit barrier functionality. The water dispersibility in processing was maintained with GGM esters with low DS. The use of chemically functionalised GGM was evaluated for the use as barriers against water, oxygen and grease for the food packaging purposes. The results show undoubtedly that GGM derivatives exhibit high potential to function as a barrier material in food packaging.