89 resultados para DYNAMIC PROGRAMMING


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkielman tavoitteena on määritellä keskeiset ja sopivat asiakasportfoliomallit ja asiakasmatriisit asiakassuhteen määrittämiseen. Tutkimus keskittyy asiakassuhteen arvottamiseen ja avainasiakkaiden määrittämiseen kohdeyrityksessä. Keskeisimmät ja sopivimmat asiakasportfliomallit huomioidaan asiakkaiden arvioinnissa. Tutkielman teoriaosassa esitellään tunnetuimmat ja käytetyimmät asiakasportfoliomallit ja matriisit alan kirjallisuuden perusteella. Tämän lisäksi asiakasportfoliomalleihin yhdistetään näkökulmia suhdemarkkinoinnin, asiakkuuksien johtamisen ja tuoteportfolioiden teorioista. Keskeisimmät kirjallisuuden lähteet ovat johtamisen ja markkinoinnin alalta. Tutkielman empiriaosassa esitellään kohdeyritys ja sen tämän hetkinen asiakassuhteiden johtamiskäytäntö. Lisäksi tehdään parannusehdotuksia kohdeyrityksen nykyiseen asiakassuhteiden arvottamismenetelmään jotta asiakassuhteiden arvon laskeminen vastaisi mahdollisimman hyvin kohdeyrityksen nykyisiä tarpeita. Asiakassuhteen arvon määrittämiseksi käytetään myös fokusryhmähaastattelua. Avainasiakkaat määritellään ja tilannetta havainnollistetaan sijoittamalla avainasiakkaat asiakasportfolioon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given. The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study is to show that bone strains due to dynamic mechanical loading during physical activity can be analysed using the flexible multibody simulation approach. Strains within the bone tissue play a major role in bone (re)modeling. Based on previous studies, it has been shown that dynamic loading seems to be more important for bone (re)modeling than static loading. The finite element method has been used previously to assess bone strains. However, the finite element method may be limited to static analysis of bone strains due to the expensive computation required for dynamic analysis, especially for a biomechanical system consisting of several bodies. Further, in vivo implementation of strain gauges on the surfaces of bone has been used previously in order to quantify the mechanical loading environment of the skeleton. However, in vivo strain measurement requires invasive methodology, which is challenging and limited to certain regions of superficial bones only, such as the anterior surface of the tibia. In this study, an alternative numerical approach to analyzing in vivo strains, based on the flexible multibody simulation approach, is proposed. In order to investigate the reliability of the proposed approach, three 3-dimensional musculoskeletal models where the right tibia is assumed to be flexible, are used as demonstration examples. The models are employed in a forward dynamics simulation in order to predict the tibial strains during walking on a level exercise. The flexible tibial model is developed using the actual geometry of the subject’s tibia, which is obtained from 3 dimensional reconstruction of Magnetic Resonance Images. Inverse dynamics simulation based on motion capture data obtained from walking at a constant velocity is used to calculate the desired contraction trajectory for each muscle. In the forward dynamics simulation, a proportional derivative servo controller is used to calculate each muscle force required to reproduce the motion, based on the desired muscle contraction trajectory obtained from the inverse dynamics simulation. Experimental measurements are used to verify the models and check the accuracy of the models in replicating the realistic mechanical loading environment measured from the walking test. The predicted strain results by the models show consistency with literature-based in vivo strain measurements. In conclusion, the non-invasive flexible multibody simulation approach may be used as a surrogate for experimental bone strain measurement, and thus be of use in detailed strain estimation of bones in different applications. Consequently, the information obtained from the present approach might be useful in clinical applications, including optimizing implant design and devising exercises to prevent bone fragility, accelerate fracture healing and reduce osteoporotic bone loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

COD discharges out of processes have increased in line with elevating brightness demands for mechanical pulp and papers. The share of lignin-like substances in COD discharges is on average 75%. In this thesis, a plant dynamic model was created and validated as a means to predict COD loading and discharges out of a mill. The assays were carried out in one paper mill integrate producing mechanical printing papers. The objective in the modeling of plant dynamics was to predict day averages of COD load and discharges out of mills. This means that online data, like 1) the level of large storage towers of pulp and white water 2) pulp dosages, 3) production rates and 4) internal white water flows and discharges were used to create transients into the balances of solids and white water, referred to as “plant dynamics”. A conversion coefficient was verified between TOC and COD. The conversion coefficient was used for predicting the flows from TOC to COD to the waste water treatment plant. The COD load was modeled with similar uncertainty as in reference TOC sampling. The water balance of waste water treatment was validated by the reference concentration of COD. The difference of COD predictions against references was within the same deviation of TOC-predictions. The modeled yield losses and retention values of TOC in pulping and bleaching processes and the modeled fixing of colloidal TOC to solids between the pulping plant and the aeration basin in the waste water treatment plant were similar to references presented in literature. The valid water balances of the waste water treatment plant and the reduction model of lignin-like substances produced a valid prediction of COD discharges out of the mill. A 30% increase in the release of lignin-like substances in the form of production problems was observed in pulping and bleaching processes. The same increase was observed in COD discharges out of waste water treatment. In the prediction of annual COD discharge, it was noticed that the reduction of lignin has a wide deviation from year to year and from one mill to another. This made it difficult to compare the parameters of COD discharges validated in plant dynamic simulation with another mill producing mechanical printing papers. However, a trend of moving from unbleached towards high-brightness TMP in COD discharges was valid.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a literature review which describes the construction of state of the art of permanent magnet generators and motors constructing and discusses the current and possible application of these machines in industry. Permanent magnet machines are a well-know class of rotating and linear electric machines used for many years in industrial applications. A particular interest for permanent magnet generators is connected with wind mills, which seem to be becoming increasingly popular nowadays. Geared and direct-driven permanent magnet generators are described. A classification of direct-driven permanent magnet generators is given. Design aspects of permanent magnet generators are presented. Permanent magnet generators for wind turbines designs are highlighted. Dynamics and vibration problems of permanent magnet generators covered in literature are presented. The application of the Finite Element Method for mechanical problems solution in the field of permanent magnet generators is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technical analysis of Low Voltage Direct Current (LVDC) distribution systems shows that in LVDC transmission the customer voltage quality is higher. One of the problems in LVDC distribution networks that converters both ends of the DC line are required. Because of the converters produce not pure DC voltage, but some fluctuations as well, the huge electrolytic capacitors are required to reduce voltage distortions in the DC-side. This thesis master’s thesis is focused on calculating required DC-link capacitance for LVDC transmission and estimation of the influence of different parameters on the voltage quality. The goal is to investigate the methods of the DC-link capacitance estimation and location in the transmission line.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today´s organizations must have the ability to react to rapid changes in the market. These rapid changes cause pressure to continuously find new efficient ways to organize work practices. Increased competition requires businesses to become more effective and to pay attention to quality of management and to make people to understand their work's impact on the final result. The fundamentals in continmuois improvement are systematic and agile tackling of indentified individual process constraints and the fact tha nothin finally improves without changes. Successful continuous improvement requires management commitment, education, implementation, measurement, recognition and regeneration. These ingredients form the foundation, both for breakthrough projects and small step ongoing improvement activities. One part of the organization's management system are the quality tools, which provide systematic methodologies for identifying problems, defining their root causes, finding solutions, gathering and sorting of data, supporting decision making and implementing the changes, and many other management tasks. Organizational change management includes processes and tools for managing the people in an organizational level change. These tools include a structured approach, which can be used for effective transition of organizations through change. When combined with the understanding of change management of individuals, these tools provide a framework for managing people in change,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Matkustajainformaatio junassa koostuu vaunujen ulkopuolisilla kylkinäytöillä esitettävistä junan lähtö-, väli- ja määräasematiedoista yhdessä junan ja vaunujen myyntinumeroiden kanssa sekä vaunujen sisäpuolella automaattisista kuulutuksista ja matkustamon näytöillä esitettävästä staattisesta ja vaihtuvasta informaatiosta. Työssä toteutetaan matkustajainformaatiojärjestelmä käytettäväksi matkustajunissa. Järjestelmään syötetään ennen matkan alkua junan tiedot, jonka jälkeen se toimii automaattisesti ilman tarvetta junahenkilökunnan toimenpiteille. Poikkeustilanteissa junahenkilökunta voi estää järjestelmän toiminnan tai valita esiohjelmointuja erikoiskuulutuksia. Toteuttamismenetelmäksi valittiin C-ohjelmointikieli Linux-käyttöjärjestelmällä varustetulla sulautetulla rautatiekäyttöön suunnitellulla laitealustalla.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents an automatic, computer-aided analytical method called Comparison Structure Analysis (CSA), which can be applied to different dimensions of music. The aim of CSA is first and foremost practical: to produce dynamic and understandable representations of musical properties by evaluating the prevalence of a chosen musical data structure through a musical piece. Such a comparison structure may refer to a mathematical vector, a set, a matrix or another type of data structure and even a combination of data structures. CSA depends on an abstract systematic segmentation that allows for a statistical or mathematical survey of the data. To choose a comparison structure is to tune the apparatus to be sensitive to an exclusive set of musical properties. CSA settles somewhere between traditional music analysis and computer aided music information retrieval (MIR). Theoretically defined musical entities, such as pitch-class sets, set-classes and particular rhythm patterns are detected in compositions using pattern extraction and pattern comparison algorithms that are typical within the field of MIR. In principle, the idea of comparison structure analysis can be applied to any time-series type data and, in the music analytical context, to polyphonic as well as homophonic music. Tonal trends, set-class similarities, invertible counterpoints, voice-leading similarities, short-term modulations, rhythmic similarities and multiparametric changes in musical texture were studied. Since CSA allows for a highly accurate classification of compositions, its methods may be applicable to symbolic music information retrieval as well. The strength of CSA relies especially on the possibility to make comparisons between the observations concerning different musical parameters and to combine it with statistical and perhaps other music analytical methods. The results of CSA are dependent on the competence of the similarity measure. New similarity measures for tonal stability, rhythmic and set-class similarity measurements were proposed. The most advanced results were attained by employing the automated function generation – comparable with the so-called genetic programming – to search for an optimal model for set-class similarity measurements. However, the results of CSA seem to agree strongly, independent of the type of similarity function employed in the analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning from demonstration becomes increasingly popular as an efficient way of robot programming. Not only a scientific interest acts as an inspiration in this case but also the possibility of producing the machines that would find application in different areas of life: robots helping with daily routine at home, high performance automata in industries or friendly toys for children. One way to teach a robot to fulfill complex tasks is to start with simple training exercises, combining them to form more difficult behavior. The objective of the Master’s thesis work was to study robot programming with visual input. Dynamic movement primitives (DMPs) were chosen as a tool for motion learning and generation. Assuming a movement to be a spring system influenced by an external force, making this system move, DMPs represent the motion as a set of non-linear differential equations. During the experiments the properties of DMP, such as temporal and spacial invariance, were examined. The effect of the DMP parameters, including spring coefficient, damping factor, temporal scaling, on the trajectory generated were studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Western societies have been faced with the fact that overweight, impaired glucose regulation and elevated blood pressure are already prevalent in pediatric populations. This will inevitably mean an increase in later manifestations of cardio-metabolic diseases. The dilemma has been suggested to stem from fetal life and it is surmised that the early nutritional environment plays an important role in the process called programming. The aim of the present study was to characterize early nutritional determinants associating with cardio-metabolic risk factors in fetuses, infants and children. Further, the study was designated to establish whether dietary counseling initiated in early pregnancy can modify this cascade. Healthy mother-child pairs (n=256) participating in a dietary intervention study were followed from early pregnancy to childhood. The intervention included detailed dietary counseling by a nutritionist targeting saturated fat intake in excess of recommendations and fiber consumption below recommendations. Cardio-metabolic programming was studied by characterizing the offspring’s cardio-metabolic risk factors such as over-activation of the autonomic nervous system, elevated blood pressure and adverse metabolic status (e.g. serum high split proinsulin concentration). Fetal cardiac sympathovagal activation was measured during labor. Postnatally, children’s blood pressure was measured at six-month and four-year follow-up visits. Further, infants’ metabolic status was assessed by means of growth and serum biomarkers (32-33 split proinsulin, leptin and adiponectin) at the age of six months. This study proved that fetal cardiac sympathovagal activity was positively associated with maternal pre-pregnancy body mass index indicating adverse cardio-metabolic programming in the offspring. Further, a reduced risk of high split proinsulin in infancy and lower blood pressure in childhood were found in those offspring whose mothers’ weight gain and amount and type of fats in the diet during pregnancy were as recommended. Of note, maternal dietary counseling from early pregnancy onwards could ameliorate the offspring’s metabolic status by reducing the risk of high split proinsulin concentration, although it had no effect on the other cardio-metabolic markers in the offspring. At postnatal period breastfeeding proved to entail benefits in cardio-metabolic programming. Finally, the recommended dietary protein and total fat content in the child’s diet were important nutritional determinants reducing blood pressure at the age of four years. The intrauterine and immediate postnatal period comprise a window of opportunity for interventions aiming to reduce the risk of cardio-metabolic disorders and brings the prospect of achieving health benefits over one generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.