879 resultados para 3D printing,steel bars,calibration of design values,correlation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization is a relatively recent tool available to engineers for enhancing transportation project design through improved communication, decision making, and stakeholder feedback. Current visualization techniques include image composites, video composites, 2D drawings, drive-through or fly-through animations, 3D rendering models, virtual reality, and 4D CAD. These methods are used mainly to communicate within the design and construction team and between the team and external stakeholders. Use of visualization improves understanding of design intent and project concepts and facilitates effective decision making. However, visualization tools are typically used for presentation only in large-scale urban projects. Visualization is not widely accepted due to a lack of demonstrated engineering benefits for typical agency projects, such as small- and medium-sized projects, rural projects, and projects where external stakeholder communication is not a major issue. Furthermore, there is a perceived high cost of investment of both financial and human capital in adopting visualization tools. The most advanced visualization technique of virtual reality has only been used in academic research settings, and 4D CAD has been used on a very limited basis for highly complicated specialty projects. However, there are a number of less intensive visualization methods available which may provide some benefit to many agency projects. In this paper, we present the results of a feasibility study examining the use of visualization and simulation applications for improving highway planning, design, construction, and safety and mobility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization is a relatively recent tool available to engineers for enhancing transportation project design through improved communication, decision making, and stakeholder feedback. Current visualization techniques include image composites, video composites, 2D drawings, drive-through or fly-through animations, 3D rendering models, virtual reality, and 4D CAD. These methods are used mainly to communicate within the design and construction team and between the team and external stakeholders. Use of visualization improves understanding of design intent and project concepts and facilitates effective decision making. However, visualization tools are typically used for presentation only in large-scale urban projects. Visualization is not widely accepted due to a lack of demonstrated engineering benefits for typical agency projects, such as small- and medium-sized projects, rural projects, and projects where external stakeholder communication is not a major issue. Furthermore, there is a perceived high cost of investment of both financial and human capital in adopting visualization tools. The most advanced visualization technique of virtual reality has only been used in academic research settings, and 4D CAD has been used on a very limited basis for highly complicated specialty projects. However, there are a number of less intensive visualization methods available which may provide some benefit to many agency projects. In this paper, we present the results of a feasibility study examining the use of visualization and simulation applications for improving highway planning, design, construction, and safety and mobility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determination of fat-free mass (FFM) and fat mass (FM) is of considerable interest in the evaluation of nutritional status. In recent years, bioelectrical impedance analysis (BIA) has emerged as a simple, reproducible method used for the evaluation of FFM and FM, but the lack of reference values reduces its utility to evaluate nutritional status. The aim of this study was to determine reference values for FFM, FM, and %FM by BIA in a white population of healthy subjects, to observe the changes in these values with age, and to develop percentile distributions for these parameters. Whole-body resistance of 1838 healthy white men and 1555 women, aged 15-64 y, was determined by using four skin electrodes on the right hand and foot. FFM and FM were calculated according to formulas validated for the subject groups and analyzed for age decades. This is the first study to present BIA-determined age- and sex-specific percentiles for FFM, FM, and %FM for healthy subjects, aged 15-64 y. Mean FM and %FM increased progressively in men and after age 45 y in women. The results suggest that any weight gain noted with age is due to a gain in FM. In conclusion, the data presented as percentiles can serve as reference to evaluate the normality of body composition of healthy and ill subject groups at a given age.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to estimate genetic parameters and to evaluate simultaneous selection for root yield and for adaptability and stability of cassava genotypes. The effects of genotypes were assumed as fixed and random, and the mixed model methodology (REML/Blup) was used to estimate genetic parameters and the harmonic mean of the relative performance of genotypic values (HMRPGV), for simultaneous selection purposes. Ten genotypes were analyzed in a complete randomized block design, with four replicates. The experiment was carried out in the municipalities of Altamira, Santarém, and Santa Luzia do Pará in the state of Pará, Brazil, in the growing seasons of 2009/2010, 2010/2011, and 2011/2012. Roots were harvested 12 months after planting, in all tested locations. Root yield had low coefficients of genotypic variation (4.25%) and broad-sense heritability of individual plots (0.0424), which resulted in low genetic gain. Due to the low genotypic correlation (0.15), genotype classification as to root yield varied according to the environment. Genotypes CPATU 060, CPATU 229, and CPATU 404 stood out as to their yield, adaptability, and stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integrated system of design for manufacturing and assembly (DFMA) and internet based collaborative design are presented to support product design, manufacturing process, and assembly planning for axial eccentric oil-pump design. The presented system manages and schedules group oriented collaborative activities. The design guidelines of internet based collaborative design & DFMA are expressed. The components and the manufacturing stages of axial eccentric oil-pump are expressed in detail. The file formats of the presented system include the data types of collaborative design of the product, assembly design, assembly planning and assembly system design. Product design and assembly planning can be operated synchronously and intelligently and they are integrated under the condition of internet based collaborative design and DFMA. The technologies of collaborative modelling, collaborative manufacturing, and internet based collaborative assembly for the specific pump construction are developed. A seven-security level is presented to ensure the security of the internet based collaborative design system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena oli saada perustietoa tekijöistä, jotka vaikuttavat musteen kuivumiseen erilaisilla paperipinnoilla inkjet tulostuksessa. Tavoitteena oli saada tietoa erilaisista musteista, joita käytetään yleisimmissä inkjet tulostustekniikoissa, miten paperit vaikuttavat musteen kuivumiseen ja minkälaisia menetelmiä on olemassa musteen kuivumistekijöiden määrittämiseen. Lisäksi tarkoituksena oli varmistaa, voidaanko inkjetmusteiden absorptioajan määrittämiseen käytettävää DIGAT-laitetta käyttää määrittämään ja ennustamaan erilaisten musteiden kuivumista erilaisilla paperipinnoilla sekä etsiä korrelaatioita musteen absorptioajan ja teknisten paperiominaisuuksien sekä inkjet tulostuksen laadun välillä. Kirjallisuusosassa tarkasteltiin erilaisia inkjet tulostusmenetelmiä, niissä käytettäviä musteita ja musteiden koostumuksia. Tutkittiin myös paperin ja musteen välisiä vuorovaikutuksia sekä inkjet tulostuksen laatua. Kokeellisessa osassa tutkittiin musteenabsorboitumista paperiin DIGAT-laitteen avulla. kuudella eri musteella. Paperinäytteistä määritettiin teknisiä paperiominaisuuksia sekä ominaisuuksia, jotka liittyvät inkjet tulostuksen laatuun. Inkjet tulostuksen laatua tarkasteltiin tulostamalla testikuva kolmella eri tulostimella, jotka olivat Canon Bubble Jet i950, HP DeskJet Cxi970 ja Epson Stylus C46. Havaittiin, että DIGAT-laite ei sovellu määrittämään musteen absorptioaikoja kiiltäville näytteille.Tässä tutkimuksessa näyte, jonka kiilto oli 65 %, oli liian kiiltävä mitattavaksi DIGAT-laitteella. Lisäksi absorptiomäärityksissä havaittiin, että erilaiset musteet asettuvat erilailla paperin pintaan ja että pigmenttipohjaisella musteella asettumisaika oli kaikista pisin. Musteiden absorptioajat olivat nopeimpia erikoisinkjetpaperilla ja hitaimpia päällystetyillä, tiiviillä papereilla. Musteen absorptioajan ja teknisten paperiominaisuuksien ja inkjet tulostuksen laadun välisiä korrelaatioita oli vaikea havaita. Voidaan sanoa, että tulokset olivat muste- ja printterikohtaisia. Havaittiin vain muutamia teknisiä paperiominaisuuksia, jotka korreloivat hyvin musteen absorboitumisen kanssa. Nämäolivat Gurley-Hill huokoisuus, paperin tuhka- sekä kalsiumkarbonaattipitoisuus ja K&N värinabsorptio. Myöskään inkjet tulostuksen laadun ja musteen absorption välisiä korrelaatioita ei löytynyt kuin muutama; densiteetti, mottling sekä bleeding. Tämän tutkimuksen perusteella voidaan todeta DIGAT-laitteen soveltuvan hyvin kuvaamaan inkjet tulostuksen laatuominaisuuksista densiteettia, mottlingia sekä bleedingiä. DIGAT-laitetta voidaan siis käyttää avuksi ennustettaessa kuivumisaikaa ja sen vaikutusta edellä mainittuihin ominaisuuksiin. Läpipainatusominaisuuksia DIGAT-laitteen avulla ei voida tutkia, sillä ne ovat enemmän riippuvaisia paperin neliömassasta, paksuudesta ja huokoisuudesta kuinmusteen absorptioajasta. Teknisistä paperiominaisuuksista Gurley-Hill huokoisuus, paperin tuhka-sekä CaCO3-pitoisuus ja K&N värinabsorptio kuvaavat hyvin musteen imeytymisaikaa paperiin, kun taas ominaisuudet Cobb, HST ja polaari- sekädispersiokomponentit eivät kuvaa. Näyttää siltä, että testikuva, joka on tällä hetkellä käytössä UPM Tutkimus-keskuksessa, ei sovellu suurtehotulostuksen laadun tarkkailuun. Testikuva toimii hyvin pöytätulostimilla ja perinteisillä kopiopapereilla ja inkjetpapereilla, jotka on tarkoitettu tulostettaviksi hitaasti. Tulostusnopeuden ja musteen kuivumisnopeuden välisiä ilmiöitä seei tuo esille, joten se ei sovellu kuvaamaan suurtehotulostusta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thin disk and fiber lasers are new solid-state laser technologies that offer a combinationof high beam quality and a wavelength that is easily absorbed by metal surfacesand are expected to challenge the CO2 and Nd:YAG lasers in cutting of metals ofthick sections (thickness greater than 2mm). This thesis studied the potential of the disk and fiber lasers for cutting applications and the benefits of their better beam quality. The literature review covered the principles of the disk laser, high power fiber laser, CO2 laser and Nd:YAG laser as well as the principle of laser cutting. The cutting experiments were made with thedisk, fiber and CO2 lasers using nitrogen as an assist gas. The test material was austenitic stainless steel of sheet thickness 1.3mm, 2.3mm, 4.3mm and 6.2mm for the disk and fiber laser cutting experiments and sheet thickness of 1.3mm, 1.85mm, 4.4mm and 6.4mm for the CO2 laser cutting experiments. The experiments focused on the maximum cutting speeds with appropriate cut quality. Kerf width, cutedge perpendicularity and surface roughness were the cut characteristics used to analyze the cut quality. Attempts were made to draw conclusions on the influence of high beam quality on the cutting speed and cut quality. The cutting speeds were enormous for the disk and fiber laser cutting experiments with the 1.3mm and 2.3mm sheet thickness and the cut quality was good. The disk and fiber laser cutting speeds were lower at 4.3mm and 6.2mm sheet thickness but there was still a considerable percentage increase in cutting speeds compared to the CO2 laser cutting speeds at similar sheet thickness. However, the cut quality for 6.2mm thickness was not very good for the disk and fiber laser cutting experiments but could probably be improved by proper selection of cutting parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research has been focused at the development of a tuned systematic design methodology, which gives the best performance in a computer aided environment and utilises a cross-technological approach, specially tested with and for laser processed microwave mechanics. A tuned design process scheme is also presented. Because of the currently large production volumes of microwave and radio frequency mechanics even slight improvements of design methodologies or manufacturing technologies would give reasonable possibilities for cost reduction. The typical number of required iteration cycles could be reduced to one fifth of normal. The research area dealing with the methodologies is divided firstly into a function-oriented, a performance-oriented or a manufacturability-oriented product design. Alternatively various approaches can be developed for a customer-oriented, a quality-oriented, a cost-oriented or an organisation-oriented design. However, the real need for improvements is between these two extremes. This means that the effective methodology for the designers should not be too limited (like in the performance-oriented design) or too general (like in the organisation-oriented design), but it should, include the context of the design environment. This is the area where the current research is focused. To test the developed tuned design methodology for laser processing (TDMLP) and the tuned optimising algorithm for laser processing (TOLP), seven different industrial product applications for microwave mechanics have been designed, CAD-modelled and manufactured by using laser in small production series. To verify that the performance of these products meets the required level and to ensure the objectiveness ofthe results extensive laboratory tests were used for all designed prototypes. As an example a Ku-band horn antenna can be laser processed from steel in 2 minutes at the same time obtaining a comparable electrical performance of classical aluminium units or the residual resistance of a laser joint in steel could be limited to 72 milliohmia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mechanistic soil-crop models have become indispensable tools to investigate the effect of management practices on the productivity or environmental impacts of arable crops. Ideally these models may claim to be universally applicable because they simulate the major processes governing the fate of inputs such as fertiliser nitrogen or pesticides. However, because they deal with complex systems and uncertain phenomena, site-specific calibration is usually a prerequisite to ensure their predictions are realistic. This statement implies that some experimental knowledge on the system to be simulated should be available prior to any modelling attempt, and raises a tremendous limitation to practical applications of models. Because the demand for more general simulation results is high, modellers have nevertheless taken the bold step of extrapolating a model tested within a limited sample of real conditions to a much larger domain. While methodological questions are often disregarded in this extrapolation process, they are specifically addressed in this paper, and in particular the issue of models a priori parameterisation. We thus implemented and tested a standard procedure to parameterize the soil components of a modified version of the CERES models. The procedure converts routinely-available soil properties into functional characteristics by means of pedo-transfer functions. The resulting predictions of soil water and nitrogen dynamics, as well as crop biomass, nitrogen content and leaf area index were compared to observations from trials conducted in five locations across Europe (southern Italy, northern Spain, northern France and northern Germany). In three cases, the model’s performance was judged acceptable when compared to experimental errors on the measurements, based on a test of the model’s root mean squared error (RMSE). Significant deviations between observations and model outputs were however noted in all sites, and could be ascribed to various model routines. In decreasing importance, these were: water balance, the turnover of soil organic matter, and crop N uptake. A better match to field observations could therefore be achieved by visually adjusting related parameters, such as field-capacity water content or the size of soil microbial biomass. As a result, model predictions fell within the measurement errors in all sites for most variables, and the model’s RMSE was within the range of published values for similar tests. We conclude that the proposed a priori method yields acceptable simulations with only a 50% probability, a figure which may be greatly increased through a posteriori calibration. Modellers should thus exercise caution when extrapolating their models to a large sample of pedo-climatic conditions for which they have only limited information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Body mass index (BMI) may cluster in space among adults and be spatially dependent. Whether and how BMI clusters evolve over time in a population is currently unknown. We aimed to determine the spatial dependence of BMI and its 5-year evolution in a Swiss general adult urban population, taking into account the neighbourhood-level and individual-level characteristics. DESIGN: Cohort study. SETTING: Swiss general urban population. PARTICIPANTS: 6481 georeferenced individuals from the CoLaus cohort at baseline (age range 35-74 years, period=2003-2006) and 4460 at follow-up (period=2009-2012). OUTCOME MEASURES: Body weight and height were measured by trained healthcare professionals with participants standing without shoes in light indoor clothing. BMI was calculated as weight (kg) divided by height squared (m(2)). Participants were geocoded using their postal address (geographic coordinates of the place of residence). Getis-Ord Gi statistic was used to measure the spatial dependence of BMI values at baseline and its evolution at follow-up. RESULTS: BMI was not randomly distributed across the city. At baseline and at follow-up, significant clusters of high versus low BMIs were identified and remained stable during the two periods. These clusters were meaningfully attenuated after adjustment for neighbourhood-level income but not individual-level characteristics. Similar results were observed among participants who showed a significant weight gain. CONCLUSIONS: To the best of our knowledge, this is the first study to report longitudinal changes in BMI clusters in adults from a general population. Spatial clusters of high BMI persisted over a 5-year period and were mainly influenced by neighbourhood-level income.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, manufacturability analysis and collection of design aspects is made for a microwave test-fixture. Aspects of applying systematic design for a microwave test-fixture design and manufacturing are also analysed. Special questionnaires for the component and machining are made in order to enable necessary information to ensure DFM(A) – aspects of the component. The aspects of easy manufacturing for machining the microwave test-fixture are collected. Material selection is discussed and manufacturing stages of prototype manufacturing are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We live in an era defined by a wealth of open and readily available information, and the accelerated evolution of social, mobile and creative technologies. The provision of knowledge, once a primary role of educators, is now devolved to an immense web of free and readily accessible sources. Consequently, educators need to redefine their role not just ¿from sage on the stage to guide on the side¿ but, as more and more voices insist, as ¿designers for learning¿.The call for such a repositioning of educators is heard from leaders in the field of technology-enhanced learning (TEL) and resonates well with the growing culture of design-based research in Education. However, it is still struggling to find a foothold in educational practice. We contend that the root causes of this discrepancy are the lack of articulation of design practices and methods, along with a shortage of tools and representations to support such practices, a lack of a culture of teacher-as-designer among practitioners, and insufficient theoretical development.The Art and Science of Learning Design (ASLD) explores the frameworks, methods, and tools available for teachers, technologists and researchers interested in designing for learning Learning Design theories arising from findings of research are explored, drawing upon research and practitioner experiences. It then surveys current trends in the practices, methods, and methodologies of Learning Design. Highlighting the translation of theory into practice, this book showcases some of the latest tools that support the learning design process itself.