862 resultados para machine tools and accessories


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-invasive documentation methods such as surface scanning and radiological imaging are gaining in importance in the forensic field. These three-dimensional technologies provide digital 3D data, which are processed and handled in the computer. However, the sense of touch gets lost using the virtual approach. The haptic device enables the use of the sense of touch to handle and feel digital 3D data. The multifunctional application of a haptic device for forensic approaches is evaluated and illustrated in three different cases: the representation of bone fractures of the lower extremities, by traffic accidents, in a non-invasive manner; the comparison of bone injuries with the presumed injury-inflicting instrument; and in a gunshot case, the identification of the gun by the muzzle imprint, and the reconstruction of the holding position of the gun. The 3D models of the bones are generated from the Computed Tomography (CT) images. The 3D models of the exterior injuries, the injury-inflicting tools and the bone injuries, where a higher resolution is necessary, are created by the optical surface scan. The haptic device is used in combination with the software FreeForm Modelling Plus for touching the surface of the 3D models to feel the minute injuries and the surface of tools, to reposition displaced bone parts and to compare an injury-causing instrument with an injury. The repositioning of 3D models in a reconstruction is easier, faster and more precisely executed by means of using the sense of touch and with the user-friendly movement in the 3D space. For representation purposes, the fracture lines of bones are coloured. This work demonstrates that the haptic device is a suitable and efficient application in forensic science. The haptic device offers a new way in the handling of digital data in the virtual 3D space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metals price risk management is a key issue related to financial risk in metal markets because of uncertainty of commodity price fluctuation, exchange rate, interest rate changes and huge price risk either to metals’ producers or consumers. Thus, it has been taken into account by all participants in metal markets including metals’ producers, consumers, merchants, banks, investment funds, speculators, traders and so on. Managing price risk provides stable income for both metals’ producers and consumers, so it increases the chance that a firm will invest in attractive projects. The purpose of this research is to evaluate risk management strategies in the copper market. The main tools and strategies of price risk management are hedging and other derivatives such as futures contracts, swaps and options contracts. Hedging is a transaction designed to reduce or eliminate price risk. Derivatives are financial instruments, whose returns are derived from other financial instruments and they are commonly used for managing financial risks. Although derivatives have been around in some form for centuries, their growth has accelerated rapidly during the last 20 years. Nowadays, they are widely used by financial institutions, corporations, professional investors, and individuals. This project is focused on the over-the-counter (OTC) market and its products such as exotic options, particularly Asian options. The first part of the project is a description of basic derivatives and risk management strategies. In addition, this part discusses basic concepts of spot and futures (forward) markets, benefits and costs of risk management and risks and rewards of positions in the derivative markets. The second part considers valuations of commodity derivatives. In this part, the options pricing model DerivaGem is applied to Asian call and put options on London Metal Exchange (LME) copper because it is important to understand how Asian options are valued and to compare theoretical values of the options with their market observed values. Predicting future trends of copper prices is important and would be essential to manage market price risk successfully. Therefore, the third part is a discussion about econometric commodity models. Based on this literature review, the fourth part of the project reports the construction and testing of an econometric model designed to forecast the monthly average price of copper on the LME. More specifically, this part aims at showing how LME copper prices can be explained by means of a simultaneous equation structural model (two-stage least squares regression) connecting supply and demand variables. A simultaneous econometric model for the copper industry is built: {█(Q_t^D=e^((-5.0485))∙P_((t-1))^((-0.1868) )∙〖GDP〗_t^((1.7151) )∙e^((0.0158)∙〖IP〗_t ) @Q_t^S=e^((-3.0785))∙P_((t-1))^((0.5960))∙T_t^((0.1408))∙P_(OIL(t))^((-0.1559))∙〖USDI〗_t^((1.2432))∙〖LIBOR〗_((t-6))^((-0.0561))@Q_t^D=Q_t^S )┤ P_((t-1))^CU=e^((-2.5165))∙〖GDP〗_t^((2.1910))∙e^((0.0202)∙〖IP〗_t )∙T_t^((-0.1799))∙P_(OIL(t))^((0.1991))∙〖USDI〗_t^((-1.5881))∙〖LIBOR〗_((t-6))^((0.0717) Where, Q_t^D and Q_t^Sare world demand for and supply of copper at time t respectively. P(t-1) is the lagged price of copper, which is the focus of the analysis in this part. GDPt is world gross domestic product at time t, which represents aggregate economic activity. In addition, industrial production should be considered here, so the global industrial production growth that is noted as IPt is included in the model. Tt is the time variable, which is a useful proxy for technological change. A proxy variable for the cost of energy in producing copper is the price of oil at time t, which is noted as POIL(t ) . USDIt is the U.S. dollar index variable at time t, which is an important variable for explaining the copper supply and copper prices. At last, LIBOR(t-6) is the 6-month lagged 1-year London Inter bank offering rate of interest. Although, the model can be applicable for different base metals' industries, the omitted exogenous variables such as the price of substitute or a combined variable related to the price of substitutes have not been considered in this study. Based on this econometric model and using a Monte-Carlo simulation analysis, the probabilities that the monthly average copper prices in 2006 and 2007 will be greater than specific strike price of an option are defined. The final part evaluates risk management strategies including options strategies, metal swaps and simple options in relation to the simulation results. The basic options strategies such as bull spreads, bear spreads and butterfly spreads, which are created by using both call and put options in 2006 and 2007 are evaluated. Consequently, each risk management strategy in 2006 and 2007 is analyzed based on the day of data and the price prediction model. As a result, applications stemming from this project include valuing Asian options, developing a copper price prediction model, forecasting and planning, and decision making for price risk management in the copper market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chapter 1 is used to introduce the basic tools and mechanics used within this thesis. Most of the definitions used in the thesis will be defined, and we provide a basic survey of topics in graph theory and design theory pertinent to the topics studied in this thesis. In Chapter 2, we are concerned with the study of fixed block configuration group divisible designs, GDD(n; m; k; λ1; λ2). We study those GDDs in which each block has configuration (s; t), that is, GDDs in which each block has exactly s points from one of the two groups and t points from the other. Chapter 2 begins with an overview of previous results and constructions for small group size and block sizes 3, 4 and 5. Chapter 2 is largely devoted to presenting constructions and results about GDDs with two groups and block size 6. We show the necessary conditions are sufficient for the existence of GDD(n, 2, 6; λ1, λ2) with fixed block configuration (3; 3). For configuration (1; 5), we give minimal or nearminimal index constructions for all group sizes n ≥ 5 except n = 10, 15, 160, or 190. For configuration (2, 4), we provide constructions for several families ofGDD(n, 2, 6; λ1, λ2)s. Chapter 3 addresses characterizing (3, r)-regular graphs. We begin with providing previous results on the well studied class of (2, r)-regular graphs and some results on the structure of large (t; r)-regular graphs. In Chapter 3, we completely characterize all (3, 1)-regular and (3, 2)-regular graphs, as well has sharpen existing bounds on the order of large (3, r)- regular graphs of a certain form for r ≥ 3. Finally, the appendix gives computational data resulting from Sage and C programs used to generate (3, 3)-regular graphs on less than 10 vertices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished. Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chapter 1 is used to introduce the basic tools and mechanics used within this thesis. Some historical uses and background are touched upon as well. The majority of the definitions are contained within this chapter as well. In Chapter 2 we consider the question whether one can decompose λ copies of monochromatic Kv into copies of Kk such that each copy of the Kk contains at most one edge from each Kv. This is called a proper edge coloring (Hurd, Sarvate, [29]). The majority of the content in this section is a wide variety of examples to explain the constructions used in Chapters 3 and 4. In Chapters 3 and 4 we investigate how to properly color BIBD(v, k, λ) for k = 4, and 5. Not only will there be direct constructions of relatively small BIBDs, we also prove some generalized constructions used within. In Chapter 5 we talk about an alternate solution to Chapters 3 and 4. A purely graph theoretical solution using matchings, augmenting paths, and theorems about the edgechromatic number is used to develop a theorem that than covers all possible cases. We also discuss how this method performed compared to the methods in Chapters 3 and 4. In Chapter 6, we switch topics to Latin rectangles that have the same number of symbols and an equivalent sized matrix to Latin squares. Suppose ab = n2. We define an equitable Latin rectangle as an a × b matrix on a set of n symbols where each symbol appears either [b/n] or [b/n] times in each row of the matrix and either [a/n] or [a/n] times in each column of the matrix. Two equitable Latin rectangles are orthogonal in the usual way. Denote a set of ka × b mutually orthogonal equitable Latin rectangles as a k–MOELR(a, b; n). We show that there exists a k–MOELR(a, b; n) for all a, b, n where k is at least 3 with some exceptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This doctoral thesis presents the computational work and synthesis with experiments for internal (tube and channel geometries) as well as external (flow of a pure vapor over a horizontal plate) condensing flows. The computational work obtains accurate numerical simulations of the full two dimensional governing equations for steady and unsteady condensing flows in gravity/0g environments. This doctoral work investigates flow features, flow regimes, attainability issues, stability issues, and responses to boundary fluctuations for condensing flows in different flow situations. This research finds new features of unsteady solutions of condensing flows; reveals interesting differences in gravity and shear driven situations; and discovers novel boundary condition sensitivities of shear driven internal condensing flows. Synthesis of computational and experimental results presented here for gravity driven in-tube flows lays framework for the future two-phase component analysis in any thermal system. It is shown for both gravity and shear driven internal condensing flows that steady governing equations have unique solutions for given inlet pressure, given inlet vapor mass flow rate, and fixed cooling method for condensing surface. But unsteady equations of shear driven internal condensing flows can yield different “quasi-steady” solutions based on different specifications of exit pressure (equivalently exit mass flow rate) concurrent to the inlet pressure specification. This thesis presents a novel categorization of internal condensing flows based on their sensitivity to concurrently applied boundary (inlet and exit) conditions. The computational investigations of an external shear driven flow of vapor condensing over a horizontal plate show limits of applicability of the analytical solution. Simulations for this external condensing flow discuss its stability issues and throw light on flow regime transitions because of ever-present bottom wall vibrations. It is identified that laminar to turbulent transition for these flows can get affected by ever present bottom wall vibrations. Detailed investigations of dynamic stability analysis of this shear driven external condensing flow result in the introduction of a new variable, which characterizes the ratio of strength of the underlying stabilizing attractor to that of destabilizing vibrations. Besides development of CFD tools and computational algorithms, direct application of research done for this thesis is in effective prediction and design of two-phase components in thermal systems used in different applications. Some of the important internal condensing flow results about sensitivities to boundary fluctuations are also expected to be applicable to flow boiling phenomenon. Novel flow sensitivities discovered through this research, if employed effectively after system level analysis, will result in the development of better control strategies in ground and space based two-phase thermal systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining how an exhaust system will perform acoustically before a prototype muffler is built can save the designer both a substantial amount of time and resources. In order to effectively use the simulation tools available it is important to understand what is the most effective tool for the intended purpose of analysis as well as how typical elements in an exhaust system affect muffler performance. An in-depth look at the available tools and their most beneficial uses are presented in this thesis. A full parametric study was conducted using the FEM method for typical muffler elements which was also correlated to experimental results. This thesis lays out the overall ground work on how to accurately predict sound pressure levels in the free field for an exhaust system with the engine properties included. The accuracy of the model is heavily dependent on the correct temperature profile of the model in addition to the accuracy of the source properties. These factors will be discussed in detail and methods for determining them will be presented. The secondary effects of mean flow, which affects both the acoustical wave propagation and the flow noise generation, will be discussed. Effective ways for predicting these secondary effects will be described. Experimental models will be tested on a flow rig that showcases these phenomena.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the face of increasing globalisation, and a collision between global communication systems and local traditions, this book offers innovative trans-disciplinary analyses of the value of traditional cultural expressions (TCE) and suggests appropriate protection mechanisms for them. It combines approaches from history, philosophy, anthropology, sociology and law, and charts previously untravelled paths for developing new policy tools and legal designs that go beyond conventional copyright models. Its authors extend their reflections to a consideration of the specific features of the digital environment, which, despite enhancing the risks of misappropriation of traditional knowledge and creativity, may equally offer new opportunities for revitalising indigenous peoples' values and provide for the sustainability of TCE.This book will appeal to scholars interested in multidisciplinary analyses of the fragmentation of international law in the field of intellectual property and traditional cultural expressions. It will also be valuable reading for those working on broader governance and human rights issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Patients' health related quality of life (HRQoL) has rarely been systematically monitored in general practice. Electronic tools and practice training might facilitate the routine application of HRQoL questionnaires. Thorough piloting of innovative procedures is strongly recommended before the conduction of large-scale studies. Therefore, we aimed to assess i) the feasibility and acceptance of HRQoL assessment using tablet computers in general practice, ii) the perceived practical utility of HRQoL results and iii) to identify possible barriers hindering wider application of this approach. Methods Two HRQoL questionnaires (St. George's Respiratory Questionnaire SGRQ and EORTC QLQ-C30) were electronically presented on portable tablet computers. Wireless network (WLAN) integration into practice computer systems of 14 German general practices with varying infrastructure allowed automatic data exchange and the generation of a printout or a PDF file. General practitioners (GPs) and practice assistants were trained in a 1-hour course, after which they could invite patients with chronic diseases to fill in the electronic questionnaire during their waiting time. We surveyed patients, practice assistants and GPs regarding their acceptance of this tool in semi-structured telephone interviews. The number of assessments, HRQoL results and interview responses were analysed using quantitative and qualitative methods. Results Over the course of 1 year, 523 patients filled in the electronic questionnaires (1–5 times; 664 total assessments). On average, results showed specific HRQoL impairments, e.g. with respect to fatigue, pain and sleep disturbances. The number of electronic assessments varied substantially between practices. A total of 280 patients, 27 practice assistants and 17 GPs participated in the telephone interviews. Almost all GPs (16/17 = 94%; 95% CI = 73–99%), most practice assistants (19/27 = 70%; 95% CI = 50–86%) and the majority of patients (240/280 = 86%; 95% CI = 82–91%) indicated that they would welcome the use of electronic HRQoL questionnaires in the future. GPs mentioned availability of local health services (e.g. supportive, physiotherapy) (mean: 9.4 ± 1.0 SD; scale: 1 – 10), sufficient extra time (8.9 ± 1.5) and easy interpretation of HRQoL results (8.6 ± 1.6) as the most important prerequisites for their use. They believed HRQoL assessment facilitated both communication and follow up of patients' conditions. Practice assistants emphasised that this process demonstrated an extra commitment to patient centred care; patients viewed it as a tool, which contributed to the physicians' understanding of their personal condition and circumstances. Conclusion This pilot study indicates that electronic HRQoL assessment is technically feasible in general practices. It can provide clinically significant information, which can either be used in the consultation for routine care, or for research purposes. While GPs, practice assistants and patients were generally positive about the electronic procedure, several barriers (e.g. practices' lack of time and routine in HRQoL assessment) need to be overcome to enable broader application of electronic questionnaires in every day medical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Object-oriented modelling languages such as EMOF are often used to specify domain specific meta-models. However, these modelling languages lack the ability to describe behavior or operational semantics. Several approaches have used a subset of Java mixed with OCL as executable meta-languages. In this experience report we show how we use Smalltalk as an executable meta-language in the context of the Moose reengineering environment. We present how we implemented EMOF and its behavioral aspects. Over the last decade we validated this approach through incrementally building a meta-described reengineering environment. Such an approach bridges the gap between a code-oriented view and a meta-model driven one. It avoids the creation of yet another language and reuses the infrastructure and run-time of the underlying implementation language. It offers an uniform way of letting developers focus on their tasks while at the same time allowing them to meta-describe their domain model. The advantage of our approach is that developers use the same tools and environment they use for their regular tasks. Still the approach is not Smalltalk specific but can be applied to language offering an introspective API such as Ruby, Python, CLOS, Java and C#.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Psychosocial factors have been described as affecting cellular immune measures in healthy subjects. In patients with early breast cancer we explored bi-directional psycho-immune effects to determine whether subjective burden has an impact on immune measures, and vice versa. Patients (n = 239) operated for early breast cancer and randomized into International Breast Cancer Study Group (IBCSG) adjuvant clinical trials were assessed immediately before the beginning of adjuvant treatment (baseline) and 3 and 6 months thereafter, at the beginning of the corresponding treatment cycle. Cellular immune measures (leukocytes, lymphocytes, lymphocyte subset counts), markers of activation of the cellular immune system (beta2-microglobulin, soluble interleukin-2 receptor serum levels), and self-report subjective burden (global indicators of physical well-being, mood, coping effort) were assessed concurrently. The relationship between subjective burden and gradients of immune measures was investigated with regression analyses controlling for adjuvant treatment. There was a pattern of small negative associations between all variables assessing subjective burden before the beginning of adjuvant therapy with the gradients of the markers of activation of the cellular immune system and NK cell counts. In particular, better mood predicted a decline in the course of beta2-microglobulin and IL-2r at months 3 and 6. The gradient of beta2-microglobulin was associated with mood and coping effort at month 3. However, the effect sizes were very small. In conclusion, in this explorative investigation, there was an indication for subjective burden affecting and being affected by markers of activation of the cellular immune system during the first 3 and 6 months of adjuvant therapy. The question of clinical significance remains unanswered. These associations have to be investigated with refined assessment tools and schedules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the functioning of brains is an extremely challenging endeavour - both for researches as well as for students. Interactive media and tools, like simulations, databases and visualizations or virtual laboratories proved to be not only indispensable in research but also in education to help understanding brain function. Accordingly, a wide range of such media and tools are now available and it is getting increasingly difficult to see an overall picture. Written by researchers, tool developers and experienced academic teachers, this special issue of Brains, Minds & Media covers a broad range of interactive research media and tools with a strong emphasis on their use in neural and cognitive sciences education. The focus lies not only on the tools themselves, but also on the question of how research tools can significantly enhance learning and teaching and how a curricular integration can be achieved. This collection gives a comprehensive overview of existing tools and their usage as well as the underlying educational ideas and thus provides an orientation guide not only for teaching researchers but also for interested teachers and students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stets darauf bedacht, die Anlagen- und Maschinenverfügbarkeit bei möglichst geringem Ressourceneinsatz zu gewährleisten, wird die Rolle der Instandhaltung als unternehmerischer Wertschöpfungsfaktor immer bedeutsamer. Voraussetzung für eine Nutzbarmachung bestehender Potentiale sind neue Werkzeuge und Ansätze, deren Umsetzung eine effiziente Sicherstellung von Verfügbarkeit ermöglicht. Vor diesem Hintergrund wurde im Teilprojekt C3 des DFG Paketantrags 672 ein Konzept zur nutzungsabhängigen Instandhaltung entwickelt. Auf Grundlage des bestehenden Zusammenhangs von Nutzung und Abnutzung risikobehafteter Bauteile intralogistischer Systeme können damit die durch zukünftige Systemlasten hervorgerufenen Beanspruchungen antizipiert werden. Instandhaltungsmaßnahmen und technische Verfügbarkeiten werden dadurch anforderungsgerecht und ressourcen-optimal planbar.