903 resultados para planning of experiments
Resumo:
In this study, we show the use of three-dimensional printing models for preoperative planning of transcatheter valve replacement in a patient with an extreme porcelain aorta. A 70-year-old man with severe aortic stenosis and a porcelain aorta was referred to our center for transcatheter aortic valve replacement. Unfortunately, the patient died after the procedure because of a potential ischemic event. Therefore, we decided to fabricate three-dimensional models to evaluate the potential effects of these constructs for previous surgical planning and simulation of the transcatheter valve replacement.
Resumo:
With many different investigators studying the same disease and with a strong commitment to publish supporting data in the scientific community, there are often many different datasets available for any given disease. Hence there is substantial interest in finding methods for combining these datasets to provide better and more detailed understanding of the underlying biology. We consider the synthesis of different microarray data sets using a random effects paradigm and demonstrate how relatively standard statistical approaches yield good results. We identify a number of important and substantive areas which require further investigation.
Resumo:
A patient-specific surface model of the proximal femur plays an important role in planning and supporting various computer-assisted surgical procedures including total hip replacement, hip resurfacing, and osteotomy of the proximal femur. The common approach to derive 3D models of the proximal femur is to use imaging techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). However, the high logistic effort, the extra radiation (CT-imaging), and the large quantity of data to be acquired and processed make them less functional. In this paper, we present an integrated approach using a multi-level point distribution model (ML-PDM) to reconstruct a patient-specific model of the proximal femur from intra-operatively available sparse data. Results of experiments performed on dry cadaveric bones using dozens of 3D points are presented, as well as experiments using a limited number of 2D X-ray images, which demonstrate promising accuracy of the present approach.
Resumo:
This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim.
Resumo:
Localized Magnetic Resonance Spectroscopy (MRS) is in widespread use for clinical brain research. Standard acquisition sequences to obtain one-dimensional spectra suffer from substantial overlap of spectral contributions from many metabolites. Therefore, specially tuned editing sequences or two-dimensional acquisition schemes are applied to extend the information content. Tuning specific acquisition parameters allows to make the sequences more efficient or more specific for certain target metabolites. Cramér-Rao bounds have been used in other fields for optimization of experiments and are now shown to be very useful as design criteria for localized MRS sequence optimization. The principle is illustrated for one- and two-dimensional MRS, in particular the 2D separation experiment, where the usual restriction to equidistant echo time spacings and equal acquisition times per echo time can be abolished. Particular emphasis is placed on optimizing experiments for quantification of GABA and glutamate. The basic principles are verified by Monte Carlo simulations and in vivo for repeated acquisitions of generalized two-dimensional separation brain spectra obtained from healthy subjects and expanded by bootstrapping for better definition of the quantification uncertainties.
Resumo:
Off-site effects of soil erosion are becoming increasingly important, particularly the pollution of surface waters. In order to develop environmentally efficient and cost effective mitigation options it is essential to identify areas that bear both a high erosion risk and high connectivity to surface waters. This paper introduces a simple risk assessment tool that allows the delineation of potential critical source areas (CSA) of sediment input into surface waters concerning the agricultural areas of Switzerland. The basis are the erosion risk map with a 2 m resolution (ERM2) and the drainage network, which is extended by drained roads, farm tracks, and slope depressions. The probability of hydrological and sedimentological connectivity is assessed by combining soil erosion risk and extended drainage network with flow distance calculation. A GIS-environment with multiple-flow accumulation algorithms is used for routing runoff generation and flow pathways. The result is a high resolution connectivity map of the agricultural area of Switzerland (888,050 ha). Fifty-five percent of the computed agricultural area is potentially connected with surface waters, 45% is not connected. Surprisingly, the larger part of 34% (62% of the connected area) is indirectly connected with surface waters through drained roads, and only 21% are directly connected. The reason is the topographic complexity and patchiness of the landscape due to a dense road and drainage network. A total of 24% of the connected area and 13% of the computed agricultural area, respectively, are rated with a high connectivity probability. On these CSA an adapted land use is recommended, supported by vegetated buffer strips preventing sediment load. Even areas that are far away from open water bodies can be indirectly connected and need to be included in planning of mitigation measures. Thus, the connectivity map presented is an important decision-making tool for policy-makers and extension services. The map is published on the web and thus available for application.
Resumo:
Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.
Resumo:
Agro-areas of Arroyos Menores (La Colacha) west and south of Rand south of R?o Cuarto (Prov. of Cordoba, Argentina) basins are very fertile but have high soil loses. Extreme rain events, inundations and other severe erosions forming gullies demand urgently actions in this area to avoid soil degradation and erosion supporting good levels of agro production. The authors first improved hydrologic data on La Colacha, evaluated the systems of soil uses and actions that could be recommended considering the relevant aspects of the study area and applied decision support systems (DSS) with mathematic tools for planning of defences and uses of soils in these areas. These were conducted here using multi-criteria models, in multi-criteria decision making (MCDM); first of discrete MCDM to chose among global types of use of soils, and then of continuous MCDM to evaluate and optimize combined actions, including repartition of soil use and the necessary levels of works for soil conservation and for hydraulic management to conserve against erosion these basins. Relatively global solutions for La Colacha area have been defined and were optimised by Linear Programming in Goal Programming forms that are presented as Weighted or Lexicographic Goal Programming and as Compromise Programming. The decision methods used are described, indicating algorithms used, and examples for some representative scenarios on La Colacha area are given.
Resumo:
Experimental software engineering includes several processes, the most representative being run experiments, run replications and synthesize the results of multiple replications. Of these processes, only the first is relatively well established in software engineering. Problems of information management and communication among researchers are one of the obstacles to progress in the replication and synthesis processes. Software engineering experimentation has expanded considerably over the last few years. This has brought with it the invention of experimental process support proposals. However, few of these proposals provide integral support, including replication and synthesis processes. Most of the proposals focus on experiment execution. This paper proposes an infrastructure providing integral support for the experimental research process, specializing in the replication and synthesis of a family of experiments. The research has been divided into stages or phases, whose transition milestones are marked by the attainment of their goals. Each goal exactly matches an artifact or product. Within each stage, we will adopt cycles of successive approximations (generateand- test cycles), where each approximation includes a diferent viewpoint or input. Each cycle will end with the product approval.
Resumo:
In the 2000’s, the Internet became the preferred mean for the citizens to communicate. The YouTube, Twitter, Facebook, LinkedIn, i.e., the social networks in general appeared together with the Web 2.0, which allows an extraordinary interaction between citizens and the democratic institutions. The trade unions constantly fight governments’ decisions, especially in periods of crisis like the one that the world, Europe and, in particular, Portugal are facing. In this regard, the use of e-participation platforms is expected to strengthen the relationship between trade unions and the education community. This paper reports the research about the planning and driving of a series of experiments of online public consultation, launched by teachers’ trade unions. These experiments are compared with those of other countries, such as Australia, United Kingdom and United States of America. A quantitative analysis of the results regarding hits, subscriptions, and response rates is presented, and it is compared with the 90-9-1 rule, the ASCU model and data from government agencies. The experiments performed used the Liberopinion, an online platform that supports bidirectional asynchronous communication. A better understanding of the benefits of these collaborative environments is expected by promoting quality of interaction between actors.
Resumo:
In marine environments, sediments from different sources are stirred and dispersed, generating beds that are composed of mixed and layered sediments of differing grain sizes. Traditional engineering formulations used to predict erosion thresholds are however, generally for unimodal sediment distributions, and so may be inadequate for commonly occurring coastal sediments. We tested the transport behavior of deposited and mixed sediment beds consisting of a simplified two-grain fraction (silt (D50 = 55 µm) and sand (D50 = 300 µm)) in a laboratory-based annular flume with the objective of investigating the parameters controlling the stability of a sediment bed. To mimic recent deposition of particles following large storm events and the longer-term result of the incorporation of fines in coarse sediment, we designed two suites of experiments: (1) "the layering experiment": in which a sandy bed was covered by a thin layer of silt of varying thickness (0.2 - 3 mm; 0.5 - 3.7 wt %, dry weight in a layer 10 cm deep); and (2) "the mixing experiment" where the bed was composed of sand homogeneously mixed with small amounts of silt (0.07 - 0.7 wt %, dry weight). To initiate erosion and to detect a possible stabilizing effect in both settings, we increased the flow speeds in increments up to 0.30 m/s. Results showed that the sediment bed (or the underlying sand bed in the case of the layering experiment) stabilized with increasing silt composition. The increasing sediment stability was defined by a shift of the initial threshold conditions towards higher flow speeds, combined with, in the case of the mixed bed, decreasing erosion rates. Our results show that even extremely low concentrations of silt play a stabilizing role (1.4% silt (wt %) on a layered sediment bed of 10 cm thickness). In the case of a mixed sediment bed, 0.18% silt (wt %, in a sample of 10 cm depth) stabilized the bed. Both cases show that the depositional history of the sediment fractions can change the erosion characteristics of the seabed. These observations are summarized in a conceptual model that suggests that, in addition to the effect on surface roughness, silt stabilizes the sand bed by pore-space plugging and reducing the inflow in the bed, and hence increases the bed stability. Measurements of hydraulic conductivity on similar bed assemblages qualitatively supported this conclusion by showing that silt could decrease the permeability by up to 22% in the case of a layered bed and by up to 70% in the case of a mixed bed.
Resumo:
At head of title: State of Illinois.
Resumo:
There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.
Resumo:
The manufacture of copper alloy flat rolled metals involves hot and cold rolling operations, together with annealing and other secondary processes, to transform castings (mainly slabs and cakes) into such shapes as strip, plate, sheet, etc. Production is mainly to customer orders in a wide range of specifications for dimensions and properties. However, order quantities are often small and so process planning plays an important role in this industry. Much research work has been done in the past in relation to the technology of flat rolling and the details of the operations, however, there is little or no evidence of any research in the planning of processes for this type of manufacture. Practical observation in a number of rolling mills has established the type of manual process planning traditionally used in this industry. This manual approach, however, has inherent drawbacks, being particularly dependent on the individual planners who gain their knowledge over a long span of practical experience. The introduction of the retrieval CAPP approach to this industry was a first step to reduce these problems. But this could not provide a long-term answer because of the need for an experienced planner to supervise generation of any plan. It also fails to take account of the dynamic nature of the parameters involved in the planning, such as the availability of resources, operation conditions and variations in the costs. The other alternative is the use of a generative approach to planning in the rolling mill context. In this thesis, generative methods are developed for the selection of optimal routes for single orders and then for batches of orders, bearing in mind equipment restrictions, production costs and material yield. The batch order process planning involves the use of a special cluster analysis algorithm for optimal grouping of the orders. This research concentrates on cold-rolling operations. A prototype model of the proposed CAPP system, including both single order and batch order planning options, has been developed and tested on real order data in the industry. The results were satisfactory and compared very favourably with the existing manual and retrieval methods.
Resumo:
The development of new, health supporting food of high quality and the optimization of food technological processes today require the application of statistical methods of experimental design. The principles and steps of statistical planning and evaluation of experiments will be explained. By example of the development of a gluten-free rusk (zwieback), which is enriched by roughage compounds the application of a simplex-centroid mixture design will be shown. The results will be illustrated by different graphics.