937 resultados para Editor of flow analysis methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The important active and passive role of mineral dust aerosol in the climate and the global carbon cycle over the last glacial/interglacial cycles has been recognized. However, little data on the most important aeolian dust-derived biological micronutrient, iron (Fe), has so far been available from ice-cores from Greenland or Antarctica. Furthermore, Fe deposition reconstructions derived from the palaeoproxies particulate dust and calcium differ significantly from the Fe flux data available. The ability to measure high temporal resolution Fe data in polar ice-cores is crucial for the study of the timing and magnitude of relationships between geochemical events and biological responses in the open ocean. This work adapts an existing flow injection analysis (FIA) methodology for low-level trace Fe determinations with an existing glaciochemical analysis system, continuous flow analysis (CFA) of ice-cores. Fe-induced oxidation of N,Nâ²-dimethyl-p-pheylenediamine (DPD) is used to quantify the biologically more important and easily leachable Fe fraction released in a controlled digestion step at pH ∼1.0. The developed method was successfully applied to the determination of labile Fe in ice-core samples collected from the Antarctic Byrd ice-core and the Greenland Ice-Core Project (GRIP) ice-core.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phosphorus (P) is an essential macronutrient for all living organisms. Phosphorus is often present in nature as the soluble phosphate ion PO43â and has biological, terrestrial, and marine emission sources. Thus PO43â detected in ice cores has the potential to be an important tracer for biological activity in the past. In this study a continuous and highly sensitive absorption method for detection of dissolved reactive phosphorus (DRP) in ice cores has been developed using a molybdate reagent and a 2-m liquid waveguide capillary cell (LWCC). DRP is the soluble form of the nutrient phosphorus, which reacts with molybdate. The method was optimized to meet the low concentrations of DRP in Greenland ice, with a depth resolution of approximately 2 cm and an analytical uncertainty of 1.1 nM (0.1 ppb) PO43â. The method has been applied to segments of a shallow firn core from Northeast Greenland, indicating a mean concentration level of 2.74 nM (0.26 ppb) PO43â for the period 1930â2005 with a standard deviation of 1.37 nM (0.13 ppb) PO43â and values reaching as high as 10.52 nM (1 ppb) PO43â. Similar levels were detected for the period 1771â1823. Based on impurity abundances, dust and biogenic particles were found to be the most likely sources of DRP deposited in Northeast Greenland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES To assess the available evidence on the effectiveness of accelerated orthodontic tooth movement through surgical and non-surgical approaches in orthodontic patients. METHODS Randomized controlled trials and controlled clinical trials were identified through electronic and hand searches (last update: March 2014). Orthognathic surgery, distraction osteogenesis, and pharmacological approaches were excluded. Risk of bias was assessed using the Cochrane risk of bias tool. RESULTS Eighteen trials involving 354 participants were included for qualitative and quantitative synthesis. Eight trials reported on low-intensity laser, one on photobiomodulation, one on pulsed electromagnetic fields, seven on corticotomy, and one on interseptal bone reduction. Two studies on corticotomy and two on low-intensity laser, which had low or unclear risk of bias, were mathematically combined using the random effects model. Higher canine retraction rate was evident with corticotomy during the first month of therapy (WMD=0.73; 95% CI: 0.28, 1.19, p<0.01) and with low-intensity laser (WMD=0.42mm/month; 95% CI: 0.26, 0.57, p<0.001) in a period longer than 3 months. The quality of evidence supporting the interventions is moderate for laser therapy and low for corticotomy intervention. CONCLUSIONS There is some evidence that low laser therapy and corticotomy are effective, whereas the evidence is weak for interseptal bone reduction and very weak for photobiomodulation and pulsed electromagnetic fields. Overall, the results should be interpreted with caution given the small number, quality, and heterogeneity of the included studies. Further research is required in this field with additional attention to application protocols, adverse effects, and cost-benefit analysis. CLINICAL SIGNIFICANCE From the qualitative and quantitative synthesis of the studies, it could be concluded that there is some evidence that low laser therapy and corticotomy are associated with accelerated orthodontic tooth movement, while further investigation is required before routine application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ATLAS measurements of the azimuthal anisotropy in leadâlead collisions at √sNN = 2.76 TeV are shown using a dataset of approximately 7μbâˆ1 collected at the LHC in 2010. The measurements are performed for charged particles with transversemomenta 0.5 < pT < 20 GeV and in the pseudorapidity range |η| < 2.5. The anisotropy is characterized by the Fourier coefficients, vn, of the charged-particle azimuthal angle distribution for n = 2â4. The Fourier coefficients are evaluated using multi-particle cumulants calculated with the generating function method. Results on the transverse momentum, pseudorapidity and centrality dependence of the vn coefficients are presented. The elliptic flow, v2, is obtained from the two-, four-, six- and eight-particle cumulants while higher-order coefficients, v3 and v4, are determined with two- and four-particle cumulants. Flow harmonics vn measured with four-particle cumulants are significantly reduced compared to the measurement involving two-particle cumulants. A comparison to vn measurements obtained using different analysis methods and previously reported by the LHC experiments is also shown. Results of measurements of flow fluctuations evaluated with multiparticle cumulants are shown as a function of transverse momentum and the collision centrality. Models of the initial spatial geometry and its fluctuations fail to describe the flow fluctuations measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through the correct implementation of lean manufacturing methods, a company can greatly improve their business. Over a period of three months at TTM Technologies, I utilized my knowledge to fix existing problems ans streamline production. In addition, other trouble areas in their production process were discovered and proper lean methods were used to address them. TTM Technologies saw many changed in the right direction over this time period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pattern of the births during the week has been reported by many studies. The births occurred in weekends are found consistently less then births occurred in weekdays. This study employed two statistical methods, two-way ANOVA and two-way Friedman's test to analyse the daily variations in amount of births of 222,735 births from 2005-2007 in Harris County, Texas. The two methods were compared on their assumptions, procedures and results. Both of the tests showed a significant result which indicated that the births through the week are not uniformly distributed. The result of multiple comparison demonstrated the births occurring on weekends were significantly different than the births occurring on weekdays with least amount on Sundays.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies have shown that rare genetic variants have stronger effects in predisposing common diseases, and several statistical methods have been developed for association studies involving rare variants. In order to better understand how these statistical methods perform, we seek to compare two recently developed rare variant statistical methods (VT and C-alpha) on 10,000 simulated re-sequencing data sets with disease status and the corresponding 10,000 simulated null data sets. The SLC1A1 gene has been suggested to be associated with diastolic blood pressure (DBP) in previous studies. In the current study, we applied VT and C-alpha methods to the empirical re-sequencing data for the SLC1A1 gene from 300 whites and 200 blacks. We found that VT method obtains higher power and performs better than C-alpha method with the simulated data we used. The type I errors were well-controlled for both methods. In addition, both VT and C-alpha methods suggested no statistical evidence for the association between the SLC1A1 gene and DBP. Overall, our findings provided an important comparison of the two statistical methods for future reference and provided preliminary and pioneer findings on the association between the SLC1A1 gene and blood pressure.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information on possible resource value of sea floor manganese nodule deposits in the eastern north Pacific has been obtained by a study of records and collections of the 1972 Sea Scope Expedition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the issue of the practicality of global flow analysis in logic program compilation, in terms of both speed and precision of analysis. It discusses design and implementation aspects of two practical abstract interpretation-based flow analysis systems: MA3, the MOO Andparallel Analyzer and Annotator; and Ms, an experimental mode inference system developed for SB-Prolog. The paper also provides performance data obtained from these implementations. Based on these results, it is concluded that the overhead of global flow analysis is not prohibitive, while the results of analysis can be quite precise and useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract interpretation-based data-flow analysis of logic programs is at this point relatively well understood from the point of view of general frameworks and abstract domains. On the other hand, comparatively little attention has been given to the problems which arise when analysis of a full, practical dialect of the Prolog language is attempted, and only few solutions to these problems have been proposed to date. Such problems relate to dealing correctly with all builtins, including meta-logical and extra-logical predicates, with dynamic predicates (where the program is modified during execution), and with the absence of certain program text during compilation. Existing proposals for dealing with such issues generally restrict in one way or another the classes of programs which can be analyzed if the information from analysis is to be used for program optimization. This paper attempts to fill this gap by considering a full dialect of Prolog, essentially following the recently proposed ISO standard, pointing out the problems that may arise in the analysis of such a dialect, and proposing a combination of known and novel solutions that together allow the correct analysis of arbitrary programs using the full power of the language.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative descriptive analysis (QDA) is used to describe the nature and the intensity of sensory properties from a single evaluation of a product, whereas temporal dominance of sensation (TDS) is primarily used to identify dominant sensory properties over time. Previous studies with TDS have focused on model systems, but this is the first study to use a sequential approach, i.e. QDA then TDS in measuring sensory properties of a commercial product category, using the same set of trained assessors (n = 11). The main objectives of this study were to: (1) investigate the benefits of using a sequential approach of QDA and TDS and (2) to explore the impact of the sample composition on taste and flavour perceptions in blackcurrant squashes. The present study has proposed an alternative way of determining the choice of attributes for TDS measurement based on data obtained from previous QDA studies, where available. Both methods indicated that the flavour profile was primarily influenced by the level of dilution and complexity of sample composition combined with blackcurrant juice content. In addition, artificial sweeteners were found to modify the quality of sweetness and could also contribute to bitter notes. Using QDA and TDS in tandem was shown to be more beneficial than each just on its own enabling a more complete sensory profile of the products.