904 resultados para Test data
Resumo:
To enable a mathematically and physically sound execution of the fatigue test and a correct interpretation of its results, statistical evaluation methods are used to assist in the analysis of fatigue testing data. The main objective of this work is to develop step-by-stepinstructions for statistical analysis of the laboratory fatigue data. The scopeof this project is to provide practical cases about answering the several questions raised in the treatment of test data with application of the methods and formulae in the document IIW-XIII-2138-06 (Best Practice Guide on the Statistical Analysis of Fatigue Data). Generally, the questions in the data sheets involve some aspects: estimation of necessary sample size, verification of the statistical equivalence of the collated sets of data, and determination of characteristic curves in different cases. The series of comprehensive examples which are given in this thesis serve as a demonstration of the various statistical methods to develop a sound procedure to create reliable calculation rules for the fatigue analysis.
Case study of the use of remotely sensed data for modeling flood inundation on the river Severn, UK.
Resumo:
A methodology for using remotely sensed data to both generate and evaluate a hydraulic model of floodplain inundation is presented for a rural case study in the United Kingdom: Upton-upon-Severn. Remotely sensed data have been processed and assembled to provide an excellent test data set for both model construction and validation. In order to assess the usefulness of the data and the issues encountered in their use, two models for floodplain inundation were constructed: one based on an industry standard one-dimensional approach and the other based on a simple two-dimensional approach. The results and their implications for the future use of remotely sensed data for predicting flood inundation are discussed. Key conclusions for the use of remotely sensed data are that care must be taken to integrate different data sources for both model construction and validation and that improvements in ground height data shift the focus in terms of model uncertainties to other sources such as boundary conditions. The differences between the two models are found to be of minor significance.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
The purpose of this project was to investigate the effect of using of data collection technology on student attitudes towards science instruction. The study was conducted over the course of two years at Madison High School in Adrian, Michigan, primarily in college preparatory physics classes, but also in one college preparatory chemistry class and one environmental science class. A preliminary study was conducted at a Lenawee County Intermediate Schools student summer environmental science day camp. The data collection technology used was a combination of Texas Instruments TI-84 Silver Plus graphing calculators and Vernier LabPro data collection sleds with various probeware attachments, including motion sensors, pH probes and accelerometers. Students were given written procedures for most laboratory activities and were provided with data tables and analysis questions to answer about the activities. The first year of the study included a pretest and posttest measuring student attitudes towards the class they were enrolled in. Pre-test and post-test data were analyzed to determine effect size, which was found to be very small (Coe, 2002). The second year of the study focused only on a physics class and used Keller’s ARCS model for measuring student motivation based on the four aspects of motivation: Attention, Relevance, Confidence and Satisfaction (Keller, 2010). According to this model, it was found that there were two distinct groups in the class, one of which was motivated to learn and the other that was not. The data suggest that the use of data collection technology in science classes should be started early in a student’s career, possibly in early middle school or late elementary. This would build familiarity with the equipment and allow for greater exploration by the student as they progress through high school and into upper level science courses.
Resumo:
The variability of toxicity data contained within databases was investigated using the widely used US EPA ECOTOX database as an example. Fish acute lethality (LC50) values for 44 compounds (for which at least 10 data entries existed) were extracted from the ECOTOX database yielding a total of 4654 test records. Significant variability of LC50 test results was observed, exceeding several orders of magnitude. In an attempt to systematically explore potential causes of the data variability, the influence of biological factors (such as test species or life stages) and physical factors (such as water temperature, pH or water hardness) were examined. Even after eliminating the influence of these inherent factors, considerable data variability remained, suggesting an important role of factors relating to technical and measurement procedures. The analysis, however, was limited by pronounced gaps in the test documentation. Of the 4654 extracted test reports, 66.5% provided no information on the fish life stage used for testing. Likewise, water temperature, hardness or pH were not recorded in 19.6%, 48.2% and 41.2% of the data entries, respectively. From these findings, we recommend the rigorous control of data entries ensuring complete recording of testing conditions. A more consistent database will help to better discriminate between technical and natural variability of the test data, which is of importance in ecological risk assessment for extrapolation from laboratory tests to the field, and also might help to develop correction factors that account for systematic differences in test results caused by species, life stage or test conditions.
Resumo:
Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.
Resumo:
The electronics industry, is experiencing two trends one of which is the drive towards miniaturization of electronic products. The in-circuit testing predominantly used for continuity testing of printed circuit boards (PCB) can no longer meet the demands of smaller size circuits. This has lead to the development of moving probe testing equipment. Moving Probe Test opens up the opportunity to test PCBs where the test points are on a small pitch (distance between points). However, since the test uses probes that move sequentially to perform the test, the total test time is much greater than traditional in-circuit test. While significant effort has concentrated on the equipment design and development, little work has examined algorithms for efficient test sequencing. The test sequence has the greatest impact on total test time, which will determine the production cycle time of the product. Minimizing total test time is a NP-hard problem similar to the traveling salesman problem, except with two traveling salesmen that must coordinate their movements. The main goal of this thesis was to develop a heuristic algorithm to minimize the Flying Probe test time and evaluate the algorithm against a "Nearest Neighbor" algorithm. The algorithm was implemented with Visual Basic and MS Access database. The algorithm was evaluated with actual PCB test data taken from Industry. A statistical analysis with 95% C.C. was performed to test the hypothesis that the proposed algorithm finds a sequence which has a total test time less than the total test time found by the "Nearest Neighbor" approach. Findings demonstrated that the proposed heuristic algorithm reduces the total test time of the test and, therefore, production cycle time can be reduced through proper sequencing.
Resumo:
The viscosity of ionic liquids (ILs) has been modeled as a function of temperature and at atmospheric pressure using a new method based on the UNIFAC–VISCO method. This model extends the calculations previously reported by our group (see Zhao et al. J. Chem. Eng. Data 2016, 61, 2160–2169) which used 154 experimental viscosity data points of 25 ionic liquids for regression of a set of binary interaction parameters and ion Vogel–Fulcher–Tammann (VFT) parameters. Discrepancies in the experimental data of the same IL affect the quality of the correlation and thus the development of the predictive method. In this work, mathematical gnostics was used to analyze the experimental data from different sources and recommend one set of reliable data for each IL. These recommended data (totally 819 data points) for 70 ILs were correlated using this model to obtain an extended set of binary interaction parameters and ion VFT parameters, with a regression accuracy of 1.4%. In addition, 966 experimental viscosity data points for 11 binary mixtures of ILs were collected from literature to establish this model. All the binary data consist of 128 training data points used for the optimization of binary interaction parameters and 838 test data points used for the comparison of the pure evaluated values. The relative average absolute deviation (RAAD) for training and test is 2.9% and 3.9%, respectively.
Resumo:
Numerical modelling and simulations are needed to develop and test specific analysis methods by providing test data before BIRDY would be launched. This document describes the "satellite data simulator" which is a multi-sensor, multi-spectral satellite simulator produced especially for the BIRDY mission which could be used as well to analyse data from other satellite missions providing energetic particles data in the Solar system.
Resumo:
This document does NOT address the issue of oxygen data quality control (either real-time or delayed mode). As a preliminary step towards that goal, this document seeks to ensure that all countries deploying floats equipped with oxygen sensors document the data and metadata related to these floats properly. We produced this document in response to action item 14 from the AST-10 meeting in Hangzhou (March 22-23, 2009). Action item 14: Denis Gilbert to work with Taiyo Kobayashi and Virginie Thierry to ensure DACs are processing oxygen data according to recommendations. If the recommendations contained herein are followed, we will end up with a more uniform set of oxygen data within the Argo data system, allowing users to begin analysing not only their own oxygen data, but also those of others, in the true spirit of Argo data sharing. Indications provided in this document are valid as of the date of writing this document. It is very likely that changes in sensors, calibrations and conversions equations will occur in the future. Please contact V. Thierry (vthierry@ifremer.fr) for any inconsistencies or missing information. A dedicated webpage on the Argo Data Management website (www) contains all information regarding Argo oxygen data management : current and previous version of this cookbook, oxygen sensor manuals, calibration sheet examples, examples of matlab code to process oxygen data, test data, etc..
Resumo:
This thesis describes the development and correlation of a thermal model that forms the foundation of a thermal capacitance spacecraft propellant load estimator. Specific details of creating the thermal model for the diaphragm propellant tank used on NASA’s Magnetospheric Multiscale spacecraft using ANSYS and the correlation process implemented are presented. The thermal model was correlated to within +/- 3 Celsius of the thermal vacuum test data, and was determined sufficient to make future propellant predictions on MMS. The model was also found to be relatively sensitive to uncertainties in applied heat flux and mass knowledge of the tank. More work is needed to improve temperature predictions in the upper hemisphere of the propellant tank where predictions were found to be 2-2.5 Celsius lower than the test data. A road map for applying the model to predict propellant loads on the actual MMS spacecraft in 2017-2018 is also presented.
Resumo:
This study evaluated comparatively the adhesion of Epiphany and AH Plus endodontic sealers to human root dentin treated with 1% NaOCl and 1% NaOCl+17% EDTA, using the push-out test. Sixty root cylinders obtained from maxillary canines had the canals prepared and were randomly assigned to 3 groups (n=20), according to root dentin treatment: GI - distilled water (control), GII - 1% NaOCl and GIII - 1% NaOCl+17% EDTA. Each group was divided into 2 subgroups (n=10) filled with either Epiphany or AH Plus. Bond strength push-out test data (kN) were obtained and analyzed statistically by ANOVA and Tukey's post-hoc test. There was statistically significant difference between sealers (AH Plus: 0.78 ± 0.13; Epiphany: 0.61 ± 0.19; p<0.01) and among root dentin treatments (distilled water: 0.58 ± 0.19; 1% NaOCl: 0.71 ± 0.12; 1% NaOCl+17% EDTA: 0.80 ± 0.17; p<0.05). In conclusion, AH Plus sealer presented greater adhesion to dentin than Epiphany, regardless of the treatment of root canal walls.
Resumo:
Background: To determine the prevalence of diabetes mellitus (DM) and impaired glucose tolerance (IGT) in a rural community (Bengo) of Angola. Methods: A random sample of 421 subjects aged 30 to 69 years (30% men and 70% women) was selected from three villages of Bengo province. This cross-sectional home survey was conducted using a sampling design of stage conglomerates. First, clinical and anthropometric data were obtained and fasting capillary glucose level was determined. Subjects who screened positive (fasting capillary glucose >= 100 mg/dl and < 200 mg/dl) and each sixth consecutive subject who screened negative (fasting capillary glucose < 100 mg/dl) were submitted to the second phase of survey, consisting of the 75 g oral glucose tolerance test. Data was analyzed by the use of SAS statistical software. Results: The prevalence rates of diabetes mellitus and IGT were 2.8% and 8.1%, respectively. The age group with the highest prevalence of diabetes was 60 to 69 years (42%). Impaired glucose tolerance prevalence was 38% in the 40 to 49 year age group and it increased with age, considering that the 50 to 59 and 60 to 69 year age groups as a whole represent 50% of all subjects with impaired glucose tolerance. The prevalence of diabetes mellitus did not differ significantly between men (3.2%) and women (2.7%) (p = 0.47). On the other hand, the prevalence of impaired glucose tolerance among women showed almost twice that found in men (9.1% vs. 5.6%, respectively). Overweight was present in 66.7% of the individuals with diabetes mellitus and 26.5% of individuals with impaired glucose tolerance showed overweight or obesity. Conclusions: Although the prevalence of diabetes mellitus was low, the prevalence of impaired glucose tolerance is considered to be within an intermediary range, suggesting a future increase in the frequency of diabetes in this population.
Resumo:
A large percentage of pile caps support only one column, and the pile caps in turn are supported by only a few piles. These are typically short and deep members with overall span-depth ratios of less than 1.5. Codes of practice do not provide uniform treatment for the design of these types of pile caps. These members have traditionally been designed as beams spanning between piles with the depth selected to avoid shear failures and the amount of longitudinal reinforcement selected to provide sufficient flexural capacity as calculated by the engineering beam theory. More recently, the strut-and-tie method has been used for the design of pile caps (disturbed or D-region) in which the load path is envisaged to be a three-dimensional truss, with compressive forces being supported by concrete compressive struts between the column and piles and tensile forces being carried by reinforcing steel located between piles. Both of these models have not provided uniform factors of safety against failure or been able to predict whether failure will occur by flexure (ductile mode) or shear (fragile mode). In this paper, an analytical model based on the strut-and-tie approach is presented. The proposed model has been calibrated using an extensive experimental database of pile caps subjected to compression and evaluated analytically for more complex loading conditions. It has been proven to be applicable across a broad range of test data and can predict the failures modes, cracking, yielding, and failure loads of four-pile caps with reasonable accuracy.
Resumo:
This chapter is concerned with acquisition and analysis of test data for determining whether or not the flexural strength of granite cladding under extreme conditions is adequate to assure that reliability requirements are satisfied.