983 resultados para Simulation Testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main function of a roadway culvert is to effectively convey drainage flow during normal and extreme hydrologic conditions. This function is often impaired due to the sedimentation blockage of the culvert. This research sought to understand the mechanics of sedimentation process at multi-box culverts, and develop self-cleaning systems that flush out sediment deposits using the power of drainage flows. The research entailed field observations, laboratory experiments, and numerical simulations. The specific role of each of these investigative tools is summarized below: a) The field observations were aimed at understanding typical sedimentation patterns and their dependence on culvert geometry and hydrodynamic conditions during normal and extreme hydrologic events. b) The laboratory experiments were used for modeling sedimentation process observed insitu and for testing alternative self-cleaning concepts applied to culverts. The major tasks for the initial laboratory model study were to accurately replicate the culvert performance curves and the dynamics of sedimentation process, and to provide benchmark data for numerical simulation validation. c) The numerical simulations enhanced the understanding of the sedimentation processes and aided in testing flow cases complementary to those conducted in the model reducing the number of (more expensive) tests to be conducted in the laboratory. Using the findings acquired from the laboratory and simulation works, self-cleaning culvert concepts were developed and tested for a range of flow conditions. The screening of the alternative concepts was made through experimental studies in a 1:20 scale model guided by numerical simulations. To ensure the designs are effective, performance studies were finally conducted in a 1:20 hydraulic model using the most promising design alternatives to make sure that the proposed systems operate satisfactory under closer to natural scale conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current study was initiated to quantify the stresses induced in critical details on the reinforcing jacket and the tower itself through the use of field instrumentation, load testing, and long-term monitoring. Strain gages were installed on the both the tower and the reinforcing jacket. Additional strain gages were installed on two anchor rods. Tests were conducted with and without the reinforcing jacket installed. Data were collected from all strain gages during static load testing and were used to study the stress distribution of the tower caused by known loads, both with and without the reinforcing jacket. The tower was tested dynamically by first applying a static load, and then quickly releasing the load causing the tower to vibrate freely. Furthermore, the tower was monitored over a period of over 1 year to obtain stress range histograms at the critical details to be used for a fatigue evaluation. Also during the long-term monitoring, triggered time-history data were recorded to study the wind loading phenomena that excite the tower.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In response to the mandate on Load and Resistance Factor Design (LRFD) implementations by the Federal Highway Administration (FHWA) on all new bridge projects initiated after October 1, 2007, the Iowa Highway Research Board (IHRB) sponsored these research projects to develop regional LRFD recommendations. The LRFD development was performed using the Iowa Department of Transportation (DOT) Pile Load Test database (PILOT). To increase the data points for LRFD development, develop LRFD recommendations for dynamic methods, and validate the results of LRFD calibration, 10 full-scale field tests on the most commonly used steel H-piles (e.g., HP 10 x 42) were conducted throughout Iowa. Detailed in situ soil investigations were carried out, push-in pressure cells were installed, and laboratory soil tests were performed. Pile responses during driving, at the end of driving (EOD), and at re-strikes were monitored using the Pile Driving Analyzer (PDA), following with the CAse Pile Wave Analysis Program (CAPWAP) analysis. The hammer blow counts were recorded for Wave Equation Analysis Program (WEAP) and dynamic formulas. Static load tests (SLTs) were performed and the pile capacities were determined based on the Davisson’s criteria. The extensive experimental research studies generated important data for analytical and computational investigations. The SLT measured load-displacements were compared with the simulated results obtained using a model of the TZPILE program and using the modified borehole shear test method. Two analytical pile setup quantification methods, in terms of soil properties, were developed and validated. A new calibration procedure was developed to incorporate pile setup into LRFD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Iowa Pore Index (IPI) measures the pore system of carbonate (limestone and dolomite) rocks using pressurized water to infiltrate the pore system. This technique provides quantitative results for the primary and capillary (secondary) pores in carbonate rocks. These results are used in conjunction with chemical and mineralogical test results to calculate a quality number, which is used as a predictor of aggregate performance in Portland cement concrete (PCC) leading to the durability classification of the aggregate. This study had two main objectives: to determine the effect different aggregate size has on IPI test results and to establish the precision of IPI test and test apparatus. It was found that smaller aggregate size fractions could be correlated to the standard 1/2”-3/4” size sample. Generally, a particle size decrease was accompanied by a slight decrease in IPI values. The IPI testing also showed fairly good agreement of the secondary pore index number between the 1/2”-3/4”and the 3/8”-1/2” fraction. The #4-3/8” showed a greater difference of the secondary number from the 1/2”-3/4” fraction. The precision of the IPI test was established as a standard deviation (Sr) of 2.85 (Primary) and 0.87 (Secondary) with a repeatability limit (%r) of 8.5% and 14.9% for the primary and secondary values, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many patients with malignant gliomas do not respond to alkylating agent chemotherapy. Alkylator resistance of glioma cells is mainly mediated by the DNA repair enzyme O(6)-methylguanine-DNA methyltransferase (MGMT). Epigenetic silencing of the MGMT gene by promoter methylation in glioma cells compromises this DNA repair mechanism and increases chemosensitivity. MGMT promoter methylation is, therefore, a strong prognostic biomarker in paediatric and adult patients with glioblastoma treated with temozolomide. Notably, elderly patients (>65-70 years) with glioblastoma whose tumours lack MGMT promoter methylation derive minimal benefit from such chemotherapy. Thus, MGMT promoter methylation status has become a frequently requested laboratory test in neuro-oncology. This Review presents current data on the prognostic and predictive relevance of MGMT testing, discusses clinical trials that have used MGMT status to select participants, evaluates known issues concerning the molecular testing procedure, and addresses the necessity for molecular-context-dependent interpretation of MGMT test results. Whether MGMT promoter methylation testing should be offered to all individuals with glioblastoma, or only to elderly patients and those in clinical trials, is also discussed. Justifications for withholding alkylating agent chemotherapy in patients with MGMT-unmethylated glioblastomas outside clinical trials, and the potential role for MGMT testing in other gliomas, are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To determine 1) HIV testing practices in a 1400-bed university hospital where local HIV prevalence is 0.4% and 2) the effect on testing practices of national HIV testing guidelines, revised in March 2010, recommending Physician-Initiated Counselling and Testing (PICT). METHODS: Using 2 hospital databases, we determined the number of HIV tests performed by selected clinical services, and the number of patients tested as a percentage of the number seen per service ('testing rate'). To explore the effect of the revised national guidelines, we examined testing rates for two years pre- and two years post-PICT guideline publication. RESULTS: Combining the clinical services, 253,178 patients were seen and 9,183 tests were performed (of which 80 tested positive, 0.9%) in the four-year study period. The emergency department (ED) performed the second highest number of tests, but had the lowest testing rates (0.9-1.1%). Of inpatient services, neurology and psychiatry had higher testing rates than internal medicine (19.7% and 9.6% versus 8%, respectively). There was no significant increase in testing rates, either globally or in the majority of the clinical services examined, and no increase in new HIV diagnoses post-PICT recommendations. CONCLUSIONS: Using a simple two-database tool, we observe no global improvement in HIV testing rates in our hospital following new national guidelines but do identify services where testing practices merit improvement. This study may show the limit of PICT strategies based on physician risk assessment, compared to the opt-out approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Is it possible to perfectly simulate a signature, in the particular and challenging case where the signature is simple? A set of signatures of six writers, considered to be simple on the basis of highlighted criteria, was sampled. These signatures were transferred to forgers requested to produce freehand simulations. Among these simulations, those capable of reproducing the features of the reference signatures were submitted for evaluation to forensic document experts through proficiency testing. The results suggest that there is no perfect simulation. With the supplementary aim of assessing the influence of forger's skills on the results, forgers were selected from three distinct populations, which differ according to professional criteria. The results indicate some differences in graphical capabilities between individuals. However, no trend could be established regarding age, degrees, years of practice and time dedicated to the exercise. The findings show that simulation is made easier if a graphical compatibility exists between the forger's own writing and the signature to be reproduced. Moreover, a global difficulty to preserve proportions and slant as well as the shape of capital letters and initials has been noticed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to improve the simulation of node number in soybean cultivars with determinate stem habits. A nonlinear model considering two approaches to input daily air temperature data (daily mean temperature and daily minimum/maximum air temperatures) was used. The node number on the main stem data of ten soybean cultivars was collected in a three-year field experiment (from 2004/2005 to 2006/2007) at Santa Maria, RS, Brazil. Node number was simulated using the Soydev model, which has a nonlinear temperature response function [f(T)]. The f(T) was calculated using two methods: using daily mean air temperature calculated as the arithmetic average among daily minimum and maximum air temperatures (Soydev tmean); and calculating an f(T) using minimum air temperature and other using maximum air temperature and then averaging the two f(T)s (Soydev tmm). Root mean square error (RMSE) and deviations (simulated minus observed) were used as statistics to evaluate the performance of the two versions of Soydev. Simulations of node number in soybean were better with the Soydev tmm version, with a 0.5 to 1.4 node RMSE. Node number can be simulated for several soybean cultivars using only one set of model coefficients, with a 0.8 to 2.4 node RMSE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Platelet P2YI2 receptor inhibition with clopidogrel, prasugrel or ticagrelor plays a key role to prevent recurrent ischaemic events after percutaneous coronary intervention in acute coronary syndromes or elective settings. The degree of platelet inhibition depends on the antiplatelet medication used and is influenced by clinical and genetic factors. A concept of therapeutic window exists. On one side, efficient anti-aggregation is required in order to reduce cardio-vascular events. On the other side, an excessive platelet inhibition represents a risk of bleeding complications. This article describes the current knowledge about some platelet function tests and genetic tests and summarises their role in the clinical practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel numerical algorithm for the simulation of seismic wave propagation in porous media, which is particularly suitable for the accurate modelling of surface wave-type phenomena. The differential equations of motion are based on Biot's theory of poro-elasticity and solved with a pseudospectral approach using Fourier and Chebyshev methods to compute the spatial derivatives along the horizontal and vertical directions, respectively. The time solver is a splitting algorithm that accounts for the stiffness of the differential equations. Due to the Chebyshev operator the grid spacing in the vertical direction is non-uniform and characterized by a denser spatial sampling in the vicinity of interfaces, which allows for a numerically stable and accurate evaluation of higher order surface wave modes. We stretch the grid in the vertical direction to increase the minimum grid spacing and reduce the computational cost. The free-surface boundary conditions are implemented with a characteristics approach, where the characteristic variables are evaluated at zero viscosity. The same procedure is used to model seismic wave propagation at the interface between a fluid and porous medium. In this case, each medium is represented by a different grid and the two grids are combined through a domain-decomposition method. This wavefield decomposition method accounts for the discontinuity of variables and is crucial for an accurate interface treatment. We simulate seismic wave propagation with open-pore and sealed-pore boundary conditions and verify the validity and accuracy of the algorithm by comparing the numerical simulations to analytical solutions based on zero viscosity obtained with the Cagniard-de Hoop method. Finally, we illustrate the suitability of our algorithm for more complex models of porous media involving viscous pore fluids and strongly heterogeneous distributions of the elastic and hydraulic material properties.