997 resultados para Parameter testing
Resumo:
For the improvement of current neutron capture therapy, several liposomal formulations of neutron capture agent gadolinium were developed and tested in a glioma cell model. Formulations were analyzed regarding physicochemical and biological parameters, such as size, zeta potential, uptake into cancer cells and performance under neutron irradiation. The neutron and photon dose derived from intracellular as well as extracellular Gd was calculated via Monte Carlo simulations and set in correlation with the reduction of cell survival after irradiation. To investigate the suitability of Gd as a radiosensitizer for photon radiation, cells were also irradiated with synchrotron radiation in addition to clinically used photons generated by linear accelerator.rnIrradiation with neutrons led to significantly lower survival for Gd-liposome-treated F98 and LN229 cells, compared to irradiated control cells and cells treated with non-liposomal Gd-DTPA. Correlation between Gd-content and -dose and respective cell survival displayed proportional relationship for most of the applied formulations. Photon irradiation experiments showed the proof-of-principle for the radiosensitizer approach, although the photon spectra currently used have to be optimized for higher efficiency of the radiosensitizer. In conclusion, the newly developed Gd-liposomes show great potential for the improvement of radiation treatment options for highly malignant glioblastoma.rn
Resumo:
OBJECTIVE Texture analysis is an alternative method to quantitatively assess MR-images. In this study, we introduce dynamic texture parameter analysis (DTPA), a novel technique to investigate the temporal evolution of texture parameters using dynamic susceptibility contrast enhanced (DSCE) imaging. Here, we aim to introduce the method and its application on enhancing lesions (EL), non-enhancing lesions (NEL) and normal appearing white matter (NAWM) in multiple sclerosis (MS). METHODS We investigated 18 patients with MS and clinical isolated syndrome (CIS), according to the 2010 McDonald's criteria using DSCE imaging at different field strengths (1.5 and 3 Tesla). Tissues of interest (TOIs) were defined within 27 EL, 29 NEL and 37 NAWM areas after normalization and eight histogram-based texture parameter maps (TPMs) were computed. TPMs quantify the heterogeneity of the TOI. For every TOI, the average, variance, skewness, kurtosis and variance-of-the-variance statistical parameters were calculated. These TOI parameters were further analyzed using one-way ANOVA followed by multiple Wilcoxon sum rank testing corrected for multiple comparisons. RESULTS Tissue- and time-dependent differences were observed in the dynamics of computed texture parameters. Sixteen parameters discriminated between EL, NEL and NAWM (pAVG = 0.0005). Significant differences in the DTPA texture maps were found during inflow (52 parameters), outflow (40 parameters) and reperfusion (62 parameters). The strongest discriminators among the TPMs were observed in the variance-related parameters, while skewness and kurtosis TPMs were in general less sensitive to detect differences between the tissues. CONCLUSION DTPA of DSCE image time series revealed characteristic time responses for ELs, NELs and NAWM. This may be further used for a refined quantitative grading of MS lesions during their evolution from acute to chronic state. DTPA discriminates lesions beyond features of enhancement or T2-hypersignal, on a numeric scale allowing for a more subtle grading of MS-lesions.
Resumo:
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.
Resumo:
The susceptibility of clay bearing rocks to weathering (erosion and/or differential degradation) is known to influence the stability of heterogeneous slopes. However, not all of these rocks show the same behaviour, as there are considerable differences in the speed and type of weathering observed. As such, it is very important to establish relationships between behaviour quantified in a laboratory environment with that observed in the field. The slake durability test is the laboratory test most commonly used to evaluate the relationship between slaking behaviour and rock durability. However, it has a number of disadvantages; it does not account for changes in shape and size in fragments retained in the 2 mm sieve, nor does its most commonly used index (Id2) accurately reflect weathering behaviour observed in the field. The main aim of this paper is to propose a simple methodology for characterizing the weathering behaviour of carbonate lithologies that outcrop in heterogeneous rock masses (such as Flysch slopes), for use by practitioners. To this end, the Potential Degradation Index (PDI) is proposed. This is calculated using the fragment size distribution curves taken from material retained in the drum after each cycle of the slake durability test. The number of slaking cycles has also been increased to five. Through laboratory testing of 117 samples of carbonate rocks, extracted from strata in selected slopes, 6 different rock types were established based on their slaking behaviour, and corresponding to the different weathering behaviours observed in the field.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Are persistent marketing effects most likely to appear right after the introduction of a product? The authors give an affirmative answer to this question by developing a model that explicitly reports how persistent and transient marketing effects evolve over time. The proposed model provides managers with a valuable tool to evaluate their allocation of marketing expenditures over time. An application of the model to many pharmaceutical products, estimated through (exact initial) Kalman filtering, indicates that both persistent and transient effects occur predominantly immediately after a brand's introduction. Subsequently, the size of the effects declines. The authors theoretically and empirically compare their methodology with methodology based on unit root testing and demonstrate that the need for unit root tests creates difficulties in applying conventional persistence modeling. The authors recommend that marketing models should either accommodate persistent effects that change over time or be applied to mature brands or limited time windows only.
Resumo:
Many papers claim that a Log Periodic Power Law (LPPL) model fitted to financial market bubbles that precede large market falls or 'crashes', contains parameters that are confined within certain ranges. Further, it is claimed that the underlying model is based on influence percolation and a martingale condition. This paper examines these claims and their validity for capturing large price falls in the Hang Seng stock market index over the period 1970 to 2008. The fitted LPPLs have parameter values within the ranges specified post hoc by Johansen and Sornette (2001) for only seven of these 11 crashes. Interestingly, the LPPL fit could have predicted the substantial fall in the Hang Seng index during the recent global downturn. Overall, the mechanism posited as underlying the LPPL model does not do so, and the data used to support the fit of the LPPL model to bubbles does so only partially. © 2013.
Resumo:
Long-span bridges are flexible and therefore are sensitive to wind induced effects. One way to improve the stability of long span bridges against flutter is to use cross-sections that involve twin side-by-side decks. However, this can amplify responses due to vortex induced oscillations. Wind tunnel testing is a well-established practice to evaluate the stability of bridges against wind loads. In order to study the response of the prototype in laboratory, dynamic similarity requirements should be satisfied. One of the parameters that is normally violated in wind tunnel testing is Reynolds number. In this dissertation, the effects of Reynolds number on the aerodynamics of a double deck bridge were evaluated by measuring fluctuating forces on a motionless sectional model of a bridge at different wind speeds representing different Reynolds regimes. Also, the efficacy of vortex mitigation devices was evaluated at different Reynolds number regimes. One other parameter that is frequently ignored in wind tunnel studies is the correct simulation of turbulence characteristics. Due to the difficulties in simulating flow with large turbulence length scale on a sectional model, wind tunnel tests are often performed in smooth flow as a conservative approach. The validity of simplifying assumptions in calculation of buffeting loads, as the direct impact of turbulence, needs to be verified for twin deck bridges. The effects of turbulence characteristics were investigated by testing sectional models of a twin deck bridge under two different turbulent flow conditions. Not only the flow properties play an important role on the aerodynamic response of the bridge, but also the geometry of the cross section shape is expected to have significant effects. In this dissertation, the effects of deck details, such as width of the gap between the twin decks, and traffic barriers on the aerodynamic characteristics of a twin deck bridge were investigated, particularly on the vortex shedding forces with the aim of clarifying how these shape details can alter the wind induced responses. Finally, a summary of the issues that are involved in designing a dynamic test rig for high Reynolds number tests is given, using the studied cross section as an example.
Resumo:
We propose a mechanism for testing the theory of collapse models such as continuous spontaneous localization (CSL) by examining the parametric heating rate of a trapped nanosphere. The random localizations of the center-of-mass for a given particle predicted by the CSL model can be understood as a stochastic force embodying a source of heating for the nanosphere. We show that by utilising a Paul trap to levitate the particle and optical cooling, it is possible to reduce environmental decoher- ence to such a level that CSL dominates the dynamics and contributes the main source of heating. We show that this approach allows measurements to be made on the timescale of seconds, and that the free parameter λcsl which characterises the model ought to be testable to values as low as 10^{−12} Hz.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.