972 resultados para Threee Step Test
Resumo:
This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.
Resumo:
This paper is a first step of a research about the analysis of the richness of the existing sounds in the Plaza Mayor, due to the old and traditional shops and bars under its porticoes together with the huge daily affluence of people. In this paper we study the sound preferences of the salesmen and bar tenders at those traditional shops. These sound preferences include particular sounds, time of occurrence and date of specific annoying and pleasant sounds perceived at the square and the shops surrounding it. To carry out this study, several noise level measurements and socio-acoustic surveys were held. We will also try to correlate sound preferences and annoyance with noise levels of specific events existing at this particular square.
Resumo:
Comparison of mitochondrial and morphological divergence in eight populations of a widespread leaf-litter skink is used to determine the relative importance of geographic isolation and natural selection in generating phenotypic diversity in the Wet Tropics Rainforest region of Australia. The populations occur in two geographically isolated regions, and within each region, in two different habitats (closed rainforest and tall open forest) that span a well characterized ecological gradient. Morphological differences among ancient geographic isolates (separated for several million years, judging by their mitochondrial DNA sequence divergence) were slight, but morphological and life history differences among habitats were large and occurred despite moderate to high levels of mitochondrial gene flow. A field experiment identified avian predation as one potential agent of natural selection. These results indicate that natural selection operating across ecological gradients can be more important than geographic isolation in similar habitats in generating phenotypic diversity. In addition, our results indicate that selection is sufficiently strong to overcome the homogenizing effects of gene flow, a necessary first step toward speciation in continuously distributed populations. Because ecological gradients may be a source of evolutionary novelty, and perhaps new species, their conservation warrants greater attention. This is particularly true in tropical regions, where most reserves do not include ecological gradients and transitional habitats.
Resumo:
From a set of gonioapparent automotive samples from different manufacturers we selected 28 low-chroma color pairs with relatively small color differences predominantly in lightness. These color pairs were visually assessed with a gray scale at six different viewing angles by a panel of 10 observers. Using the Standardized Residual Sum of Squares (STRESS) index, the results of our visual experiment were tested against predictions made by 12 modern color-difference formulas. From a weighted STRESS index accounting for the uncertainty in visual assessments, the best prediction of our whole experiment was achieved using AUDI2000, CAM02-SCD, CAM02-UCS and OSA-GP-Euclidean color-difference formulas, which were no statistically significant different among them. A two-step optimization of the original AUDI2000 color-difference formula resulted in a modified AUDI2000 formula which performed both, significantly better than the original formula and below the experimental inter-observer variability. Nevertheless the proposal of a new revised AUDI2000 color-difference formula requires additional experimental data.
Resumo:
The focus of this research was defined by a poorly characterised filtration train employed to clarify culture broth containing monoclonal antibodies secreted by GS-NSO cells: the filtration train blinded unpredictably and the ability of the positively charged filters to adsorb DNA from process material was unknown. To direct the development of an assay to quantify the ability of depth filters to adsorb DNA, the molecular weight of DNA from a large-scale, fed-batch, mammalian cell culture vessel was evaluated as process material passed through the initial stages of the purification scheme. High molecular weight DNA was substantially cleared from the broth after passage through a disc stack centrifuge and the remaining low molecular weight DNA was largely unaffected by passage through a series of depth filters and a sterilising grade membrane. Removal of high molecular weight DNA was shown to be coupled with clarification of the process stream. The DNA from cell culture supernatant showed a pattern of internucleosomal cleavage of chromatin when fractionated by electrophoresis but the presence of both necrotic and apoptotic cells throughout the fermentation meant that the origin of the fragmented DNA could not be unequivocally determined. An intercalating fluorochrome, PicoGreen, was elected for development of a suitable DNA assay because of its ability to respond to low molecular weight DNA. It was assessed for its ability to determine the concentration of DNA in clarified mammalian cell culture broths containing pertinent monoclonal antibodies. Fluorescent signal suppression was ameliorated by sample dilution or by performing the assay above the pI of secreted IgG. The source of fluorescence in clarified culture broth was validated by incubation with RNase A and DNase I. At least 89.0 % of fluorescence was attributable to nucleic acid and pre-digestion with RNase A was shown to be a requirement for successful quantification of DNA in such samples. Application of the fluorescence based assay resulted in characterisation of the physical parameters governing adsorption of DNA by various positively charged depth filters and membranes in test solutions and the DNA adsorption profile of the manufacturing scale filtration train. Buffers that reduced or neutralised the depth filter or membrane charge, and those that impeded hydrophobic interactions were shown to affect their operational capacity, demonstrating that DNA was adsorbed by a combination of electrostatic and hydrophobic interactions. Production-scale centrifugation of harvest broth containing therapeutic protein resulted in the reduction of total DNA in the process stream from 79.8 μg m1-1 to 9.3 μg m1-1 whereas the concentration of DNA in the supernatant of pre-and post-filtration samples had only marginally reduced DNA content: from 6.3 to 6.0 μg m1-1 respectively. Hence the filtration train was shown to ineffective in DNA removal. Historically, blinding of the depth filters had been unpredictable with data such as numbers of viable cells, non-viable cells, product titre, or process shape (batch, fed-batch, or draw and fill) failing to inform on the durability of depth filters in the harvest step. To investigate this, key fouling contaminants were identified by challenging depth filters with the same mass of one of the following: viable healthy cells, cells that had died by the process of apoptosis, and cells that had died through the process of necrosis. The pressure increase across a Cuno Zeta Plus 10SP depth filter was 2.8 and 16.5 times more sensitive to debris from apoptotic and necrotic cells respectively, when compared to viable cells. The condition of DNA released into the culture broth was assessed. Necrotic cells released predominantly high molecular weight DNA in contrast to apoptotic cells which released chiefly low molecular weight DNA. The blinding of the filters was found to be largely unaffected by variations in the particle size distribution of material in, and viscosity of, solutions with which they were challenged. The exceptional response of the depth filters to necrotic cells may suggest the cause of previously noted unpredictable filter blinding whereby a number of necrotic cells have a more significant impact on the life of a depth filter than a similar number of viable or apoptotic cells. In a final set of experiments the pressure drop caused by non-viable necrotic culture broths which had been treated with DNase I or benzonase was found to be smaller when compared to untreated broths: the abilities of the enzyme treated cultures to foul the depth filter were reduced by 70.4% and 75.4% respectively indicating the importance of DNA in the blinding of the depth filter studied.
Resumo:
Purpose: To evaluate the effect of reducing the number of visual acuity measurements made in a defocus curve on the quality of data quantified. Setting: Midland Eye, Solihull, United Kingdom. Design: Evaluation of a technique. Methods: Defocus curves were constructed by measuring visual acuity on a distance logMAR letter chart, randomizing the test letters between lens presentations. The lens powers evaluated ranged between +1.50 diopters (D) and -5.00 D in 0.50 D steps, which were also presented in a randomized order. Defocus curves were measured binocularly with the Tecnis diffractive, Rezoom refractive, Lentis rotationally asymmetric segmented (+3.00 D addition [add]), and Finevision trifocal multifocal intraocular lenses (IOLs) implanted bilaterally, and also for the diffractive IOL and refractive or rotationally asymmetric segmented (+3.00 D and +1.50 D adds) multifocal IOLs implanted contralaterally. Relative and absolute range of clear-focus metrics and area metrics were calculated for curves fitted using 0.50 D, 1.00 D, and 1.50 D steps and a near add-specific profile (ie, distance, half the near add, and the full near-add powers). Results: A significant difference in simulated results was found in at least 1 of the relative or absolute range of clear-focus or area metrics for each of the multifocal designs examined when the defocus-curve step size was increased (P<.05). Conclusion: Faster methods of capturing defocus curves from multifocal IOL designs appear to distort the metric results and are therefore not valid. Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned. © 2013 ASCRS and ESCRS.
Resumo:
Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.
Resumo:
This study developed a reliable and repeatable methodology to evaluate the fracture properties of asphalt mixtures with an overlay test (OT). In the proposed methodology, first, a two-step OT protocol was used to characterize the undamaged and damaged behaviors of asphalt mixtures. Second, a new methodology combining the mechanical analysis of viscoelastic force equilibrium in the OT specimen and finite element simulations was used to determine the undamaged properties and crack growth function of asphalt mixtures. Third, a modified Paris's law replacing the stress intensity factor by the pseudo J-integral was employed to characterize the fracture behavior of asphalt mixtures. Theoretical equations were derived to calculate the parameters A and n (defined as the fracture properties) in the modified Paris's law. The study used a detailed example to calculate A and n from the OT data. The proposed methodology was successfully applied to evaluate the impact of warm-mix asphalt (WMA) technologies on fracture properties. The results of the tested specimens showed that Evotherm WMA technology slightly improved the cracking resistance of asphalt mixtures, while foaming WMA technology provided comparable fracture properties. In addition, the study found that A decreased with the increase in n in general. A linear relationship between 2log(A) and n was established.
Resumo:
Fixed-step-size (FSS) and Bayesian staircases are widely used methods to estimate sensory thresholds in 2AFC tasks, although a direct comparison of both types of procedure under identical conditions has not previously been reported. A simulation study and an empirical test were conducted to compare the performance of optimized Bayesian staircases with that of four optimized variants of FSS staircase differing as to up-down rule. The ultimate goal was to determine whether FSS or Bayesian staircases are the best choice in experimental psychophysics. The comparison considered the properties of the estimates (i.e. bias and standard errors) in relation to their cost (i.e. the number of trials to completion). The simulation study showed that mean estimates of Bayesian and FSS staircases are dependable when sufficient trials are given and that, in both cases, the standard deviation (SD) of the estimates decreases with number of trials, although the SD of Bayesian estimates is always lower than that of FSS estimates (and thus, Bayesian staircases are more efficient). The empirical test did not support these conclusions, as (1) neither procedure rendered estimates converging on some value, (2) standard deviations did not follow the expected pattern of decrease with number of trials, and (3) both procedures appeared to be equally efficient. Potential factors explaining the discrepancies between simulation and empirical results are commented upon and, all things considered, a sensible recommendation is for psychophysicists to run no fewer than 18 and no more than 30 reversals of an FSS staircase implementing the 1-up/3-down rule.