64 resultados para laser threshold
Resumo:
Light Detection And Ranging (LIDAR) is an important modality in terrain and land surveying for many environmental, engineering and civil applications. This paper presents the framework for a recently developed unsupervised classification algorithm called Skewness Balancing for object and ground point separation in airborne LIDAR data. The main advantages of the algorithm are threshold-freedom and independence from LIDAR data format and resolution, while preserving object and terrain details. The framework for Skewness Balancing has been built in this contribution with a prediction model in which unknown LIDAR tiles can be categorised as “hilly” or “moderate” terrains. Accuracy assessment of the model is carried out using cross-validation with an overall accuracy of 95%. An extension to the algorithm is developed to address the overclassification issue for hilly terrain. For moderate terrain, the results show that from the classified tiles detached objects (buildings and vegetation) and attached objects (bridges and motorway junctions) are separated from bare earth (ground, roads and yards) which makes Skewness Balancing ideal to be integrated into geographic information system (GIS) software packages.
Resumo:
Laser beams emitted from the Geoscience Laser Altimeter System (GLAS), as well as other spaceborne laser instruments, can only penetrate clouds to a limit of a few optical depths. As a result, only optical depths of thinner clouds (< about 3 for GLAS) are retrieved from the reflected lidar signal. This paper presents a comprehensive study of possible retrievals of optical depth of thick clouds using solar background light and treating GLAS as a solar radiometer. To do so one must first calibrate the reflected solar radiation received by the photon-counting detectors of the GLAS 532-nm channel, the primary channel for atmospheric products. Solar background radiation is regarded as a noise to be subtracted in the retrieval process of the lidar products. However, once calibrated, it becomes a signal that can be used in studying the properties of optically thick clouds. In this paper, three calibration methods are presented: (i) calibration with coincident airborne and GLAS observations, (ii) calibration with coincident Geostationary Opera- tional Environmental Satellite (GOES) and GLAS observations of deep convective clouds, and (iii) cali- bration from first principles using optical depth of thin water clouds over ocean retrieved by GLAS active remote sensing. Results from the three methods agree well with each other. Cloud optical depth (COD) is retrieved from the calibrated solar background signal using a one-channel retrieval. Comparison with COD retrieved from GOES during GLAS overpasses shows that the average difference between the two retriev- als is 24%. As an example, the COD values retrieved from GLAS solar background are illustrated for a marine stratocumulus cloud field that is too thick to be penetrated by the GLAS laser. Based on this study, optical depths for thick clouds will be provided as a supplementary product to the existing operational GLAS cloud products in future GLAS data releases.
Resumo:
We present an application of cavity-enhanced absorption spectroscopy with an off-axis alignment of the cavity formed by two spherical mirrors and with time integration of the cavity-output intensity for detection of nitrogen dioxide (NO2) and iodine monoxide (IO) radicals using a violet laser diode at lambda = 404.278 nm. A noise-equivalent (1sigma = root-mean-square variation of the signal) fractional absorption for one optical pass of 4.5x10(-8) was demonstrated with a mirror reflectivity of similar to0.99925, a cavity length of 0.22 m and a lock-in-amplifier time constant of 3 s. Noise-equivalent detection sensitivities towards nitrogen dioxide of 1.8x10(10) molecule cm(-3) and towards the IO radical of 3.3x10(9) molecule cm(-3) were achieved in flow tubes with an inner diameter of 4 cm for a lock-in-amplifier time constant of 3 s. Alkyl peroxy radicals were detected using chemical titration with excess nitric oxide (RO2 + NO --> RO + NO2). Measurement of oxygen-atom concentrations was accomplished by determining the depletion of NO2 in the reaction NO2 + O --> NO + O-2. Noise-equivalent concentrations of alkyl peroxy radicals and oxygen atoms were 3x10(10) molecule cm(-3) in the discharge-flow-tube experiments.
Resumo:
A new approach is presented to identify the number of incoming signals in antenna array processing. The new method exploits the inherent properties existing in the noise eigenvalues of the covariance matrix of the array output. A single threshold has been established concerning information about the signal and noise strength, data length, and array size. When the subspace-based algorithms are adopted the computation cost of the signal number detector can almost be neglected. The performance of the threshold is robust against low SNR and short data length.
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
This mixed-method study tracked social interaction and adaptation among 20 international postgraduates on a 1-year programme in the UK, examining assumptions that language proficiency and interactional engagement directly underpin sociocultural adaptation. Participants remained frustrated by a perceived ‘threshold’ barring successful interaction with English speakers, while reporting reluctance to take up available opportunities, independent of language proficiency and sociocultural adaptation. We challenge linear models of adaptation and call for assistance to international students in crossing the threshold to successful interaction.
Resumo:
Cell patterning commonly employs photolithographic methods for the micro fabrication of structures on silicon chips. These require expensive photo-mask development and complex photolithographic processing. Laser based patterning of cells has been studied in vitro and laser ablation of polymers is an active area of research promising high aspect ratios. This paper disseminates how 800 nm femtosecond infrared (IR) laser radiation can be successfully used to perform laser ablative micromachining of parylene-C on SiO2 substrates for the patterning of human hNT astrocytes (derived from the human teratocarcinoma cell line (hNT)) whilst 248 nm nanosecond ultra-violet laser radiation produces photo-oxidization of the parylene-C and destroys cell patterning. In this work, we report the laser ablation methods used and the ablation characteristics of parylene-C for IR pulse fluences. Results follow that support the validity of using IR laser ablative micromachining for patterning human hNT astrocytes cells. We disseminate the variation in yield of patterned hNT astrocytes on parylene-C with laser pulse spacing, pulse number, pulse fluence and parylene-C strip width. The findings demonstrate how laser ablative micromachining of parylene-C on SiO2 substrates can offer an accessible alternative for rapid prototyping, high yield cell patterning with broad application to multi-electrode arrays, cellular micro-arrays and microfluidics.
Resumo:
This paper describes the use of 800nm femtosecond infrared (IR) and 248nm nanosecond ultraviolet (UV) laser radiation in performing ablative micromachining of parylene-C on SiO2 substrates for the patterning of human hNT astrocytes. Results are presented that support the validity of using IR laser ablative micromachining for patterning human hNT astrocytes cells while UV laser radiation produces photo-oxidation of the parylene-C and destroys cell patterning. The findings demonstrate how IR laser ablative micromachining of parylene-C on SiO2 substrates can offer a low cost, accessible alternative for rapid prototyping, high yield cell patterning.
Resumo:
We test the expectations theory of the term structure of U.S. interest rates in nonlinear systems. These models allow the response of the change in short rates to past values of the spread to depend upon the level of the spread. The nonlinear system is tested against a linear system, and the results of testing the expectations theory in both models are contrasted. We find that the results of tests of the implications of the expectations theory depend on the size and sign of the spread. The long maturity spread predicts future changes of the short rate only when it is high.
Resumo:
BACKGROUND. To use spectra acquired by matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) from pre- and post-digital rectal examination (DRE) urine samples to search for discriminating peaks that can adequately distinguish between benign and malignant prostate conditions, and identify the peaks’ underlying biomolecules. METHODS. Twenty-five participants with prostate cancer (PCa) and 27 participants with a variety of benign prostatic conditions as confirmed by a 10-core tissue biopsy were included. Pre- and post-DRE urine samples were prepared for MALDI MS profiling using an automated clean-up procedure. Following mass spectra collection and processing, peak mass and intensity were extracted and subjected to statistical analysis to identify peaks capable of distinguishing between benign and cancer. Logistic regression was used to combine markers to create a sensitive and specific test. RESULTS. A peak at m/z 10,760 was identified as b-microseminoprotein (b-MSMB) and found to be statistically lower in urine from PCa participants using the peak’s average areas. By combining serum prostate-specific antigen (PSA) levels with MALDI MS-measured b-MSMB levels, optimum threshold values obtained from Receiver Operator characteristics curves gave an increased sensitivity of 96% at a specificity of 26%. CONCLUSIONS. These results demonstrate that with a simple sample clean-up followed by MALDI MS profiling, significant differences of MSMB abundance were found in post-DRE urine samples. In combination with PSA serum levels, obtained from a classic clinical assay led to high classification accuracy for PCa in the studied sample set. Our results need to be validated in a larger multicenter prospective randomized clinical trial.
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.
Resumo:
This paper combines and generalizes a number of recent time series models of daily exchange rate series by using a SETAR model which also allows the variance equation of a GARCH specification for the error terms to be drawn from more than one regime. An application of the model to the French Franc/Deutschmark exchange rate demonstrates that out-of-sample forecasts for the exchange rate volatility are also improved when the restriction that the data it is drawn from a single regime is removed. This result highlights the importance of considering both types of regime shift (i.e. thresholds in variance as well as in mean) when analysing financial time series.
Resumo:
Taste and smell detection threshold measurements are frequently time consuming especially when the method involves reversing the concentrations presented to replicate and improve accuracy of results. These multiple replications are likely to cause sensory and cognitive fatigue which may be more pronounced in elderly populations. A new rapid detection threshold methodology was developed that quickly located the likely position of each individuals sensory detection threshold then refined this by providing multiple concentrations around this point to determine their threshold. This study evaluates the reliability and validity of this method. Findings indicate that this new rapid detection threshold methodology was appropriate to identify differences in sensory detection thresholds between different populations and has positive benefits in providing a shorter assessment of detection thresholds. The results indicated that this method is appropriate at determining individual as well as group detection thresholds.
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.