987 resultados para Threshold Limit Values


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pulsed Nd-YAG laser beam is used to produce a transient refractive index gradient in air adjoining the plane surface of the sample material. This refractive index gradient is probed by a continuous He-Ne laser beam propagating parallel to the sample surface. The observed deflection signals produced by the probe beam exhibit drastic variations when the pump laser energy density crosses the damage threshold for the sample. The measurements are used to estimate the damage threshold for a few polymer samples. The present values are found to be in good agreement with those determined by other methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The acoustic signals generated in solids due to interaction with pulsed laser beam is used to determine the ablation threshold of bulk polymer samples of teflon (polytetrafluoroethylene) and nylon under the irradiation from a Q-switched Nd:YAG laser at 1.06µm wavelength. A suitably designed piezoelectric transducer is employed for the detection of photoacoustic (PA) signals generated in this process. It has been observed that an abrupt increase in the amplitude of the PA signal occurs at the ablation threshold. Also there exist distinct values for the threshold corresponding to different mechanisms operative in producing damages like surface morphology, bond breaking and melting processes at different laser energy densities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser‐induced damage and ablation thresholds of bulk superconducting samples of Bi2(SrCa)xCu3Oy(x=2, 2.2, 2.6, 2.8, 3) and Bi1.6 (Pb)xSr2Ca2Cu3 Oy (x=0, 0.1, 0.2, 0.3, 0.4) for irradiation with a 1.06 μm beam from a Nd‐YAG laser have been determined as a function of x by the pulsed photothermal deflection technique. The threshold values of power density for ablation as well as damage are found to increase with increasing values of x in both systems while in the Pb‐doped system the threshold values decrease above a specific value of x, coinciding with the point at which the Tc also begins to fall.  

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photothermal deflection technique was used for determining the laser damage threshold of polymer samples of teflon (PTFE) and nylon. The experiment was conducted using a Q-switched Nd-YAG laser operating at its fundamental wavelength (1-06μm, pulse width 10 nS FWHM) as irradiation source and a He-Ne laser as the probe beam, along with a position sensitive detector. The damage threshold values determined by photothermal deflection method were in good agreement with those determined by other methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biclustering is simultaneous clustering of both rows and columns of a data matrix. A measure called Mean Squared Residue (MSR) is used to simultaneously evaluate the coherence of rows and columns within a submatrix. In this paper a novel algorithm is developed for biclustering gene expression data using the newly introduced concept of MSR difference threshold. In the first step high quality bicluster seeds are generated using K-Means clustering algorithm. Then more genes and conditions (node) are added to the bicluster. Before adding a node the MSR X of the bicluster is calculated. After adding the node again the MSR Y is calculated. The added node is deleted if Y minus X is greater than MSR difference threshold or if Y is greater than MSR threshold which depends on the dataset. The MSR difference threshold is different for gene list and condition list and it depends on the dataset also. Proper values should be identified through experimentation in order to obtain biclusters of high quality. The results obtained on bench mark dataset clearly indicate that this algorithm is better than many of the existing biclustering algorithms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The photoionization cross sections for the production of the Kr II 4s state and Kr II satellite states were studied in the 4s ionization threshold region. The interference of direct photoionization and ionization through the autoionization decay of doubly-excited states was considered. In the calculations of doubly-excited state energies, performed by a configuration interaction technique, the 4p spin-orbit interaction and the (Kr II core)-(excited electron) Coulomb interaction were included. The theoretical cross sections are in many cases in good agreement with the measured values. Strong resonant features in the satellite spectra with threshold energies greater than 30 eV are predicted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hourly data (1994–2009) of surface ozone concentrations at eight monitoring sites have been investigated to assess target level and long–term objective exceedances and their trends. The European Union (EU) ozone target value for human health (60 ppb–maximum daily 8–hour running mean) has been exceeded for a number of years for almost all sites but never exceeded the set limit of 25 exceedances in one year. Second highest annual hourly and 4th highest annual 8–hourly mean ozone concentrations have shown a statistically significant negative trend for in–land sites of Cork–Glashaboy, Monaghan and Lough Navar and no significant trend for the Mace Head site. Peak afternoon ozone concentrations averaged over a three year period from 2007 to 2009 have been found to be lower than corresponding values over a three–year period from 1996 to 1998 for two sites: Cork–Glashaboy and Lough Navar sites. The EU long–term objective value of AOT40 (Accumulated Ozone Exposure over a threshold of 40 ppb) for protection of vegetation (3 ppm–hour, calculated from May to July) has been exceeded, on an individual year basis, for two sites: Mace Head and Valentia. The critical level for the protection of forest (10 ppm–hour from April to September) has not been exceeded for any site except at Valentia in the year 2003. AOT40–Vegetation shows a significant negative trend for a 3–year running average at Cork–Glashaboy (–0.13±0.02 ppm–hour per year), at Lough Navar (–0.05±0.02 ppm–hour per year) and at Monaghan (–0.03±0.03 ppm–hour per year–not statistically significant) sites. No statistically significant trend was observed for the coastal site of Mace head. Overall, with the exception of the Mace Head and Monaghan sites, ozone measurement records at Irish sites show a downward negative trend in peak values that affect human health and vegetation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We test the expectations theory of the term structure of U.S. interest rates in nonlinear systems. These models allow the response of the change in short rates to past values of the spread to depend upon the level of the spread. The nonlinear system is tested against a linear system, and the results of testing the expectations theory in both models are contrasted. We find that the results of tests of the implications of the expectations theory depend on the size and sign of the spread. The long maturity spread predicts future changes of the short rate only when it is high.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model intercomparisons have identified important deficits in the representation of the stable boundary layer by turbulence parametrizations used in current weather and climate models. However, detrimental impacts of more realistic schemes on the large-scale flow have hindered progress in this area. Here we implement a total turbulent energy scheme into the climate model ECHAM6. The total turbulent energy scheme considers the effects of Earth’s rotation and static stability on the turbulence length scale. In contrast to the previously used turbulence scheme, the TTE scheme also implicitly represents entrainment flux in a dry convective boundary layer. Reducing the previously exaggerated surface drag in stable boundary layers indeed causes an increase in southern hemispheric zonal winds and large-scale pressure gradients beyond observed values. These biases can be largely removed by increasing the parametrized orographic drag. Reducing the neutral limit turbulent Prandtl number warms and moistens low-latitude boundary layers and acts to reduce longstanding radiation biases in the stratocumulus regions, the Southern Ocean and the equatorial cold tongue that are common to many climate models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we have studied the effects of random biquadratic and random fields in spin-glass models using the replica method. The effect of a random biquadratic coupling was studied in two spin-1 spin-glass models: in one case the interactions occur between pairs of spins, whereas in the second one the interactions occur between p spins and the limit p > oo is considered. Both couplings (spin glass and biquadratic) have zero-mean Gaussian probability distributions. In the first model, the replica-symmetric assumption reveals that the system presents two pha¬ses, namely, paramagnetic and spin-glass, separated by a continuous transition line. The stability analysis of the replica-symmetric solution yields, besides the usual instability associated with the spin-glass ordering, a new phase due to the random biquadratic cou¬plings between the spins. For the case p oo, the replica-symmetric assumption yields again only two phases, namely, paramagnetic and quadrupolar. In both these phases the spin-glass parameter is zero. Besides, it is shown that they are stable under the Almeida-Thouless stability analysis. One of them presents negative entropy at low temperatures. We developed one step of replica simmetry breaking and noticed that a new phase, the biquadratic glass phase, emerge. In this way we have obtained the correct phase diagram, with.three first-order transition lines. These lines merges in a common triple point. The effects of random fields were studied in the Sherrington-Kirkpatrick model consi¬dered in the presence of an external random magnetic field following a trimodal distribu¬tion {P{hi) = p+S(hi - h0) +Po${hi) +pS(hi + h0))- It is shown that the border of the ferromagnetic phase may present, for conveniently chosen values of p0 and hQ, first-order phase transitions, as well as tricritical points at finite temperatures. It is verified that the first-order phase transitions are directly related to the dilution in the fields: the extensions of these transitions are reduced for increasing values of po- In fact, the threshold value pg, above which all phase transitions are continuous, is calculated analytically. The stability analysis of the replica-symmetric solution is performed and the regions of validity of such a solution are identified

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os objetivos do presente estudo foram avaliar a correlação entre a contagem eletrônica de células somáticas (eCCS) com o Somaticell® sob diferentes níveis de contagem de células somáticas (CCS) do leite e patógenos causadores de mastites, além de calcular a sensibilidade, especificidade e valores preditivos do Somaticell® utilizando diferentes limites de CCS estabelecidos pelos diferentes países. Trezentos e quarenta amostras de leite foram coletadas assepticamente após realização do California Mastitis Test (CMT). O Somaticell® e a eCCS foram realizados em todas as amostras de leite. A correlação entre o Somaticell® e a contagem eletrônica foi determinada de acordo com o CMT, patógeno isolado e escore de eCCS. de acordo com os escores de CCS estabelecidos, 26,5% das amostras de leite apresentaram escore 1 (69-166 x10³células mL-1), 26,8% escore 2 (167-418x10³células mL-1), 27,4% escore 3 (419-760x10³células mL-1) e 19,4% escore 4 (761 to 1970x10³células mL-1). A eCCS e o Somaticell® apresentaram correlação positiva em quase todos os escores estudados (exceto escore 2 e 3). O valor de r obtido entre CCS e o Somaticell® foi de 0,32. Observou-se que, quando o limite de CCS estabelecido aumentou, a sensibilidade decresceu e os valores de especificidade aumentaram. Os valores preditivos apresentaram-se constantes em todos os limites. Quando o limite de CCS era baixo (<760,000 células mL-1), Somaticell® forneceu resultados consistentemente mais elevados que os valores de CCS. Já para amostras com CCS elevada, Somaticell® resultou em menores contagens que a eCCS. A correlação entre os dois métodos permaneceu relativamente constante em todas as condições e os valores de sensibilidade e especificidade do teste são altamente dependentes do limite estabelecido. Os resultados deste trabalho sugerem que o Somaticell® não é útil para avaliar a CCS do leite, pois seus resultados são significativamente diferentes da eCCS, no entanto, pode ser utilizado como método de triagem, tal como o CMT, para a detecção do aumento da CCS do leite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The lethargic crab disease (LCD) is an emergent infirmity that has decimated native populations of the mangrove land crab (Ucides cordatus, Decapoda: Ocypodidae) along the Brazilian coast. Several potential etiological agents have been linked with LCD, but only in 2005 was it proved that it is caused by an ascomycete fungus. This is the first attempt to develop a mathematical model to describe the epidemiological dynamics of LCD. The model presents four possible scenarios, namely, the trivial equilibrium, the disease-free equilibrium, endemic equilibrium, and limit cycles arising from a Hopf bifurcation. The threshold values depend on the basic reproductive number of crabs and fungi, and on the infection rate. These scenarios depend on both the biological assumptions and the temporal evolution of the disease. Numerical simulations corroborate the analytical results and illustrate the different temporal dynamics of the crab and fungus populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)