123 resultados para Tolerance Threshold


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A major concern in stiffener run-out regions, where the stiffener is terminated due to a cut-out, intersecting rib, or some other structural feature which interrupts the load path, is the relatively weak skin–stiffener interface in the absence of mechanical fasteners. More damage tolerant stiffener run-outs are clearly required and these are investigated in this paper. Using a parametric finite element analysis, the run-out region was optimised for stable debonding crack growth. The modified run-out, as well as a baseline configuration, were manufactured and tested. Damage initiation and propagation was investigated in detail using state-of-the-art monitoring equipment including Acoustic Emission and Digital Image Correlation. As expected, the baseline configuration failed catastrophically. The modified run-out showed improved crack-growth stability, but subsequent delamination failure in the stiffener promptly led to catastrophic failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A criterion is derived for delamination onset in transversely isotropic laminated plates under small mass, high velocity impact. The resulting delamination threshold load is about 21% higher than the corresponding quasi-static threshold load. A closed form approximation for the peak impact load is then used to predict the delamination threshold velocity. The theory is validated for a range of test cases by comparison with 3D finite element simulation using LS-DYNA and a newly developed interface element to model delamination onset and growth. The predicted delamination threshold loads and velocities are in very good agreement with the finite element simulations. Good agreement is also shown in a comparison with published experimental results. In contrast to quasi-static impacts, delamination growth occurs under a rapidly decreasing load. Inclusion of finite thickness effects and a proper description of the contact stiffness are found to be vital for accurate prediction of the delamination threshold velocity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: To compare the ability of Glaucoma Progression Analysis (GPA) and Threshold Noiseless Trend (TNT) programs to detect visual-field deterioration.

METHODS: Patients with open-angle glaucoma followed for a minimum of 2 years and a minimum of seven reliable visual fields were included. Progression was assessed subjectively by four masked glaucoma experts, and compared with GPA and TNT results. Each case was judged to be stable, deteriorated or suspicious of deterioration

RESULTS: A total of 56 eyes of 42 patients were followed with a mean of 7.8 (SD 1.0) tests over an average of 5.5 (1.04) years. Interobserver agreement to detect progression was good (mean kappa = 0.57). Progression was detected in 10-19 eyes by the experts, in six by GPA and in 24 by TNT. Using the consensus expert opinion as the gold standard (four clinicians detected progression), the GPA sensitivity and specificity were 75% and 83%, respectively, while the TNT sensitivity and specificity was 100% and 77%, respectively.

CONCLUSION: TNT showed greater concordance with the experts than GPA in the detection of visual-field deterioration. GPA showed a high specificity but lower sensitivity, mainly detecting cases of high focality and pronounced mean defect slopes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The authors sought to quantify neighboring and distant interpoint correlations of threshold values within the visual field in patients with glaucoma. Methods: Visual fields of patients with confirmed or suspected glaucoma were analyzed (n = 255). One eye per patient was included. Patients were examined using the 32 program of the Octopus 1-2-3. Linear regression analysis among each of the locations and the rest of the points of the visual field was performed, and the correlation coefficient was calculated. The degree of correlation was categorized as high (r > 0.66), moderate (0.66 = r > 0.33), or low (r = 0.33). The standard error of threshold estimation was calculated. Results: Most locations of the visual field had high and moderate correlations with neighboring points and with distant locations corresponding to the same nerve fiber bundle. Locations of the visual field had low correlations with those of the opposite hemifield, with the exception of locations temporal to the blind spot. The standard error of threshold estimation increased from 0.6 to 0.9 dB with an r reduction of 0.1. Conclusion: Locations of the visual field have highest interpoint correlation with neighboring points and with distant points in areas corresponding to the distribution of the retinal nerve fiber layer. The quantification of interpoint correlations may be useful in the design and interpretation of visual field tests in patients with glaucoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the sensitivity and specificity of the screening mode of the Humphrey-Welch Allyn frequency-doubling technology (FDT), Octopus tendency-oriented perimetry (TOP), and the Humphrey Swedish Interactive Threshold Algorithm (SITA)-fast (HSF) in patients with glaucoma. DESIGN: A comparative consecutive case series. METHODS: This was a prospective study which took place in the glaucoma unit of an academic department of ophthalmology. One eye of 70 consecutive glaucoma patients and 28 age-matched normal subjects was studied. Eyes were examined with the program C-20 of FDT, G1-TOP, and 24-2 HSF in one visit and in random order. The gold standard for glaucoma was presence of a typical glaucomatous optic disk appearance on stereoscopic examination, which was judged by a glaucoma expert. The sensitivity and specificity, positive and negative predictive value, and receiver operating characteristic (ROC) curves of two algorithms for the FDT screening test, two algorithms for TOP, and three algorithms for HSF, as defined before the start of this study, were evaluated. The time required for each test was also analyzed. RESULTS: Values for area under the ROC curve ranged from 82.5%-93.9%. The largest area (93.9%) under the ROC curve was obtained with the FDT criteria, defining abnormality as presence of at least one abnormal location. Mean test time was 1.08 ± 0.28 minutes, 2.31 ± 0.28 minutes, and 4.14 ± 0.57 minutes for the FDT, TOP, and HSF, respectively. The difference in testing time was statistically significant (P <.0001). CONCLUSIONS: The C-20 FDT, G1-TOP, and 24-2 HSF appear to be useful tools to diagnose glaucoma. The test C-20 FDT and G1-TOP take approximately 1/4 and 1/2 of the time taken by 24 to 2 HSF. © 2002 by Elsevier Science Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biocides play an essential role in limiting the spread of infectious disease. The food industry is dependent on these agents, and their increasing use is a matter for concern. Specifically, the emergence of bacteria demonstrating increased tolerance to biocides, coupled with the potential for the development of a phenotype of cross-resistance to clinically important antimicrobial compounds, needs to be assessed. In this study, we investigated the tolerance of a collection of susceptible and multidrug-resistant (MDR) Salmonella enterica strains to a panel of seven commercially available food-grade biocide formulations. We explored their abilities to adapt to these formulations and their active biocidal agents, i.e., triclosan, chlorhexidine, hydrogen peroxide, and benzalkonium chloride, after sequential rounds of in vitro selection. Finally, cross-tolerance of different categories of biocidal formulations, their active agents, and the potential for coselection of resistance to clinically important antibiotics were investigated. Six of seven food-grade biocide formulations were bactericidal at their recommended working concentrations. All showed a reduced activity against both surface-dried and biofilm cultures. A stable phenotype of tolerance to biocide formulations could not be selected. Upon exposure of Salmonella strains to an active biocidal compound, a high-level of tolerance was selected for a number of Salmonella serotypes. No cross-tolerance to the different biocidal agents or food-grade biocide formulations was observed. Most tolerant isolates displayed changes in their patterns of susceptibility to antimicrobial compounds. Food industry biocides are effective against planktonic Salmonella. When exposed to sublethal concentrations of individual active biocidal agents, tolerant isolates may emerge. This emergence was associated with changes in antimicrobial susceptibilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to characterize the transcriptome of a balanced polymorphism, under the regulation of a single gene, for phosphate fertilizer responsiveness/arsenate toler- ance in wild grass Holcus lanatus genotypes screened from the same habitat.

De novo transcriptome sequencing, RNAseq (RNA sequencing) and single nucleotide poly- morphism (SNP) calling were conducted on RNA extracted from H.lanatus. Roche 454 sequencing data were assembled into c. 22 000 isotigs, and paired-end Illumina reads for phosphorus-starved (P) and phosphorus-treated (P+) genovars of tolerant (T) and nontoler- ant (N) phenotypes were mapped to this reference transcriptome.

Heatmaps of the gene expression data showed strong clustering of each P+/P treated genovar, as well as clustering by N/T phenotype. Statistical analysis identified 87 isotigs to be significantly differentially expressed between N and T phenotypes and 258 between P+ and P treated plants. SNPs and transcript expression that systematically differed between N and T phenotypes had regulatory function, namely proteases, kinases and ribonuclear RNA- binding protein and transposable elements.

A single gene for arsenate tolerance led to distinct phenotype transcriptomes and SNP pro- files, with large differences in upstream post-translational and post-transcriptional regulatory genes rather than in genes directly involved in P nutrition transport and metabolism per se.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a low-complexity closed-loop spatial multiplexing method with limited feedback over multi-input-multi-output (MIMO) fading channels. The transmit adaptation is simply performed by selecting transmit antennas (or substreams) by comparing their signal-to-noise ratios to a given threshold with a fixed nonadaptive constellation and fixed transmit power per substream. We analyze the performance of the proposed system by deriving closed-form expressions for spectral efficiency, average transmit power, and bit error rate (BER). Depending on practical system design constraints, the threshold is chosen to maximize the spectral efficiency (or minimize the average BER) subject to average transmit power and average BER (or spectral efficiency) constraints, respectively. We present numerical and Monte Carlo simulation results that validate our analysis. Compared to open-loop spatial multiplexing and other approaches that select the best antenna subset in spatial multiplexing, the numerical results illustrate that the proposed technique obtains significant power gains for the same BER and spectral efficiency. We also provide numerical results that show improvement over rate-adaptive orthogonal space-time block coding, which requires highly complex constellation adaptation. We analyze the impact of feedback delay using analytical and Monte Carlo approaches. The proposed approach is arguably the simplest possible adaptive spatial multiplexing system from an implementation point of view. However, our approach and analysis can be extended to other systems using multiple constellations and power levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We employ the time-dependent R-matrix (TDRM) method to calculate anisotropy parameters for positive and negative sidebands of selected harmonics generated by two-color two-photon above-threshold ionization of argon. We consider odd harmonics of an 800-nm field ranging from the 13th to 19th harmonic, overlapped by a fundamental 800-nm IR field. The anisotropy parameters obtained using the TDRM method are compared with those obtained using a second-order perturbation theory with a model potential approach and a soft photon approximation approach. Where available, a comparison is also made to published experimental results. All three theoretical approaches provide similar values for anisotropy parameters. The TDRM approach obtains values that are closest to published experimental values. At high photon energies, the differences between each of the theoretical methods become less significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The response of arsenate and non-tolerant Holcus lanatus L. phenotypes, where tolerance is achieved through suppression of high affinity phosphate/arsenate root uptake, was investigated under different growth regimes to investigate why there is a polymorphism in tolerance found in populations growing on uncontaminated soil. Tolerant plants screened from an arsenic uncontaminated population differed, when grown on the soil from the populations origin, from non-tolerants, in their biomass allocation under phosphate fertilization: non-tolerants put more resources into tiller production and down regulated investment in root production under phosphate fertilization while tolerants tillered less effectively and did not alter resource allocation to shoot biomass under phosphate fertilization. The two phenotypes also differed in their shoot mineral status having higher concentrations of copper, cadmium, lead and manganese, but phosphorus status differed little, suggesting tight homeostasis. The polymorphism was also widely present (40%) in other wild grass species suggesting an important ecological role for this gene that can be screened through plant root response to arsenate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.