925 resultados para Antibiotic sensitivity test


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper describes the sensitivity of the simulated precipitation to changes in convective relaxation time scale (TAU) of Zhang and McFarlane (ZM) cumulus parameterization, in NCAR-Community Atmosphere Model version 3 (CAM3). In the default configuration of the model, the prescribed value of TAU, a characteristic time scale with which convective available potential energy (CAPE) is removed at an exponential rate by convection, is assumed to be 1 h. However, some recent observational findings suggest that, it is larger by around one order of magnitude. In order to explore the sensitivity of the model simulation to TAU, two model frameworks have been used, namely, aqua-planet and actual-planet configurations. Numerical integrations have been carried out by using different values of TAU, and its effect on simulated precipitation has been analyzed. The aqua-planet simulations reveal that when TAU increases, rate of deep convective precipitation (DCP) decreases and this leads to an accumulation of convective instability in the atmosphere. Consequently, the moisture content in the lower-and mid-troposphere increases. On the other hand, the shallow convective precipitation (SCP) and large-scale precipitation (LSP) intensify, predominantly the SCP, and thus capping the accumulation of convective instability in the atmosphere. The total precipitation (TP) remains approximately constant, but the proportion of the three components changes significantly, which in turn alters the vertical distribution of total precipitation production. The vertical structure of moist heating changes from a vertically extended profile to a bottom heavy profile, with the increase of TAU. Altitude of the maximum vertical velocity shifts from upper troposphere to lower troposphere. Similar response was seen in the actual-planet simulations. With an increase in TAU from 1 h to 8 h, there was a significant improvement in the simulation of the seasonal mean precipitation. The fraction of deep convective precipitation was in much better agreement with satellite observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most women acquire genital high risk human papillomavirus (HPV) infection during their lifetime, but seldom the infection persists and leads to cervical cancer. However, currently it is not possible to identify the women who will develop HPV mediated cervical cancer and this often results to large scale follow-up and overtreatment of the likely spontaneously regressing infection. Thus, it is important to obtain more information on the course of HPV and find markers that could help to identify HPV infected women in risk for progression of cervical lesions and ultimately cancer. Nitric oxide is a free radical gas that takes part both in immune responses and carcinogenesis. Nitric oxide is produced also by cervical cells and therefore, it is possible that cervical nitric oxide could affect also HPV infection. In the present study, including 801 women from the University of Helsinki between years of 2006 and 2011, association between HPV and cervical nitric oxide was evaluated. The levels of nitric oxide were measured as its metabolites nitrate and nitirite (NOx) by spectrophotometry and the expression of nitric oxide producing enzymes endothelial and inducible synthases (eNOS, iNOS) by Western blotting. Women infected with HPV had two-times higher cervical fluid NOx levels compared with non-infected ones. The expression levels of both eNOS and iNOS were higher in HPV-infected women compared with non-infected. Another sexually transmitted disease Chlamydia trachomatis that is an independent risk factor for cervical cancer was also accompanied with elevated NOx levels, whereas vaginal infections, bacterial vaginosis and candida, did not have any effect on NOx levels. The meaning of the elevated HPV related cervical nitric oxide was evaluated in a 12 months follow-up study. It was revealed that high baseline cervical fluid NOx levels favored HPV persistence with OR 4.1. However, low sensitivity (33%) and high false negative rate (67%) restrict the clinical use of the current NOx test. This study indicated that nitric oxide favors HPV persistence and thus it seems to be one of the cofactor associated with a risk of carcinogenesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently reported experimental results on the rotation sensitivity of Lau fringes to the spatial coherence of the source have been theoretically analyzed and explained on the basis of coherence theory. A theoretical plot of the rotation angle required for the Lau fringes to vanish is obtained as a function of the coherence length of the illumination used in the Lau experiment. The theoretical results compare well with the experimental observations. The analysis as well as the experiment could form the basis for a simple and easy measurement of the coherence length of the illumination in a plane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A critical test has been presented to establish the nature of the kinetic pathways for the decomposition of Fe-12 at.% Si alloy below the metastable tricritical point. The results, based on the measurements of saturation magnetization, establish that a congruent ordering from B2 --> D0(3) precedes the development of a B2 + D0(3) two-phase field, consistent with the predictions in 1976 of Allen and Cahn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The blood-brain barrier (BBB) is a unique barrier that strictly regulates the entry of endogenous substrates and xenobiotics into the brain. This is due to its tight junctions and the array of transporters and metabolic enzymes that are expressed. The determination of brain concentrations in vivo is difficult, laborious and expensive which means that there is interest in developing predictive tools of brain distribution. Predicting brain concentrations is important even in early drug development to ensure efficacy of central nervous system (CNS) targeted drugs and safety of non-CNS drugs. The literature review covers the most common current in vitro, in vivo and in silico methods of studying transport into the brain, concentrating on transporter effects. The consequences of efflux mediated by p-glycoprotein, the most widely characterized transporter expressed at the BBB, is also discussed. The aim of the experimental study was to build a pharmacokinetic (PK) model to describe p-glycoprotein substrate drug concentrations in the brain using commonly measured in vivo parameters of brain distribution. The possibility of replacing in vivo parameter values with their in vitro counterparts was also studied. All data for the study was taken from the literature. A simple 2-compartment PK model was built using the Stella™ software. Brain concentrations of morphine, loperamide and quinidine were simulated and compared with published studies. Correlation of in vitro measured efflux ratio (ER) from different studies was evaluated in addition to studying correlation between in vitro and in vivo measured ER. A Stella™ model was also constructed to simulate an in vitro transcellular monolayer experiment, to study the sensitivity of measured ER to changes in passive permeability and Michaelis-Menten kinetic parameter values. Interspecies differences in rats and mice were investigated with regards to brain permeability and drug binding in brain tissue. Although the PK brain model was able to capture the concentration-time profiles for all 3 compounds in both brain and plasma and performed fairly well for morphine, for quinidine it underestimated and for loperamide it overestimated brain concentrations. Because the ratio of concentrations in brain and blood is dependent on the ER, it is suggested that the variable values cited for this parameter and its inaccuracy could be one explanation for the failure of predictions. Validation of the model with more compounds is needed to draw further conclusions. In vitro ER showed variable correlation between studies, indicating variability due to experimental factors such as test concentration, but overall differences were small. Good correlation between in vitro and in vivo ER at low concentrations supports the possibility of using of in vitro ER in the PK model. The in vitro simulation illustrated that in the simulation setting, efflux is significant only with low passive permeability, which highlights the fact that the cell model used to measure ER must have low enough paracellular permeability to correctly mimic the in vivo situation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stress relaxation testing is often utilised for determining whether athermal straining contributes to plastic flow; if plastic strain rate is continuous across the transition from tension to relaxation then plastic strain is fully thermally activated. This method was applied to an aged type 316 stainless steel tested in the temperature range 973–1123 K and to a high purity Al in the recrystallised annealed condition tested in the temperature range 274–417 K. The results indicated that plastic strain is thermally activated in these materials at these corresponding test temperatures. For Al, because of its high strain rate sensitivity, it was necessary to adopt a back extrapolation procedure to correct for the finite period that the crosshead requires to decelerate from the constant speed during tension to a dead stop for stress relaxation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Processing maps for hot working of stainless steel of type AISI 304L have been developed on the basis of the flow stress data generated by compression and torsion in the temperature range 600–1200 °C and strain rate range 0.1–100 s−1. The efficiency of power dissipation given by 2m/(m+1) where m is the strain rate sensitivity is plotted as a function of temperature and strain rate to obtain a processing map, which is interpreted on the basis of the Dynamic Materials Model. The maps obtained by compression as well as torsion exhibited a domain of dynamic recrystallization with its peak efficiency occurring at 1200 °C and 0.1 s−1. These are the optimum hot-working parameters which may be obtained by either of the test techniques. The peak efficiency for the dynamic recrystallization is apparently higher (64%) than that obtained in constant-true-strain-rate compression (41%) and the difference in explained on the basis of strain rate variations occurring across the section of solid torsion bar. A region of flow instability has occurred at lower temperatures (below 1000 °C) and higher strain rates (above 1 s−1) and is wider in torsion than in compression. To achieve complete microstructure control in a component, the state of stress will have to be considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A general procedure for arriving at 3-D models of disulphiderich olypeptide systems based on the covalent cross-link constraints has been developed. The procedure, which has been coded as a computer program, RANMOD, assigns a large number of random, permitted backbone conformations to the polypeptide and identifies stereochemically acceptable structures as plausible models based on strainless disulphide bridge modelling. Disulphide bond modelling is performed using the procedure MODIP developed earlier, in connection with the choice of suitable sites where disulphide bonds could be engineered in proteins (Sowdhamini,R., Srinivasan,N., Shoichet,B., Santi,D.V., Ramakrishnan,C. and Balaram,P. (1989) Protein Engng, 3, 95-103). The method RANMOD has been tested on small disulphide loops and the structures compared against preferred backbone conformations derived from an analysis of putative disulphide subdatabase and model calculations. RANMOD has been applied to disulphiderich peptides and found to give rise to several stereochemically acceptable structures. The results obtained on the modelling of two test cases, a-conotoxin GI and endothelin I, are presented. Available NMR data suggest that such small systems exhibit conformational heterogeneity in solution. Hence, this approach for obtaining several distinct models is particularly attractive for the study of conformational excursions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The no-hiding theorem says that if any physical process leads to bleaching of quantum information from the original system, then it must reside in the rest of the Universe with no information being hidden in the correlation between these two subsystems. Here, we report an experimental test of the no-hiding theorem with the technique of nuclear magnetic resonance. We use the quantum state randomization of a qubit as one example of the bleaching process and show that the missing information can be fully recovered up to local unitary transformations in the ancilla qubits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of structural system identification when measurements originate from multiple tests and multiple sensors is considered. An offline solution to this problem using bootstrap particle filtering is proposed. The central idea of the proposed method is the introduction of a dummy independent variable that allows for simultaneous assimilation of multiple measurements in a sequential manner. The method can treat linear/nonlinear structural models and allows for measurements on strains and displacements under static/dynamic loads. Illustrative examples consider measurement data from numerical models and also from laboratory experiments. The results from the proposed method are compared with those from a Kalman filter-based approach and the superior performance of the proposed method is demonstrated. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Random Access Scan, which addresses individual flip-flops in a design using a memory array like row and column decoder architecture, has recently attracted widespread attention, due to its potential for lower test application time, test data volume and test power dissipation when compared to traditional Serial Scan. This is because typically only a very limited number of random ``care'' bits in a test response need be modified to create the next test vector. Unlike traditional scan, most flip-flops need not be updated. Test application efficiency can be further improved by organizing the access by word instead of by bit. In this paper we present a new decoder structure that takes advantage of basis vectors and linear algebra to further significantly optimize test application in RAS by performing the write operations on multiple bits consecutively. Simulations performed on benchmark circuits show an average of 2-3 times speed up in test write time compared to conventional RAS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the area of testing communication systems, the interfaces between systems to be tested and their testers have great impact on test generation and fault detectability. Several types of such interfaces have been standardized by the International Standardization Organization (ISO). A general distributed test architecture, containing distributed interfaces, has been presented in the literature for testing distributed systems based on the Open Distributing Processing (ODP) Basic Reference Model (BRM), which is a generalized version of ISO distributed test architecture. We study in this paper the issue of test selection with respect to such an test architecture. In particular, we consider communication systems that can be modeled by finite state machines with several distributed interfaces, called ports. A test generation method is developed for generating test sequences for such finite state machines, which is based on the idea of synchronizable test sequences. Starting from the initial effort by Sarikaya, a certain amount of work has been done for generating test sequences for finite state machines with respect to the ISO distributed test architecture, all based on the idea of modifying existing test generation methods to generate synchronizable test sequences. However, none studies the fault coverage provided by their methods. We investigate the issue of fault coverage and point out a fact that the methods given in the literature for the distributed test architecture cannot ensure the same fault coverage as the corresponding original testing methods. We also study the limitation of fault detectability in the distributed test architecture.