920 resultados para Residue curve maps
Resumo:
Drosophila mutants have played an important role in elucidating the physiologic function of genes. Large-scale projects have succeeded in producing mutations in a large proportion of Drosophila genes. Many mutant fly lines have also been produced through the efforts of individual laboratories over the past century. In an effort to make some of these mutants more useful to the research community, we systematically mapped a large number of mutations affecting genes in the proximal half of chromosome arm 2L to more precisely defined regions, defined by deficiency intervals, and, when possible, by individual complementation groups. To further analyze regions 36 and 39-40, we produced 11 new deficiencies with gamma irradiation, and we constructed 6 new deficiencies in region 30-33, using the DrosDel system. trans-heterozygous combinations of deficiencies revealed 5 additional functions, essential for viability or fertility.
Resumo:
Although the influence of cytochrome P450 inhibitory drugs on the area under the curve (AUC) of cyclosporine (CsA) has been described, data concerning the impact of these substances on the shape of the blood concentration curve are scarce. By assessment of CsA blood levels before and 1, 2, and 4 hr after oral intake (C0, C1, C2, and C4, respectively) CsA profiling examinations were performed in 20 lung transplant recipients taking 400 mg, 200 mg, and no itraconazole, respectively. The three groups showed comparable results for C0, C2, and AUC(0-12). Greater values were found for Cmax, Cmax-C0, peak-trough fluctuation and rise to Cmax in favor of the non-itraconazole group. Additionally, tmax was shorter in the non-itraconazole group. Comedication with the metabolic inhibitor itraconazole is associated with a flattening of the CsA blood concentration profile in lung transplant recipients. These changes cannot be assessed by isolated C0, C2, or AUC(0-12) values alone.
Resumo:
Fluid optimization is a major contributor to improved outcome in patients. Unfortunately, anesthesiologists are often in doubt whether an additional fluid bolus will improve the hemodynamics of the patient or not as excess fluid may even jeopardize the condition. This article discusses physiological concepts of liberal versus restrictive fluid management followed by a discussion on the respective capabilities of various monitors to predict fluid responsiveness. The parameter difference in pulse pressure (dPP), derived from heart-lung interaction in mechanically ventilated patients is discussed in detail. The dPP cutoff value of 13% to predict fluid responsiveness is presented together with several assessment techniques of dPP. Finally, confounding variables on dPP measurements, such as ventilation parameters, pneumoperitoneum and use of norepinephrine are also mentioned.
Resumo:
The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).
Resumo:
Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation
Resumo:
Global transcriptomic and proteomic profiling platforms have yielded important insights into the complex response to ionizing radiation (IR). Nonetheless, little is known about the ways in which small cellular metabolite concentrations change in response to IR. Here, a metabolomics approach using ultraperformance liquid chromatography coupled with electrospray time-of-flight mass spectrometry was used to profile, over time, the hydrophilic metabolome of TK6 cells exposed to IR doses ranging from 0.5 to 8.0 Gy. Multivariate data analysis of the positive ions revealed dose- and time-dependent clustering of the irradiated cells and identified certain constituents of the water-soluble metabolome as being significantly depleted as early as 1 h after IR. Tandem mass spectrometry was used to confirm metabolite identity. Many of the depleted metabolites are associated with oxidative stress and DNA repair pathways. Included are reduced glutathione, adenosine monophosphate, nicotinamide adenine dinucleotide, and spermine. Similar measurements were performed with a transformed fibroblast cell line, BJ, and it was found that a subset of the identified TK6 metabolites were effective in IR dose discrimination. The GEDI (Gene Expression Dynamics Inspector) algorithm, which is based on self-organizing maps, was used to visualize dynamic global changes in the TK6 metabolome that resulted from IR. It revealed dose-dependent clustering of ions sharing the same trends in concentration change across radiation doses. "Radiation metabolomics," the application of metabolomic analysis to the field of radiobiology, promises to increase our understanding of cellular responses to stressors such as radiation.
Resumo:
Several months were required to produce a single gram of indium. Consequently, the industrial history of the metal is extremely short. In view of the unique properties that indium has demonstrated in this short period, it is probable that indium is still in its early stage of development. However, the commercial applications of the metal are well established and indium is now produced on a commercial scale. It is obtainable as the metal or in solution for electroplating.
Resumo:
The intensive postwar search for new petroleum horizons has resulted in widespread prospecting in the northern Great Plains. No commercial production has as yet been derived from Ordovician or Devonian rocks in Montana, but the relatively few tests that have penetrated to critical depths have disclosed encouraging conditions which merit further consideration, especially in Devonian strata.
Resumo:
More than 3000 years ago, men began quenching and tempering tools to improve their physical properties. The ancient people found that iron was easier to shape and form in a heated condition. Charcoal was used as the fuel, and when the shaping process was completed, the smiths cooled the piece in the most obvious way, quenching in water. Quite unintentionally, these people stumbled on the process for improving the properties of iron, and the art of blacksmithing began.
Resumo:
The present chapter gives a comprehensive introduction into the display and quantitative characterization of scalp field data. After introducing the construction of scalp field maps, different interpolation methods, the effect of the recording reference and the computation of spatial derivatives are discussed. The arguments raised in this first part have important implications for resolving a potential ambiguity in the interpretation of differences of scalp field data. In the second part of the chapter different approaches for comparing scalp field data are described. All of these comparisons can be interpreted in terms of differences of intracerebral sources either in strength, or in location and orientation in a nonambiguous way. In the present chapter we only refer to scalp field potentials, but mapping also can be used to display other features, such as power or statistical values. However, the rules for comparing and interpreting scalp field potentials might not apply to such data. Generic form of scalp field data Electroencephalogram (EEG) and event-related potential (ERP) recordings consist of one value for each sample in time and for each electrode. The recorded EEG and ERP data thus represent a two-dimensional array, with one dimension corresponding to the variable “time” and the other dimension corresponding to the variable “space” or electrode. Table 2.1 shows ERP measurements over a brief time period. The ERP data (averaged over a group of healthy subjects) were recorded with 19 electrodes during a visual paradigm. The parietal midline Pz electrode has been used as the reference electrode.
Diffusion Dynamics of Energy Efficient Buildings. Actor's Cognitive Maps of the Construction Process