898 resultados para Sensitivity analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Meta-analysis of predictive values is usually discouraged because these values are directly affected by disease prevalence, but sensitivity and specificity sometimes show substantial heterogeneity as well. We propose a bivariate random-effects logitnormal model for the meta-analysis of the positive predictive value (PPV) and negative predictive value (NPV) of diagnostic tests.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We describe the measurement, at 100 K, of the SIMS relative sensitivity factors (RSFs) of the main physiological cations Na+, K+, Mg2+, and Ca2+ in frozen-hydrated (F-H) ionic solutions. Freezing was performed by either plunge freezing or high-pressure freezing. We also report the measurement of the RSFs in flax fibers, which are a model for ions in the plant cell wall, and in F-H ionic samples, which are a model for ions in the vacuole. RSFs were determined under bombardment with neutral oxygen (FAB) for both the fibers and the F-H samples. We show that referencing to ice-characteristic secondary ions is of little value in determining RSFs and that referencing to K is preferable. The RSFs of Na relative to K and of Ca relative to Mg in F-H samples are similar to their respective values in fiber samples, whereas the RSFs of both Ca and Mg relative to K are lower in fibers than in F-H samples. Our data show that the physical factors important for the determination of the RSFs are not the same in F-H samples and in homogeneous matrixes. Our data show that it is possible to perform a SIMS relative quantification of the cations in frozen-hydrated samples with an accuracy on the order of 15%. Referencing to K permits the quantification of the ionic ratios, even when the absolute concentration of the referencing ion is unknown. This is essential for physiological studies of F-H biological samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: The purpose of our study was to evaluate the efficacy of CT histogram analysis for further characterization of lipid-poor adenomas on unenhanced CT. MATERIALS AND METHODS: One hundred thirty-two adrenal nodules were identified in 104 patients with lung cancer who underwent PET/CT. Sixty-five nodules were classified as lipid-rich adenomas if they had an unenhanced CT attenuation of less than or equal to 10 H. Thirty-one masses were classified as lipid-poor adenomas if they had an unenhanced CT attenuation greater than 10 H and stability for more than 1 year. Thirty-six masses were classified as lung cancer metastases if they showed rapid growth in 1 year (n = 27) or were biopsy-proven (n = 9). Histogram analysis was performed for all lesions to provide the mean attenuation value and percentage of negative pixels. RESULTS: All lipid-rich adenomas had more than 10% negative pixels; 51.6% of lipid-poor adenomas had more than 10% negative pixels and would have been classified as indeterminate nodules on the basis of mean attenuation alone. None of the metastases had more than 10% negative pixels. Using an unenhanced CT mean attenuation threshold of less than 10 H yielded a sensitivity of 68% and specificity of 100% for the diagnosis of an adenoma. Using an unenhanced CT threshold of more than 10% negative pixels yielded a sensitivity of 84% and specificity of 100% for the diagnosis of an adenoma. CONCLUSION: CT histogram analysis is superior to mean CT attenuation analysis for the evaluation of adrenal nodules and may help decrease referrals for additional imaging or biopsy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tumor necrosis factor (TNF) is known to have antiproliferative effects on a wide variety of tumor cells but proliferative effects on normal cells. However, the molecular basis for such differences in the action of TNF are unknown. The overall objectives of my research are to investigate the role of oncogenes in TNF sensitivity and delineate some of the molecular mechanisms involved in TNF sensitivity and resistance. To accomplish these objectives, I transfected TNF-resistant C3H mouse embryo fibroblasts (10T1/2) with an activated Ha-ras oncogene and determined whether these cells exhibit altered sensitivity to TNF. The results indicated that 10T1/2 cells transfected with an activated Ha-ras oncogene (10T-EJ) not only produced tumors in nude mice but also exhibited extreme sensitivity to cytolysis by TNF. In contrast, 10T1/2 cells transfected with the pSV2-neo gene alone were resistant to the cytotoxic effects of TNF. I also found that TNF-induced cell death was mediated through apoptosis. The differential sensitivity of 10T1/2 and 10T-EJ cell lines to TNF was not due to differences in the number of TNF receptors on their cell surface. In addition, TNF-resistant revertants isolated from Ha-ras-transformed, TNF-sensitive cells still expressed the same amount of p21 as TNF-sensitive cells and were still tumorigenic, suggesting that Ha-ras-induced transformation and TNF sensitivity may follow different pathways. Interestingly, TNF-resistant but not sensitive cells expressed higher levels of bcl-2, c-myc, and manganese superoxide dismutase (MnSOD) mRNA following exposure to TNF. However, TNF treatment resulted in a marginal induction of p53 mRNA in both TNF-sensitive and resistant cells. Based on these results I can conclude that (i) Ha-ras oncogene induces both transformation and TNF sensitivity, (ii) TNF-induced cytotoxicity involves apoptosis, and (iii) TNF-induced upregulation of bcl-2, c-myc, and MnSOD genes is associated with TNF resistance in C3H mouse embryo fibroblasts. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The major complications for tumor therapy are (i) tumor spread (metastasis); (ii) the mixed nature of tumors (heterogeneity); and (iii) the capacity of tumors to evolve (progress). To study these tumor characteristics, the rat 13762NF mammary adenocarcinoma was cloned and studied for metastatic properties and sensitivities to therapy (chemotherapy, radiation and hyperthermia). The cell clones were heterogeneous and no correlation between metastatic potential and therapeutic sensitivities was observed. Further, these phenotypes were unstable during passage in vitro; yet, the changes were clone dependent and reproducible using different cryoprotected cell stocks. To understand the phenotypic instability, subclones were isolated from low and high passage cell clones. Each subclone possessed a unique composite phenotype. Again, no apparent correlation was seen between metastatic potential and sensitivity to therapy. The results demonstrated that (1) tumor cells are heterogeneous for multiple phenotypes; (2) tumor cells are unstable for multiple phenotypes; (3) the magnitude, direction and time of occurrence of phenotypic drift is clone dependent; (4) the sensitivity of cell clones to ionizing radiation (gamma or heat) and chemotherapy agents is independent of their metastatic potential; (5) shifts in metastatic potential and sensitivity to therapy may occur simultaneously but are not linked; and (6) tumor cells independently diverge to form several subpopulations with unique phenotypic profiles. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Métodos estadísticos para análisis de MRI PSIR

Relevância:

40.00% 40.00%

Publicador:

Resumo:

UPM Activities on Sensitivity and Uncertainty Analysis of Assembly Depletion

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Triticum aestivum aluminum-activated malate transporter (TaALMT1) is the founding member of a unique gene family of anion transporters (ALMTs) that mediate the efflux of organic acids. A small sub-group of root-localized ALMTs, including TaALMT1, is physiologically associated with in planta aluminum (Al) resistance. TaALMT1 exhibits significant enhancement of transport activity in response to extracellular Al. In this study, we integrated structure–function analyses of structurally altered TaALMT1 proteins expressed in Xenopus oocytes with phylogenic analyses of the ALMT family. Our aim is to re-examine the role of protein domains in terms of their potential involvement in the Al-dependent enhancement (i.e. Al-responsiveness) of TaALMT1 transport activity, as well as the roles of all its 43 negatively charged amino acid residues. Our results indicate that the N-domain, which is predicted to form the conductive pathway, mediates ion transport even in the absence of the C-domain. However, segments in both domains are involved in Al3+ sensing. We identified two regions, one at the N-terminus and a hydrophobic region at the C-terminus, that jointly contribute to the Al-response phenotype. Interestingly, the characteristic motif at the N-terminus appears to be specific for Al-responsive ALMTs. Our study highlights the need to include a comprehensive phylogenetic analysis when drawing inferences from structure–function analyses, as a significant proportion of the functional changes observed for TaALMT1 are most likely the result of alterations in the overall structural integrity of ALMT family proteins rather than modifications of specific sites involved in Al3+ sensing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A validation of the burn-up simulation system EVOLCODE 2.0 is presented here, involving the experimental measurement of U and Pu isotopes and some fission fragments production ratios after a burn-up of around 30 GWd/tU in a Pressurized Light Water Reactor (PWR). This work provides an in-depth analysis of the validation results, including the possible sources of the uncertainties. An uncertainty analysis based on the sensitivity methodology has been also performed, providing the uncertainties in the isotopic content propagated from the cross sections uncertainties. An improvement of the classical Sensitivity/ Uncertainty (S/U) model has been developed to take into account the implicit dependence of the neutron flux normalization, that is, the effect of the constant power of the reactor. The improved S/U methodology, neglected in this kind of studies, has proven to be an important contribution to the explanation of some simulation-experiment discrepancies for which, in general, the cross section uncertainties are, for the most relevant actinides, an important contributor to the simulation uncertainties, of the same order of magnitude and sometimes even larger than the experimental uncertainties and the experiment- simulation differences. Additionally, some hints for the improvement of the JEFF3.1.1 fission yield library and for the correction of some errata in the experimental data are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Early detection is an effective means of reducing cancer mortality. Here, we describe a highly sensitive high-throughput screen that can identify panels of markers for the early detection of solid tumor cells disseminated in peripheral blood. The method is a two-step combination of differential display and high-sensitivity cDNA arrays. In a primary screen, differential display identified 170 candidate marker genes differentially expressed between breast tumor cells and normal breast epithelial cells. In a secondary screen, high-sensitivity arrays assessed expression levels of these genes in 48 blood samples, 22 from healthy volunteers and 26 from breast cancer patients. Cluster analysis identified a group of 12 genes that were elevated in the blood of cancer patients. Permutation analysis of individual genes defined five core genes (P ≤ 0.05, permax test). As a group, the 12 genes generally distinguished accurately between healthy volunteers and patients with breast cancer. Mean expression levels of the 12 genes were elevated in 77% (10 of 13) untreated invasive cancer patients, whereas cluster analysis correctly classified volunteers and patients (P = 0.0022, Fisher's exact test). Quantitative real-time PCR confirmed array results and indicated that the sensitivity of the assay (1:2 × 108 transcripts) was sufficient to detect disseminated solid tumor cells in blood. Expression-based blood assays developed with the screening approach described here have the potential to detect and classify solid tumor cells originating from virtually any primary site in the body.