992 resultados para Reference level
Resumo:
Human exposure to Bisphenol A (BPA) results mainly from ingestion of food and beverages. Information regarding BPA effects on colon cancer, one of the major causes of death in developed countries, is still scarce. Likewise, little is known about BPA drug interactions although its potential role in doxorubicin (DOX) chemoresistance has been suggested. This study aims to assess potential interactions between BPA and DOX on HT29 colon cancer cells. HT29 cell response was evaluated after exposure to BPA, DOX, or co-exposure to both chemicals. Transcriptional analysis of several cancer-associated genes (c-fos, AURKA, p21, bcl-xl and CLU) shows that BPA exposure induces slight up-regulation exclusively of bcl-xl without affecting cell viability. On the other hand, a sub-therapeutic DOX concentration (40nM) results in highly altered c-fos, bcl-xl, and CLU transcript levels, and this is not affected by co-exposure with BPA. Conversely, DOX at a therapeutic concentration (4μM) results in distinct and very severe transcriptional alterations of c-fos, AURKA, p21 and CLU that are counteracted by co-exposure with BPA resulting in transcript levels similar to those of control. Co-exposure with BPA slightly decreases apoptosis in relation to DOX 4μM alone without affecting DOX-induced loss of cell viability. These results suggest that BPA exposure can influence chemotherapy outcomes and therefore emphasize the necessity of a better understanding of BPA interactions with chemotherapeutic agents in the context of risk assessment.
Resumo:
Objective To suggest a national value for the diagnostic reference level (DRL) in terms of activity in MBq.kg–1, for nuclear medicine procedures with fluorodeoxyglucose (18F-FDG) in whole body positron emission tomography (PET) scans of adult patients. Materials and Methods A survey on values of 18F-FDG activity administered in Brazilian clinics was undertaken by means of a questionnaire including questions about number and manufacturer of the installed equipment, model and detector type. The suggested DRL value was based on the calculation of the third quartile of the activity values distribution reported by the clinics. Results Among the surveyed Brazilian clinics, 58% responded completely or partially the questionnaire; and the results demonstrated variation of up to 100% in the reported radiopharmaceutical activity. The suggested DRL for 18F-FDG/PET activity was 5.54 MBq.kg–1 (0.149 mCi.kg–1). Conclusion The present study has demonstrated the lack of standardization in administered radiopharmaceutical activities for PET procedures in Brazil, corroborating the necessity of an official DRL value to be adopted in the country. The suggested DLR value demonstrates that there is room for optimization of the procedures and 18F-FDG/PET activities administered in Brazilian clinics to reduce the doses delivered to patients. It is important to highlight that this value should be continually revised and optimized at least every five years.
Resumo:
We study an intertemporal asset pricing model in which a representative consumer maximizes expected utility derived from both the ratio of his consumption to some reference level and this level itself. If the reference consumption level is assumed to be determined by past consumption levels, the model generalizes the usual habit formation specifications. When the reference level growth rate is made dependent on the market portfolio return and on past consumption growth, the model mixes a consumption CAPM with habit formation together with the CAPM. It therefore provides, in an expected utility framework, a generalization of the non-expected recursive utility model of Epstein and Zin (1989). When we estimate this specification with aggregate per capita consumption, we obtain economically plausible values of the preference parameters, in contrast with the habit formation or the Epstein-Zin cases taken separately. All tests performed with various preference specifications confirm that the reference level enters significantly in the pricing kernel.
Resumo:
Human exposure to Bisphenol A (BPA) results mainly from ingestion of food and beverages. Information regarding BPA effects on colon cancer, one of the major causes of death in developed countries, is still scarce. Likewise, little is known about BPA drug interactions although its potential role in doxorubicin (DOX) chemoresistance has been suggested. This study aims to assess potential interactions between BPA and DOX on HT29 colon cancer cells. HT29 cell response was evaluated after exposure to BPA, DOX, or co-exposure to both chemicals. Transcriptional analysis of several cancer-associated genes (c-fos, AURKA, p21, bcl-xl and CLU) shows that BPA exposure induces slight up-regulation exclusively of bcl-xl without affecting cell viability. On the other hand, a sub-therapeutic DOX concentration (40 nM) results in highly altered c-fos, bcl-xl, and CLU transcript levels, and this is not affected by co-exposure with BPA. Conversely, DOX at a therapeutic concentration (4 μM) results in distinct and very severe transcriptional alterations of c-fos, AURKA, p21 and CLU that are counteracted by co-exposure with BPA resulting in transcript levels similar to those of control. Co-exposure with BPA slightly decreases apoptosis in relation to DOX 4 μM alone without affecting DOX-induced loss of cell viability. These results suggest that BPA exposure can influence chemotherapy outcomes and therefore emphasize the necessity of a better understanding of BPA interactions with chemotherapeutic agents in the context of risk assessment.
Resumo:
The quantification of the available energy in the environment is important because it determines photosynthesis, evapotranspiration and, therefore, the final yield of crops. Instruments for measuring the energy balance are costly and indirect estimation alternatives are desirable. This study assessed the Deardorff's model performance during a cycle of a sugarcane crop in Piracicaba, State of São Paulo, Brazil, in comparison to the aerodynamic method. This mechanistic model simulates the energy fluxes (sensible, latent heat and net radiation) at three levels (atmosphere, canopy and soil) using only air temperature, relative humidity and wind speed measured at a reference level above the canopy, crop leaf area index, and some pre-calibrated parameters (canopy albedo, soil emissivity, atmospheric transmissivity and hydrological characteristics of the soil). The analysis was made for different time scales, insolation conditions and seasons (spring, summer and autumn). Analyzing all data of 15 minute intervals, the model presented good performance for net radiation simulation in different insolations and seasons. The latent heat flux in the atmosphere and the sensible heat flux in the atmosphere did not present differences in comparison to data from the aerodynamic method during the autumn. The sensible heat flux in the soil was poorly simulated by the model due to the poor performance of the soil water balance method. The Deardorff's model improved in general the flux simulations in comparison to the aerodynamic method when more insolation was available in the environment.
Resumo:
Dissertação de Mestrado, Ciências Económicas e Empresariais, 6 de Dezembro de 2012, Universidade dos Açores.
Resumo:
Atualmente a Tomografia Computorizada (TC) é o método de imagem que mais contribui para a dose coletiva resultante de exposições médicas. Este estudo pretende determinar os valores de Índice de Dose de TC (CTDI) e produto dose-comprimento (DLP) para os exames de crânio e tórax em adultos num equipamento de TC multidetetores; e efetuar uma análise objetiva e subjetiva da qualidade da imagem. Determinaram-se os valores de CTDI e DLP utilizando uma câmara de ionização e fantomas de crânio e tórax. Efetuou-se ainda uma análise objetiva e subjetiva da qualidade da imagem com o fantoma Catphan® 500 e observadores, respetivamente. Os resultados obtidos foram superiores relativamente às Guidelines europeias no protocolo de crânio (CTDIvol = 80,13 mGy e DLP = 1209,22 mGy.cm) e inferiores no protocolo de tórax (CTDIvol = 8,37 mGy e DLP = 274,71 mGy.cm). Na análise objetiva da qualidade da imagem, à exceção da resolução de baixo contraste no protocolo de crânio, todos os outros critérios analisados estavam em conformidade com a legislação. Na análise subjetiva da qualidade da imagem existiu uma diferença estatisticamente significativa entre as classificações atribuídas pelos observadores às imagens nos parâmetros avaliados (p = 0,000-0,005).
Resumo:
Background: Gene expression analysis has emerged as a major biological research area, with real-time quantitative reverse transcription PCR (RT-QPCR) being one of the most accurate and widely used techniques for expression profiling of selected genes. In order to obtain results that are comparable across assays, a stable normalization strategy is required. In general, the normalization of PCR measurements between different samples uses one to several control genes (e. g. housekeeping genes), from which a baseline reference level is constructed. Thus, the choice of the control genes is of utmost importance, yet there is not a generally accepted standard technique for screening a large number of candidates and identifying the best ones. Results: We propose a novel approach for scoring and ranking candidate genes for their suitability as control genes. Our approach relies on publicly available microarray data and allows the combination of multiple data sets originating from different platforms and/or representing different pathologies. The use of microarray data allows the screening of tens of thousands of genes, producing very comprehensive lists of candidates. We also provide two lists of candidate control genes: one which is breast cancer-specific and one with more general applicability. Two genes from the breast cancer list which had not been previously used as control genes are identified and validated by RT-QPCR. Open source R functions are available at http://www.isrec.isb-sib.ch/similar to vpopovic/research/ Conclusion: We proposed a new method for identifying candidate control genes for RT-QPCR which was able to rank thousands of genes according to some predefined suitability criteria and we applied it to the case of breast cancer. We also empirically showed that translating the results from microarray to PCR platform was achievable.
Resumo:
Climate science indicates that climate stabilization requires low GHG emissions. Is thisconsistent with nondecreasing human welfare?Our welfare or utility index emphasizes education, knowledge, and the environment. Weconstruct and calibrate a multigenerational model with intertemporal links provided by education,physical capital, knowledge and the environment.We reject discounted utilitarianism and adopt, first, the Pure Sustainability Optimization (orIntergenerational Maximin) criterion, and, second, the Sustainable Growth Optimization criterion,that maximizes the utility of the first generation subject to a given future rate of growth. We applythese criteria to our calibrated model via a novel algorithm inspired by the turnpike property.The computed paths yield levels of utility higher than the level at reference year 2000 for allgenerations. They require the doubling of the fraction of labor resources devoted to the creation ofknowledge relative to the reference level, whereas the fractions of labor allocated to consumptionand leisure are similar to the reference ones. On the other hand, higher growth rates requiresubstantial increases in the fraction of labor devoted to education, together with moderate increasesin the fractions of labor devoted to knowledge and the investment in physical capital.
Resumo:
PURPOSE: To investigate the relationship between hemoglobin (Hgb) and brain tissue oxygen tension (PbtO(2)) after severe traumatic brain injury (TBI) and to examine its impact on outcome. METHODS: This was a retrospective analysis of a prospective cohort of severe TBI patients whose PbtO(2) was monitored. The relationship between Hgb-categorized into four quartiles (≤9; 9-10; 10.1-11; >11 g/dl)-and PbtO(2) was analyzed using mixed-effects models. Anemia with compromised PbtO(2) was defined as episodes of Hgb ≤ 9 g/dl with simultaneous PbtO(2) < 20 mmHg. Outcome was assessed at 30 days using the Glasgow outcome score (GOS), dichotomized as favorable (GOS 4-5) vs. unfavorable (GOS 1-3). RESULTS: We analyzed 474 simultaneous Hgb and PbtO(2) samples from 80 patients (mean age 44 ± 20 years, median GCS 4 (3-7)). Using Hgb > 11 g/dl as the reference level, and controlling for important physiologic covariates (CPP, PaO(2), PaCO(2)), Hgb ≤ 9 g/dl was the only Hgb level that was associated with lower PbtO(2) (coefficient -6.53 (95 % CI -9.13; -3.94), p < 0.001). Anemia with simultaneous PbtO(2) < 20 mmHg, but not anemia alone, increased the risk of unfavorable outcome (odds ratio 6.24 (95 % CI 1.61; 24.22), p = 0.008), controlling for age, GCS, Marshall CT grade, and APACHE II score. CONCLUSIONS: In this cohort of severe TBI patients whose PbtO(2) was monitored, a Hgb level no greater than 9 g/dl was associated with compromised PbtO(2). Anemia with simultaneous compromised PbtO(2), but not anemia alone, was a risk factor for unfavorable outcome, irrespective of injury severity.
Resumo:
The purpose of this article was to review the strategies to control patient dose in adult and pediatric computed tomography (CT), taking into account the change of technology from single-detector row CT to multi-detector row CT. First the relationships between computed tomography dose index, dose length product, and effective dose in adult and pediatric CT are revised, along with the diagnostic reference level concept. Then the effect of image noise as a function of volume computed tomography dose index, reconstructed slice thickness, and the size of the patient are described. Finally, the potential of tube current modulation CT is discussed.
Resumo:
Purpose: Diagnostic radiology involving ionizing radiation often leads to crucial information but also involves risk. Estimated cancer risks associated with CT range between 1 in 1000 to 1 in 10 000, depending on age and exposure settings. The aim of this contribution is to provide radiologists a way to inform a patient about these risks on a collective and individual base. Materials and methods: After a brief review of the effects of ionizing radiations, conversion from dose indicators into effective dose will be presented for radiography, fluoroscopy and CT. The Diagnostic Reference Level (DRL) concept will be then introduced to enable the reader to compare the level of exposure of various examinations. Finally, the limit of effective dose will be explained and risk projections after various radiological procedures for adults and children will be presented. Results: From an individual standpoint the benefit of a well justified and optimized CT examination clearly outweigh its risk of inducing a fatal cancer. The uncertainties associated with the effective dose concept should be kept in mind in order to avoid cancer risk projections after an examination on an individual basis. Conclusion: Risk factors or effective dose are not the simplest tools to communicate when dealing with radiological risks. Thus, a set of categories should be preferred as proposed in the ICRP (International Commission on Radiation Protection) report 99.
Resumo:
The objective of the thesis is to structure and model the factors that contribute to and can be used in evaluating project success. The purpose of this thesis is to enhance the understanding of three research topics. The goal setting process, success evaluation and decision-making process are studied in the context of a project, business unitand its business environment. To achieve the objective three research questionsare posed. These are 1) how to set measurable project goals, 2) how to evaluateproject success and 3) how to affect project success with managerial decisions.The main theoretical contribution comes from deriving a synthesis of these research topics which have mostly been discussed apart from each other in prior research. The research strategy of the study has features from at least the constructive, nomothetical, and decision-oriented research approaches. This strategy guides the theoretical and empirical part of the study. Relevant concepts and a framework are composed on the basis of the prior research contributions within the problem area. A literature review is used to derive constructs of factors withinthe framework. They are related to project goal setting, success evaluation, and decision making. On the basis of this, the case study method is applied to complement the framework. The empirical data includes one product development program, three construction projects, as well as one organization development, hardware/software, and marketing project in their contexts. In two of the case studiesthe analytic hierarchy process is used to formulate a hierarchical model that returns a numerical evaluation of the degree of project success. It has its origin in the solution idea which in turn has its foundation in the notion of projectsuccess. The achieved results are condensed in the form of a process model thatintegrates project goal setting, success evaluation and decision making. The process of project goal setting is analysed as a part of an open system that includes a project, the business unit and its competitive environment. Four main constructs of factors are suggested. First, the project characteristics and requirements are clarified. The second and the third construct comprise the components of client/market segment attractiveness and sources of competitive advantage. Together they determine the competitive position of a business unit. Fourth, the relevant goals and the situation of a business unit are clarified to stress their contribution to the project goals. Empirical evidence is gained on the exploitation of increased knowledge and on the reaction to changes in the business environment during a project to ensure project success. The relevance of a successful project to a company or a business unit tends to increase the higher the reference level of project goals is set. However, normal performance or sometimes performance below this normal level is intentionally accepted. Success measures make project success quantifiable. There are result-oriented, process-oriented and resource-oriented success measures. The study also links result measurements to enablers that portray the key processes. The success measures can be classified into success domains determining the areas on which success is assessed. Empiricalevidence is gained on six success domains: strategy, project implementation, product, stakeholder relationships, learning situation and company functions. However, some project goals, like safety, can be assessed using success measures that belong to two success domains. For example a safety index is used for assessing occupational safety during a project, which is related to project implementation. Product safety requirements, in turn, are connected to the product characteristics and thus to the product-related success domain. Strategic success measures can be used to weave the project phases together. Empirical evidence on their static nature is gained. In order-oriented projects the project phases are oftencontractually divided into different suppliers or contractors. A project from the supplier's perspective can represent only a part of the ¿whole project¿ viewed from the client's perspective. Therefore static success measures are mostly used within the contractually agreed project scope and duration. Proof is also acquired on the dynamic use of operational success measures. They help to focus on the key issues during each project phase. Furthermore, it is shown that the original success domains and success measures, their weights and target values can change dynamically. New success measures can replace the old ones to correspond better with the emphasis of the particular project phase. This adjustment concentrates on the key decision milestones. As a conclusion, the study suggests a combination of static and dynamic success measures. Their linkage to an incentive system can make the project management proactive, enable fast feedback and enhancethe motivation of the personnel. It is argued that the sequence of effective decisions is closely linked to the dynamic control of project success. According to the used definition, effective decisions aim at adequate decision quality and decision implementation. The findings support that project managers construct and use a chain of key decision milestones to evaluate and affect success during aproject. These milestones can be seen as a part of the business processes. Different managers prioritise the key decision milestones to a varying degree. Divergent managerial perspectives, power, responsibilities and involvement during a project offer some explanation for this. Finally, the study introduces the use ofHard Gate and Soft Gate decision milestones. The managers may use the former milestones to provide decision support on result measurements and ad hoc critical conditions. In the latter milestones they may make intermediate success evaluation also on the basis of other types of success measures, like process and resource measures.
Resumo:
With the aim of monitoring the dynamics of the Livingston Island ice cap, the Departament de Geodinàmica i Geofísica of the Universitat de Barcelona began ye a r ly surveys in the austral summer of 1994-95 on Johnsons Glacier. During this field campaign 10 shallow ice cores were sampled with a manual ve rtical ice-core drilling machine. The objectives were: i) to detect the tephra layer accumulated on the glacier surface, attributed to the 1970 Deception Island pyroclastic eruption, today interstratified; ii) to verify wheter this layer might serve as a reference level; iii) to measure the 1 3 7Cs radio-isotope concentration accumulated in the 1965 snow stratum; iv) to use the isochrone layer as a mean of verifying the age of the 1970 tephra layer; and, v) to calculate both the equilibrium line of the glacier and average mass balance over the last 28 years (1965-1993). The stratigr a p hy of the cores, their cumulative density curves and the isothermal ice temperatures recorded confi rm that Johnsons Glacier is a temperate glacier. Wi n d, solar radiation heating and liquid water are the main agents controlling the ve rtical and horizontal redistribution of the volcanic and cryoclastic particles that are sedimented and remain interstratified within the g l a c i e r. It is because of this redistribution that the 1970 tephra layer does not always serve as a ve ry good reference level. The position of the equilibrium line altitude (ELA) in 1993, obtained by the 1 3 7Cs spectrometric analysis, varies from about 200 m a.s.l. to 250 m a.s.l. This indicates a rising trend in the equilibrium line altitude from the beginning of the 1970s to the present day. The va rying slope orientation of Johnsons Glacier relative to the prevailing NE wind gives rise to large local differences in snow accumulation, which locally modifies the equilibrium line altitude. In the cores studied, 1 3 7Cs appears to be associated with the 1970 tephra laye r. This indicates an intense ablation episode throughout the sampled area (at least up to 330 m a.s.l), which probably occurred synchronically to the 1970 tephra deposition or later. A rough estimate of the specific mass balance reveals a considerable accumulation gradient related to the increase with altitude.
Resumo:
The aim of this paper is to discuss the trend of overvaluation of the Brazilian currency in the 2000s, presenting an econometric model to estimate the real exchange rate (RER) and which should be a reference level of the RER to guide long-term economic policy. In the econometric model, we consider long-term structural and short-term components, both of which may be responsible for explaining overvaluation trend of the Brazilian currency. Our econometric exercise confirms that the Brazilian currency had been persistently overvalued throughout almost all of the period under analysis, and we suggest that the long-term reference level of the real exchange rate was reached in 2004. In July 2014, the average nominal exchange rate should have been around 2.90 Brazilian reais per dollar (against an observed nominal rate of 2.22 Brazilian reais per dollar) to achieve the 2004 real reference level (average of the year). That is, according to our estimates, in July 2014 the Brazilian real was overvalued at 30.6 per cent in real terms relative to the reference level. Based on these findings we conclude the paper suggesting a mix of policy instruments that should have been used in order to reverse the overvaluation trend of the Brazilian real exchange rate, including a target for reaching a real exchange rate in the medium and the long-run which would favor resource allocation toward more technological intensive sectors.