964 resultados para Baseline forecast
Resumo:
BACKGROUND: The baseline susceptibility of primary HIV-2 to maraviroc (MVC) and other entry inhibitors is currently unknown. METHODS: The susceptibility of 19 HIV-2 isolates obtained from asymptomatic and AIDS patients and seven HIV-1 clinical isolates to the fusion inhibitors enfuvirtide (ENF) and T-1249, and to the coreceptor antagonists AMD3100, TAK-779 and MVC, was measured using a TZM-bl cell-based assay. The 50% inhibitory concentration (IC(50)), 90% inhibitory concentration (IC(90)) and dose-response curve slopes were determined for each drug. RESULTS: ENF and T-1249 were significantly less active on HIV-2 than on HIV-1 (211- and 2-fold, respectively). AMD3100 and TAK-779 inhibited HIV-2 and HIV-1 CXCR4 tropic (X4) and CCR5 tropic (R5) variants with similar IC(50) and IC(90) values. MVC, however, inhibited the replication of R5 HIV-2 variants with significantly higher IC(90) values (42.7 versus 9.7 nM; P<0.0001) and lower slope values (0.7 versus 1.3; P<0.0001) than HIV-1. HIV-2 R5 variants derived from AIDS patients were significantly less sensitive to MVC than variants from asymptomatic patients, this being inversely correlated with the absolute number of CD4(+) T-cells. CONCLUSIONS: T-1249 is a potent inhibitor of HIV-2 replication indicating that new fusion inhibitors might be useful to treat HIV-2 infection. Coreceptor antagonists TAK-779 and AMD3100 are also potent inhibitors of HIV-2 replication. The reduced sensitivity of R5 variants to MVC, especially in severely immunodeficient patients, indicates that the treatment of HIV-2-infected patients with MVC might require higher dosages than those used in HIV-1 patients, and should be adjusted to the disease stage.
Resumo:
This work presents the archaeometallurgical study of a group of metallic artefacts found in Moinhos de Golas site, Vila Real (North of Portugal), that can generically be attributed to Proto-history (1st millennium BC, Late Bronze Age and Iron Age). The collection is composed by 35 objects: weapons, ornaments and tools, and others of difficult classification, as rings, bars and one small thin bent sheet. Some of the objects can typologically be attributed to Late Bronze Age, others are of more difficult specific attribution. The archaeometallurgical study involved x-ray digital radiography, elemental analysis by micro-energy dispersive X-ray fluorescence spectrometry and scanning electron microscopy with energy dispersive spectroscopy, microstructural observations by optical microscopy and scanning electron microscopy. The radiographic images revealed structural heterogeneities frequently related with the degradation of some artefacts and the elemental analysis showed that the majority of the artefacts was produced in a binary bronze alloy (Cu-Sn) (73%), being others produced in copper (15%) and three artefacts in brass (Cu-Zn(-Sn-Pb)). Among each type of alloy there’s certain variability in the composition and in the type of inclusions. The microstructural observations revealed that the majority of the artefacts suffered cycles of thermo-mechanical processing after casting. The diversity of metals/alloys identified was a discovery of great interest, specifically due to the presence of brasses. Their presence can be interpreted as importations related to the circulation of exogenous products during the Proto-history and/or to the deposition of materials during different moments at the site, from the transition of Late Bronze Age/Early Iron Age (Orientalizing period) onwards, as during the Roman period.
Resumo:
This study analyses the access and use of financial services by small business owners in the cities of Mozambique, as an important tool for boosting economic growth and diminishing inequality. It correlates owners’ and business characteristics with the probability of adopting Points-of-Sale (POS), Mobile Banking and Mobile Money in everyday transactions. The main findings highlight that what mostly affects the use of POS is the size of business and the volume of transactions (positively correlated with POS adoption), while using mobile phone technologies for payments predominantly depends on the owner’s age and whether he/she is a frequent cellphone user. Moreover, to increase the use of electronic means of payment it is necessary to increase financial literacy and improve the banking services.
Resumo:
There are two significant reasons for the uncertainties of water demand. On one hand, an evolving technological world is plagued with accelerated change in lifestyles and consumption patterns; and on the other hand, intensifying climate change. Therefore, with an uncertain future, what enables policymakers to define the state of water resources, which are affected by withdrawals and demands? Through a case study based on thirteen years of observation data in the Zayandeh Rud River basin in Isfahan province located in Iran, this paper forecasts a wide range of urban water demand possibilities in order to create a portfolio of plans which could be utilized by different water managers. A comparison and contrast of two existing methods are discussed, demonstrating the Random Walk Methodology, which will be referred to as the â On uncertainty pathâ , because it takes the uncertainties into account and can be recommended to managers. This On Uncertainty Path is composed of both dynamic forecasting method and system simulation. The outcomes show the advantage of such methods particularly for places that climate change will aggravate their water scarcity, such as Iran.
Resumo:
This Study assessed the development of sludge treatment and reuse policy since the original 1993 National Sludge Strategy Report (Weston-FTA, 1993). A review of the 48 sludge treatment centres, current wastewater treatment systems and current or planned sludge treatment and reuse systems was carried out Sludges from all Regional Sludge Treatment Centres (areas) were characterised through analysis of selected parameters. There have been many changes to the original policy, as a result of boundary reviews, delays in developing sludge management plans, development in technology and changes in tendering policy, most notably a move to design-build-operate (DBO) projects. As a result, there are now 35 designated Hub Centres. Only 5 of the Hub Centres are producing Class A Biosolids. These are Ringsend, Killamey, Carlow, Navan and Osberstown. Ringsend is the only Hub Centre that is fully operational, treating sludge from surrounding regions by Thermal Drying. Killamey is producing Class A Biosolids using Autothermal Thermophilic Aerobic Digestion (ATAD) but is not, as yet, treating imported sludge. The remaining three plants are producing Class A Biosolids using Alkaline Stabilisation. Anaerobic Digestion with post pasteurisation is the most common form of sludge treatment, with 11 Hub Centres proposing to use it. One plant is using ATAD, two intend to use Alkaline Stabilisation, seven have selected Thermal Drying and three have selected Composting. While the remaining plants have not decided which sludge treatment to select, this is because of incomplete Sludge Management Plans and on DBO contracts. Analysis of sludges from the Hub Centres showed that all Irish sewage sludge is safe for agricultural reuse as defined by the Waste Management Regulations {Use of Sewage Sludge in Agriculture) (S.I. 267/2001), providing that a nutrient management plan is taken into consideration and that the soil limits of the 1998 (S.I. 148/1998) Waste Management Regulations are not exceeded.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting model as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.
Resumo:
The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.
Resumo:
Using survey expectations data and Markov-switching models, this paper evaluates the characteristics and evolution of investors' forecast errors about the yen/dollar exchange rate. Since our model is derived from the uncovered interest rate parity (UIRP) condition and our data cover a period of low interest rates, this study is also related to the forward premium puzzle and the currency carry trade strategy. We obtain the following results. First, with the same forecast horizon, exchange rate forecasts are homogeneous among different industry types, but within the same industry, exchange rate forecasts differ if the forecast time horizon is different. In particular, investors tend to undervalue the future exchange rate for long term forecast horizons; however, in the short run they tend to overvalue the future exchange rate. Second, while forecast errors are found to be partly driven by interest rate spreads, evidence against the UIRP is provided regardless of the forecasting time horizon; the forward premium puzzle becomes more significant in shorter term forecasting errors. Consistent with this finding, our coefficients on interest rate spreads provide indirect evidence of the yen carry trade over only a short term forecast horizon. Furthermore, the carry trade seems to be active when there is a clear indication that the interest rate will be low in the future.
Resumo:
A forecast of nonepidemic morbidity due to acute respiratory infections were carry out by using time series analysis. The data consisted of the weekly reports of medical patient consultation from ambulatory facilities from the whole country. A version of regression model was fitted to the data. Using this approach, we were able to detect the starting data of the epidemic under routine surveillance conditions for various age groups. It will be necessary to improve the data reporting system in order to introduce these procedures at the local health center level, as well as on the provincial level.
Resumo:
BACKGROUND: Management of blood pressure (BP) in acute ischemic stroke is controversial. The present study aims to explore the association between baseline BP levels and BP change and outcome in the overall stroke population and in specific subgroups with regard to the presence of arterial hypertensive disease and prior antihypertensive treatment. METHODS: All patients registered in the Acute STroke Registry and Analysis of Lausanne (ASTRAL) between 2003 and 2009 were analyzed. Unfavorable outcome was defined as modified Rankin score more than 2. A local polynomial surface algorithm was used to assess the effect of BP values on outcome in the overall population and in predefined subgroups. RESULTS: Up to a certain point, as initial BP was increasing, optimal outcome was seen with a progressively more substantial BP decrease over the next 24-48 h. Patients without hypertensive disease and an initially low BP seemed to benefit from an increase of BP. In patients with hypertensive disease, initial BP and its subsequent changes seemed to have less influence on clinical outcome. Patients who were previously treated with antihypertensives did not tolerate initially low BPs well. CONCLUSION: Optimal outcome in acute ischemic stroke may be determined not only by initial BP levels but also by the direction and magnitude of associated BP change over the first 24-48 h.
Resumo:
Exogenous administration of glucocorticoids is a widely used and efficient tool to investigate the effects of elevated concentrations of these hormones in field studies. Because the effects of corticosterone are dose and duration-dependent, the exact course of plasma corticosterone levels after exogenous administration needs to be known. We tested the performance of self-degradable corticosterone pellets (implanted under the skin) in elevating plasma corticosterone levels. We monitored baseline (sampled within 3min after capture) total corticosterone levels and investigated potential interactions with corticosteroid-binding-globulin (CBG) capacity and the endogenous corticosterone response to handling in Eurasian kestrel Falco tinnunculus and barn owl Tyto alba nestlings. Corticosterone pellets designed for a 7-day-release in rodents elevated circulating baseline total corticosterone during only 2-3 days compared to placebo-nestlings. Highest levels occurred 1-2days after implantation and levels decreased strongly thereafter. CBG capacity was also increased, resulting in a smaller, but still significant, increase in baseline free corticosterone levels. The release of endogenous corticosterone as a response to handling was strong in placebo-nestlings, but absent 2 and 8 days after corticosterone pellet implantation. This indicates a potential shut-down of the hypothalamo-pituitary-adrenal axis after the 2-3 days of elevated baseline corticosterone levels. 20 days after pellet implantation, the endogenous corticosterone response to handling of nestlings implanted with corticosterone pellets attained similar levels as in placebo-nestlings. Self-degradable pellets proved to be an efficient tool to artificially elevate circulating baseline corticosterone especially in field studies, requiring only one intervention. The resulting peak-like elevation of circulating corticosterone, the concomitant elevation of CBG capacity, and the absence of an endogenous corticosterone response to an acute stressor have to be taken into account.
Resumo:
Report of the findings of the Institute's Health Impact Assessment (HIA) work programme in 2001, in order to (a) record the baseline of HIA awareness, activity and thinking in Ireland and Northern Ireland and (b) identify the issues around its implementation.
Resumo:
Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.